Marjorie: All right, everyone.

Jay Wright: Before I jump into anything here, I will introduce the panelists. Let's just go to this one here and jump to it. But I think they're all on camera. So Thoibi Rublaitus or Thoibi, I don't see you on my screen, but I'm sure you're here.

Thoibi Rublaitus: I'm here. I'm here. Good afternoon, everybody.

Jay Wright: OK, and Ryan Burke.

Ryan Burke: Good afternoon, everybody. Welcome to the summit.

Jay Wright: All right. John Russell.

John Russell: I'm here, Jay. Thanks for having me. Glad everyone's here. Hope you enjoy this.

Jay Wright: And last but not least, Kathleen Porter.

Kathleen Porter: Good afternoon, everyone. Good to see you all.

Jay Wright: OK. Well, as you can see by your cameras, we've got a superstar panel here. My starter will be, hey, got a lot of PowerPoints here in the last hour or two. Sounds like, oh my gosh, oh my gosh, but no, it was the opposite. Everybody was kind of spicing it up a little bit this time. So dare say I'm more jazzed now than I was a couple hours ago looking at what some of you got cooking.

So we have some great panelists. I'm going to kind of be-- I was thinking about, hey, I see Francisco on my screen. And that little pumpkin anecdote sounded straight from a Charlie Brown special. So to stick with the Charlie Brown special, I'm going to be like Lucy with the football and pull it away and start talking myself for a few minutes before letting you dig into the superstar panelists.

The reason why is I just wanted to talk a little bit about local performance goals. That is the title that we gave this workshop panel. We called it CAEP local performance goals. I got to say, most of you have been to these workshops a lot of you many, many times, so this is really boring review, but not necessarily all of you. So I always feel like it's good to get everybody's foot on the rail and at least know the basic concept that we're talking about. The four panelists, of course, have gone through this process and are working on great solutions having already done this analysis.

But when we talk about the local performance goals concept, there are several workshops we've done the last few years that speak to local performance schools. When we speak to it, a lot of times it's called, quote unquote, "NRS" performance goals. That's because we have the WIOA II NRS data on the CASAS Data Portal. When we do workshops, the data portal typically is a great way to be able to show the concept. But overall, you're using some combination of that data portal as well as all of the accountability reports you know and love in TE, different ways to determine goals based on your strengths, based on your areas of improvement and so on.

Just in case you're interested, here's kind of a directions page to the data portal. Here's the link at the top. Here's the screen with the different ways to access the different reports. We've got WIOA II California data going all the way back to I believe the '04, '05 year. You can access NRS table 4, 4B, the persister. We have follow up data. We have lots of really good stuff there if you're interested.

Overall, there is no one way or the highway. But in general, we say use the data portal at first. That's a good way to identify strengths and weaknesses. Once you kind of have a handle on where you're strong and where you're not, then go into TE and use the reports there so you can take advantage and drill down, isolating your data in other ways and so on.

Here's a screenshot of the data portal. This is solely just to make sure everybody kind of understands the concept. A couple of years old, but still the same process. What we're showing is we've got the state goals and averages. You can generate your own agency's data. If you're WIOA II funded, take a look at your percentages across the six levels of ABE, six levels of ESL, compare and contrast. Some areas you're going to be doing really well. Other areas, not so much. You're just looking for areas of strength as well as areas of improvement.

So here's just a couple examples to illustrate the concept. That red arrow is ASE low. We're just showing that as an example where we're well below average. You can see that 22.6 is below the state goal and below the state average. That might be an area where we might want to consider targeting, because we need to improve that area. Over there the green arrow shows a lower level of ESL where we're doing well. You can see there we're above the goal and above the state average. So that's an area where we can probably consider ourselves as a strength.

This is just showing-- this is looking at another table. This is persistence. So again, just to make sure everybody's foot on the rail, what we've been looking at through these first couple of slides are NRS table 4, which shows pre post test performance. That is the percentage of students in each level that make a gain between the pretest and post test. What we've been saying in these workshops for several years now is a real good way to start trying to get at what your areas of strength or weaknesses might be and what the first step toward developing a solution might be is to compare that performance with your agency's persistence.

In federal reporting land, persistence relates to the percentage of qualified students that stick around long enough or quote unquote "persist" long enough to at least complete a pretest and a post test. So what we're doing in this slide is having already identified a couple of good areas and a couple not so good areas in our performance. We're now comparing those areas of low performance to the persistence rates.

So here looking at it, we have one of each to the left. And that red arrow you might say kind of confirms our suspicion. You can see our performance in ABE intermediate high is not very good. Our persistence rate is also not very good. So it suggests that low persistence rate most likely is a really big reason why we're achieving low performance.

On the other hand, that green arrow is another area that we've firmly established as an area of low performance. But you can see that persistence rate is OK or pretty gosh darn good. What that's telling us is, yeah, we're still low in this area, but it's not because we're not dutifully pre and post testing everybody. We're doing well.

So it typically suggests there's other reasons for low performance. Those other reasons, it could be a long list. But in general, if the persistence is low, that means clean up our testing pre post pairs, clean up our data. If the persistence is OK or pretty good, then that means we more likely need to look in the classroom and improve things there.

This is just showing another example of some areas that we might want to improve. What I'll point out with the data portal feature is that number in parentheses shows the number of students that qualified for each level. If you look at these numbers, obviously we're looking at a large agency here. But just pointing out that's another big issue to look at when we're identifying strengths and weaknesses.

Again, we're consolidating about an hour's worth of information in five minutes here. But to tie it neatly in a bow, by comparing our performance to our pre post test persistence, that allows it to fall in nice, neat, and tidy, this little 2 by 2 grid. We might be doing great in both, that's the upper right, or poorly in both, that's the lower left, or we could be mixed and matched.

In general, we're focusing on those two squares to the left. That is if persistence and performance are both low, that's an indicator that we need to focus on fundamentals, generally speaking, clean up our data, get more students pre and post testing. If we fall in that upper left, that is our performance is low but our persistence is pretty good, then that suggests we need to improve. But getting more tested isn't going to really help that much, because we're already doing good in that area. So that suggests we need to look at the classroom and improve performance from that point of view.

In more recent workshops, and this is really I think getting at what our four superstars are doing a little bit more succinctly as we've been talking about, OK, great, we've evaluated performance and persistence. We've identified a broad strategy to move forward. Now that we've identified it, what do we do now? So we've had some workshops over the last year or so that kind of has taken it the next step, looked at what I just call quantitative and qualitative issues.

From the quantitative point of view, that's looking at reports in TE, using what I call hotspots. That is isolating the data to find pockets of especially strong or especially weak performance. Isolating those areas and figuring out exactly where those hotspots need to be improved. I'll point out another cliche I've been using is called diamonds in the rough. That's a hotspot that's positive where you might have some really needy areas, but you have that one random class or program that's doing well.

But I've been saying Sesame Street. Which of these ones is not like the others? A lot of times finding what looks different from similar programs or classes or students holds the key to where you might need to improve or where you might want to focus your efforts.

On the qualitative side, we've been talking about getting more feedback from your staff and students. Making sure you have clear channels of communication so you could do exactly that. Find out what your staff are saying. Find out what your students are saying. Bring them into your data evaluation process. And then developing that culture of data at your agency where staff and students alike are aware of it and knowing what you all need to do to move forward.

So I'm going to get out of my PowerPoint. These are the areas that the panelists are going to talk about. That first bullet is what I just did. We'll have a couple that talk about student focus, some that get into regional collaboration, and then Kathleen will go forth and that will be that final bullet about building the culture of data.

Let me go ahead and stop sharing. I really want to stop talking so I can kind of have me stop and have our panelists begin. So this is the one I think that we want here for Thoibi. So I think I have-- yep, Corona-Norco. So here's Thoibi. She's first up. We'll blow up the PowerPoint. And I'll just say, take it away Thoibi.

Thoibi Rublaitus: Thank you, Jay. Thank you for allowing me to be part of your panel. And I have done this session a few times last year and the year before, but I've spruced it up just a little bit, like Jay said. And for those of you who've been in my sessions before, I'm sorry for the repetition, but I think we all learn something every day. So here we go.

So we are adult education. We're all adult educators, and we know the value of andragogy. So when we talk about data and collecting data, we need students to participate in the process of collecting data, which happens to be not only our pre and post test but also agreeing to have us call them back and for them to give us their progress.

And to do all of this, how many of you will agree with me that people hate to test? Students don't like to test. All of us. We don't like tests. Whether it is academic test or going to the doctor to get our yearly numbers. So just to allay that fear of tests a little bit, it makes sense to use andragogy to understand why we do what we do, for our students to know why we test them, and how the test will affect their lives, and help them to direct their own learning and to take responsibility of their own learning and to use their prior experience and to understand what they come in with so that we can help them better and so that they can take agency as well.

Efficacy of learning is a huge deal nowadays. So as we know, most of our students come back to us after not being a student for many years for many of them. So it's challenging for them to come back and just be engaged and be good students. So engaging them or helping them to understand why we do what we do really helps them and motivates them.

And also it gives some way of orienting them to the learning and solving problems with us as they go along. And so that's why I truly believe and this is something we practice at Corona-Norco is the learner centered approach to all the things we do. Not just the academic teaching, but also in data collection and our goal settings. Next slide, please Jay.

So a while ago, we started this whole idea of the concept of paying it forward. And it also came with a little bit of experience with which I came to adult education as an adult educator is when students understand why or how they are able to get free classes, they are more prone or interested in partnering with us. So we share with them that we get federal and state funds.

And so why do we get free classes? So that our lives are enhanced. Not just the students on our campuses, but also beyond. When they earn more money, they pay more taxes to the government. And that also gives them a little smile, like I'm smiling right now, so that then they become more involved in their test processes.

We also share with them that testing allows us to have that accountability. They also get their own accountability. They came to us with a purpose, with a goal, and they have an accountability to us to complete their work. And so to show their growth, they need that accountability, that partnership of learning and growing. So the tests are not a reflection to them at all or to their learning. It's more to help our teachers understand what to teach them or how to teach them. What is the area they need more support in? And the gains that they bring also brings funds to us as a school, which helps us to support more students in the future.

So we also share with our students that the work they do this year is going to help people two years later. This is something I used to do in the past, just sharing with my students. And then we shared it with-- when I was a teacher that is. I haven't told you my history. I've been in adult ed for 15 years, 10 years of which as ESL teacher. So I share this with my students. And then over time, we shared it with the ESL program, and now we're sharing it with ABE and CT as well.

And then what makes me really happy is that just last week, our paraeducator came and told me she's going and talking to our students in Spanish and telling them the same thing. But she does in her own way. But the concept is taking ground on our campus. And that's what we call paying it forward. When we do something, we are helping people two years in the future. Next slide, please, Jay. Thank you.

So it's basically a shift of culture. And a lot of times I've been asked how long does it take? Yes, it took a long time. I told you I've been in adult ed as an adult ed educator for 15 years.

So it started in my third or fourth year at Corona-Norco and then now it's kind of a practice in our school. Our school orientation packet has a small little clip of me talking about it. But we are also seeing that other staff members are doing it the same. The teachers are doing the same, and the change is coming from within slowly and gently. And so our students are our learning partners.

And I don't know if you've seen this report. CASAS makes it easy for us. And they have this brand new report that started just last year. We now provide each student a personal score report after our pretest. So immediately students know where they stand. And then if you go to the next slide, please Jay.

So in our student handbook, we have a page that is dedicated completely to tracking their CASAS score and setting their goals. So the students get to write them. This is for an ABE / ASE class. Students get to write down what their score is at the pretest, both in reading as well as math, and then what their goal is for the next test. So this is a practice that's now quite prevalent. Next slide, please Jay.

So we use some learner centered tools. And this number one tool is a student performance report that we print out and give to all our teachers and students. This is the one that goes to the student. The student gets to see their performance and how much they scored, which test they took. And so they get to write down in the previous sheet that I showed you. And also this has the content standards.

So they get to see, for this example, the student could see that fourth or fifth row from the top, the 0%. That's the area they need to focus first on. And it also is an important area to focus on because there are three items from that content standard. So for a teacher to have reports like this also really helps them to see what major areas her students are struggling with. And those are the content standards they teach, pay more attention to. Next slide, please.

The second tool is something we've used in the past. We have revised it to the first page that I showed you already. Again, this is another score sheet that the student gets to see. They get yeses and no's. In the past, I used to also share that lower level students also understand yeses and nos even if they don't know how to read or to learn competency description. But the first slide that I showed you today is a modern, newer version where it's much easier for lower level students also to see. Jay, next slide, please.

So this is a sample of what we had started long ago. And we're tweaking it a little bit now. But this gives you an idea on how students can record their own progress. The first test, CASAS test number one, what areas they need to improve on, CASAS number test two. And at the bottom, what's most important is if a student has reached their level of completion for a year, they don't have to be tested again for that year at least. Or if they want to be promoted, we can promote them to the next level. So students get agency of their own learning.

And that's the end of my part. Jay, one more slide just to show that all this is to show that students are empowered, they are engaged, and they are self directed learners to reach the school's goal, but not only the school's goal, but the student's goals. Thank you.

Jay Wright: All right. Well, thank you. Let's go ahead and transition from Thoibi to Ryan. I'm going ahead and getting his PowerPoint up. I'll try to make this as quick and seamless as possible. I believe this is his cover slide. So hey, take it away, Ryan.

Ryan Burke: Yeah, that one's on me, actually. Mine says Sweetwater on it. It's a yellow one.

Jay Wright: Makes it easy.

Ryan Burke: Although I can just get out of line and get back in and let somebody else go if we want.

Jay Wright: No, you're good. You're good.

John Russell: Nice try, Ryan.

[laughs]

Ryan Burke: OK. Well, I'll start with the introduction while Jay is taking care of that. My name is--

John Russell: Come on. I'm rooting for you. You got this, Jay.

Ryan Burke: Do my best. My name is Ryan Burke. I am the Director of Adult Ed for Sweetwater Union High School District.

We have approximately around now, at our biggest point, we were around 11,000 students. We dropped down into the sixth. Well, back in the old days, we were up above 20. But more recently around 11, then we drop down to in the sixes and now we're bouncing back hoping to get up this year closer to that 10 mark as we bounce back from pandemic. Sweetwater--

Jay Wright: Euleen asks, where is Sweetwater?

Ryan Burke: Yeah. We're located on the border right South of San Diego. So we're right there on the border. So there's a lot of community need here, and we're happy to fill it. We have a large program which does afford us some opportunity to do some things that not all providers may be able to do, because we do have more funding to work with due to scale.

But I hope to share with you about what we do. And then hopefully you are able to connect with us and we can learn from you and you can pull from us. We're happy to share all the time. So any luck on finding me, Jay?

Jay Wright: Are you not seeing it? Your second slide is now on my screen. It's showing.

John Russell: No, you're still showing Thoibi's.

Jay Wright: All right. Well, sorry, my error then. Let me start then. I thought I had you on here. Sorry about that. I didn't realize you were hemming and hawing because of me. So here, is this better?

Ryan Burke: Buying time, dude.

Jay Wright: Here I was thinking I was all set and I thought I had you ready and I didn't. Sorry about that.

Ryan Burke: We're good, we're good. OK, so a big part of what we're trying to do is make sure that we start off on the right foot. I mean, we're talking about student services, student supports. The one thing we want to do is make sure we're starting off on the right foot.

Things that we do to make that happen, we test before anybody learns anything. So that's part of the testing system. It's a pre post test. Sweetwater used to get up and start teaching and teaching and then try to fill in the tests whenever we could. Now we take a week at the-- or even a week and a half sometimes and take a deep breath at the beginning of the year, make sure everybody's tested before we start teaching, and take care of all the housekeeping that way.

Another thing is getting people into the program they want at the level they need. And in our institution, that also means making sure they're in the pathway that they need. And we'll talk about ESL pathways in a little bit. But other things that we've done is our oral exam process. We added that to make sure that we're getting-- a lot of times for students, the reading and writing happens more naturally. It happens first, the reading especially, because it's controlled pace. You can see the word. You can really think about it. It's not coming at you 100 miles an hour the way the spoken word does.

So a lot of times, students pick that up first. But what that does in a testing environment is it can give you some false scores, as many of you have experienced, some false highs. And really it's not totally false. It's showing you how well they read. But that's not always an indicator that we're meeting all the modalities that students are at that level in ability to speak and write in particular as well. Writing is a big one.

The last one that we talk about getting students off on the right is we set student goals right away. I know we just talked about that. We try to make sure that our student goals are student driven. Your goal can be I want to get an apartment. It's not something that we want to control other people's goals.

We don't want to give them a menu and say, hey, here's the list of goals that you're allowed to have. So we want authentic goals. And as I mentioned there, are no offense, Jay, but sorry to my CASAS overlords on that. We like to have the goal be authentic, and then we try to fit it into what CASAS says that goal is. Next slide, please.

The other thing that we also started doing is we want-- that's our goal is 100% of our students are going to be post tested or we know why they are not. So either they've been post tested or we know what happened. And that's a tall order. I mean, outside of the system, anybody not in adult ed would be like, yeah, sure, that's easy. You just want to make sure. You look at all your test scores and know the students that didn't test and why. That sounds pretty easy.

But when those of you who are in adult ed, all of us, I guess, it's hard. It's hard to keep up. They disappear. They come in and out of the program. They drop out for four weeks and then they resurface and the testing happened while they were out. And now we're not on that push right now, because we're into civics or something else, some other test.

So we keep track of it. Our testing technicians are using that-- see that little spreadsheet there? That's just a snippet of a spreadsheet that we're using. That's kept in real time for every class, every student. It has student name and more information to the left that we've cut off for privacy. It does have EL civics and other tests on the right also.

So it keeps going to the right. But the purpose of that is that we always have a real time quick look. We don't have to pull a report. We don't have to do anything. Just look at this document that our teachers are accessing, our testing technicians are accessing, our admin, and our office staff. Everybody has access and uses this regularly. So it's a great way to have real time data.

The other thing that we do is when students do disappear and we don't have an answer as to what happened, we just got an empty cell here, we don't leave it empty. We try to get an answer. So our community relations facilitators are calling students and they're asking them, hey, what happened? How come you're not still coming to the adult school?

And then they're listening for the response in an effort to steer them and say, oh, well maybe you weren't getting what you needed. It wasn't hard enough. You already knew all that. Well, maybe you need to go to a different level. Or maybe you're in the ESL community pathway, but what you really want is to find a job. So maybe let's talk about getting into the workplace pathway.

So conversations like that, that if you don't reach out, it's just going to be an empty cell. And we're trying to avoid that at all costs. We're OK with not post testing 100% of our students. We know that that's not possible. But we just want to know why we did it. And we've got a long ways to go still. We're working on it. We're hoping for a big improvement this year in this area. Next slide, please. I'm going to try to speed up, because I know--

Jay Wright: I skipped one there, sorry.

Ryan Burke: Oh, that's right. I can't go that fast, but I'll try to speed up anyway. We also have program design as a key when we're thinking about student outcomes. So we want to make sure that our students can get into exactly what they need when they need it that's going to get them the fastest possible pathway to where they want. So we really need to know what they want before we even start talking about classes. So that's not an easy process, and we'll talk about that in a little bit. But I'll show you a little bit more about ESL pathways in a moment.

But we also offer dual enrollment for ABE and ESL. What that does is pushes the academic language and accelerates the academics and language development. Then same thing. CTE ESL enables people to get to the workplace faster. CTE and ABE through VABE, same thing. Gets to the workplace faster. And then citizenship. You can partner with any of our programs.

But one of the keys is making sure that your office staff is ready to field that. That when you have interested students coming in that your office staff knows how all these programs work together. And we're going through that a little bit right now. We've got a training series in Sweetwater with our classified staff members that starts tomorrow, actually, going through some of that decision tree.

If a student says this, well OK, what do they need and what can we start offering them? And going through and mapping out the conversations to really try to emphasize dual enrollment so students can get where they need to be faster. And next slide.

ESL pathways that's the program that we've had going on. We started the year of the pandemic, which is a pretty crazy idea. But we had done all the pre-work. We had done it for like a year and setting it up.

And it got to the point where it was time to turn on the pathways and go and the pandemic hit. And we're like, should we do it? And we decided, heck yeah, we're going to do it. Let's go. So we are now in our, what's that make it, the third year that we've been doing ESL pathways. And next slide, please. Let's explain a little bit about what that is. So go ahead. You're going to have to click through these, Jay.

So teacher says, what are your goals? And if you're not specific-- keep on clicking, Jay. There's going to be several bubbles popping up. I want to learn English, I want to learn English. So these are ESL students that have enrolled in the class. All right, so next slide, please Jay.

The key question becomes, why do you want to learn English? Let's see some of the reasons the students want to learn, Jay. They need a job. College. Road signs. Driver's license. Academics. Have experience. Get my GED. Help children. My children in school. And next slide one more time, Jay. And one more time. Yeah, that one was I need a promotion.

Then the teacher's pressing again, no really, why? Let's get more details. OK. So now you'll see some of these examples popping up a little bit further, a little deeper. So you have the degree from your native country that you're not able to use here. Citizenship. How to use your email. How to communicate with service orders and bosses and all of those things.

All those things that would be a tremendous challenge to do in a second language. And some of them it's just, hey, I want to be able to navigate the grocery store, the street signs, and talk to my kid's teacher. So there's all those different possibilities. And you just need to really find out from your students why they want to be there. And maybe it's not the first surface answer they give you. Digging a little deeper often tells you a lot more of the story. So next one please, Jay.

So we ended up, and I'm not going into the whole spiel of it, and this is one of our-- we are on the CDE website coming up I guess after this conference as one of the model programs. And it's the ESL pathway program that we have here. But we let students self select into one of three pathways. Either the community pathway, workplace pathway, and academics pathway.

And again, this is about student success and persistence, which is going to connect all your metrics. We recognized that students' needs were not being met in the same classroom if we didn't have context. So if all the students are at the beginning level in a classroom and that's all you know, that's fine and good and you can do a good job there. But if their goal is to go to college whereas a lot of the students' goal is really to just get up to a functioning citizenship level of English, that's two really different needs there.

And so I saw that happening for a couple of years where the academic students were the ones that I felt like suffered the most, because they in our program made about 20% of the student body at the beginning. And the workplace folks who self identified were closer to 30%. So we had about half of them were either academics or workplace and the other half were community. And when we say community, it's that, like I said, street signs, schools, grocery stores, all the things, laws, all the things that you need to as an American citizen, the basics.

So those three needs, again, are just not the same. So if a student is looking to go to college and they're in the same room as the students who are just interested in that community level of rigor, the challenge here is how do you meet both students' needs without completely turning off the community set because it's too rigorous, too hard, or without underserving the high academic contextual needs of somebody that's trying to go on to San Diego State or Southwestern College?

And again, there's that other group of workplace students. They're in there. They want to be able to learn how to read work manuals, communicate through email. They want to get a job, so they need resume writing skills. All those types of things. So there's a very specific focus.

So we did survey our students. We went through a full process. I can't get into all that. But it took us a little while to land on those three pathways. But since we've been running, it's been great. It's not perfect. We're still learning as we go, of course. But it really has helped give a real focus for our teaching staff and our office staff and placement and teaching staff in how we contextualize the learning to make it most relevant for students. Then that in turn impacts college success, persistence, and community happiness. All right, and next slide, please.

Couple last pieces. Lesson design and directed collaboration. A couple of things that we have at Sweetwater that I think are impactful. We're close to the goal on this. We want every single student, I mean, every single lesson to have a communicated learning intention, success criteria, and a formative assessment. The teams are creating them. Nobody's telling them what they need to be. The teams are using their curriculum, their competencies, their standards, and deciding what is the most important thing I'm going to teach this week, that we as a team are going to teach this week. And that is what we write the learning intentions around.

Success criteria, we're saying what would it look like if somebody could master this? What would it look like if somebody-- what's an example of somebody mastering that in the context of whatever it is? So what is the success criteria? What are we expecting for a workplace ESL student around that learning intention? And the common formative assessment, they agree to it. They create it. They write it. They decide on what it's going to be.

Again, this takes planning time. This takes investment. We make that investment. It's not an investment in the-- it's investment in the teachers and the students. It's not one or the other. We're investing the teacher's time, and the impact is direct on the students. And it is a huge impact.

We used to have what I called independent contractors in all of our classrooms around the district, around the division. And I could walk into all the classes. They had all different priorities, all different. And it could be the same level. And that was fine and good. It was teachers doing what they could. But once that teamwork happened, the team decisions on what is important became really powerful for them as a group. And we don't tell them as admin what it should be. It's hands off. They decide what it is.

Thursday night, Data Knights. We gather in job alike teams. We look at those formative assessment results that the students took. And then they're talking in their teams again. What are we going to do on Monday? We had this number of students that got it. What are we going to do with them that's an extension that doesn't get into a new competency, a new standard? Because if you start teaching new stuff to the students that are performing at the highest level, they start racing out farther. You're stretching your gap even wider. So you can't do that.

So how do you make this meaningful? You need to give them extension work that is directly related to the competency and standard that they have mastered but a higher level of application. You might look at different ways that you can go to a higher level of DOK, having them create things that relate to that standard rather than just respond and receive them. So a lot of different things you can do.

But then we're also looking at, what did students miss on the common formative assessment? And we're bringing that into the Monday reteach for the students that are not ready for that extension, that need another touch. So we're doing that Monday mornings. And it's a great way to start the week with a refresher of what you did last week and so that we can start building this week.

Another learning piece for this year is that is integrating the student competency performance report and a competency performance summary into those decisions, into those planning cycles. Right now to this point, we've been relying on curriculum and teachers' expertise to make those decisions. We're training this year on getting that into the results from CASAS as well so that we're getting informed from multiple directions.

Last slide I'd like to talk about is our adults with disabilities. And this is a statewide issue. '17, '18, 2017, 2018, I point out here that the LAO Legislative Analyst's Office, that's California gubernatorial office, and they identified that we had 12.5% of students in the K-12 system with disabilities. Then the National Center for Ed Stats put it at 15% last year. And as any K-12 educator will tell you, it does seem to be increasing. Whether it's an identification thing or that there's something going on beyond that, we can't speculate here. But the numbers seem to be growing.

And so I ask you, what percentage of adult ed students are working with a disability? We don't have stats like that, which is a total bummer. And you'll probably never know, because they don't self identity. They don't come with an IEP. There's no, thank goodness, scarlet letter telling you. But that doesn't mean we sit and do nothing about it. We as educators of adults need to do something about this.

The last thing we need to do, like I say here, is replicate what they've done in the K-12 system. It's so broken, and many of you came from the K-12 system and know special education teachers who are worn out and have been whittled down to a nub because all they do is fill out paperwork and grind through all that stuff. And there's always pressure, there's lawsuits, there's all that stuff.

We don't need any of that, folks. I'm not advocating for that. We don't need that. But we've got to look at providing services. Because my guess is that our population in adult ed, we're dealing with numbers that are higher than that 12 and 1/2 to 15. My guess is that a higher percentage of students that are in adult ed did not move on to college and career perhaps at the other end of high school or came from another country and have disabilities that were never identified in their homeland. So there's all those issues going around, and I just want to advocate real quick.

So I took that time on this slide. It's not OK to do nothing. We as a group, I know there's 70 something people in here, we can be the change that we need to see in adult education and lead for this country and show people how it should be done, because that's what people need. Our students need it. Not providing supports is not OK. The student body has the needs. So we've got to start searching for services and supports and doing what we can.

At Sweetwater, we have the Disabilities and Access Resource Center. That is our DARC Center. There's a presentation on that tomorrow by Kenya Bradson, our program manager. I believe it's at 12:30. But we find ways. And it's not so dogmatic like the way it can be in the K-12 system. So that's that. Those are student supports that we're providing at Sweetwater and some of the program designs and structures that we're providing to try to impact positive student achievement.

Jay Wright: All right, well thank you very much, Ryan. I'll try to do a better job switching screens than I did the first time around. So we're looking at John Russell here next. And I just want to make sure that I get the right one here. OK, I think this is it right here.

John Russell: I always feel awkward at these things, because-- nope, that's not it.

Jay Wright: OK, hang on. It needs to show. It's not showing everything. There we go. This is the one. This is the one we're looking.

John Russell: But you're still showing-- there you go.

Jay Wright: All right. So anyway, I'll blow it up.

John Russell: But like I say, when Thoibi speaks or Ryan and I see Carol is on here and anybody that's been on these where I'm on, I always feel like I'm bringing up the rear and that Jay just does this because he likes going to dinner with me. But I'll just assume that you'll get something out of this.

And kind of setting the stage for this, I think a lot of what Thoibi and Ryan and I think also Kathleen talk about are strategies. A lot of what I'm going to talk about is what are the ways that we know that we have something that we need to be working on? I think we've got 70 some odd people here. But when you're in the thick of a school year, I think a lot of adult eds do not spend time doing what they need to do in terms of looking at data. So if you can go to the next slide, Jay.

So the agenda for this is looking at vision versus goals in terms of what do we want to try to do and what do we want to try to achieve in terms of what we want our data to look like. And I'm going to focus on vision a little bit, just because it's my personal pet peeve. Sorry. If you can hold on one second. I promised I would time this so that I'm not going to-- I just refuse to go over.

And then what are the goals from your three year plan and your annual plan? And obviously those are going to be different, but how are you quantifying those goals? And then I'm going to make a pitch for a consortium data document as a repository for all the data that you are gathering as a consortium. And so I have this as an institution and as a consortium.

I'll back up and tell you that I am the Assistant Principal at Monrovia Community Adult School. We're a little old adult school in the San Gabriel Valley of Los Angeles County. I'm also the Program Director for the Citrus College Adult Ed Consortium and the Pasadena Area Consortium. So got two consortia that stretch from Pasadena to Pomona along the 210 corridor and are trying to help everybody in those institutions, member institutions, to be more effective. Because I know member effectiveness is something that's really important.

And Neal answered a question talking about-- and the question, and I'm sorry, I did not see who it was from, is CAEP going to be like WIOA, performance based? And that is something that could-- Neal was saying could come down the pike. And I believe that everybody would have to sit. This has been talked about since the beginning, and I've been around since AB86.

Performance is going to be a part of what's getting us funded. So with these metrics in the three year plan that we had to complete, it seems like every year the state more and more is wanting us to be accountable for what we're doing with these funds.

And then what I'm going to do is just kind of go through some of the metrics that are tracked in our document, our consortium document. The document is not entirely finished. I wish it were by today, but it's pretty close. And I'm going to give you the hyperlink to our website where you can pull it up and take a look at it and then see if there's any questions. All right, so go ahead, Neal. I'm sorry, Jay.

[laughs]

OK. So many institutions say they have a vision statement and they do not have a vision statement. The thing with a vision statement, you should be able to close your eyes and quantify it and see, is this where our entire organization is going to go? Steve Jobs is not my nerd Jesus. I'm not a huge fan. I'm a huge fan of his vision statement.

His first one was a computer for the rest of us at a time when nobody had computers. And then it became an Apple on every desktop. And that turned into an Apple in every pocket. Five words. Very clean, elegant, simple vision statement. And it's quantifiable. There are 7 billion pockets or 7 and 1/2 billion pockets on the planet. I'm trying to get an Apple into every one of them. Another one of my favorite vision statements is all men are created equal.

What typically happens in education is people rewrite a mission statement and it's a bunch of gobbledygook and it's not a vision statement. The vision is if you did your mission statement perfectly, what would be the perfect outcome? So in terms of our consortium and my school, we have vision statements around data. And actually interestingly enough, Ryan's got one of them.

We look at participation, persistence, and gains. And then for CTE, we're looking at completion and employment. But the first row is for ESL and ABE ASE. So what we want is 100% participation, 100% persistence, 100% gains in EFL. And so if you're keeping track at home and you've got this CAEP summary, what it's saying is that whatever your number is in M, that is equal to your number and E, which is equal to your number in C. And it's equal to your number in D.

So if you've got-- ideally everybody that walks into the door gets to 12 hours of instruction, takes a pretest and a post test, and either graduates or gets a EFL gain. That is the vision. Obviously we're not going to get there. But if we don't state this is the goal, this is the vision, this is where we're headed, then we don't have an opportunity to try to get as far as we can if we don't say this is the vision.

And so in terms of CTE, it's 100% completion and 100% employment. So everybody that shows up in the door becomes a participant. Then they become a postsecondary achieved which means that they've-- and that's self reported, but they've gotten their credential or certification or license. And then they go to work, which is in J.

So that is kind of the vision for our consortium. It's been shared with our TE data coordinators. It's been shared with member reps and the consortium, and they are sharing it with their teachers and stakeholders. Okie doke, next slide, Jay.

So when we-- vision. Yeah, vision. Wouldn't it be great to have 100%? But in the three year plan, the annual plan, you have SMART goals. These are not our SMART goals. They're just examples. And these are not the greatest SMART goals, but they are SMART goals because they are specific, measurable, attainable, realistic, and timely.

So individually, we're all going to increase participation two and a half percent every year. So that's what it looks like with an individual member whose participation rate is 63.5% and then they move it up to 68.5%. And then we're going to move persistence 3% per year. And people are going to increase performance 2% per year. So yeah, we're not getting to 100%. But if we don't state that that's what the vision is, then we're not going to get to the goals that we believe are more realistic. So you can go to the next one, Jay.

All right. So the importance of some kind of consortium data document. I call them annual data reviews for the Citrus College Consortium. The PAC calls it-- Pasadena Area Consortium calls it an annual data report. It provides a single locus to aggregate and analyze data. And so for the consortium, it helps us to understand the community we serve.

So having done the report, I was able to just copy and paste and put it into the three year plan. Three year plan was pretty simple, because I just copied tables and the explanation and highlights from what's in the report. And it just allowed-- I mean, the three year planning process was pretty simple in terms of building out the plan. It allows us to understand and measure are we effective? Are the members effective? And how do we look at that? It fosters deliberations among board reps with CFAD, the three year plan, the annual plan, the budget and work plan, all of those.

For stakeholders, you can use it to inform stakeholders, boards of education, consortium partners, staff, info sessions for CTE, many areas of where stakeholders can find access to the information that they want. What are your pass rates? What are your placement rates? And so those are just a number of ways.

And then reports. I use it for WASC. I often just copy and paste and put it in our WASC. Or nowadays with WASC they just want you to hyperlink to a lot of the data that you're presenting instead of filling up a page report with 150 pages of tables. So just hyperlink to our data review and they are able to go accordingly. There's a hyperlink if you get a hold of the slide. And go to the next one.

So what do we look at when we look at? Participation. So this is just something that's across the consortium. You can see there's the three different sections there that everybody recognizes. The NRS, which is literacy, the CAEP, outcomes, and then the total services. Those are people that just walk through the door. Our '21 '22 data is down a little bit because our community college has not gotten me the data.

However, you see the trend is not great. And hey, Meredith, yes, we were talking WASC a year ago. So you can see that the trend has not been great from pandemic. But I will say that it did improve last year, and we're hoping that it improves again this year. So from this table, which is in our annual data review, I can get the next-- if you can go to the next slide, Jay. Hello Jay. OK, great.

And so then what I've done is show a participation rate, which is basically our CAEP divided by-- the CAEP total divided by the enrollee total. And we look at it.

Jay Wright: Hey John?

John Russell: Yeah?

Jay Wright: I think this was for the previous slide, but Susan asked what's the meaning of the asterisk. I think that was on the previous slide, perhaps.

John Russell: OK, go back. I don't know.

Jay Wright: Sorry. I'm running rampant. Sorry. I didn't mean to do that.

John Russell: That's OK.

Jay Wright: But this one there's a few asterisks next to a couple schools.

John Russell: So the asterisk is there because those-- other part of the report, the total services unduplicated number has a non-CAEP program in there. And so what that asterisk is saying is that the number that's in the total services unduplicated is minus community classroom or people that are not in CAEP programs. So I didn't include the asterisk when I put the chart in there. You can go to the next one. I'm going to try to pull through this as quickly because I only have two minutes. Go, next.

So looking at persistence again. I'm looking at this from the CAEP summary. You can look at different-- the CAEP summary is looking at ESL, ABE, CTE. We look at it over four years. And there you see your persistence rate's not quite 100%. But that's where we want to get to. And then if you go to the next one, Jay, you can see-- if you go to the next one. Too far. Back, back, back. There you go. No. Back. There you go.

And so you can see the total persistence. We had a couple of members who just fell off the map during the pandemic. So you can see those zeros that are in that table. And then we have our charts that show persistence by member. And then if you can go to the next one. So we've got participation, persistence, and now performance.

This kind of looks similar to what the CASAS data portal has. However, I change it around, because I think English is a second language should be the first six levels and then you should get into ABE. And then what happens is you can see based on the state average and then based on the goals how people did, how institutions did. And then where they did well, they're in black. And where they didn't, they are in red. And so you can see it compared to the state average and compared to the state goal. If you can go to the next one.

Then we look at individuals. And so you can see individuals' performance over four years. I get this from table 4 in our CAEP or NRS numbers. So if you go back to the other one real quick, Jay. One of the things that you can see, this is '21, '22. That data is not out yet. So what I did-- the other problem with that data-- so I'm over. Damn it. So I'm going to go one more minute. And just--

Kathleen Porter: John, take your time.

John Russell: OK. I don't want to leave you with no time, Kathleen.

Kathleen Porter: Don't worry. Don't worry. Take our time.

John Russell: One of the problems with the CASAS data portal is that if you're not a title II school, guess what? You're not in there. So I built this by hand in Excel. So it gives me these numbers and then I copy and paste it into Word. So you can go next. Next slide, please.

And then these are individuals', again, performance. You can see that while Azusa's persistence was not as good as Monrovia's their performance was better for the '21, '22 year. And then I put those charts into the annual data review. And then the next one.

So this is one way. And Jay has talked about this and I don't know if Neal wants to weigh in. But one of the ways that we're sort of measuring effectiveness internally, these are going to get a little whacked out because 2021 was a pandemic year and '21 and '22 is coming out of the pandemic year. But what we do is take the CAEP funds, we take the hours of instruction which come from TE, and then when we report that, which we should have reported by September 30.

It needs to be certified by December 1. We take the hours and the CAEP students and we divide our CAEP funds by the instructional hours and by the number of students. And it gives us this metric of CAEP dollars per instructional hour. You can see 2021 was not great. Although we were pretty good.

Other folks, I think one of the things that doing this constantly help us to understand that there are data problems. I mean, it's one thing to be everybody on the same page and hey, we're supposed to be doing this, and we're not. The problem is that not only do you want to use the suggestions that Thoibi has, that Kathleen's going to have, that Ryan have, but just making sure that your coordinators, your top pro coordinators, are getting the correct data.

There's no way that if you look at Azusa in 2021 that their hours are right. 20,204 with 408 students. That's not right. So somebody screwed something up. But we can't go in and change the hours because the state said, hey, you're supposed to report what's in TOPS You can't just make it up and go on top of tops. Anyway, you can see things are better for '21, '22. But it's something that we look at. We go, hey, why are you at 9423?

And so Jenny, just to let you know that the formula is just I'm just taking the CAEP program funds and dividing it by the instructional hours. So the second column is divided by the third column and that gives you the CAEP dollars per instructional hour. And then to get the CAEP per student, I'm taking the program funds in the second column and divide it by students.

So we look at this as a consortium and go, dude, seven grand per student, really? What's going on there? $94 an hour? And Claremont's not been somebody that's been a problem in the past. It's just the pandemic really affected them. Anyway, that's something that's really-- they don't get any money data.

[laughs]

So we're one of seven consortia in a state where the community college gets nothing. And we like it that way. So we don't really look at their data as closely as we should, because they don't get any funding. Anyway, I want to leave some time for Kathleen. I'm sorry. I go over every freaking time. I'm sorry, Kathleen.

Jay Wright: Thank you very much there, John. Let me go ahead and switch screens. Third time's going to be a charm here, hopefully. OK, does everybody see Kathleen's presentation?

Kathleen Porter: We can see it, Jay. Yup.

Jay Wright: All right. Third time's a charm. All right, great. Let me blow it up and take her away, Kathleen.

Kathleen Porter: Thank you, Jay. And thank you to all my other presenters. I feel really lucky to have learned so much this afternoon, and I hope I don't let you down.

So I definitely don't feel like I'm a data expert. I think I'm just somebody who's been practicing at this for a long time. And like you, I'm probably struggling with some of the same obstacles that you are struggling with.

So if you can click to the next slide, I'm going to focus this afternoon really on four things. I'm going to talk about my agency's journey toward creating what I would call a data driven culture. I'm going to talk about the use of a data dialogue protocol. I'm going to tell you a little bit about the protocol itself that we used and then how we're kind of taking things and carrying things forward. So you can click to the next slide.

OK. Either one. So I just want you to look at these little graphics for a minute. And you can put in the chat anything that kind of reminds you. So we've got two on this page. We're looking at somebody's looking at an object and it looks like a stick. Somebody is looking at a nine. There's a guy on an island who is excited he sees a boat. There's a guy in a boat who's excited he's seeing land.

And Jay, if you want to go to the next four. OK, so then we see one guy is seeing four logs. Another guy is seeing three. And then we have this graphic where some people look at it and see a young girl and some see an old man. So thank you, Beth. Yeah, perspective matters. Thank you, Thoibi. Exactly. So sometimes we're looking at data, the same set of data, but our perceptions, we're seeing things differently. And so Jay, if you can go to the next slide. And just keep clicking. We'll go ahead and put the four learnings up.

So we always start with this when we have our data dialogue conversations. First of all, that we know that there are many ways to see the same information. And some of that is perspective. Some of it is just what draws your attention. Some of it might even be how you want to confirm beliefs that you have. So you look for particular things in data.

So we know there's no right way to see data. You can choose to see data. Like with the old man and the young girl, you can choose to see it in multiple ways. And we know that the language that we use can help build understanding. So other people who are looking at the same data sets we are have this opportunity to see it your way after you explain things to them. So if you want to click to the next slide.

This is one of my favorite quotes of all time. If you want to change the culture, you have to change the conversation. So as we start to think about wanting to create a data driven culture, we want to change what people are talking about. How are they expressing what they're seeing? How are they sharing? How are they making meaning of things that they're seeing? And what people are talking about actually starts to become the culture in your organization. And so next slide, Jay. That was backwards I think. Yep, here we go.

And we know that data points are discussion points. And so in our organization, we've been doing this now I think I'm sad to say it's been about 10 years. We've been doing this for about 10 years. Really intentionally using data dialogues to help drive our culture and to focus our culture on this data driven kind of concept. So if you want to click to the next slide.

I just want to talk about how we implemented and what were the problems we were trying to solve for a minute so you know why we adopted this protocol. First of all, we had a lot of turnover in staff, and we lost a really key staff person who we used to call lovingly our data queen. And so when she retired, we sort of noticed that data hadn't really been a part of our whole culture.

It resided in one corner. It was one person's responsibility. And so when she retired, we really felt a need to create, I think, build our capacity, make sure that different staff members were seeing and learning and taking responsibility for this, that it wasn't just our data queen who had a responsibility for data.

So the problems we were trying to solve, we really wanted to improve our student outcomes. We were looking at our data and we were thinking we could do better than this. And so what was driving us was definitely student outcomes. And so when I first started the data dialogue, we have data dialogue luncheons, but I really love the Data Knights.

We invited our whole staff to attend. Not everyone attended. I just let it be optional. So I invited absolutely everyone and we met at lunch time. We met about 10 times the first year. We started and ended on time. We met from 12:00 to 2:00 just about once a month. And we organized the meetings by starting on time and ending on time.

We brought lunch in, so nobody had to worry about that. And then we paid people. We paid people for their time, and we also paid for the lunches. And so pretty nominal cost, but really started to shift our culture and our understanding of our student data. If you can click, Jay, to the next slide.

So I'm actually going to drop this little chart in the chat. So when we think about what data did we start to look at, you'll have a chance to take a look at this. But we actually created a calendar for the year and said, OK, we want to look at these data points this time of year. And then we have other people that might want to look at particular things more frequently.

But when we started implementing this data dialogue protocol, one of the things that you close with in the protocol, you start to think about, well, what other data do I need to see in order to confirm my beliefs or to confirm my understandings or to provide more insight into this?

So sometimes we'll let these data dialogue meetings that we have determine what we see next time. So we have an idea of what we want to see, but we also will let the actual conversation and the protocol determine what we want to see. So if you go to the next slide, Jay.

We started looking at our results. And so in the first year when we just look at our quantitative results, we doubled our payment points in EL civics. Our ASE payment points increased by more than 30%. We increased our number of paired scores. And although we didn't increase the number of students enrolled, we had better outcomes with the students we were serving. So we saw more persistence with students. We saw better outcomes.

And then qualitative, it was pretty-- I think just in terms of the culture pieces, having our teams work together, not being afraid to share data publicly, because I think that's sometimes may be scary where you feel like, oh, maybe I'm not-- it's just like with our PLCs. You're feeling like, oh, if I share my outcome data, maybe I'm not doing as well as somebody else. And is there something wrong with me? And so that whole sense of teamwork really came through.

And even when people sometimes jump to the conclusion that, hey, there's something wrong with me, I'm not doing something well, other people can see data in a way that shows you that, oh, maybe there's more to the picture. There are more variables that you need to look at. So it really helped our teaching staff and our classified staff collaborate and become more of a team. So if you want to go to the next slide.

I think we talked about this. What to consider? How to start with trust so that our staff members are more comfortable airing data publicly. Having the whole team look at data rather than just looking at your own data really helped. That was something that worked. And then like I said, dedicating time. When I invited people to come to the meetings, I invited them to commit to the whole year.

So they didn't have the option of saying, oh, I can come, but I'm not going to come next month. You committed for the year. And so that's kind of how we approach that. And Neal, yeah, I'm with you. Quality is everyone's job. That's what it became is this shared culture of helping students succeed and improving our structures, improving our instruction, and improving our culture so that we could really support students.

So what didn't work, what we've learned over time is actually identifying smaller chunks of data to examine in meetings. Because when you follow this protocol, it's pretty detailed. And you want people to have time to really digest the data. And the time constraints, of course. So you want to size it to the time that you have. If we did an hour and a half, we usually do 90 minutes, and you can cover a lot of ground in 90 minutes.

The other thing that didn't work for us was deviating from the protocol. There were a couple of years, maybe a couple of meetings where we kind of went away from the protocol. And we just had conversations. And we went back to using the formal protocol. The protocol for us really worked well.

So if you want to click to the next slide, I'm going to talk a little bit about what that protocol is. And I'm also going to drop this link in the chat for you. This is from School Reform Initiatives. This is their data driven dialogue protocol. And so it's something that our whole district had started using about 10 years ago. And we had trainers come out and train our district wide staff on looking at data with this particular tool. We were also doing it with our unions in interest based problem solving. So again, it's always coming to the shared understandings before you start to solve problems.

So in this particular protocol, you can see that you start out with predictions. If I told you we were going to look at enrollment data over the past three years, what do you predict we're going to see? Are you going to see enrollment increase? Are you going to see enrollments decrease?

We then actually distribute the data. We give people a chance to look at the data. They can ask clarifying questions on how the data is presented at that point. But once we know what we're looking at, then they go visual. And going visual is where people spend the most time. When they go visual, they're marking up the data. They're highlighting data. Maybe they're making their own charts. But that's when people start to make meaning of the data. They can do that in a small group. They can do it with one or two. They can do it on their own. Whatever they want to do.

And then we get into the next phase, which is observations. And observations are where people just observe the facts. And so this is going back to that piece that people can see the same data in multiple ways. So we really spend time focusing on what you can observe. And so the sentence starters here are I can count that, I observe that.

Really, really the facts only. You're not saying because. You're not saying, oh, well, this might have been because of COVID or this might have been because we didn't offer that class. You're not explaining anything. You're just looking for the data to tell you what it says.

And then phase four, finally, is when you get to the inferences. And so when you get to the inferences, you start thinking about, oh, additional data that would help my understanding of this might be x. So I'd like to know who was teaching in our program that year. Or I'd like to know what was going on. Did we change how we advertised a course? Whatever it might be. The inferences are what you do at the very end.

And then finally, you get into implications for teaching and learning. And those are your actionable next steps. OK, so who needs to know about this data? How are we taking this to a larger audience? How are we making meaning of this as a whole team? So Jay, do you want to click for me?

This is just a high level overview of how we spend a 90 minute period. And I mean it. We follow the protocol. This is how we do it. And finally, if you want to go to my next slide, Jay.

So we have continued, as I said. We've had some little ups and downs in how we have adhered to the protocol. But we're back doing it now. We do it every month. We have a schedule. We also took this protocol to our consortium. So we're doing quarterly data dialogues with our consortium partners, which has really been helpful. It's starting to move our whole consortium in a different way.

Some of the data that we look at has included our equity data. So can it inform our equity practices? Can this data inform what we need to do for professional development with staff? All of those different kinds of pieces have started to come Forward and I can end here, Jay. Then we can open it up to questions.

Jay Wright: Thank you very much. I guess that was our fourth and final panelist. So I'll just sort of open up to everybody, whether it's a question for Kathleen, John, Ryan, Thoibi, or myself, or anybody else, for that matter. Anybody with any questions?

Kathleen Porter: Oh, I see Janae asked, do you use the same protocol for the consortium meetings? And the answer is yes. And we pull consortium-level data that we can disaggregate to show different member districts.

Jay Wright: All right. Thank you. The other chat seemed to be warn thank yous Thank you. So before I turn it over, I just want to wish my thank you to all four of you. I kind of feel like all four of you hit it out of the park and then some. We're talking about goals and raising that bar. I kind of feel like all four of you did that here today. We're talking about implications and vision statements and definitely raising it to a better level each time. So thank you all for that. And I'll turn it over to Marjorie with any closing remarks.

Marjorie: Cool. Thanks, Jay, and thanks, everyone, for presenting. There is a lot of valuable information there. I just want to let you know that the presentation may be posted depending on evaluations. So I'm going to put the evaluation link in the chat. Please fill that out. If the evaluations are really good, then this could be remediated and posted on the site. So please fill that out.

Take care, everybody. I think that actually ends today's day one of the CAEP Summit. So thanks, everyone, for coming. I will leave the room open a little bit just in case anyone wants to grab that link for the evaluation. I'll post it again. I know there's some chats coming through, so I'll wait a little bit. But other than that, thank you, everybody, and we'll see you tomorrow morning.

Jay Wright: Thank you.

Marjorie: There are some questions in the chat also. Where can we find PowerPoint presentations? I've been looking through the site, because that question has come up some. I know that there are some posted if the presenters have submitted their presentations, but I will ask the CAEP team and find out. OK.

I will go ahead and post that link again. I will give you a few more moments to click on that. There's a lot of thank you's in the chat there. All right. Take care, everyone. Have a great day.

Jay Wright: Thank you so much, everybody.