Jay Wright: OK. Thank you very much, Veronica. I did screen share. I guess, for sanity's sake, I'll make sure that everybody sees the screen in the manner in which they would expect to see it. That is, we've got the slide with the list of four presenters. Is that what everybody sees at the moment? Yes or no?

Thoibi Rublaitus: Yes.

Jay Wright: OK, great. Just making sure. I've had a history of messing up screen sharing before, so I'm making sure it's not messed up here.

So we have our all-star panel. We've been joking we've done this quite a bit now, at least five times. I think I've done it seven times, although maybe with a slightly different lineup each time. But Thoibi, Carol, and John have been the most frequent fliers, I think, on these NRS performance goal panels. They've both appeared multiple times.

So what we'll do is I'm going to start off with about five minutes of overview at this point, where another part of our joke is we're thinking everybody's probably had a chance to do this one before, because we did it at last year's CAEP Summit. We've done it at a couple of COABE's We've done it just for kicks on the CASAS website. We've done it in several different ways with the same bill. But here we go.

Just out of curiosity, have you-- any of you here willing to admit that you're here for a second showing? You've seen Carol, and you've seen Thoibi. You've seen this presentation from start to finish, but you just couldn't resist coming on for a second time? Just wondering, because we have done this a lot.

In any case, this will be the order. But I guess, I'll start-- Oh, yes. Frequent flyer. OK. Somebody's sticking up-- Thank you, Dana. So I'll keep going here. And I'm babbling about nonsense and realizing I better move on.

So anyway, here's what we'll talk about. I'll do an overview. We'll talk about some of the NRS issues. I think probably by panelist is better than the agenda though.

I'll just say I will do an overview of that whole NRS performance goal concept for those of you that have never heard this. There is a minimum amount of knowledge that I really do think does help in understanding what the panelists are talking through. So I will do five or 10 minutes of goal setting 101 to get started.

And then Carol will start off. Her specialty will be engaging staff in this process. And then Thoibi will pick up and talk about different ways of engaging students. And then John Russell will bat cleanup. And he'll basically work more from an external point of view, that is, how to look at this in terms of collaborating with your CAEP consortium, collaborating with your WIOA partners, and so on.

So I'll stop talking, and I'll just say-- Yeah, it was off-Broadway right? Yeah. Now we're on Broadway. We've got the gigantic marquee that you all were able to see via vFairs. So, much bigger marquee this time than what we get for other conferences for sure. OK. And I'm trying to keep myself from talking, so I'm just going to click and move to my first slide, here, to plug myself in so I, mercifully, get moving.

So the whole idea here is called NRS local performance goals. This is a topic that yahoos like myself have been talking about for 15 years now. The feds have been imploring all states to do this for about that same amount of time.

That is, we're looking at NRS Table 4. We're not going to get into the gory details of that right now. If you don't know all the gory details, you should be fine. But Table 4 is the federal table that looks at the 12 federal levels-- six for ABE, six for ESL.

And it basically looks in terms of measurable skill gains. With WIOA there's multiple ways of dealing with measurable skill gains now, but historically, that relates to making gains between pre and post. What we've been talking about for years is looking at these levels, identifying your strengths, identifying your weak spots, and identifying those areas of need, and looking at a game plan to try to improve in the areas here that represent areas in need of improvement.

So I'll just say, is it just Chris that's having the screen problem or is it everybody?

Veronica: I believe it's just Chris.

Jay Wright: OK. Just making sure. All right. So I'll move on then, Chris. Sorry. It's just you. I'm not sure. I think maybe you need to log back in. Yeah. It is a black hole anyway, though, whether it's our screen or something else.

So anyway, you can use the NRS tables and TE to identify strengths and needs. You can also use the CASAS data portal. When we look at these examples, we'll use the data portal. We do feel like it's a little more user friendly. You can look at multiple years, multiple agencies, on one screen rather than spreading it all out on the kitchen table, so to speak.

So this slide-- again, we're not going to spend a lot of time on data portal 101 here either. But here's the link to the CASAS data portal with an arrow showing here's the areas that we're looking at. These are the links that you would click to get into the areas we're showcasing here this afternoon. And so what we say overall is we suggest using the data--

[garbled voice]

Veronica: Jay? Jay?

Jay Wright: Yeah.

Veronica: OK. Go ahead. Sorry.

Jay Wright: I unwittingly got muted, I think. Anyway, so what we say is use the data portal when you're troubleshooting. Somehow-- I don't know if I can-- OK. I can't go back. There we go. Use the data portal when you're troubleshooting, and then use the NRS tables and TE so you can drill down once it comes to figuring out ways to make improvements and so on.

So the process-- here we go-- is, again, we're looking at the data portal. We're reviewing our NRS performance across all 12 levels. So here's what the data portal looks like from a recent year. You can see it's got the goal for the current year. It's got the goal and average from the previous year.

So what we're doing is we're comparing our own agency's performance to the statewide average and the statewide goals from the current and previous year. We can go blow by blow and we can see, in some cases, the percentage is better than the average. In other cases, it's worse than the average. So for starters, we're really just looking and seeing which areas are we looking good, which areas are we looking not so good.

The one quick data portal 101 I will bring up is the number in parentheses at the bottom are the number of students represented. So we're, obviously, looking at a large agency here. So, for example, we have 147 ABE beginning lit students. We have 265 ASE high students and so on-- or 71 ASE high students and so on. Again, the numbers in parentheses are just showing the number of students at each level.

So, again, we're looking at our strengths and needs here using the CASAS data portal. So just getting into a brass tacks sort of example, we're blowing it up so you can see it a little bit better. And we're skipping a few steps. But in any case, we're showing you one good area and one area in need of improvement.

The green arrow to the right represents an area in which we're doing well. You can see we're at 63%. The average is at 60%. So that's an area where we're looking pretty good.

On the other hand, ASE low with the red area arrow-- we're well below the statewide average and well below the goal, so that suggests an area where we might need to improve. So, again, look at how far below the average we are. Look at the total number of students in that particular area. You might need to use some local factors, as well. But look at the degree of need and how far below the average we are when we're looking at targets.

So here is-- once we've identified our areas of strength and area of weakness, another good thing that we like to look at is persistence. So when we do the longer presentations here, we'll look and compare performance to persistence. Persistence, in terms of NRS performance, means, of those who qualify for federal reporting, what percentage of students stick around long enough or "persist" long enough to at least complete a pre-test and a post-test. There's different ways of measuring this based on testing, based on attendance hours, based on lots of other things. But just know when we're looking at federal reporting, where WIOA II land anyway, persisters relate to those that complete a pre-test and a post-test.

So what we can also do with the data portal is look at our persistence rate-- so that's what this slide is looking at. So we've already noted these two areas-- ABE intermediate high and ASE low-- are both areas in which we need to improve in terms of performance. They're both bad areas for us. So we're looking at these two bad areas and comparing it to the persistence rate.

So over here to the left, you might say we're sort of seeing what we're expecting to see-- that is, a lousy persistence rate. I'm saying that's what we're expecting to see, because we've already determined that our performance rate is not that good in ABE intermediate high. So this is confirming our suspicion where low persistence seems to be the main thing that's contributing to low performance here.

On the other hand, we're also underperforming in ASE low, but you can see our persistence rate there is actually pretty good. It's not great, but it's definitely not bad at all. It's well above the statewide average.

So the point here is, sometimes we're performing below average, but persistence isn't necessarily the reason. So on the left hand, red arrow, that's an area in which persistence probably is the big explanation for our low performance. Where, over here to the right with the green arrow, there's some other factor, potentially, that seems to be contributing to that low performance.

And then, again, what we've talked about is applying strategies at the agency and the student level to improve our pre/post persistence rate and strategies at the student and agency level to improve our overall areas of performance. Kind of oversimplifying it here is what we've looked at in previous sessions-- again, we're rolling an hour and a half into five minutes here-- is, at the end of the day, in all these different areas, persistence is either good or bad. Similarly, in all these different areas, performance can be good or bad.

So, again, at the end of the day, that allows us to sweep it nice, neat, and over-tidily into four areas on this quadrant-- that is, persistence high or low, performance high or low. Generally speaking, we're looking at the left-hand side, where we're looking at those areas when performance and persistence are both low. And then other times, maybe where our performance low, but our persistence is actually pretty good. And then we get into lots of suggestions, based on these areas, with strategies to try to move our way forward based on our persistence rate in tandem with our performance rate.

OK. So that's my quick little overview just to look at it from an NRS point of view. Before I move on to Carol, I'll give a little plug that we'll do this review very quickly again and apply it to CAEP data collection in a session we're doing Thursday, I think at 8:30. I believe Carol and Thoibi are listed. Carol and Thoibi are welcome to join me on Thursday morning. But by definition, they're not required to attend.

The one I'm doing Thursday will be the deeper dive where we'll look into all these and spread it out. And the emphasis will be using those CAEP reports, not necessarily NRS, given that it's the CAEP Summit. So I'll push myself off the cliff so I stop talking and yield the floor to Carol here. You better jump in, Carol, or I'll think of something else to talk about, and we'll never get out of this mess.

Carol Hirota: Thank you, Jay, very much. Thanks.

Jay Wright: I will slide click.

Carol Hirota: OK. Thank you. So on this slide and during this portion that I'm talking about, it's really a focus on data and also the accuracy. And it's also the who. It's the process and the procedures that are part of this system, but it's about the people who make this system work. So it's a lot of kudos to my colleagues out in Stockton, who have been working on this year after year and continue to do this.

The first thing, obviously, is your customer service, to ensure that your counter, your phone, and your email services, that you are responding to your students continuously. And how you register them-- you want to make it really friendly, using-- whether it's online or they're walking in and doing their registration.

And how you're entering everything in on your TOPS forms or in your system, into your management information system-- that goes over to TOPS Enterprise. Your data integrity-- that report is critical. It's having everybody's eyes on data integrity, on the accuracy of what you're collecting.

And if you're not collecting something really well, go figure out why and how can I do it better. And every time-- it's a Kaizen, continuous improvement. What can I do better?

So you want to also support your teachers and, mainly, your students to do pre and post testing of CASAS. And so having makeup tests when they're not there is really important. And I know our office staff coordinate that with the teachers and the students.

Supporting your students on registering for their exams online, whether it's a GED or HiSET and also-- I learned this from another consortium-- I believe Iris is on this call-- she shares-- she observes students that are testing, and they're going to be taking the GED test. And if they do not have the skills, she lets that teacher know, hey, you're going to really have to work on these computer skills for them to take the essay part.

So that kind of collaboration is really critical and, of course, your accuracy on your attendance, your TOPS forms, and your data entry, sharing your quarterly and annual data results, and exchanging feedback back and forth, really looking at that data together. You can't be afraid to look at the data and say, oh, I don't want all the teachers looking at each other's data. Everybody looks at the data.

You want to also support your professional learning. There's CASAS Institute. There's CCAE. There are CALPRO events and OTAN. Attending all of those is really, really good.

The critical thing is communication-- your staff meetings with your office staff and your faculty meetings, which include your teachers and counselors. And, in Stockton, their teachers have a professional learning community. They have that built into their collaboration time, so it works really well. So they are meeting weekly and, of course, daily communication.

So, on the next slide, it's really important for the data team people, the office people, your assessment people, your data people to work very closely with your teachers and counselors. You need to share those data results in TOPS Enterprise. There are weekly, monthly, quarterly, annual monitoring reports that you can share with your teachers-- quarterly reports, annual reports, and, of course, using the professional learning community.

If your teachers are using a PLC process, you can embed a data team process within that system so they are constantly looking at their data and what they're doing to help students learn. The whole idea is student learning-- not just what teachers are teaching, but how are students learning. And they rely on all of this data that they receive from the data technicians.

Next slide. So part of this system is setting this up at the very, very beginning. And I use this. This is from 2017, but this has been going on for several years.

This is the first day you meet with your teachers. You start looking at your summative data, and you ask things like, what statements can you make about this data. What does-- what is communicated about this? What are some questions you have about the data? What can we celebrate, and what do we need to improve for the following year?

Let them ask these questions, and ask more questions. And exchange information, so that they're looking at the real data together. Then, afterwards, make sure you give them time to set up their data teams within-- and in Stockton, they set up their data teams by ESL, CTE, Adult Basic Education, Adult Secondary Ed, and high school equivalency-- all meet together as one. And sometimes they could meet together all together, but meeting separately helps them look at their data and look at their data goals. And I'm going to share some of that with you in a few minutes.

So the next slide, we're going to be looking at some of the WIOA slides that we look at. Data Table 4, you can easily just put all of the dates that you want so you can compare from year to year. So, on this one, you can see that Stockton-- it's up to 2019-20, but we've done it for like the 10 years.

So you know exactly which areas are doing really well, and you compare it with your state goal and the state average. Because we want to know, not only did we reach the state goal, but how do we compare with other schools in California. So that's really critical. And I'll show you a local report that we use later on.

The next slide is Persister Report, critical in making sure that your students are there long enough so that they are able to demonstrate some learning. So what you want to do is take a look-- and we always use 70% as the minimum for our students-- and, as you can see, our beginning literacy areas and some of our-- and that's for ABE-- are some of our lowest. So what are we doing with those 7, 4, 17-- those first students? Critical.

And then there were 45 students that were at 60-- what-- 69% on beginning basic and ABE. What did we do with those 45 students? That's a lot of students to lose in an Adult Basic Ed program.

So taking a look at all of that-- and, Thoibi, I have to thank you for your session that you had, because I think you've got some-- had some really good ideas on how we can take a look at some of that data and use it efficiently and more effectively so we can make improvements.

So the next report is a local report. And so this is-- and it goes up to 2018-19-- but what this report does is be WIOA. It's a WIOA chart.

So for Stockton School for Adults, it's their percentage in each of the program areas or the entering educational functioning levels, what the performance goal was for the state, and what was the performance against all enrollees in California. So what we want Stockton to do is make sure that they've hit all the performance goals and that they are above the California performance against all enrollees.

That is really important, because we want to make sure that we are doing our very best at this school to provide services for students. And it's critical. So using this is really important for our teachers to take a look at.

OK, next slide. That's a local one. The other one-- and it's actually a longer page-- but for your CAEP we use this as a program worksheet with the-- it's your manager summary, your annual manager summary, or you can use it for your quarterly data. You can take a look at it.

But you can see that there are Column C, Column B, your totals. Anyway, it's a template, and you put your numbers in there. And it will give you your persistence rate. It will also give you your program persistence. And if you look at the entire worksheet, you can also find out evidence of student learning, how many student learning gains there are.

But I believe this worksheet got cut off. It was something that was created from one of Jay's workshop that he got. So we put it in this template, and I know Delta C-- our Adult Education Alliance-- that that consortium has been using this. We've also been using this worksheet in our targeted technical assistance with our coaches when we're evaluating some of our consortium that we're working with.

So, anyway, it's good. So you look at persistence and also learning gains.

The next slide goes into a little bit about what the teachers do. I talked about professional learning communities and the teachers setting up their data systems. You'll see some of the data systems in the next few slides, but quarterly there's a data team system that's going on. And they put down their section number, and then they put their level of their-- what data they're going to be reviewing.

They analyze this data quarterly. They put their data in their SMART goal and their management strategies. And the fourth week, they do a reflection and comments so that they can do better the next round of-- the next quarter that they're going to be doing their data team process. But the whole idea is that they're checking on student learning, and they want to make sure that they are paying attention to what students are doing. So this is a formalized data team process embedded in the professional learning community.

I asked if Stockton was able to follow up on this during the pandemic, and it's been a little harder to do this exact same process. But, however, the teachers have all set up their data systems, which you will be able to see in the next few slides.

The next one-- here's another one where a teacher takes a look at-- flip back, please. They take a look at paired scores, learning gains, and their annual goals, how many students, their SMART goals. They analyze it. They have negative variables affecting the outcomes and the positive intervention affecting the outcomes.

Next slide. These are all the discussions that they have at their professional learning communities. And future interventions-- you'll see like, conferences, flexibility, emailing, communication, modifying expectations, following up. So those are really important kinds of things that we can do to have a positive impact on student learning.

Next slide is another one where teacher-- this was for high school equivalency. I wanted to share with you-- one of the things that I like that this teacher did is not how many students passed the GED, but how many subtests were passed. And so this was for the first quarter, wanting x amount of subtests pass-- oh, this would be for, by February of 2020.

So she would evaluate all of the subtests and where they were having areas of need and more support. So she wasn't just looking on whether they passed or not passed the GED, but each, every single test. So that was really critical. Her data system is very, very involved. And I don't believe I was able to make a picture of that one, because it's so involved.

But the next slide gives you an example of some of the things that the teachers do. They all have their data system. The one where you see the green shaded areas, those are the names of the students and the ESL-- additional assessments for the EL civics project that they're doing-- how many students passed some of these modules-- we call them modules, although they are additional assessments. But this is another way to demonstrate learning gains in your ESL program besides just your CASAS level gains when they take the test.

The other one is data on-- you can see what level, how many gains they made in CASAS, and if they completed anything, and if they had a drop code. So teachers are expected to have a data system so they are tracking their students and tracking their learning. And this is what they use when they have those discussions in those professional learning community sessions.

The whole idea is that this is a system where teachers work very closely with the data technicians. And they, obviously, have support from their administrator, who-- he makes sure-- Jeff Dundas and Bryan Wright-- they make sure that they have all of the technology and have access to all of the reports that they need so that they can take a look at their student learning. So it's really critical.

And they meet very closely with counselors. And-- just a little plug-- we have three counselors in Stockton that are working in the areas of high school diploma, high school equivalency-- one in ESL and one with the transition-- making sure that students get over to the community college or employment. We have found that having credentialed counselors have really complimented the team in Stockton. So, thank you.

Jay Wright: Hey, Carol? A question came through chat wondering how you calculate graduation rate when you're doing this.

Carol Hirota: How we calculate graduation rates?

Jay Wright: Yes.

Carol Hirota: So, there are several ways. We have one chart that just lists numbers-- how many students graduate or complete their high school diploma-- high school equivalency. So that number is tracked every year. The other way we do it is, if you look on Learning Gains, it will tell you how many students passed and how many did not. You can look that up in one of your reports. We-- you can use your manager summary for that, which would be your CAEP manager summary.

Jay Wright: Thanks.

Carol Hirota: Thank you.

Jay Wright: OK. So, anything else, Carol? The next slide is Thoibi's. Anything else you wanted to present, or should we go ahead and move along to Thoibi?

Carol Hirota: Yes, please, Thoibi. Thank you.

Jay Wright: Thoibi, you are on.

Thoibi Rublaitus: Thank you. Thank you. Thank you, Jay. And thank you, Carol. And, just a little introduction before we get into deeper-- Carol has shared a lot of great work that they have done. And every time I listen to her, I learn something new.

And what she shared was mostly from a bigger perspective. I am a principal at a school that serves about 2,500, 3,000 in good times. And we are trying our best to get back to our previous enrollment, as well. What I'm going to share, mostly, is the student-centered approach.

So, as the slide here says, we all, as teachers and administrators, do a lot to serve our students and to provide the best possible education that they need to meet their goals. But we are serving adults. And we have all learned this in our training courses and credential courses about adult learning as opposed to K-12 learning-- adults have different way of learning. And andragogy is what teaches us that.

Adults have a different way of learning. They have to be self-motivated. They need to know what they are doing. They need to build on their previous knowledge. They are responsible for their own growth.

They have immediate needs. And we have to serve their immediate needs, or they'll walk away. And we all know this, and we've heard this often-- that they vote with their feet. They also have problems in their life, so we have to understand their problems and focus on building on what they already have and developing the motivation that they have by coming and enrolling themselves in our courses.

So as we consider all that and work with our students, it is important-- we have seen in our practices-- that constant communication with them is important from the get go to get them to understand why they are doing the test they do. Because CASAS testing-- I'm sure you will all agree with me-- a lot of students, the moment you say test, they are already put off.

But if they know why they do the test with us, then they are more inclined to do it willingly. And so helping them to be the owners of their own learning by providing them the communication from the very beginning and also giving them the empowerment to track their own learning--

So, on this slide, you see those two sheets. The yellow sheet that you see on the top is a report card that our team created a few years ago-- it's about 5, 7, 10 years ago, now-- that we provide for our students. And that, in that report-- when we just hand out the report to somebody, some will look deeply. But not everybody pays attention to look at everything in detail. But they want to know what is in it for me, and when they write it down themselves, they are more involved.

So that report there gives students a worksheet that says, this is your goal-- CASAS test number one, CASAS test number two, number three, number four-- because we give test every quarter. So if the students get to write down there which test they took and what was their score; and then, when they do a post-test and they get to see, OK, I take the next test and then my score went up by three points, or four points, or 15 points, whatever; they get to write it down-- they set their goals for themselves.

Then the onus is not only on the administrators and the teachers, but it is a shared ownership. And so that has helped our students understand why we take the test and how they're tracking their own test. It also helps them to come back and then say to us, oh, I took the test, but I didn't take the other test. I want to take the other test.

Because, as a teacher, when we-- each student has this worksheet. And getting the students to write it down there, they're comparing with each other-- or they're at least paying attention-- and saying, oh, she's got three tests, and I got only one test. Why is that, right? Or, she's got a score, and I haven't got--

But that's not where we stop. I think we also have to get our students to understand and that we try to do this on a regular basis-- with our EL students, particularly, and ABSE teachers are also trying to do it in different ways-- is to get the students to also understand, hey, we are adults. We don't compare ourselves with each other.

We compare with our own work. What I did in the first test, I compare it to the second test, not to my neighbor. Because adults all have different entry level they come to, so they all don't have the same test. They don't all have the same beginning level. So that's another constant communication, as teachers and as administrators, we try to do with our students, which will help them to understand the why, what, and how.

The other thing we also try to do is with the teachers, getting the teachers to look at the data-- like Carol mentioned, the data team that meet, or the PLCs that meet with the data, on a regular basis. In the beginning, it was a little difficult, but over time, the teachers get to share data with each other.

And at the PLCs, during the data discussion, they have a discussion about-- or look at their own class report. So our CASAS data technician provides the data report for students, performance report for students, as well as teachers. The teachers get their own packets.

So the second sheet on the slide that you see, which is the white one in the background, is the one where the teachers do the analysis of their class report-- so where they look at the kind of test the students take, how many students took what test, and what was the areas of growth that most students have.

So by looking at-- so when I was a teacher-- this was a while ago-- when I was a teacher, I got all our students to complete their yellow sheet. And then I collect all the yellow sheets, and then I use them to then go through and look at how many students took test number 81R, how many took 81RX, and how many took 83R, et cetera, et cetera. And then I tally them and see how many students have the goals to meet for their next test.

So that helps me, to give me a data-driven instruction design. So I look at that and say, OK, 30 students took 81RX, and they all did not do well on-- or 50% of them did not do well on-- question number 18, for instance. And then we get to see, then, that, OK, question number 18-- the competency in 18 was to look at maps. So then I teach more on learning to look at maps for students.

Similarly, if all the teachers then go and all intermediate level teachers get together and they say, OK, these are the questions that my students constantly have a problem with-- and then they get together to then create a lesson that meets those students' needs, majority of the students' needs. So we're doing some kind of analysis using the worksheet and the white one.

Jay, the next slide, please. Thank you.

So all this is born out of the mission of what our school was, right-- to provide the learners more educational opportunities and to help them transition. So for all that, we prepare our students. And in this whole big arch of preparing our students to be productive members of the community, there is something that we have built in our school over time. It's a culture of change and the ethical rationale.

Let me preface this by saying that this came because I understand most of our EL students, especially, are immigrants from other countries. So a lot of them come to this new country, and they get free education, right? And the free education-- why do we get free education? You get nothing for free. And how can we get this free education, what does it mean for us, and what do we do with it, is a very big philosophical question, right?

And so, when I was a new teacher at Corona-Norco, I had just come from India. And I was shocked that our students are all getting free education, myself included, right? And then, the more we think about it, the more we realize that it's a matter of paying it forward.

So when we have to test students with CASAS, a lot of our teachers and ourselves included-- CASAS is tough. Why should we do it? Well, we have to do it. Why? Because we're getting free education.

How can we afford a free education? It's the funds that we get from the federal, and the state government, and the region, that allows us to provide this free education. And why would they give us this free education? It's because-- and students laugh at this when I share this with them.

Just two weeks ago, I was walking classrooms and talking to them the importance of taking their post-test, because we were having post-test last week. And so when I tell them, hey, why do you get to do the test, and why do you do the test, and how is it important-- so and then I share with them, if you do well, and you get a good job, you pay more taxes to the state. And that's good for the state. But then, that's not only good for the state-- that's where they laugh.

You pay more taxes. That's why they giving you for free classes. But that's not where it stops. It's beyond that.

People, I share with them, students two years ago did very well, and today you get free education because those students bring the funds to be able to teach you today. Similarly, you do well this year, and the students two years later will benefit from it. So we are paying it forward.

We're getting it for free, but for a good reason-- to improve the state, and the government, and our country together. Now this is our country, because we live here. And then, we are also helping more people to serve more people in the future. So this idea of civic responsibility is something that is important for our students to understand.

And this is also, this also plays out in ABSE classes, as well. If you tell the students, do the test, do the test, they're like, OK, why do I have to keep doing this test when my goal is to get a GED or a HiSET. But if I say that, when you do the test, you get to see your own improvement-- that helps you to grow. Even if you didn't get your GED, you already made some gains, which itself is a motivation, a pat on your back.

And then, similarly, from the teacher's perspective, our teachers can find out which areas of growth they need to focus on you, right? Because when you do the test, each question is related to a competency. And that competency is where, if you did not do the test, if you did not know the answer to that, then the teacher needs to help you on that competency. And that's why we need to do this test.

So this whole idea of changing the mindset, getting our adult students to understand why they are doing what they do and how it helps them, not just the school, brings us more performance and persistence-- or, at least, students doing the test. So that's the idea of this slide.

Also, from a teacher's perspective, the performance-based funding-- For some time, as administrators, we are afraid to let our teachers know about all the detail that goes into it. But, as adults, if they know the why we do what we do, why we have to work on the data, why we have to get outcomes so that we can get our performance funding, it helps them, as well as us.

Of course, we all know it's to help students to grow and students to get their goals. But, above all of this, to help them to grow better, we need the funds. And to be able to provide the support they need, we need more payment points. And to get the payment points, we have to get the CASAS testing done.

And those are some cultural discussions. It's difficult, sometimes, but sharing that helps everybody understand the why. Like, Carol always shares a list of professional development books that she uses with her team. Today, I didn't see it, Carol, somehow. That sharing, and learning together, and growing together-- one of the books she has on her list is Simon Sinek's, The Why, right?

So knowing the why helps everybody else to be more participative and having a vision together as a school. I think, with that, my part is up. Thank you.

Jay Wright: OK. John still isn't here, so I think I'll just move along and have to wear my John hat, here. That's just the way that one is, but when he gets in, I guess I'll let him take over. But I'll just keep talking.

This is the part from the Citrus Consortium, and it focuses on looking on the outside, working with other stakeholders, and also looking at the geographic region that you're in. So this is from his consortium, like official, annual publication here, OK? So this is what they're doing, by community.

A big part of this presentation relates to presenting information by zip code, where he rolls up the data, and gets these consortium and agency level averages, and compares the agencies by their NRS levels. So this first one is just looking at the NRS tables-- the six levels of ESL, six levels of ABE, ASE-- looking at the average, looking at each individual agency, and showing the agencies within the consortium, looking at the pluses and minuses there.

I'll also point out with this, as, hopefully, he hops in here soon-- but anyway, looking at this, he also is tracking which agencies in the consortium are going down, which are going up, and you know when, in particular, which ones are moving up and moving down in relationship to each other. This is looking at overall enrollment, looking at the different agencies in the consortium, and looking at, by percentage, when each individual agency, each individual community, which program comprises the bulk at each agency.

So you can see ESL is almost all of the population at the college. It's most of Claremont's. In other agencies, you can see it's not quite as dominant a program. So it's looking at, by program, which ones are most important by agency and by community.

OK. Not sure how I got there. OK. I guess there was more for John to present. And, obviously, he's not here, so I guess I'll really just have to open it up for questions. He has a lot more to talk about on this. I just have the slides.

I guess I'll just open it up for questions and see if he shows up here in the next couple of minutes. If he doesn't, we just might be minus a panelist here. Sorry about that. He was going to show up about 1:50 and come in for the cleanup role.

I'll just say, any questions here for anybody on any of this? I'll try to guide through the external part any questions for Thoibi, any questions for Carol, or anything for myself here. I was expecting a final panelist and didn't get it, so I'll just make it Q&A, anything about this process that you would like to dig deeper in-- lots of ways we could go.

OK. They're saying they're having trouble with that access. I'll just say, I've put my email in, and Carol sent it. There does seem to be an issue. Maybe there's something with this Zoom. But the area where I normally upload documents when I'm in my own Zoom definitely is not working in this Zoom.

I'm guessing TAP must have disabled that feature so people can't post during the summit. I'm not quite sure what that might be, but I'm not able to post the way I normally do that when I'm doing my own Zoom. I'm assuming that's a difference between the CAEP Zoom and the Zoom that I use from CASAS.

OK? OK, great. OK, thank you. Veronica's looking into that one. All right.

So, Carol or Thoibi, Neil's question-- have you been looking at any of this? I guess I'll say, Thoibi, from a student point of view, or Carol, from a staff point of view-- whether you've looked at any of these things in terms of the new CAEP three-year planning goals and all that?

Thoibi Rublaitus: So we haven't deeply looked into CAEP planning goals yet. But when we did our planning goals for our NRS-- the SIP plan, we'll be using a similar process. But, again, it's the same thing about looking at data as a team, as well as picking the teachers and staff, the data team, our data technician, et cetera, to work together on looking at what we have done, where we have met our state level goals, and where we want to go next, and whether it's practical to take 5% point above more as a goal or--

For our NRS, SIP goal, we were thinking that when we had 35 students passed the, achieved the diploma in the year of the pandemic, the following year we could make at least 70 to 80. And that was the goal we set. It looks like it's going to be still challenging, because the pandemic is still not over. But we have been able to get at least 15% so far. And it is quarter one, so there is still ways to feel quite optimistic that we may be able to meet our goals. So, similarly, for CAEP, as well, we'll be doing similar to what we did for our SIP goals. I don't know if that answers the question.

Carol Hirota: So, in regards to mandatory metrics for three-year planning for the CAEP, I think what we're going to be doing-- because we looked at this when we were doing our coaching-- is we looked at the CAEP manager summary. And on there, there are some really important things in there that document learning gains and different ways to document learning gains, as well as employment.

And then on the right-hand side, there's also services. So as we're trying to improve learning gains, we also need to make sure that we are always improving our services to support services to students.

So that one document, that one data piece, can really give a lot of feedback to your consortia, because you can look at it as a consortium or you can look at it as individual members of the consortium. And so I think that that's really critical, because in NRS and WIOA we have different ways of looking at student learning, which is-- there's, like Jay just shared, a bunch of information that's online. But with CAEP, I think we're just working through some of the new systems on how we are demonstrating learning gains, as well as learner persistence and number of students that we're serving.

But also, how are we serving our students with student support? So I think that's critical. Should this-- should we be holding each other accountable? As far as member effectiveness, I believe that the consortium members should all be holding each other accountable to make sure that we are serving as many students as we can, looking at the gaps that you're not serving in your consortium area, as well as-- are the students learning anything, and am I documenting any of that, and what kinds of support services do they need?

And I can tell you, in my experience, working with the community college has really helped us improve our support services to students as we transition them to work or to a community college. So there's just a lot of things that are going on. But I really think you need to use some data piece to give yourself some feedback as to whether you're learning something-- that your students are learning something or that you are being effective as a member, as well as a consortium.

Jay Wright: OK. Thank you for that, Carol. And it looks like John is here. So we've got time. So I'm going to go ahead and flash this information up here on the screen and let him start talking, I was thinking there might be a little bit of an interim, but there really wasn't. So--

John Russell: Yeah, I apologize for that.

Jay Wright: Jump on it, and say whatever it is you think you can come up with here. I think the best thing is just scroll through this report at this point.

John Russell: Yeah. Thank you, and I apologize, team. I am in the middle of a-- I'm chairing a WASC visiting committee, so I'm sure that everybody in this breakout session understands what that's about. It's actually, also, an opportunity for, what I used to say in the classroom, was a teachable moment.

So as a precursor, to set the table for what we're about to look at, what I would say is that this document helps me immensely when I am-- because I'm usually the self-study coordinator. I've been a chair on about six visiting committees, and using this, having this document, makes it so easy when I'm writing our self-study report.

Now this is a report, this data report, and these are the highlights from the data report that's on our website. And this template has been updated over time. And it probably still has some work that it needs to be done. The fact is that we're doing it.

But it becomes-- because I use it for our school, as well as for the consortium-- and then, writing the WASC report really is very simple, because I just copy and paste from our annual data review. Anyway, with that--

So I do apologize for being late, but I'm the least entertaining and informative of the people that are on here, because Thoibi and Carol are absolute rock stars. And what they do is phenomenal. So I'm just little old Citrus College Adult Ed Consortium. We're a pretty small, little consortium, and Monrovia is a small school. But we do like what we do with our annual data review.

So we're-- the annual data review helps us understand the community we serve. So that we can look at those macro gaps that the legislation asks us to do. And then, Carol was talking about consortium effectiveness. We're able to look at members who were effective, and we do look at that. And we do have members-- we have some tough conversations around that.

But we also look at it as a consortium and how're we doing as a consortium. And we use this for member funding, and it drives our three-year logic models. This annual data review drives a lot of what we do.

And then, people can use it for school board updates. We use it for our information sessions. And then, like I mentioned, it's great for WASC reports.

So, go ahead and, Jay, move on. The one thing that I will say, and everybody has seen this slide, but the one thing that I will say is-- actually, no. Sorry, go ahead, go ahead, Jay. I was going to make a point, but--

One of the places where we start is looking at how, as a consortium, we're doing against the state averages and the state goals. So this is something where I've taken the CASAS data portal, which Jay shared with you earlier, and I take it, I flip it around. I always complain about the fact that CASAS starts with the ABE and ASE and then works down to the ESL. So I flip it around to where it starts with beginning literacy.

We put the California averages-- I have another slide where we do the California goals, as well. And then the bottom section, those bottom four rows, are from an Excel spreadsheet where we compare-- just a simple formula subtracting what each member's EFL gains were compared to the state average. And where it's black, we're above. And where it's red, we're below.

And so it's funny, because Jay always says, these are the top performers in the state. Unfortunately, I mean, I'm just-- I'm doing a WASC for Torrance Adult School, and it's outrageous what their-- when you look at this for them. I created this document for them, and it's really staggering what they're doing in terms of their EFL gain and their persistence gain. And so-- but I will say that they didn't have this document, and they couldn't really quantify it where it's so readily visually telling you, OK, we're below here, and we're above here.

In their document, they're above everywhere, and in some places, astonishingly so, especially in their persistence. The only reason that they are not at 100% persistence is because they don't pre and post their CTE classes. That's it. Their persistence for-- and I don't mean to bring my WASC in, but it's just where my head is at-- their persistence for ESL is at 98.3%, and their persistence for ABE is higher than that. It's about 99.1.

So it's amazing what they do with their persistence. And then their EFL gains are pretty astonishing, too. So we're not a high performer, but what I will say is we know where we're not and what we need to do to work on it.

OK, so you can slide on down, Jay. And so this is the same thing except for 2018-19, so you can keep moving.

And then we look at persistence rates in the same way. And so you can see some of us are doing better than others in persistence rates. And we have some areas for growth. But a lot of it is that we don't have enough data, and we don't have a big enough sample size.

We-- and the other piece of this is I don't have 2020 and 2021 numbers, because I have been too depressed to actually to do them, because I don't want to see what that data looks like. But eventually, I'm going to need to get over the shock and depression of it, because I know all of us have seen plummeting enrollment and subsequent issues around there in terms of persistence and performance.

So, anyway, we look at it from a global point of view, but the key is that, looking at the black and the red, we're able to sort of see-- this is something Jay did, but it's a spreadsheet that I've created. I can share that with anybody that wants it. You can keep going, Jay. And there's 2018-19. Keep going. There's our total unduplicated enrollment. We pulled this from TOPS.

Jay Wright: What--

John Russell: Go ahead.

Jay Wright: The question came up a few minutes ago-- which report did you base this Table 14 on? Somebody was asking about that right as you came in.

John Russell: OK, great. Yeah. So, when you go-- As a program manager, I am able to pull the reports. There's a demographic report-- I'd have to go into TE and figure out what report it is-- and it comes from being a program manager and having access to everybody. And what-- let me see if-- Is that unduplicated?

Jay Wright: Yeah. Yeah. Karina is asking, so I know you know about the TE consortium-level report-- so I want to say this is probably a variation on the demographic summary right here and, to some extent, a variation on the CAEP summary. But these are just consortium-wide enrollment totals. I'm assuming, probably, from Column E, the CAEP summary, would be my guess.

John Russell: Correct. And-- although, column-- Yeah. Is-- The column E and the CAEP summary is duplicated. And what this chart does-- and it's corrupt-- and don't get me started, because I-- We set ourselves up for failure at, with us, with our-- how we've set things up in ASAP and how it comes over into TE, because this unduplicated report gives you the lowest level of what somebody comes in at.

So if you have an ESL student who does ESL and eventually goes and maybe they get a HiSET, then it's going to count them unduplicated in ESL. They're not-- they'll be duplicated in another spreadsheet in another report. So there were demographic report that the-- I'd have to, and I have to, get into TE and look, because--

Jay Wright: So a short response is it's not something you just printed straight from TE. It's something you had to--

John Russell: Absolutely.

Jay Wright: --craft together from a few different things. It's not readily available something she can run now and see how she's doing compared to you or whatever?

John Russell: Yeah. So I take all the stuff that's in TE, and I don't just use them. I don't like the interface, necessarily. And I redo them, so that they're-- and then, once I have them redone, then it's just easier to redo them. But I just don't like just downloading and using TE's formats, like the CASAS data portal. I completely have changed that. And it means that there's manual work that I have to do, but I like the way that my reports are set up rather than the way TE--

Jay Wright: Somebody's getting on your case for that one.

John Russell: OK. And, actually, what I have found, Karina, is that I think that we're going to drop this report or this table. Because I think that duplicated-- unduplicated is not as meaningful as unduplicated. So I think we're going to look more at the unduplicated.

The problem was is that one of our members-- and the director of that school is no longer with us, so I don't mind saying it-- but she just went in and checked the box that immediately gave her like 2,400 enrollments in a program that, just because she had her ESL, she did something in ASAP that checked the box. And then, suddenly, her enrollment, unduplicated enrollment, was double, because she checked this box.

And so I was like, well, that's not-- that doesn't have integrity. And when I tried to push back, it's a political dance in trying to look at data, because it can get really testy, especially if funding dollars are tied to it.

So you can go past this, Jay. And that's the next year. Keep going. OK.

So this is a really important table that I have pulled together from our CAEP summary tables across all the members, except Citrus does not have a CAEP summary. We have a great partner in Citrus They-- because-- I don't want to say because they don't take any money, but that's part of it. So they don't ask for money. And they've actually found ways to help us with their Strong Workforce money, so they're an incredible partner.

And we've had a great partner with Michael Wangler. Ivon McCraven is there now. But Michael has left, and so we've been doing a lot of work in terms of transitions. But they don't get into our CAEP money, so that's fantastic.

But if you look at this, this is a summary over, from 2017 to 2020. And we actually used this again, because when we were doing CFAD negotiations for 2021-22, because we didn't-- because everybody's enrollment for 2021 was not great. So, the percentages were thrown off a little bit. And we looked at-- the literacy as the NRS should really be NRS, but that's the way it's listed on the CAEP summary-- and then the CAEP student, and then services. And so we looked at those three, looked at our total enrollments, and then said, OK, for these years, these are the averages and who we've-- the number we've served.

And then you can scroll down, Jay. And then, we use that to look at macro gaps, which is, I mean, those are huge. Because when we all pull census data, you're going to see massive amounts of gaps in terms of the number of people in the region that don't have a diploma or speak English less-than-very well that are five years and older.

So-- you can scroll up from there, Jay. That's one area we look at it. We also look at what we call micro gaps. And there was a-- Carol was on-- no, Thoibi was on-- a breakout session today to look at, specifically, this.

How do we lose people? How do they come in the door, and then we lose them? We don't get them to 12 hours. So while persistence looks at the number of CAEP students that have received pre and post, this, we call, in our consortium, are micro gaps.

How did they come in for-- they came and registered. They obviously walked in the door, and then we didn't get them to 12 hours. Now, a lot of this is because we have career centers, or job centers, or one-stops, and people come in and they maybe just want to take a typing test. Or they come in and check a job board, and they don't stay for 12 hours.

But there is a huge amount of students that, for whatever reason, come and go. And I know Thoibi had a brainstorming session today about what do we do to keep them.

And what I will tell you, again, bringing up the WASC that I'm on, at Torrance, these numbers are phenomenal. So we're at 38, 38, and 42.5. Again, some of that is going to be because we have a one-stop or career center where people don't, what they're looking for is not to come for 12 hours but just to get those job-seeking services. At Torrance, their percentages are 22, 15, 17-- phenomenal.

But here's the thing I'll also say-- I don't think that they knew that. I don't think that they had really taken a look and analyzed the data the way that we do at our consortium. And I'm able to go, hey, you're doing this, but you don't even know that you're doing it. So they've had a tremendous amount of success in keeping people beyond the 11 hours. And so what I was telling them, I said, you should figure out what you're doing, and bottle that, and sell it, and come do a breakout session at the CAEP.

So anyway, you can move on from that, Jay. All right. This is how we used that table previously to do our funding.

So we looked at the NRS, CAEP outcomes in total services. Everybody said, OK, let's use the middle total. And we, basically, said-- we looked at two years of enrollment, figured out what the total number for that was, which was 4,914-- you can see that in the middle section. We did the percent of enrollment for that. And then we came up with-- this is for 2021 funding--

[alarm]

That's a timer to let me know that I'm talking too long, because I always do. So I'm going to put this for five minutes.

But what you can see is that after we pull out consortium overhead, which is the fiscal agent fee, and the program director fee, and a couple of other fees, then what's left we then divide up. We multiply times the enrollment, times a certain amount-- turns a percentage of the enrollment-- times the total CFAD amount, and that's how we get our CFAD amounts. And it's been a process that has worked, but we're now going to start to use member effectiveness-- how much your EFLs are growing-- as part of that discussion on a go-forward basis.

You can go to the next one, Jay. And then, this is a huge one that we look at. We look at how much did you get in CAEP funds, and what were your total instructional hours. So we're all required to report this. This comes from the Trailer Bill. And so I pull this from TE-- TE gives us instructional hours-- so that we have a consistent source of it. Because some people go to ASAP, and ASAP hours just are all over the map, as are their reports. Their enrollment reports, I don't know what they do, but the reports are not, they never seem to be trustworthy.

But we look at this, and we look at the number of instructional hours, the amount of funds. And then we say, these are the number of students that you had. And we look at how much was your CAEP dollar per instructional hour. We do that for everyone.

And then it becomes, who's an outlier? Who-- the old Sesame Street song-- one of these things is not like the other one. If somebody is a big outlier, then we really have some hard conversations about that. In this case, Duarte is getting such a small amount that it's not a huge issue, because it is such a small amount.

But we also get to look at how much are we spending per student that walks in the door. And this is questions that we-- this provides an opportunity for us to ask questions around our effectiveness. What are we doing? Are we effective? Are we spending the money wisely?

And Jay has talked about, there's rumblings at the state. And I know the bill died in committee, but this, effectiveness, is coming down the pike. And this is one of the things that we've been looking at to measure effectiveness before. We want to get out ahead of it. So I don't know if you want to add anything to that, Jay.

Jay Wright: I'll bring up a question that Richard brought up about five minutes ago on the previous table. I think it relates to what you and I were talking about, is how do you deal with the duplication. So I think he was referring to this chart right here, where you've got these aggregated totals by program.

So he's noting that, in TE, we provide the duplicated totals by program. We unduplicate it once we total it all up. Is there any formula you use at Citrus to unduplicate, or do you just use those duplicated totals like you would get on the TE CAEP summary?

John Russell: That's a great question.

Jay Wright: Richard, you can jump in if I got it wrong.

John Russell: That's a great question. No. Right. No. No, we're not looking at-- and that's one reason why I said, I think we need to ditch the duplicated and look more at the unduplicated. Because, in many instances, that unduplicated is a better reflection of what's going on than-- the duplicated is a better reflection of what's happening, rather than the unduplicated. So Richard brings up a very good question.

Jay Wright: Thank you, Richard. I was going to say, I thought I might have bludgeoned your question a little bit, but OK. Thank you.

John Russell: And then, yes, Kay, I will share these. I'm going to-- I have a folder, and what I will do is share the things that I use in Excel to create these documents because everything, right now, on our page is in PDF. And I will provide the Word or the Excel spreadsheet that we use to create it. All right. Go, Jay, because I swore I was not going to talk this long. And I'm already doing it again.

Jay Wright: This is it. The big one here is 34, though? I know this is the crescendo here.

John Russell: Yes-- saving the best for last. Come see our breakout session tomorrow, please. Come see our breakout session. It is called targeted marketing.

One of the things that we use-- it's not the primary focus of the breakout session, but what we have done is leverage Title I funds. Now, Baldwin Park, Haci, La Puente-- those big institutions that are getting, that have Pell grants for their programs, they don't need that session. But we're just a little old school. And what I'll tell you is, I start in a deficit mode every year, and I got to go hustle for money. And what we have done is completely changed our model by leveraging Title I Workforce Investment funds.

And what you can see is that in 2016-17, we're on ETPL. Most of you are. If you're not, you should get on there. And we're on I-TRAIN. We're on ETPL.

And we started at 2016-17, at $20,000 in Title I fees, and since 2016-17, we have collected over a million dollars in Title I fees. It's brick by brick. It's hard work. I have a tremendous staff. We have a tremendous staff here that allows that to happen, and we have great partnerships with America's Job Centers across the region.

And if you come and find out tomorrow, you'll see how we are going to where students are, which is-- door hangers don't work. Catalogs are nice. They get mailed out. That's not what's working.

What works is, you go on Instagram, Facebook, go into social media, Craigslist. We actually have a marketer that goes and finds people. And I don't know what he does.

I don't want to. I don't want to know what he does. I don't know what Russian servers he used or what he does with cell phones, and I don't want to know. But all I know is that people show up and they want to take part.

And then, what we do is a lot of the pre-counseling. And you'll hear all about all this tomorrow, because I'll have our AJCC partners in there. And then-- but the end result is that we have leveraged a tremendous amount of Title I fees for our budget for our CTE classes. And we believe that's the model way of using--

It is called Targeted Marketing-- let me get the exact name. Neil, can somebody look that up? Targeted-- I should get the exact name. Look for Targeted Marketing.

And sorry, it's not tomorrow. It's Thursday at 1 o'clock. Thursday at 1 o'clock. Thursday, at 1 o'clock, Targeted Marketing and the CAEP Model for Expending CTE Funds or the Model for Expending CAEP CTE Funds.

What we do is we're proposing a model as to how they should be done, at least for smaller schools like us. So that's-- this last piece-- we're now spreading that expertise to our members in our consortium. And there's a lot of very exciting things that are happening because of that, because of the partnerships that we're making.

And the last thing that I'll leave you with is that the feds and WIOA said, here are your five titles. Title I, you have to have MOUs with all these other partners. And part of-- the main person that they have to have an MOU with is Title II.

And we all know that, as Title II schools, we have to go sign that MOU to get our Title II WIOA funds. And we don't want to work without those funds. And they don't get funded and they're out of compliance if they don't have us sign.

So the feds have said, hey, you gotta to get in the same room and figure things out, start to provide wraparound services, basic education, ESL to help people go and get jobs and help these Title I AJCCs-- that are funded by Title I fees-- to go get their people the necessary academic skills so that they're successful in their trainings.

The state has said, AB 104 came out and said, hey, you should be leveraging your partnerships. And the primary partnership that you should be leveraging is AJCC. So the states are saying-- and the feds require the Title I AJCCs to have a-- they're mandated to provide financial aid.

We can do that with CAEP funds. And the way we do it is we charge a certain amount of tuition, and then we lower the tuition. We say, yeah, you're getting financial aid. And then their Title I funds help us.

The state and the feds want us to do that. We're supposed to be working together in order to get an understanding of what industry wants and how to provide the trainings that are helping industry, that are helping the job centers, the Title I recipients at WIOA. And so we go into that in depth on Thursday at 1 o'clock. But what I'll tell you is the million dollars of funding that we've received over the last four years has made a tremendous difference in our operations, because we're able to spend more money on ESL and ASE programs and make CTE more of a standalone model.

Anyway, I don't know. I'm just--

Jay Wright: So, first off, Thanks to Jodi. She posted your session, so it looks like that's taken care of. So thank you, Jodi.

John Russell: Thanks, Judy.

Jay Wright: And then, the other thing I'll just bring up about this issue is, this is a big hot topic here for 2021-22, kind of one of the anomalies of some of the year-end data we got in 2021 where, unsurprisingly, most Title II agencies are reporting some difficulties during the pandemic with communicating with their Title I partners and so on. But somewhat surprisingly, the amount of money Title II agencies have gotten out of this has increased.

So we have kind of been looking for Title II agencies like John that have actually been doing this, to figure out where that's coming from. It's a figuring out a positive, not a negative. There's a few agencies that are actually cracking this code now, finally, after five years.

So I'll just say, if you know you're doing this, we definitely would like to hear from you too. Because, from our point of view, the number of Title IIs that are cracking that Title I code still tends to be few and far between. So we really are looking for Title II agencies that have figured this out a little bit.

And I think a lot have cracked the code on finding out who their Title I is, but taking that to the next level where it's actually related to actual revenue has been where the few and far between comes in. So if you feel like you're doing it, we definitely would like to hear from you. And this definitely is going to get to be a bigger and bigger topic over the next year, is working with your Title I to crack some of this Title I finance.

John Russell: Yeah. The money's there. And they don't want to give it to private schools. They would prefer to give it to public schools. They want to give it to us.

And there's a lot of private schools are out there finding-- they're figuring it out. They are certainly figuring out a way to feed from that trough, intensely. And so-- we're not set up as a Pell Grant school. If you are, more power to you that you've figured it out beyond what we are. And we've talked about it, but it requires a real fundamental change in how you provide services, because the classes have to be a certain length. And we've been more of a short term. But in any way, please stop by. Stop by at 1 o'clock on Thursday.

Jay Wright: And I'll say thank you, Carol. You're adding some good stuff from San Joaquin that fits right into this. So thank you for that. So I'll say, Veronica, did you have anything you needed to bring up to close?

Veronica: Yes. Can everyone hear me?

Jay Wright: Yep.

Veronica: OK, great. Yes. So if you all do not have any more, we have about two minutes left. But I did post a link to the evaluation so that you can go ahead and rate today's session. When you click on that evaluation link, just note that the evaluation link is used for all sessions today. So you'll move past the CAP update and choose from the dropdown menu for this particular session. And that is how you would rate that session-- or excuse me-- this session that you are currently in.

Jay Wright: OK. All right. Yeah. Thank you. Neil is going to take care of the talk about data police, like-- all right, good. Sheriff going to get you on being late, too. Thank you, Neil.

John Russell: Thanks, Neil. You know, what can I-- I'm sorry, Neil. I was doing a WASC visiting committee. Shoot me.

Neil Kelly: Sheriff going to get you.

Jay Wright: All right. Well, thank you very much, Carol. Thank you very much, Thoibi. Thank you very much, John. You know, Veronica, as always-- Great session, as always.

Hopefully, the fly-by-night nature this time was OK with everybody. I think everybody is so used to this by now that people would expect nothing less than a little fly-by-night, I think. So I'm going to strike that up to the as expected.

And thank you very much, everybody. Lots of great sessions here tomorrow and Thursday from each and every one of us, I got to say. So thank you very much, everybody.

Carol Hirota: Thank you, Jay.

John Russell: Thanks, Jay. Thanks, Carol. Thanks, Thoibi. You all rock.

Veronica: Thank you, everyone.

Carol Hirota: Thanks, John. Yeah, thanks, Thoibi.

Veronica: There is a little bit of a break.

Thoibi Rublaitus: Thanks, Carol. Thanks, John. Thanks, Veronica.

Veronica: You're welcome. The next set of breakout sessions start at 3 o'clock PM. So if you haven't checked the agenda, check the agenda to see which session you want to attend next, and log on at 3 o'clock. I know Carol and her targeted technical assistance team will be hosting a session. I'll be the tech host for that session, as well.

So thank you all very much for your time, and, again, join another session, visit the exhibitor booth, take a photo, and complete the evaluation, please, as well as look at other areas in the conference platform. Have a great day, and see you all soon.

Jay Wright: Thank you much.

John Russell: Thank you.

Thoibi Rublaitus: Bye bye.