And here we go. The attendees are starting to come in. We've got one. [laughter] Sometimes, those links are a little hard to find, and we do have to hit the Refresh button every once in a while so we get the Join button. We're up to two, yay! I mean, it was just-- here we go.

All right. And we're starting to fill in. It's looking good. All right, everybody. My name is Melinda Holt. I am the tech host today for this webinar, NRS Performance Goals-- doo, doo, doo, doo, doo, doo, doo.

Everybody come in. Have a seat. There's lots of room up front. Everybody just move towards the front. You'll get much better seating there. They're nice and cushy seats towards the front, hard chairs in the back.

And we're going to go over a few housekeeping items really quick here. So Jay, if you could release Share, thank you. This will be really quick, folks. And you've probably seen these before, but we have to do this every time.

We are recording this webinar. All of the recordings will be available on the VFairs platform as soon as they are downloaded, rendered, and uploaded. So that takes time. Please give us two to three business days, if not more. If you don't see the recordings on the VFairs platform, contact TAP after about three days.

Audio-- you control your own audio. It's one of the few things you do control here in the webinar land. If you cannot hear, or if your sound's really soft or really loud, turn your volume up or down. If the sound appears to be coming from your speakers instead of your headset, then check the little carrot next to the Audio Settings button. And then you can control your output, which is what you're listening to.

The webinar Chat will be used, as well as-- the other button down there is the Q&A. For the webinar Chat, just keep that for general chat. And make sure that All Panelists and Everyone are selected. If you just choose All Panelists, then that's going to be the only people that see your chat. So if you want everyone to view the chat, select All Panelists and Everyone. That's just for general chit-chat.

Or if the presenters ask you to say "yes" or "no," use the Chat. All other questions should be typed in the Q&A. So to get ready, why don't you go ahead and click on the Q&A button. Just pull it off to the side, and wait until you have a question where you can type your questions related to the presentation, please.

You can also Upvote someone else's questions, so you don't have to type it. You let them do the work. You Upvote it. And then the viewers will see that more people really want that question answered.

You have View Options. This is the other thing that you can control during a webinar. You can fit to window, or you can increase the size of the presentation. So if things look a little dim, you might want to increase your setting. You could also Exit Full Screen. That way, you can multitask while you're listening to the presentation.

There will be an evaluation at the end of this webinar. When we end the webinar, you'll see a Continue button. You click on that, and it should open the evaluation. This time or today, it should open up the evaluation so you can actually input.

Please don't edit, all right? And we hope you're having a great summit. I'm going to go ahead and stop sharing and hand off to our lead presenter, Jay Wright from CASAS. Jay?

Hello, everybody. Hopefully, everybody can hear me OK. I'm sharing my quick little PowerPoint here.

We can hear, and we can see.

Going to get into full screen here. Hopefully, everybody can see what we need to see.


But I want to start by introducing my all-star panel. I have Steve Curiel from Huntington Beach Adult School. I have Carol Hirota from Delta Siera Consortium. I have Thoibi Rublaitus from Corona-Norco Adult School. And I have John Russell from the Citrus Consortium.

Just in the interest of giving full credit where credit is due, Branka Marceta from the CAERC Capital Consortium was a fifth panelist. She did do a lot of the preparatory work with the other four panelists, but she was triple-booked. So she was unable to actually be a panelist.

Got to say, we have way more than enough already, though, to fill the time with the panelists we have. Because we have a great panel, there is a short little presentation I was going to do just so everybody's 100% clear on what the heck we're talking about when we use the term "NRS Performance Goals." But I really wanted to get the panel out here in front, so I wanted to introduce them.

Don't worry about this crazy bunch of questions here, but this is what the panelists had to face by being on here. So a good appreciation for the panelists, as I gave them, like, 20 pages of this sort of stuff. So just wading through all the questions I overloaded them with was a big heavy lift applied itself.

But back to this slide, which is a lot easier to view, I wanted to have each panelist introduce him or herself and kind of talk about how they've implemented this over the years. It was no accident who I picked. I knew exactly who I wanted to pick for this panel because they've shown superstar qualities for a very long time.

We've had lots of detailed discussions, each one of these panelists, about their data and about how they use their data to excel at their agency and at their consortium. So I'll start with Steve at the top. He appears at the top of my screen, and he's at the top bullet on this slide.

--everyone. Just happy to be here. And of course, any time Jay asked for me to pitch in and talk about my school, I'm happy to do so. And so we've been using these performance reports, the NRS tables, the TE tables ever since I jumped into ESL, what, 12 years ago. I wasn't an ESL person. I was a more CTE administrator.

And so I started learning about the NRS tables, and some of you guys or folks are new to this. I will say, honestly, it takes a little bit of time. You got to be talking with-- definitely networking with your colleagues, finding out about where your data is in terms of how you guys are performing and comparing to other schools.

And-- I'm looking at the-- let's see here-- also, one of the things that I highly recommend is attending the workshops that CASAS puts on and CDE Those have been crucial and beneficial to me and my staff, as we've learned about the performance goals and the changes that happen as they come in. So with that, I'll save more of my comments to my next section, where I've got a whole section just to myself.

OK, thank you. Carol, if you could go next?

Hi, my name is Carol Hirota. And I'm representing Stockton Unified School District, which is Stockton School for Adults. I'm currently the Delta Sierra Adult Education Alliance Exec. Director, but formerly the director of Educational Services, Adult Ed.

I've been in adult education since 1995. I am a credential speech language pathologist, so data is very important. And that's always been a priority. I will be talking about our journey with our colleagues in Stockton.

But I will tell you, I think the change process started when Marian Thacher came from OTAN and created a technology integration program project with our faculty. And I think the other significant professional learning opportunity was the professional learning communities and the PLC Institute, and the process. And I will be talking about that with the data teams, so thank you.

Thank you. Thoibi?

Good morning, everybody. My name is Thoibi Rublaitus. I am honored to be part of this team. And before I go ahead with anything, I'd like to share that I'm a relatively new principal. However, I have been with Corona-Norco Adult School for about 13 years now.

And under the leadership of my director, Jodee Slyter, we've had a very good data team here. And I've been part of the data team. And today, I will be sharing with you a little bit of how we do what we do with a student-centered perspective. Thank you.

Thank you. John?

I'm John Russell. I'm at the Citrus College Adult Ed. Consortium. I'm also assistant principal of Monrovia Community Adult School. I don't know that we're necessarily superstars in our results, but what I will say is that we're superstars in looking at the data. So we do pretty extensive deep dives into our data as a consortium. That fuels a lot of conversations as to how successful we're being as individual member institutions and as a consortium.

And you know, I'm just happy to be here to share that process and answer any questions as to how we look at all these things and say, maybe our results are not spectacular. But we know what's fueling and driving our decisions. And so with that, I'll just pass it back to Jay.

OK. Thank you, everybody. I just wanted to get everybody in before I went in to stuff. So I'm going to be super brief, but just because we have this title, NRS Local Performance Goals. I know some of you have been into this and know this, but it's a little bit of an ambiguous title. So I wanted to spend five minutes or so running through some slides, just so everybody knows what we're all talking about.

So for starters, we have the NRS goals. NRS stands for National Reporting System. Most of you know this. But just to be clear, that refers to the feds. To be even more clear, the NRS isn't really the feds. They're contracted by the feds to provide regular evaluation at the national level, I like to say very similar to what we at CASAS provide at the state level. That is standardized evaluations, standardized accountability services.

So when we talk about NRS goals, we refer to our state goals that we have in California. Making a long story short is California and all other states are required to establish goals with the NRS or the feds every year, so that's a negotiation process. So we have our statewide goals.

Once we know what our statewide goals are, we're looking at our local data to establish local goals so we can be obviously all superstars and be above average. Maybe if we're not above average, we can get closer to average, depending on where our data is, depending on how we look, ABE versus ESL and so on.

So I don't want to get too detailed, but I'll just say can use the NRS tables in TE. And/or you can use the CASAS Data Portal to look at your local data, compare it to statewide performance totals, as well as our statewide performance goals we establish every year with the federal government. That's the name of the game.

When you look at how the feds measure California, how the feds compare us to everybody else, that's the way they do it, is they set goals for us and look and see across these 12 levels on the federal tables do we meet or exceed our goals that as many of these 12 areas as possible?

OK. So here's the screen shot for the CASAS Data Portal and a link. Some of you, I'm sure, have looked at it. But we have data dating all the way back to the '04, '05 year, so it goes way back for everybody. If you've been WIOA II for any period of time over the last 15 years, your data rests here. We've got Table 4, 4B and persistence results dating back some 15 years on the Data Portal. It's really user-friendly to compare and contrast your data with state totals, with your neighbors, with the other large or other small agencies, and so on.

So in the training, we get into that's a lot more. But just to summarize, in general, if you're just trying to see how you compare with the Joneses, so to speak, we say, use the Data Portal because you can stack up your data year to year or against neighbors or whatever and really get a good view of how you compare with everybody else. Once you determine what your strengths and needs are, then we suggest getting into TE using the NRS tables and taking advantage of features like drill down, right clicking, so you can really get to the bottom of areas where you're strong and areas where you're not so strong.

So here is a screenshot of the CASAS Data Portal. This is what it looks like. If you've never been there before, again, we show the goals. We show the averages. And you can see where it says Agency, we whited out the agency. But you can use it to add your agency totals or other statewide totals, county totals, and so on, and see how you compare across the six levels of ABE, well as the six levels of ESL.

I'm not going to get into gory details here, but you can see you can stack it up. You can look at areas where your percentages might be better, areas where your percentages might be worse. So here I'm moving quickly, just to show-- here's an example with the red arrow. That might be a below-average area that we would want to target ASE Low.

Don't worry about all the details for now, but it shows we're at 22.6. The goal is 37 or 39 this year. So you can see that's an area where we're way below average. That might be an area where we need to improve.

Just to keep the concept going, ESL Beginning Literacy is an example where we're strong. We're above average. We're doing really well. It includes a pretty good number of students.

So that would be an area where we might say, yeah, we're on the right track with that. So we're all going to have some areas where we're doing really well and others maybe not so well relative to how we do with others, relative to how we do across our other areas. That's how we can decide our areas to set as goals year to year.

So here's an example with persistence. So we look at performance data. We can also look at persistence data. We'll talk about different ways of measuring persistence with the panel. But in federal reporting land, that's just looking at the percentage of students that each level that stick around long enough or do, quote, unquote, "persist long enough to at least complete a pre-test and a post-test."

We've done lots of workshops where there's alternative ways of looking at persistence, but the NRS data just looks at the number of students who qualify for each of those 12 levels and looks at the percentage of those who persist with the pre- and post-test. So here is another way of looking at that same concept.

We've got one example, ABE Intermediate High, where we're not persisting very well, and another one here at ASE Low where we are. So in this example, this is presuming that we've got two areas where our performance is low. We're looking at targeting it this because we're a little disappointed in our results, so we talk about looking at persistence as a first way to go, where, if we know we're particularly weak in an area, we evaluate whether we've been able to achieve persistence, yes or no.

So an example of where it's "no" is with this red arrow. We're performing weak and intermediate high. We see that we also need to improve persistence. So this is a good indicator that that's really what we need to look at first. We need to get everybody tested and/or improve our demographics or instructional hours data first and then start looking at improvement in the classroom.

The alternate example is ASE Low, where we know we're struggling. But you can see by this graphic, persistence does not explain it. Our performance is weak, but our persistence is actually pretty good. So we know we're not going to get that far by correcting it. We thereby want to look at other reasons. Typically, we want to look at reasons in the classroom.

So what we want to do then is evaluate our data. This is just an example of looking at it longitudinally. Once we identify areas where we know we want to improve, and we start looking at some of the reasons or areas we need to improve in order to get that level up, we can start digging in our data.

We can dig down on the Data Portal by looking at longitudinal results, look to see whether it's always been a problem spot, whether we've improved over the years, whether maybe it was a good spot earlier, and for whatever reason we've gone down, lots of different explanations. But we want to look at the long game here and kind of develop a short-term and long-term plan for how we might improve.

The other big issue with persistence is some of you have probably been to these workshops, where we talk a lot about this basic 2 by 2 diagram, where, once you determine performance high or low, persistence high or low, we've got this quick little graphic that kind of helps us point out what we need to do in order to improve things.

Long story short-- again, I want to be quick and turn it over to the panelists-- is if we know we need to improve and persistence is also low, that suggests we need to improve our data and our testing. We need to fix demographics, hours of instruction, and get everybody pre- and post-tested.

If our performance is low, like in that upper left, but persistence is actually pretty good, then we know we've gotten everybody tested. We know we've done pretty well with our demographics and instructional hours data. But yet, our performance is still low. So that means we need to start looking in the classroom in order to get those levels to improve.

So this graphic is just a real generalized way to show everybody what steps they might need to take to improve that particular level based on what our data is telling us. So that was a very quick-- that summarized about six hours of NRS Local Performance Goals workshops in 5 or 10 minutes. Hopefully, that told you something instead of nothing. I'm not sure, but I wanted to provide a really quick overview for everybody, just so you know what we're talking about overall.

So here's the outline. We talked about those first two. The second bullet is when the panel introduced themselves. That first one is what I just got through doing. So we're all going to have our moment in the Sun with each panelist, starting with Steve, where he's going to talk about using NRS levels and how it can apply across the board in different areas. So I'm going to turn it over to Steve now. I think you might be muted.

Steve, you're muted.

Sorry about that.

There you go.

Sorry about that. Yeah, so over in Huntington Beach Adult School, we've been-- actually, I've been really into the data tables that I look up on Data Portal. It was a kind of a-- I won't say surprising, but it was something that I had wanted to see, some kind of data, statewide data and local data, that was in many ways absent from some of the other programs I was overseeing before that.

And so when I started looking into the data, I saw exactly what Jay was pointing out, is some data that kind of interrelated with persistence and outcomes. And it was a concept that, at that time in our programs, was not really connected at that time. And so it happened to be also right that year, the next year, we were hit with the Great Recession. And our district was in the news, I think one of the first ones to make some major cuts. And I had to face some difficult decisions in terms of layoffs and figuring out how to preserve our program.

And so what I did was-- and me being new to the school, I didn't have a good feel for all the staff. And I based it-- our layoff decisions outside of what our district had negotiated with unions-- there were some flexibilities. There were some things that I had to decide in terms of staffing, and so I based it on persistence. And that was one of the easiest ways to preserve our program and to look at our teaching staff and to ultimately protect the most number of students.

And that was a very difficult time, but that became the year when our data became the driving force in a lot of things that we would do in that department. And then later on, when I became principal, schoolwide now, a lot of people didn't like that approach. But at the time, it was what I felt the best time and really turned our school, our programs into more data-driven entities.

And so we have built in, through those years, processes around reviewing the data every year, both at the administrative level, and then also at the department level, where I think it's more critical to do so. And in the review, we looked at performance levels by class. And that means by teacher.

And it is a scary-- at first, it was kind of a-- there was a pushback. But really, the idea was if we're going to figure out how to do better, we need to see who's doing best. And then much the same way that Jay has picked this panel, and I'm honored to be on this panel, that he's recognized of our data that it's doing well. And we do perform well in our outcomes.

I did the same thing with our department, and not for pointing out that someone's not doing their job, but more importantly for pointing out those who are doing well. And how do we learn from them? What are they doing?

And most of our meetings look at, who are our top performers? And we have them lead discussion groups around persistence, how do they get the persistence up, and then what other strategies they use in terms of curriculum and targeting specific questions. I think somebody else will get more specific about all the different reports you can get at CASAS. But really, looking at the data and then deciding, making curriculum decisions from there.

One of the things that I've always been a fan of is looking at low-hanging fruit. And what are the easiest things we could do and address and focus on as a team, as a school that do not cost us a lot of heartache and headaches? And so one of the easiest ones right at the beginning was persistence. How do we improve our persistence? We invest in a lot of persistence, so that was one right off the bat.

Next thing we did was we looked at, how many paired scores are we getting? And so looking at the number of students that we have in our program, and tracking who's missing a paired score, and looking at strategies for those teachers that had the best performance in terms of paired scores, that was the next step. And so we continue to look for those things that were easy to identify and track, and then actually implement some strategies to address.

And so like I said, we review-- ESL department is probably our high performing-- I would say our high performing-- that's where we tend to produce the best outcomes. That's where they have regular meetings, and the teachers are discussing the outcomes and performance levels, sharing ideas, breaking into groups as far as-- by level, and then having the best performers lead those discussions. I got some questions here. I want to check to make sure--

There is one question, Steve, from-- let me go back out.

Which great teacher--

Isn't there a risk at misalignment when you are setting goals based on two-year-old data?

So yeah, you could look at it that way. But the-- I won't say it's accurate. One of the things that I've always been-- I've gotten used to, but it took some getting used to was data where the outcomes or the results of the data in terms of revenues and grand amounts was based on two-year-old data.

So there is the issue. We're looking at data on an annual basis. So at the end of the year, when you submit your data, whoever your CASAS person is can give you your TOPS, your Table 4 report, Table 4B, Persistence Reports. You don't have to wait for the data posted on the CASAS website Data Portal. You have that data live.

In fact, we run those reports quarterly to look at it. Sometimes, they don't get it into the department because it's a little fuzzy data in terms of stuff's not cleaned up. You haven't made sure you got all your data integrity information cleaned up. But we can look at the data right after we end the fiscal year. But more importantly, we do look at persistence, which is live data, or attendance persistence, the pairs scores report.

One of the other things that we did early on is as a leadership team, we decided that no student would enter a program, would start a class, and get any instructional hours until they had a pre-test. So that was a mandate in both the diploma program, or ASE program and ESL. That we took care of right away, so that's not an issue for us pretty much anymore.

And so with that, there's another question about, are the majority of teachers part-time, full-time? Like most adult schools, majority of our teachers are part-time. We have invested in full-time teachers.

I would say, we have about the same always. And that's probably-- I would say, a fifth of our teaching staff is a full-time, 20%. So I try to keep it right around there in all our programs if I can, just to build some consistency, some long-term commitment, and some folks at that I can go to consistently and talk about data from the current data and then years past data.

And so let's see here-- anything else . We do implement a testing calendar that everyone, and I mean everyone, from teacher to classified staff-- our receptionist knows about it. Things change during test week when we do test students for our post-tests and for EL Civics.

And so that's been something that's been very helpful because support staff-- and I know somebody's going to talk about them later-- they're critical to this whole piece, and they need to be in the know, what's going on. And that's where-- as far as communication is people calling in. They want to be aware if it's testing week. They want to know what kind of questions they're going to be getting so they can prepare for that.

And Data Integrity Report-- that is definitely a support staff report that we review on a quarterly basis, maybe-- yeah, on a quarterly basis, maybe. Maybe after the-- not until December. Or do it December, and then March, and then June, near the end of June to make sure all our data is clean. And then we're getting all the date of births and all the requirements that-- or all the data that's reported in the NRS table.

And we're always looking to beat the previous year numbers. So if we're missing a certain percentage of date of births, we make sure the next year it's less than the year before. But most of our NRS data is pretty clean, pretty good.

And I think with that, I'm going to conclude and hand it over to Carol for the next--

Hi. So in Stockton, we've been collecting and really looking at our data since around 2003. And every year, what we would do is we would-- in the summer, we would put our percentages for the 12 program areas or the 12 educational functional levels. And then we would put the performance goal that the state had established.

The only data that we would have six months later would be comparing ourselves with all of California. But we have our data, so we can-- before the school year starts, based on our previous school year, we are able to create our goals by looking at our previous data. We could look at also how many students are in each of the different educational functional levels, and that actually triggers conversation with our faculty and our staff, our staff because we want to make sure that our data integrity is accurate.

So I'll go into a little bit more about how working on this kind of data and the performance-based payment grant for WIOA has influenced the way the school, the faculty work with their professional learning communities.

So I could go ahead and talk about this. So when I share data results with staff, they are actually super critical in this whole process. They are really our rock stars, our office staff. We have a school secretary, and we have six student data technicians. So they are the ones entering the data.

And they are more than just the student data technician. They are also the registrar, the attendance clerk. And they are our number-one customer service right off the bat, the counter, phones, email. And so they are constantly looking at how registration, TOPS Forms, test data is entered into our MIS.

They look at that Data Integrity to make sure that it is accurate, and they support teachers and the students so they can ensure that there's a pre- and post-test, and how we schedule make-up tests. They also support students in making sure that they know how to register for the GED or HiSET exams. They're looking at accurate attendance, TOPS Forms, and data entry, so critical.

We share quarterly and annual data results, and we exchange feedback with each other. I think the other critical thing that's really important is to support your office staff and send them to professional learning opportunities, like CASAS Institute, or professional organizations like CCAE. They also have regular staff meetings. And sometimes, they'll attend a faculty meeting with teachers and counselors and the administrators. And of course, reinforcing daily communication is essential.

So those are some of the things that we think are really important things. Oh, OK. And then sharing data results with teachers and counselors, we can start with any weekly-- looking at in the TOPS Enterprise system for monitoring reports that the teachers can have so that they know which students have paired scores, which students have completed EL Civics, which ones have had learning gains, those kinds of things, and what areas and gaps that they might have within an ESL class.

It is much more difficult in our adult basic Ed classes and our adult secondary education classes with a high school diploma and GED because of so many gaps. And also, we have found-- and this is more of a subjective observation-- that our students in our ESL class are very diligent when they take the classes test.

Some of our students-- not all, but some of our students in our ABE and ASE programs-- they kind of just go through the test, and they're done in 15 minutes. And so we try to encourage them and remind them that the reason why we're able to offer these classes for free is based on our learning gains and student performance, so we do talk about that. Looking at quarterly reports and the annual reports, and of course the professional learning community data team process-- and I can talk about that some more later on too.

Oh, OK. And then, on this one-- the reason why I shared this page is here's a typical day when they come back to school. We look at the data from the previous year. We not only look at the WIOA data. We look at course completions, high school diplomas, and number of GED certificates. And then we interpret and compare the data. We celebrate the data.

But here's the kaizen part of continuous improvement. We ask our teachers to have this discussion. What are factual statements that can be communicated based on this data? What are some of the questions that you have about the data? What do we celebrate? And what do we need to improve for the next year?

So we're doing this day one, when all of the faculty come back together. And then, of course, they set up their data teams, and they set up their format in Google Drive and their team goals in Google Drive. Every other week, they meet in a faculty meeting. And then they meet an PLC for 90 minutes the alternating weeks. So basically, our teachers are meeting together every week.

So that is actually a really nice feature in continuing professional learning communities. It is actually part of the teacher contract. Our teachers are part of the teacher union, and they're full-time teachers. We also have part-time teachers or actually hourly teachers that work for us during the day or in the evening.

So those are some of the things that we talk about. And looking at communities of practice, this one year, we looked at creating learning spaces. How can we improve our learning experiences for adult learners by changing our classroom environment? So it's just some things to start that discussion for our faculty to engage. So that's just one example.

Oh. And the reason why I showed this on the Table 4 is that when we look at this data, because we chart it, and you'll see it in the next slide, we're really low in the ABE/ASE area. We really struggle. We don't struggle in the ESL area.

But let's take, for instance, Beginning Literacy. It's at 25%. The state goal is 55, and the state average is 54. And we're at 25%, and we have 4 students that were in that educational level.

So we can look at how many hours those students were in, but it isn't a priority as much as when we look at the highest students, like the area that-- ASE High. If you go to ASE High-- or excuse me, ABE Intermediate High, we're at 40%. And-- oh, wait a minute. I meant to go to ASE High. OK, so the state average is 43, and we have 132 students that were in that area.

But the other area that I think is significant is if you go to ABE Intermediate High, we have 387 students. We were at 40%, and the state average was basically 42%. So those are some things. They're close, but still we want to pay attention to make sure that we're maximizing our opportunities for students. So we also look at our numbers, not just the percentages.

But we do compare our percentages from year to year to make sure that our teachers have an understanding that we're not just looking at goals, but we also look at the California state average. We think that we want to be above the California state average. So we pretty much just state that right out.

Hey, Carol, there's a question from Serina, asking whether all teachers, full-time and part-time, are included in your weekly PLCs.


If you do, how do you arrange their schedule?

Right. So it is scheduled every Tuesday for our full-time teachers. Our hourly teachers are all invited to attend, and we pay them. Some of them are able to attend.

But if they work in the evening, and they also are at another district site, they are not able to participate, because they work in the K-12 system. During the day, they have to participate in their PLC or their faculty meetings at their sites. So we don't get 100% hourly and full-time, but we get 100% of our full-time faculty at these meetings.

Thank you.

You're welcome.

Jay, can I make a comment about ABE and ASE? Those--


--also lower-- those are areas that we've been, I've been looking at for a few years now, trying to get those numbers, because our performance is typically low in those areas too. And I think-- and I'm not sure. Because obviously, if you have a state average, you have some folks that are above the average, doing well. I'm always looking for folks that do have those higher averages, see what they're doing differently.

I don't have solid answers right now. But I know one reason we always get pushback from our ASE teachers is that the students-- and, I think, Carol said, they're not as interested in taking these tests. They don't really have value to them when they're working on a credit economy. They just want to get their classes completed, their credits, and they want to graduate.

So that's always a challenging conversation with our staff, and trying to integrate the testing into the program is not as seamless or as easy as it is in ESL. And then the other compounding factor is that our ABE students and ASE-- they tend to be more the independent study participants. They're coming and going. They're not staying for a direct instruction, and so it makes it harder to start tackling some of those standards in the instruction.

They're basically taking courses that are designed to get them through and get them through history, and math, and science as fast as possible. And so just throwing that out there. These other folks are struggling with those areas because they are-- ASE program does present some different challenges. And I think it might be tied to the format of the program in terms of how we deliver instruction to students.

OK, thanks. For Steve, for Carol, or any of the panelists, Linda asks, what specific data are you analyzing with your data teams?

Linda, that's a great question. So it is kind of evolved. So I was just looking at the agenda for each one of the different years on how we got started with the PLCs. Formative assessments are critical for teachers to do some comparison and getting feedback on the teaching and the learning that is going on in the classroom.

So yes, we look at both. This is the annual data that we look at and quarterly data, and then we look at the monitoring reports that are set up in TOPS Enterprise. But as far as day-to-day assessments, formative assessments are really critical. And those are teacher-generated that they would have to do. Some of them use a lot of electronics now. They will use an app to get some feedback before the students walk out the door.

In our--

Thank you.

In our program, we started off-- that whole idea, the low-hanging fruit-- we started off first with the persistence, improving persistence rates in our programs, with the idea of being-- like Jay had explained that that's a good indicator or it could be an indicator to better performance or lower performance.

So really making sure we had good attendance practices on the teacher's part-- what were they doing? How are they interacting with students, welcoming students at the beginning of the day, making sure they knew their names? Things of that nature can really encourage others to keep participating, keep attending classes.

Then we moved on to paired scores. Where are we getting the paired scores on? And many of our ESL classes-- follow that responsibility, in many ways, falls on the teacher who is administering the test, who is encouraging students to show up on the day of testing, and then following up on those students who passed.

And so we started focusing on paired scores. That definitely relied on more data because they were tracking the students and reports that we've given. These students are still missing a post-test, for example.

Then we moved on to the individualized reports, where teachers would be given-- here's this list of the number of students who are missing-- or the highest missed question-- forget it, whatever the name of that report is.

But TE will give you reports for your teachers. Which test question are the students missing the most? And teachers can go back and just try to hit on those questions, knowing that it'll benefit the most number of students and hopefully get a few more points to get in that game.

All right, thanks. Carol, are you still on this one? Or are you on the next--

No. No, please go to the next slide--


--quickly. And I'll go through these slides quickly. So here's an example of-- since 2003, this is how we set up our data, using the EFLs on the left-hand side. And then we have Stockton School for Adults, the Performance Goal of the state, which is on the Data Portal. And then, usually in January, we can see how our performance stacks up against all enrollees in California. And we put that data in.

So this is a good comparison for us to see where our areas are. I have to say, though, we do struggle between ABE and ASE. ESL has never been an issue. So you can go to the next slide.

And so this is really important to look at your performance-- Payment Points Summary. And there are lots of areas. These are questions that we ask our teachers to take a look at to see how they can make some improvements. But yeah, it's-- so like for EL Civics, how many have one assessment, versus how many have two, and how many have three?

So the following year, we would say, OK, we want a little bit more of students that have one assessment completed and passed. We would like to increase the 400, and then see how many can do three per year. So it's those kinds of things that teachers have the conversation and how they schedule it, and how they schedule the EL Civics participation in the lessons. OK.

Whoops, sorry.

Oh. And the reason why-- OK, so we really like this Persister Report because we believe that learner persistence really does help your learning gains. So you can go on the Data Portal and look at your Persistence Report. And this is just a basic one for WIOA. And it has it by the years, and it goes by percentages.

And then you have the final one in the final category. And of course, the goal here is 67. We always like to be above 70% because we think that that's really important. We don't want to go below 70. So CALPRO has some amazing learner persistence, communities of practice, and resources there. I highly recommend you do that.

And then here's a California Adult Education Program worksheet that correlates to your Summary Report in CAEP. So all you have to do is put the data in based on the columns, and then you just do the computation. And you will come up with your persistence rate. And you want 70% or higher, measuring persistence and just literacy programs, program persistence. And it goes in these columns, and then outcomes performance.

So this is a really valuable worksheet. We did this one worksheet in our consortium. We haven't really done data teams with our consortium as a group. Thoibi may be able to explain a little bit more about that.

But what I did like is some of the principals went back to their school sites, and they started looking at this, because it's the discovery and the discussions with your faculty and your office staff on, how can we make this better? So this worksheet I believe came from Jay and CASAS, so we just put it all on one page.

And one of the things I'd add also is that when you look at the data, sometimes you scratch your head. Why did it go up? Why did it go down? And keep in mind that a lot of the data or all the data is based on student behavior and how they're attending classes.

And so in years, when the economy is doing really well and everyone's working, that's when we start seeing our students kind of more-- just, it's harder to get them that post-test because they're in and out of classes. And they're just busy. And then in times when they're not as busy, maybe they're not employed, then we see a little bit-- a difference.

So there's a lot of factors that go into this. And so I learned over the years not to be too critical with our teachers and push them too hard, especially when it's the students and what's going on in the environment that's impacting them. And they're making the right choices from their perspective as far as income, and so forth. So just keep that in mind. There's a lot of factors that impact our data.


Yeah. And so what I wanted to share in something like this is something that was set up for our data team process. This is one teacher, one section of ESL. And so what we do is we want to analyze the data, and she describes what she's analyzing, create a SMART goal. What are her instructional and management strategies? What are the results? And then the review date, and then a reflection.

So you will notice that the time period went from August 2019 to November 2019. We try to get in at least three or four data team cycles, where they specifically look at data. And each teacher will identify which area that they really want to see learning gains. But we feel that when teachers focus on this, it's really helpful. And this is some of the discussion that they have at their professional learning communities.

So this is an example of one. If you go to the next one Jay, this one is for ESL. This other one is for-- oh, this was for mathematics. This is our ASE adult secondary ed. teacher, same kind of data, wanting to make sure that there are some level gains, and then doing a SMART goal analysis, looking at the outcomes, like low attendance, transportation, housing. So documenting all of those kinds of things about their students is really important.

And then if you go to the next page, on this one-- future intervention. So what are you going to do to be supportive of the students? So this is the process that the faculty use in their professional learning communities and their data team process.

It's really important if you have a template, and they can put all this down. And it helps them create and think. So yeah, it's been very nice, and they share this information with each other.

Here's GED test prep. I believe the year is meant to put 2017, '18-- anyway. But improving from the prior year, looking at how many subtests, looking at variables again, and corrective action. And this time period I believe was probably from November to February.

OK. And then here's another data collection system that our teachers could use. Here's an ESL teacher on the left just simply marking how many students passed an assessment in English Literacy Civics Education. And then the other one-- another documentation system on how if they are passing their-- having learning gains in their CASAS scores.

We let teachers create their own systems that work for them. The whole idea is that they show evidence of learning, and that's really important. We're focusing on evidence of learning and creating student outcomes. We focus on the teaching in a different way. But if you look at your student data, then you're focused on student learning.

So the next one-- and then, so this process didn't just happen overnight. So what I did is I gave you some resources that support how our faculty and administrators, our counselors-- what process did we use with professional learning communities? It wasn't just that. It was Solution Tree.

We had data team training with Brandon Doubek. He was with Creative Team Leadership. Also, looking at the DuFour-- Leading Change with John Kotter is amazing. Because if you don't have an urgency to change, then change won't happen. So I highly recommend John Kotter's book.

Stuff on cultural proficiency-- Robert Marzano has a lot of things on research-based strategies that work with students. A simple one is just doing a comparison, as opposed to just doing something low level. In Bloom's Taxonomy, you know you want to get up to a higher level.

The Data Team process is supported by things like Leaders Make It Happen! An Administrators Guide to Data Teams, with Brian McNulty. Angela Peery's book is one of my favorites-- The Data Teams Experience. And there's also The Big Picture, Looking at Data Teams Through a Collaborative Lens.

Our district also brought in Solution Tree, as well as Brian McNulty. So as adult educators, we had access to all of the same professional development as our K-12 colleagues. And then, of course, the College and Career Readiness-- the ELPS is really important, English Language Proficiency Standards.

I think Linda asked about formative assessments. This book is great. It's called Transformative Assessment by James Popham. And then Viviane Robinson really focuses on students.

And-- oh, where did I-- oh, John Hattie. I missed John Hattie. It's on the first page on Visible Learning. I can't-- I think that has been really instrumental, is that we look at his support to teachers on what works for students. So those are just some things.

And then, of course, human-centered design-- we really benefited by using this kind of an approach because it focuses on our students. So what do our students go through that we can make it better? And I actually interviewed students, and it was very helpful for me because it gave me feedback that I didn't normally have. So that's it. Thank you.

OK, great. Thank you. So obviously, Carol focused a lot on engaging staff. Thoibi's got lots to say, but a lot of her focus will be on engaging students. So with that, we'll pass the baton over to Thoibi.

Thank you, Jay. And thank you Steve and Carol, all the wonderful things you do. We have learned so much from you over the years as well. And of course, I have to say that anything I do is all based on what our team at Corona-Norco does and led by our director, Jodee Slyter, for so many years.

Like I said in my introduction, what we're going to focus on, I'm going to focus on is on the student-centered approach. Besides all the other things that we do, we realized that one of the most important things is for the students to understand the importance of taking test. We all feel like sometimes-- we've also heard from our students. We don't learn what we're doing in the test. Why do we have to do the test?

And that's where we have this approach of a constant communication with our teachers and our students. And in terms of students, our students are adults. We have to remember that they're not kids.

And so if we help them to take responsibility of their own learning, they are better engaged. And it makes a huge difference than us just saying, take the test because it's important. We have to understand that they come from a perspective of, what is in this for me?

And so we share the reports with our students. Right after they've taken a pre-test, we make sure that students get a report of it. And even during COVID, I thought that, well, we might have forgotten to do it. But when I checked with our data lead, she said, well, we provided the test report to every student this year as well, especially ESL.

And so we get the students to get the test report, but learning to read the report is another very important aspect. So over the years, we have trained our teachers and our students to learn to read the CASAS Performance Report. So we use the CASAS report that saves individual student performance, and so the students look at those. And this here that we have, the yellow Student Progress Report, is what we do.

When the students look at their report, we tell them the basics of it. And beginning level ESL, they can't read all the detail and understand every detail. But at least they know what is yes and what is no. So on their report, the "yeses--" they look at it. The "nos--" they look at it.

And I would tell them, choose three yeses. And put them in this little worksheet that you have. Choose three nos, the top three nos, and put them in the report, in this worksheet. So here is an example of how a student puts the CASAS test that they took, the three yeses they had, and the three nos that they had.

And then the teachers compile these to see the entire class's top nos. And those are the ones the teachers would use to train the teachers and the students before the next test, which is the post-test. So the pre-test report-- they look at the post-test report. And if a teacher has already made their gains for that year, they check mark at the bottom that says, Level Completion.

So now that particular student-- we don't need to worry so much about getting a post-test for that student, but more about the student's entire learning. So focus more on civics and other learning. So that's the part of the students, the students' performance due to school setting, students taking responsibility of their own learning.

The second thing that we also do, like Carol and Steve before discussed, is that all teachers-- we have teacher PLCs. And during the PLCs, teachers look at the Student Performance Reports as well. All teachers get the Summer Report for their class's work.

And looking at the Summer Report, the second worksheet that I have provided at the back with the white is where the teachers do an analysis of which test the students took, most of their students in the class, and select at least three goals that more than 50% of the students are missing in their classes. So those are basically writing the competency. It's a competency report that they use. They write down the description of the competencies, and then each teacher then knows what kind of competencies they need to focus on in their class.

And we have learned over the years, and most of you I'm sure you've seen it as well, is that learning to read the direction has been a challenge for most of our students. And of course, that's what the teachers would teach more. Or in the higher level EL classes, the book, 86 R and 85 Rs-- students are always struggling with the questions that talk about summons and jury duty because most of our students have not had that chance yet, because they're are new immigrants.

Either they are still not a citizen. They haven't had a chance, so those are the ones we focus on. So it's the analysis of the data that the teachers do during the PLCs, and that's how they choose what to teach in their classes. So it's a data-driven instruction design. Jay, the next slide, please.

OK. And while you're at it, David asks whether your goal review process is the same for ASE as it is for ESL.

Very good question. With ASE and ABE, it's a little different because, especially a high school diploma students-- they do not really look at all those competencies as detailed as the ESL students do. However, the teachers still get a report that gives them an idea which students really need more support so that they can get their basic level gains. Does that answer the question?

I think so, yes. But I will add-- Angela asked, do teachers get paid extra for when they do this process, or is it incorporated into their regular time?

It's incorporated in their regular monthly PLCs. So the PLCs-- our PLCs are two or three hours. And every month, we have one. So part of the PLC-- about 15 minutes would be housekeeping work. But most of the time is focused on teachers in groups, looking at data, looking at learning.

OK. And Karen asked, which report or reports specifically do you share with your students?

It's called the student's performance report, and I can send more details--

I think it's the student performance by competency, just eyeballing. That'll just be like casual observation.

Mm-hmm. That--

All right, thank you.

Thank you. And also, for the post-test, it's a different one. We used the Gain report, where they see yes or nos, whether the students have done pre and post, already made the gain or not. So there are three different types of reports, and the teachers also get two different types of reports.

Moving forward, I'll be happy to share the names of the report by asking our data technician later. So moving on to the next slide, again, it's mostly on student-centered approach is what we take. I grew up in India. And then I went to do my teacher training in London, where I learned the whole learner-centered approach. Everything I've done in my career has been learner-centered approach.

And so again, this is what we developed here at Corona-Norco as well. And as a new immigrant here, I came here in 2007. When I started working here at Corona-Norco, I learned a lot about the new culture. And our students are mostly that as well. I mean, up 60% of our students are ESL students. And that's the approach that we have taken here as well, learning from the culture and giving it back to this community.

So the whole idea of paying it forward really, really is something that we have kind of gently inculcated in our students. And so I have heard many times, why do I need to take the test? And so what we started telling our students-- our teachers and students have now started putting this as part of something that we do here-- is that students who work with us this year will help pay for the students two years in the future.

And so that whole idea, why do we take a test-- the state and the feds have provided this opportunity to learn for free. And so if we are learning for free, why? What is in it for us? We, as we learn today, we get better jobs and opportunities. And we will be able to pay back to the state in taxes.

And we all grow better with this. This whole idea of civic responsibility is such an important part of what we try to share with our students. Because if they don't see the importance of it-- and like I said before, these are adults. They know the value, then they are more receptive to working with us on this. So that's another thing that we have to really try to put in our school.

And the teachers as well-- they see the performance data on a regular basis. And now all our teachers know that we are paid by performance. So it's important that we perform.

And so even in our student handbooks and the digital badges that we had initiated here at the school has all been to get our students' buy-in. As they get their badges, they feel like, oh, OK, this was my goal. I set this goal. Now I've met the goal. I get a badge. The next goal I'm going to set is this.

And how does this all help? This helps the community. What I do this year will help the students two years in the future. And this all works very well with our mission statement at our school. And that is to make each and every student become a productive member of the community, so the whole civic responsibility.

And as we are preparing students for a college career or confidence, they're learning life lessons. They are learning goals. They are learning to speak better, and that's all part of giving them more confidence in this new world. And I personally feel that a cultural change makes a lot of difference in the way the school-- everybody puts in their very best.

As you can see, I'm quite passionate about it. But I'll end there.

OK. Janice asks-- what is it? OK. I thought it was going to-- OK. She's giving you a compliment. What an inspiring way to help students embrace their role. Sorry, I was ready for a question. But it's a compliment, which is still good bring up.

With that, I pass on the baton to John. Is that what I should do?

Yes, that's correct. So again, Carol is engaging staff, Thoibi engaging students. John is going to talk a little bit about engaging outside stakeholders. And do I need to show that Word document, or are you OK without any show and tell?

I think I'm fine. I think I'm good.


I have the unfortunate honor of the last leg of this race, and I am certainly not a cleanup hitter. My gosh, everyone was brilliant. So I will try to do justice to what's just transpired.

So in looking at the last piece of this panel in how we aggregate all of the data and share that with stakeholders, what we do in the Citrus Consortium is we create an annual data review. And there's various stages of it that allow each individual institution to use it for whatever purposes.

So for example, in-- and Jay's got some of the highlights up of the review. Usually, it's about, I would say, a good 60 pages. And it has community data. It looks at the cities that we're serving students from. It looks at the NRS CAEP. It looks at the CAEP summary in terms of all the different-- whether it's services, or CAEP outcomes, or NRS students.

So it's a really, really comprehensive document, and each individual institution uses it for a variety of things. So for example, one of our institutions was going through FPM. And they took all the demographic data that was in there, and they used it for their demographics.

So not only are we looking at the CASAS Portal. One of the things that Jay had asked me to share with you is other sources that we're looking at. So we're looking at census data. We're looking at all the other reports that are available as a consortium manager. So anybody that's in a project, a program director, or consortium manager position, you should have access to CASAS TOPSPro Enterprise reports for your entire consortium.

And so I pull those reports and create extensive demographics, race, age, gender. And also, look at a number of different things that you can pull up from those reports and also from the census.

So Jay, if you wouldn't mind scrolling down. Those are cities that we look at. And then keep going below that. So one of the things that we do look at-- we'll look at the cities that are the primary cities that we serve. And we look at their population characteristics in terms of English language acquisition, poverty, and then high school diplomas that are-- high school students over-- individuals over the age of 25 that have earned a high school diploma or equivalency.

Scroll down. Look at poverty rates. Look at current unemployment rates. Obviously, those unemployment rates are much higher. And so we use that to inform gaps, but I like Steve talking about low-hanging fruit. You can go down a little bit further.

We also look at enrollment by program level across the consortium. You can see that we're at 57% right now in ESL across the consortium, across our six members. We're not a huge consortium. I know probably some of your schools might be bigger than our entire consortium.

But we had about 5,000 in services. And then in K programs, we had about 4,800 students that we served that were over 12 hours of instruction and we got demographics for. So if you can move down a little bit, Jay?

And then that's last year. We do this for three years running. The report's pretty massive, and so people are able to pick and choose what they want from it. And scroll down, Jay. This is something that we look at over three years, where you can see what our literacy totals were, what our CAEP-- so at the NRS.

And so I saw a question that was looking at the difference between NRS and CAEP. And what I will tell you in terms of performance outcomes, what we have found is that there's very little difference. But obviously, in terms of enrollment, a lot of students don't necessarily pre and post. There's a whole conversation as to whether or not we should pre and post CTE students, and we've fallen on either side of it.

But when we've pre and posted our CTE students, our performance rates have gone down. But our persistence rates went up. So we've looked at that as to whether or not that's a procedure that we still want to do.

And then, you can scroll down a little bit. One of the things that we do is look at all these and try to determine gaps. There's major what we call macro gaps, and we look at how many students that we're serving in the CAEP outcomes, those program students that we're-- not the ones that just come in for services, but come in and actually stay for 12 hours or longer and are actually considered a student.

So we look at those students over a three-year period and how many we've served compared to the census results that we've looked at. But a more important one is micro gaps. And if you can scroll down-- and that deals with persistence. Why are people coming in the door and not staying?

And so in our consortium, we have data coordination meetings with everybody that's the TOPSPro Enterprise coordinator at each site. And we're working together to look at how to improve persistence and how to improve performance. And the things that we look at everybody shares, and then we try to come up with solutions together as a consortium.

So in terms of sharing this out, members use this to write WASC reports. I'd share a number of WASC visits. I know a lot of people on here do.

A lot of people use the data to write their own WASC or their own FPMs, or also sharing this with board members within the district. Members in the consortium are able to find their own data in here without going through TOPS and doing it. By creating these tables, they're able to go in and pull their specific data. And then they're able to create their own reports for their own sites.

And we also share this at our board meetings, so it's on our agendas. It's in our minutes, and it's also on our board website. You can check out and go to the Data page. You can see the entire report there.

So there's a number of different ways that we get this out to the public. Now, one of the things that we talked about when we were preparing for this is, do we use any of this to market? And how can we use this--

Hey, John, before you get too far into that, Dana asked, you reference literacy on a couple of these tables. What do you mean when you reference the word "literacy?"

Yeah, let me get into that. My apologies. So this was set up to talk about the NRS tables and how you compare across the state within NRS. And NRS is National Reporting System. That means that is the strictest definition of a student, a pre- and post-test. You got demographics, and then they had more than 12 hours of instruction.

So one of the things that we do is look at the CAEP Summary. So there's three sections in the CAEP Summary table. The left section is the literacy or NRS section. So that is what I'm saying is the strictest definition of a student.

So a lot of students in the middle section, which is the CAEP outcome-- they don't necessarily pre- and post-test, but they certainly have outcomes for us. They get jobs. They increase wages. They transition to post-secondary institutions.

And so that Literacy section is going to be a smaller enrollment number than the CAEP Outcomes section. And then the farthest section to the right of a CAEP Summary table is the CAEP Services. And that's everyone that walks through the door and has provided some kind of services, whether it's testing services or counseling services.

In our case, we have a really large number of services because some people come to our Career Center, and they just want to take a typing test. This is non-COVID obviously. So we have 600 people that come through our Career Center. And probably 400 of them don't get 12 hours of instruction, because they're coming in for very specific goals. So that is something that affects our persistence because we capture them as having come through and received services, but they don't have the goal of wanting to stick around for 12 hours.

However, that Literacy is people that pre, post, have demographics, and have more than 12 hours of instruction. Does that answer that question?

I think so, yes.


Yes. Yeah, it is-- since I do a lot of WASC, I developed this in a way that our members can use it for their WASCs. The other thing about using the CAEP Summary versus the NRS is that some of our members are not-- we owe a title to. Some of them were-- we owe a title to and weren't renewed. I won't go on about how that makes me feel. I'll leave that just to dangle out there. But even though I thought their application was very compelling, that's the last I'll say to that, says Forrest Gump.

In any event, some of those institutions are still-- they still continue to-- are pre/post-testing. They're doing everything they did as if they were CAEP-- I mean, we owe a title to institution. And so we're still able to capture that data.

But we also use that CAEP Summary data, which is that middle section in the CAEP Outcome, which is the middle section of the CAEP Summary table. That's what we use to determine funding. We looked at enrollment through that. And that's how, as an institution-- if you scroll down, Jay, I think that shows that.

Yeah, so there it is right there. This looks at, over two years, the total of Literacy, which is the NRS students-- that's the strictest definition-- the CAEP Outcome, which is students that did not pre and post, but earned over 12 hours of instruction; and then the Services, which is the Unduplicated. But they walked through the door and received some kind of services.

And so you scroll up just a little bit. What you can see is that we looked-- we're passed it, sorry. I'm sorry, down. Yeah. You're down, down, down, down. My bad. Right there. That shows you-- and this is in our ADR It's on our agendas and minutes. It's explained as to how we do funding.

And what we did is look at the CAEP Outcome Enrollment data. And we did that. We did percentages of the total amount that was available, and that's how we did our CFAD. And our community college partner has suggested that we now look at our performance outcomes and include that in the algorithm. And I think that's something that the state has said all along in the local control funding, that performance should be a part of it. It's just that how that's defined is really left up to each consortium.

The last thing that I'll talk about very briefly so that we have time for questions is in that CAEP Outcome data-- well, actually, there's two things. And that CAEP Outcome data talks about other things. One of the things that we do is set goals for transitions. Now, last year, we didn't hit our goals. We were struggling with hitting goals because of the pandemic.

But we have a really strong partner in Citrus College and Michael Wangler and his team at Citrus. And they are using strong workforce dollars. They have funded a transition specialist, a navigator, which is the model that many people are using. And our navigator's really, really a strong Citrus counselor and instructor.

And we're starting to see dividends with that. Even in virtual classes, people are starting to take non-credit counseling, one-on-one counts as classes in order to start to transition and learn the things that they need to learn to be successful academically at the next level.

And then the final piece of-- Jay, you can scroll down-- keep going. Let me see-- yeah. This is one of the-- somebody had mentioned, how do you track CTE? This is for us. We really focus on AJCC. I got 2 minutes. OK. I'll go one more minute.

Just really quick, this is just aggregated from Monrovia right now. I'm in the process of getting it for 1920 data for all of us because we're still getting employment outcomes. But we look at funding sources.

We at Monrovia have gone from $20,000 a year in '16, '17 from AJCCs to over about $400,000 in 4 years. And that expertise is being shared with all the consortium. So we're making a goal to get around $600,000 for everybody in the consortium in AJCC funds.

And so one of the things we do is look at, how many people were enrolled? How many people completed? How many people received employment? How many people got 100% financial aid? And then, what got some level of financial aid? And then we share that as a consortium and making goals to what we want to serve in the CTE world.

So there's a lot of different ways that we use these other sources to create goals, and we use them in our logic models. Our logic models-- and I'll end this with this-- our logic models were very simple, increased persistence and performance, and increase CTE funds that were leveraged from AJCCs. Those were two of them in our three-year plan. And the data that we gather allows us to do that.

OK. Real quick, Pam is just asking if you can send a copy of your consortium data review?


And then liaise or whatever--

And I'll put in-- I'll type into the Chat-- right now, it is a draft. But if you go to ccadulted-- I can never remember if it's dotcom--, and it's on the--

Cindy found it and posted it in the Chat.

OK. Thanks, Cindy. Yeah. So I will tell you that there's still a little updating that needs to do. We have a board meeting this Thursday, so then a new draft will be up. Towards the end, it's a little-- it gets a little off-kilter. But it'll be all sorted out. If you go by the end of this week, it'll be-- the newest-- the volume 2 will be out. It'll be the latest and greatest.

OK. Well, thank you. I guess I'll wrap it up here. But before I do, I guess I'll just throw out to all four panelists if anybody has any parting words, any words of wisdom, anything that you wanted to get a chance to get out. Maybe I hurried you too much, and you had 30 or 60 seconds of stuff that you wanted to say that you didn't get a chance to say.

Yeah, take a stab. I think one of the things I would point out to everyone who's shy about data renewed data that the data we collect now-- it really defines who we are. That's what we use it for, our school. And then from advocacy purposes, working with CCAE and CAEAA, that data is critical for conversations that we have with legislators.

And so really, it validates us as a program, adult ed. I would say it's probably-- our reputation has increased a number of fold the last few years just because we've had the data to show. We get the outcomes. And we always hit on the fact that we get very low funding compared to other programs to do that.

So make sure you use that data for your conversations with your administrators, your district level administrators, board members. Data talks, and it tells a story. And so don't be afraid to go there. And definitely, you get to pick what data you share. So look for your highlights, where you do well. And make sure you share that with as many people as you can, including your local legislators.

OK, thank you. One quick thing, Carol, Linda is just asking if she can observe one of your team meetings. And John, I cut you off when you were getting into marketing. And a couple people have brought up, hey, they wanted to hear about that. So sorry I cut you off .

No worries. Love to do that. So in terms of our marketing, where this data is used is primarily in information sessions. So whenever we have an information session, I always say, hey, here's our data from 2019. These are the people that took our pre-certification CNA classes or pharmacy tech. These are how many people got jobs, and this is how many people were funded.

Now, we don't necessarily put it out there. But the reason that we don't is because we do really targeted marketing. The person that I have, the firm that I have doing marketing is on all sorts of social media platforms, YouTube, Instagram, Twitter-- I don't even know, and I don't know what Russian servers he uses.

But all I know-- and if you haven't seen The Social Dilemma, my gosh, everybody should because it is pretty powerful. Although, I will say I am now benefiting from it because he does all sorts of crazy, manipulative things on Facebook. And then 75 security students, potential students show up at our site.

So I am the problem, part of the problem, a very small part of the problem. But we have a magnificent marketer, who is online, doing all sorts of ads online that then drives traffic to websites. You can go to our website, which is, two Ts, N-O-C-O-S-T-T-R-A-I-N-I-N-G.ORG.

And you can't use the word "free." That's illegal. We say "no cost." He is on every single platform that then drives them to that, and then that is shared across the consortium as leads for CTE, as well as for our other programs.

But where this data is most significantly used is in information sessions, at least the CTE information that we have. Other information is put on flyers and then primarily for-- a lot of the aggregated information-- there's so much of it. It just primarily comes through agendas, minutes, and board reports, and meetings with stakeholders across the region. I hope that answers it

Jay, I just wanted to make a comment that I think one of the things that I think really is important is to have a vision with your team, your school team. And definitely involve your district team, like Steve said, because they are your best advocates. It's creating your programs and your systems internally to be very strong.

And to do that, use your professional learning opportunities, definitely CALPRO, CASAS, and OTAN. And we take advantage of district professional development. We think that that is just critical. But really looking at your system and creating a collaborative culture on your campus that focuses on student learning will really help improve your student data.

I can [inaudible]. And I'd like to add also that we are all creatures of habit. And so creating some kind of routine and habits really help. And I know I haven't done a great job. We haven't done a great job this year, because of COVID and everything.

But just today, I realized we've been doing all this. Why do we have to stop during COVID? The numbers may be a little lower. But still, if we continue to practice what we have always been doing on a routine basis, it helps everybody to be cohesive.

OK, great. Thank you so much. So last slide, and a couple of people asked about it in Q&A, is if you're interested in this topic, I glossed over a lot of the technical details at the beginning. Like I said, I did six or eight hours of training on five minutes. So I don't know if it'll be that many hours.

But we've got some of these workshops on November 16, 17, and 18 in the PM. I think a couple at 1:00 PM, a couple at 2:00 PM. But we'll have three days. The first day, we'll be just using TE reports and CASAS Data Portal.

The second day, we'll look at some of the things our four panelists have talked about, converting these data points into other areas. And then there'll be a qualitative/quantitative one that we'll do on the 18th. So if you're interested in any of this, we've got some workshops on those three successive afternoons, November 16, 17, 18.

And then the last thing is here is my email on the last slide. A couple of you asked about things from the four panelists. So if you're not sure about how to get in touch with a panelist, I'm happy to liaise. So just send an email to me and I'll be sure to get some communication going with either of these four panelists if you would like. Feel free to check in with me if that's easier.

And I'll just give a Laurel and Hardy handshake to all four panelists. I knew you'd be superstars. You exceeded expectations. Everybody's lavishing you with praise, like I knew they would.

You're the top of the line in these areas. I wouldn't have picked you otherwise. So thank you very much for being such a great all-star panel. And thank you, Melinda. I'll turn it back over to you. Thank you.

Alrighty, folks. It's lunchtime. So if you lost a minute of lunch, it's not my fault. [laughs] It's OK, not to worry. I'm going to hit the End button. The evaluation won't open up. You have to hit the Continue button first.

So I'm going to hit a button, then you're going to hit a button. Then you'll get the evaluation. It should come up for you so you can fill it out. Please, please, please fill out evaluations because we give those to the presenters eventually after we look at them. All right. Everyone have a great lunch. Sessions start back up at 1 o'clock. See you then.

Thank you very much.


Thanks, everybody.