JAY WRIGHT: Hopefully everybody can hear me OK. I put it up here on the cover slide, I'll just go straight to the agenda here. So as I believe Veronica noted it's part 3. I believe they go together somewhat, but it is three separate sessions. If you didn't do parts 1 and 2, probably not the end of the world. It's not mandatory or anything like that. All three in some way or another relate to the CAEP reports in TE. We started on April 28 where we were less than TE and more just talking about outcomes. And most of that discussion was defining all the different CAEP outcomes and looking at it less on how you would market and more on what sort of student scenarios might transpire that would result in you marking specific outcomes and clarifying scenario x versus scenario y and what exact outcome. You know what looks like, quote unquote. Then we did one just a week ago where we started getting more into TE reports. That one looked at targeting special populations. We focused on that a lot because of the goal setting activities. Everybody is going to have to do a NOVA here starting this summer, lots of focus on barriers. There's nothing mandatory on immigrant integration but it is one of the options. So we started looking at some different ways of looking at, quote unquote, "special populations", looking at ways in TE you can isolate special populations. Identified some reports you can use that will give you that sort of information. Now, fast forward to May 28, part 3 of 3 and we're going to talk about goal setting using TE for a goal setting. We'll start with the concept of NRS performance goals. My guess is that lots of you have probably attended those. Got to say we've done tons of sessions over the last year on NRS performance goals. Again, more on the WIOA federal reporting side than the CAEP side but we did a lot of them a year ago. Quite frankly, we did it a lot of year ago because we were webinar crazy there for a couple of months when we were first into COVID. Then we did a bunch more of them here early this year as we were ramping up to that set that a lot of you completed a few weeks ago for CDE we went to recording. CAEP you might say is on the exact same bandwagon as we are with to moving more toward performance and goal setting. So we'll be looking at a lot of those same concepts look at our performance goals. Only instead of the focus being on NRS federal reports, we'll look at the same concepts, but in the lens of CAEP state level reports. So we'll review NRS performance goals and then we'll start looking at how you might adapt those concepts to CAEP reports and CAEP reporting metrics. Then we'll get more specifically into different ways to measure CAEP performance and persistence. That turns back the clock a little bit to the last in person data that we had in spring 2019. I know a lot of you remember that. That one we already, we did a bunch of trainings there spring of 2019 where we started looking at those alternative metrics, looking at the different sections of the CAEP summary, looking at enrollment to see who is an enrollee and who's not, looking at attendance to see who has 12 hours and who doesn't. That is a variety of ways of looking at performance and persistence from a CAEP point of view with students that may or may not have all the same data metrics as rest students would. And then most of the time will be that second to last bullet, that is, we'll look at CAEP performance goals examples. My feeling is that by far the bullet everybody will have the most interest in or at least that's my prognosis. So hopefully, that's where we'll be spending the bulk of time. And then for reference, we'll look at the 19-20 Statewide CAEP summary. It's not going to really be a data dive on that, but it is there and has been shared. So I think it's OK and basically we'll use that as kind of reference. We'll point out how some of these metrics look at the state level. There's all kinds of comparisons you could do, but we'll at least give you a couple for starters here by looking at that summary. And so I'll stop talking. I'll look for a thumbs up or thumbs down again. Do you hear me? Do you hear me clearly? Does this sound more or less like you were expecting to get yourselves into this afternoon? Et cetera, et cetera, et cetera. OK, there's a few thumbs up. That sounds like we'll say that's good enough here. So we're going to start. That's a very predictable question. As predictable or more, we're going to start this presentation by talking about the CAEP summary fresh with a brand spanking new screenshot, not unveiled at other presentations, I don't think. Anyway, we're doing it into the three sections. So again, we've talked about those three sections for four years now. So we're skipping the overview and digging right in, you might say. So we're starting with that left hand literacy gains section. Again, that's where we look at our three post outcomes. The pre/post outcomes section for the most part comes straight from NRS table 4 or most of it does with the exception that it allows non-NRS programs. Again, column B, enrollees from the NRS federal tables, column C shows you everybody that completed a pre/post, and then column D of those who achieved a pre/post level gain. So again, the left hand section, just giving you a review here. Left-hand section is all about what's happening in terms of pre and post testing. That the big meaty middle section relates to CAEP outcomes. You can see the number of enrollees in column E. I'll stop here, that should be reviewed for a good 80% or 90% of you. But let's just get everybody steet well, let's warm everybody up, insert your favorite cliche here. That's what we're doing here. So again, middle section CAEP outcomes, you can see enrollees in column E, the enrollees in the CAEP outcomes section. Very, very similar to, but not exactly the same as that left hand pre/post test section. Both the left hand and middle sections require the demographics. Both sections require the 12 or more hours of instruction. But the difference is the left hand section relates to testing, so it requires testing. The middle section doesn't, so it doesn't. That's the big difference. I'll also point out this year we added that new column F for past I-3. We talked a good bit about that in the presentation last week. That's the new column. You can see all the other columns on this section slide backwards by one letter, but it's in the same order and basically meets the same purpose that's reporting all the outcomes and organizing it according to those areas of AB 104. OK, so moving right along here to the right hand section, that's your services only section. Again, that shows the folks who only received services or those who might have done other things in the CAEP programs that have less than 12 hours of instruction. So everything falls here if it doesn't have the demographics or hours. We have the total number of enrollees in this section in column M to clarify that who really has services versus who is just top tier because they didn't meet some of the other requirements as clarified in that column N by stipulating those in column like that actually received services. Then Oscar, Papa, Quebec, and Romeo, all basically look at the different types of short term services. OK, and I think that was enough review for just the summary. Nobody will be surprised. We'll be talking a lot about it over the next hour and change. But that's just the review that we've talked about many times. Here, we've got the CAEP Data Integrity. I spared you the slide with the 20 text boxes talking about all that action talk. But just to summarize, I will say like I always say that top summary information, very critical to understanding the CAEP DIR, kind of that way, for all the DIRs. But this CAEP version, I'd say especially, it uses that big map, looks at everybody that has any conceivable association whatsoever with CAEP reporting and then goes backwards, it reads out those that are not enrolled in one of the seven CAEP programs. As it reads some of the students out, it identifies some of those that it's reading out that might have some valuable fruit inside hey, we need to toss it out. It doesn't meet specs, but hey, it sure is a nice looking fish. Maybe we need to figure out a way that we can put it on our menu, so to speak. But anyway, those that might have outcomes that we necessarily cannot retain on the report. So it basically discards that and comes up with that number in bold at the bottom that serves as the denominator for all 27 items. And then the CAEP DIR digs into 27 items that are important for state and federal reporting. Some of them, the top 10 are the showstoppers. There's a bunch roughly item 10 through 22 that are not really showstoppers for CAEP but get into things like periods of participation and integrated education and training that is important topics. The topics that quite frankly relate more to federal WIOA reporting than state level CAEP reporting. And then at the end we have that item 23 to 27 that get into CAEP specific outcomes. OK, and then here is a sneak preview. This is coming, I believe Friday, if it's not Friday, it'll be sometime next week after the holiday weekend. I'm not sure of the exact release date, but it's not there yet, but coming soon. A new CAEP hours report in TE. So this is what it looks like. I was going to use this for some of the calculations, but there's still some issues being worked out with it, so I'll do it. So making a long story short, I am doing some of those calculations some of you remember from a couple years ago. I'm going to continue to do the calculations the old fashioned way using the CAEP summary. Some of you remember there are some tricky-- you can do all the calculations, but there are a couple that are a little tricky. Once we have this CAEP hours report, my feeling is we won't have anything all that tricky anymore. But again, it's not quite ready for this presentation. So I'm going to show you those CAEP summary ways of doing it because I've got to admit, I'm not sure exactly how this is going to look exactly in terms of numbers. If I'm just showing you what it looks like, we'll fine, you can see it. But I have to admit I get a little less confident if I dig in to show you exactly how you're going to calculate the numbers to get inside baseball type figures like I'm going to go into with this example. For now, no, we just have the four or some level reports. If you want to email me, I can certainly send this up as a request, though is something that could maybe be added to that list. For now, though, it will just be an age of C-level report for now, but it's a good suggestion. OK, but just sorry, I spent way too much time making excuses for why I'm not using it right now. But just so you know, it's going to be hopefully a simpler report that itemizes out what was called the participants versus the adults served. Well, a lot of people would say that but I got to say I've heard those terms from Neal more than I've heard from anybody else. So at least for now we're going to attribute them to Neal. Anyways it will give you those adult served versus participants. Look at those same three sections of the CAEP summary that we've just reviewed. That's right, somebody's ears are burning red after that. Anyways, we'll look at the different-- that report will look at it in terms of who has 12 or more hours? Who has 1 to 11 hours versus 0 hours? Obviously, that's really, really important that can serve as the section to help you figure out who's there just because they received a short term service and really didn't have any CAEP enrollment versus those that did have CAEP enrollment but just had something, whether it was hours or missing demographics or whatever that prohibited the person from getting into one of those other sections. So for troubleshooting and you might say categorizing, students in different buckets this report will hopefully be really, really helpful. OK, and then here's a view of the consortium manager reports. Again, we have four now. The three we've had forever, demographics, CAEP summary barriers to employment. Relatively recently, I forget exactly which one, but in the last six months anyway, we've added the CAEP DIR to the consortium level reports. And then here is what I just said for proof in the pudding. It doesn't aggregate everything, but because of that whole common denominator, an issue really is a nightmare with this level. But it will allow you to compare and contrast all the different agencies in your consortium, putting it on spreadsheets, whatever makes it easier for you to look at that and review everybody together. OK, so I think that was the table setting section of this presentation. I'll just stop. Any questions? Sorry, our sanity check everybody out again because I'm coming up for air here for a few seconds. Anybody want to indulge me? You still got it. All right, we all got it, we all got it. If you didn't have any after 20 minutes, we'd be in trouble sitting there, wouldn't we? Anyway, so all good, we're all good. Yes, we are good, aren't we? Anyway, so let's start here with NRS performance goals. So sorry, another shameless annoying self-promotional comment. How many of you have been to those sessions? Maybe it was a year ago, maybe it was more recent than that, maybe it was way back when we were doing them in person. But how many of you have been to one of those NRS local performance goals presentations? Same type of presentation but it focuses on the same concepts from an NRS federal reporting lens rather than a CAEP state level reporting lens. So if you have, you've seen these nifty screenshots, that is, we tell you to look at the NRS. It is a crazy series, that's right, it's up and down, it always goes seven gains, doesn't it? They're always going to triple altitude, don't they? You never think those games are going to end. Anyway, we have the screen shot, we tell you look at the NRS tables, in particular NRS table 4. And we also refer you to the closest Data Portal which for the NRS, our agency summarizes your table 4 performance, your 4B performance, your NRS persister. And now recently, it also includes information about your students' responses to the employment and earnings survey. So lots of stuff there, but the big question, of course, is the Data Portal is there, just for we owe it to NRS reporting agencies. Not necessarily everybody, but most of you are NRS we owe it to. So for most people in this line, again, this is at least somewhat help. So anyway, from an NRS lens, look at the Data Portal and/or look at the NRS tables to identify agency strengths and needs. We're not going to spend a lot of time here, but table 4 is the NRS reporting table that's used to report on measurable skill gains that report using the 12 federal educational functioning levels, 6 for ABE, 6 for ESL, the student places with a pre-test. The gain is based on their pre/post test progress. We've aligned our class system to those 12 levels. So based on how your students score on their pre and post tests, they'll placed into the federal levels and hopefully a lot of them will make progress on their pre and post tests to achieve a measurable skill gain. And then the report gives you percentages for each of those 12 levels, that is, at the end of the day at each of those 12 federal reporting levels, what percentage of students at each level made enough gain between pre-test and post-test to move to the next federal level? There's a lot of blanks I didn't fill in. This is a whole hour presentation onto itself. But hopefully everybody gets the basic just to how table 4 works. This is not how we do everything CAEP wise. But if you look at that left cam part for pre and post testing, you might say this represents what's, quote unquote, "under the hood" in that pre/post section of the CAEP summary. It basically is using the results from the NRS tables, more specifically, results from NRS table 4 to report out on those columns B, C, and D that you all know and love on the CAEP summary. In federal reporting layer, there's also table 4B. It basically is set up a lot like table 4, but it only includes those students that completed a pre-test and completed a post-test. For federal reporting, you always use table 4, number 4B, but there are some that might cite, quote unquote, "fairness" issues from a classroom point of view that says if you're measuring an agency or state, maybe you use table 4. But if you're measuring student performance or classroom performance, you really ought to use table 4B. That is, you've really got to only evaluate those students that completed a pre/post pair when looking at whether the agency or the class or the student is performing well or not so much. And then here is a screenshot of the class's Data Portal. I split it up maybe that was log, I made the log choice already. I moved the persister back in slider 2 so I could show you the class's Data Portal screenshot from table 4. This is just an example. Again, another 45 minutes, I'm consolidating into two or three minutes where this is a screenshot from the class's Data Portal focusing on NRS table 4 results that's just part of one agency's performance where we're looking at a few levels and just showing you how the concept works in general. Where when you generate this on the Data Portal, it'll show you Statewide goals and Statewide averages. It has some 15 years worth of California data performance from which to compare it. And so you can run your own agency, compare to the state, compare to the neighbors, et cetera. So in this example, the agency is comparing 2018, '19 local performance to Statewide goals and averages. Again, at the end of the day, we found one area, ASE low represented by the red arrow where we really need to improve. Another area, ESL, beginning lit, represented by the green arrow where we're doing pretty well. Again, we're keeping it super simple for now and just saying we know ASE low is an area of weakness because our percentage is way below the state goal and way below the state average. We know ESL beginning lit is looking good because we're above the state goal and above the state average. Again, there's 20 more minutes of this we could get into when we want, but now we're just showing it's simple looking at straight and needs. How can we use this to reach forward conclusions and start looking at where we need to improve versus where we might be able to pat ourselves on the back. And here is the NRS persister. This is what I almost automatically got into saying I might rearrange this just because I can't really seem to get out of my own way. But the persister basically compares NRS table 4 with table 4B. It looks at it from the lens of the same 12 federal level, 6 for ABE, 6 for ESL. Column B shows how many of each level placed into table 4, column C is showing of those, how many also placed on the table 4B. And then that percentage of persister there in column D is simply column C divided by column D. That is of everyone at each level that qualified for NRS federal reporting, what percentage of those students at each level stuck around long enough for a, quote unquote, "persisted" long enough to at least complete a pre-test and post-test. Here is another section of the class's Data Portal. Again, it's a CAEP training. So I'm showing you this but not spending a lot of time using it or whatever that will be and WIOA II trainings. But in any case, here is the persister section where we compare our persistence to the Statewide average rather than our performance. You can see we're looking at two areas that we already know and identified as two areas of weakness for this agency. One is ABE Intermediate High, the other is ASE Low. So in this example, we're assuming we already did the homework. We already know 125% these are lit areas that we need to improve. So one thing we always talk about is a good first step is look at persistence. If you know what's an area of weakness, the one thing you want to do right off the bat is look at persistence and see that is have we completed pre/post pairs for the bulk of these students? Yes or no. If so, there's one set series of steps we might want to follow. If not, there's another series of steps we might want to follow. Here in this example 2, areas of weakness. The one to the left, intermediate high, you might say confirms our suspension. Our persistence in that weak area is very low. So what that means is knowing that our persistence is exceedingly low, our first step is going to be to get more pre/post pairs and/or engage in basic data clean up. We need to get our demographics hours of instruction pre/post pairs in place. We need to worry about all those things first and then once we get all those ducks in a row, then we can move forward and start looking at things like classroom performance. The green arrow to the right, that gives us one that shows our persistence, surprisingly enough, already pretty gosh darn good. So yes, it's an area of weakness, but not necessarily an area of weakness explained away by a lack of pre/post pairs. Because as you can see, persistence for ASE low in this example is actually well above the state average. So in this example, we might want to look at things like classroom instruction and look at ways to directly improve our student scores on the actual pre and post tests. OK, so there is about 120 minutes worth of training frantically wrapped up into a neat little bow in five minutes. So that obnoxiousness notwithstanding, does everybody more or less follow me here on these concepts related to NRS performance goals? I know a lot of you have been in these sessions a lot, so I do think most of you understand this from previous sessions. But I'll just say, hey, if you don't understand this in basic terms or wonder how retrofitting it for CAEP. Is going to make getting certain concepts I got to say? OK, thank you for the positive remarks. So this is that goofy graphic. I know a lot of you have seen this, we've been using this for a while, to say. Yeah, at the end of the day, you look at those levels, you identify areas of strengths and need. At the end of the day, every level is performance, it's going to be good or bad. At every level, persistence is going to be good or bad. That allows us this nice little oversimplified grid, that is, everything falls into one of four quadrants. And we can look at each quadrant and basically show where are our priorities? And what do we need to do? The short answer is that to the left of the priorities, if we're low on both, we focus on data testing, and then we move on to instruction after we fix the data issues and the testing issues. If we are low performance but high persistence, that suggests that our data and testing is already in good order. So we don't pass go, we don't collect $200, we just go straight to the classroom and start working on fixing instructional performance. OK, hopefully, that makes sense. Yes, so that's true. This graphic applies to CAEP just like it does to NRS. As long as you're willing to look 40,000 feet above, then yes, the way this graphic applies to CAEP, no different at all to how it applies to NRS. Maybe if you dig into the weeds, then it's not so much the same, but certainly at 40,000 feet, this graphic is the same for CAEP as it is for federal report. OK, so I already asked whether everybody is doing all right, so I won't burden you with that question again. So here we're going to get into CAEP. Here is a little bit more probably what everybody signed up for. So what we just spent the last 5 or 10 minutes doing is table setting with that NRS approach to setting goals. We've talked many times about how you can use the same approach for CAEP. So this is a lot like those data days from a couple of years ago, but I got to say a more exhaustive list and it will focus in on details or other types of details you might say, probably a little more goal setting oriented than we did in the past as well. I think right now is before I dig in, I'll also reference that supplemental handout. I believe Veronica mentioned to hand out at the beginning of the session, not just one. One is the presentation, of course, that we're reviewing right now. That's the one that we're going to be spending 99.9% of our time covering here. The second one we're really going to be covering like not at all today, it's called the supplemental handout. The screenshots are all old, everything about it is old other than it was cleaned up and that were some garbage slides that eliminated just so it's cohesive and at least makes sense. But other than that, everything's really old. So updated screenshots but you delete the handout. The supplemental handout is 0% mandatory and 125% gravy. It gets into a lot of stuff we got into a couple of years ago, like Excel spreadsheets and getting into really detailed calculations, slides that do nothing but calculate and provide calculations so that calculations are the same. And a lot of that stuff is the same. So it's just given to you as is from a couple years ago. Everything about it is factually accurate other than a lot of screenshots are old. So again, if that just bugs you at a high heaven looking at old screenshots, then my advice is just delete it and don't worry about it because it's not mandatory. It was purposely just passed on as something is better than nothing to give you something supplemental. Again, if that bothers you, delete it and just focus on this handout that has all updated stuff rather than stuff that's two years old. But there's a lot of stuff like Excel that I really wasn't interested in updating but it's still good information, so I just passed that along again as is for everybody's consumption or not. So moving back to the feature presentation here. Here are a bunch of different things we're going to cover here looking at goal setting from a CAEP lens. The top three are the ones that we covered two years ago and that day to dive in person. Whistle stop tour, we did in 2019. We'll be looking at enrollment criteria in the 2019 series, that's what we called the kick the tires metric. That is looking at all the students that have any connection whatsoever to CAEP reporting. What percentage of those students enrolled in a CAEP program? That is they showed up for orientation or to receive a short term service. They kicked the proverbial adult dead tires. Ideally, they hung in there and they went into a class and actually accrued an hour or more. If so the enrollment criteria, that's one way of looking at CAEP persistence. The second way is looking at it from a little longer game perspective. That is looking at attendance related criteria. That's the participants versus adults served. That is, of those that accrued at least an hour of instruction, what percentage of those hung around or persisted long enough to at least accrue 12 or more hours of instruction and thereby qualify for CAEP outcomes and qualify for NRS some pre and post testing? So in any case, that's the attendance. And then the third one that we covered a lot a couple of years ago was looking at pre and post testing. Basically looking at using the exact same concepts of persistence that we just covered from the NRS plans over the last 5 or 10 minutes, but getting your calculations from the CAEP summary rather than the NRS persist rate. And then we'll get in do a few others. Targeting CAEP outcomes is another one that we've looked at in previous years where you just summarize the total number of outcomes at your agency or your consortium and divide it by the number of enrollees to get an outcomes percentage. We'll look at that one and review that one and different ways of applying that for goal setting. We'll look at the CAEP DIR. And again, that will be a goal from a goal setting point of view. More so than drilling down, look at some specific goal setting examples using the NRS model. We'll double back to some goals for special populations, like we discussed one week ago. And then we'll briefly cover consortium level reporting and how to set this up with external stakeholders. I'll just say these bottom four bullets, you might say, are all presentations unto themself in a lot of ways. There are all multiple presentations unto itself. NRS performance goals, of course, has been done many times. Defining goals for special populations was done at least once because we just did it last week. The other two, we haven't really fleshed out and done sessions that focus just on this. But I will tell you right now those bottom two bullets, like the two above it, could easily fill an hour, hour and a half presentation by itself. We might really want to look at doing exactly that here in the summer early fall. So a lot of this discussion is a big bad excuse to say we're looking at these. But we're not going to go too far because it's 30 or 45 minutes each that we really don't have now, but we will start looking at how you can set goals at the consortium level as well as ways to set goals with that external stakeholders. One more birdwatch that's not on the slide as we have heard some panel presentations with some folks in the field that have done an exceptionally good job with the goal setting. Some panelists like to talk about this from a WIOA II federal reporting point of view and use things like table 4. Others are CAEP directors that really use CAEP reports to talk about goal setting. We've done it a lot here this spring. There will be another session on June 23 of the class's institute and we'll probably be doing this again here in late summer, early fall. All kinds of stuff we'll be doing later for more information just at WIOA. OK, so starting with CAEP goal setting. And again, we'll start with enrollment criteria. So for starters, you can use the CAEP summary or that new TE CAEP hours report. I mentioned I was really going to dig into this one a little bit here, but it's not quite ready yet. So we'll have to do this one at the next CAEP day to dive. So I am showing you what this new report looks like here on the screen. I will use this for table setting. But then from this point forward when I actually show how you might calculate this kind of stuff, I'm going to stay conservative and stick with the CAEP summary so as to ensure the way I show you is something that's accurate, rather than something that's inaccurate and just sets you on nothing other than a wild goose chase or whatever. So here is the look and feel of that new report. And the question for the new report is the exact same question as the one for the old report, that is the kick the tires question. When we look at the overall number of students available for CAEP reporting, how many actually make it into a CAEP instructional program? That is, how many students accrue at least one hour of instruction? And so now, there is the CAEP summary just to go back into that way. Just kind of a translation, same question, different graphic. Using the CAEP summary to measure enrollment. And now, here is a computation. So again, when you're using the CAEP summary, I'll admit that it's a little bit complicated to do this calculation. Again, with the new report, I think it will be a straight looking at columns rather than having to dig out particular cells. But what we're doing here is we're starting with that N/A. So does everybody see-- we'll go through this some painstaking just to make sure everybody gets it. So first off, does everybody see that number 79 listed under column M in that row labeled N/A in the services section? So thank you, Chris, you jumped in. OK, people are jumping in. Yes, thank you, everybody. All right, so everybody sees that. So does anybody know what does that 79 represent that I just asked you all to at least let me know whether you can read the 79 or not? What did those 79, what 79, students, what is it about those 79 students in that cell? Anybody have a quick answer for me? Anybody know? Outside CAEP, we'll say, yeah, no program area. Good. Right, they are not part of any program, right, right, right, right, good. OK, different ways of explaining it, but right. They make the meal, they make the map, but they do not have any program. And generally it's going to be a student that received a short term service, but has no enrollment, not even the evidence of anything that relates to CAEP enrollment. So what we're doing is we're taking those total unduplicated students in this section and then we're going to be basically subtracting those 79. So those that are enrolled in this section and have a connection to a CAEP program divided by everybody in that section gives us our program enrollment rate. By apology's that's a little bit of irrational. Exuberance agency, they're doing really, really well, 98% of their learners are enrolled in a CAEP program. I will say in 1920, these percentages were higher. Hard to say whether it's for positive just getting more students in class sort of reasons or it might have been kind of an inside out negative, where it just means you have a lot fewer stray bullets because of COVID. You say typically, I say tomato there, glass half full, glass half empty. How many cliches can shape fit in a bottle here? I'll stop, but in any case, there were a lot of agencies with high percentages in this area in '19/'20. I'll just say if you want a more realistic number, you might look this at '18/'19. Hard to say there, but that is the program enrollment rate. Here's the math on the slide right here. But if you want to see at the end of the day of everybody who makes the report what percentage have some enrollment connection to one of the seven CAEP programs, here's where you can figure out that information using the CAEP summary. And I'll go out of my way again to say this is the same metric that we've reviewed in 2019. This is totally pete and repeat admittedly with new screenshots in slightly different terms. I think we've gotten more consistent with our terms. So I do feel a lot better about the terms right now probably than I did before. Yes, unmute, ask away. PEGGY: Hi, Jay, it's Peggy. JAY WRIGHT: Yes. PEGGY: So I'm struggling a little bit about 98.8% being a good number. And the reason why I'm struggling is I'm thinking about the fact that we are trying to pull in as many students as we can into the adult schools and then refer them to our colleges if they don't fit the adult school. And we're coming up with a whole way to transition them and we have outreach all by ENEN. And so I could see in our consortium that, that number may be down to 96 or 95. But that's because we'd be serving community members by helping them set goals and go to another agency that is not an adult school. So my question is, am I looking at this wrong here? Is this going to be kind of a new measures of the state-- JAY WRIGHT: No, I think you're looking at it correctly. Sorry, so I'm going to give three goofy answers. The first one is that at the end of the presentation, we have some Statewide slides, I think Statewide percentage was 96. So we were a lot higher than we used to be. We don't have any set levels, Neal clarified that. So what we're showing is not what you're officially responsible for, it's just kind of rule of thumb for reference only just to see how you compare with the overall, quote unquote, "average". I'll go back to the how many cliches can jade jam into a bottle to wear? Short answer is in '19/'20, this metric was a lot higher than it was in '18/'19 or '17/'18. I do think that COVID had a little bit to do with it. It is a glass half full, glass half empty, maybe it's just because we're a lot cleaner. We have a lot fewer stray bullets, we're getting everybody in class, we're doing a better job of managing our data. Maybe it's looking at it more half empty where we just don't have as many students who are coming in to kick the tires. We batten down the hatches. Once the cliche is in this explanation, we batten down the hatches. We're only serving those that are all the way in already, so that we don't have those coming in off the street kicking the tires much in the last year or so. So I'll just say Statewide at a lot of agencies, the last year or two these percentages are a lot higher than when we did this presentation two years ago spring of 2019. I forget what the percentage was that we presented a couple of years ago, but I remember it being more in the low 80s, I think 82% or something like that. I don't know if that's right, but it was a much lower percentage a couple of years ago than it is now. We increased this metric by quite a bit in '19/'20. So there's a lot of different answers, none of which is the silver bullet you're looking for but are all possible. I guess the other thing I'll just say related to Neal's response is if you know your agency or your consortium has something specific that might make numbers appear a little higher, appear a little low, that's what I've always called a do your homework issue. If a number looks weird but you've done your homework and you know the number looks weird because that's just how you roll at your agency and/or your consortium, then fine. You've done your homework, you've done all that, you've taken the steps you need to explain that number. Maybe you need to take steps to change it, maybe you're 100% happy with the way you're doing things. And you may just need to know that yeah, that number is always going to look a little bit different because you've decided to say tomato instead of tomato or some to speak. Does that at least make sense, Peg? Yeah. All right, thank you. All right, so I'll move on here. Good question. Thank you, Peggy. That was great. That really helped me a lot to break and break it up a little bit. I think everybody should be really thanking you to break that up because it does feel a lot better having broken up that diet right. Anyway, the next one we're calling CAEP participants. This is another one that we talked about extensively on the data dive two years ago. So again, there's that nifty screenshot of the new report. Here's the question we asked two years ago, every bit as relevant now. Of those that enrolled in a CAEP program, that is, those that have at least an hour and have official special enrollment, how many stick around long enough for, quote unquote, "persist" long enough to accrue at least 12 hours of instruction? And you might say it's not there, but in parentheses and also qualify for CAEP outcomes than NRS, quote unquote. So here, we're going to go back to the CAEP summary. We're going to make sure that our calculations are correct. Here's that outcome section. So again, we're going to look at those unduplicated enrollees in column E, that is, those that have the demographics of 12 hours. You can see that's how many show up in column E. And then we're going to divide it. It's a little tricky here. Again, we're not dividing it by the unduplicated in the right hand section represented by column M. We're putting a little bit of a twist in and making it a little trickier than anybody wants. And it's what we divide it by is that number we used in the previous metric, that is, the number of unduplicated in column M but after we subtract those in N/A. So I think, Maureen, you dutifully and graciously typed it in, it was that 6771 minus 79. That's correct, thank you, Maureen. Is that again, we're using that as the denominator here, not the total in M. Because again, we're at those that are services only, we already kicked out. We've already eliminated them or voted them off the island, so we're now looking at those that were still left standing to measure against. But we're looking at the 12 or more hours. And Neil says yes, different regions. Not all are the same, your mileage will vary, yes, indeed. But in any case, just to go back to the metric, you can see here 4372 divided by 6692, that's not right. I think I used the same one, sorry, that one I got to change. I had the right percentage, I overdid it. But anyway that's not 69, that's like 67, I think. Sorry, I used the same thing and typed it twice. I'll have to go back and set a new PowerPoint out. I think I tested it twice because I have another version that had the right number in it. Yeah, thank you. Sorry, that was a typo. I used the exact same percentage and pasted it twice by mistake. Yeah, everybody's jumping in. But you can see the right numerator and the right denominator. But yeah, I failed math, I used the same result as the other one, sorry about that. But in any case, hopefully, you can see how you do it. Again, you're looking at those that get in the middle section, comparing it against those that are in the right hand section. But again, that little caveat or twist here is to make sure you're calculating it correctly. You need to eliminate those who are in the end A cell, that is, the 79 that we address there at the beginning of this discussion before you come and calculate that right percentage. OK, I'll stop here and just make sure everybody hanging in all right. Sorry, I put that stupid number in. I'm going to leave it and just send it back to Veronica after. But everybody understand these metrics OK, yes or no? Good quiz, that's right. Giving you wrong answers and everything to check on right. OK, so in any case, the third and final one from that data dive two years ago was looking at it the old fashioned NRS way. That is if we're doing it the federal reporting way, we're looking at pre and post testing. I'll send that to Veronica. Yeah, it was just a stupid mistake. I hit Control-V too many times, giving you two TMI, I think. Anyway, CAEP pre and post testing. Here is where we look at the same way of measuring persistence exactly as the feds or NRS else would do it, but we're using it giving our calculations with the CAEP summary rather than NRS. Bottom line, if we are WIOA II agency, of course, we can use NRS table 4 or the constance Data Portal and that will work just fine. Thank you very much. But again, we're looking at CAEP that we should use the CAEP summary. The other probably more important reason is obviously CAEP includes CTE workforce crap and those other programs that are not included in NRS federal reporting. So to have it be apples to apples, we might well want to consider using this method instead of the NRS way. So as to include all of those CAEP students that are not in ABE/ASE or ESL. Again, apples to apples suggest that we will use this left hand literacy gains section. So again, we're looking at column B to show how many qualify, that column C is basically looking at the, quote unquote, "table 4B" point of view, and that column D is looking at the table 4 so to speak, where it's EFL gains. So again, it's a lot of the same type-- it's the same approach that we would use in a report such as the NRS persister, but setting it up in columns by program, rather than by NRS level. And of course, including all those non-WIOA II students. So again, if you want to look at persistence, we would simply look at column C divided by column B. That would give us the persistence rate. That 70% is not rule of thumb, I've been giving everybody for a while. If you look at the last year or so, that 70% is wrong. In a COVID land, obviously, that went down. I'll say it went down but not as much as people think. At least in '19/'20, it did it. It went down a little bit to like 66 or 67, kind of within the margin of error in '19/'20, I would say. 2021, we'll see. I think 2021 will be a lot worse, but people think it's a lot worse, it's only a little bit worse. But either way, I'm going to use 70% as the rule of thumb because you're going to be looking at goal setting as we move into the next year, that is, as we move into opening up and we'll start doing full fledged testing again and all that. So 70%, really generic rule of thumb. That's been our persistence rate for WIOA II for almost 10 years now. It's really been steady at 70% for a good long while, again, COVID notwithstanding. But I dare say 2012 to 2018, we were stuck around 70 for those 6 or 7 years run. OK, anyway, so targeting outcomes. So this is another metric to use where, wait a second, there was something-- OK, sorry, this one I skipped. It's the same exact slide, but it's column D divided by column B. So column C divided by column B gives you persistence. Column B divided by column D gives you performance. So again, if you use column C, that gives you the percentage in each program with the pre/post pair. Column D, Delta, divided by B, Bravo, will show you the percentage in each program that achieved an EFL gain. So you can do it either way for persistence or performance, again giving you a very CAEP specific figure but using the NRS method of focusing on pre and post-test results. And then here is one where we're fleshing it out more. This is what I remember covering but not in nearly as much detail maybe as those other three. I just covered where we looked at a CAEP performance metric where we were looking at outcomes. It was a real simple metric, some of you really like this, I think, because it was simple. Some of you looked at it skeptically. Some would say this is not that accurate of a metric, like either you say tomato, I say tomato issue. Some of you really love it though, so it's definitely at least and something is better than nothing category. But we look at this outcome section, we note the number of enrollees in column E, and then we just take column F, G, H, I, J, K, and L, add them all together when we add the unduplicated totals from columns F through L, and then divide it by that unduplicated number in column E. That gives us what we call the outcomes attainment rate, where we've got another calculation that shows the number. Now, I see what I did here, I put outcomes in the last one, so it's really a flat. But in any case, we've got the outcomes attainment rate, this is actually 90. But a lot of you are going to be 1.0 something here, where if you had a really good consortium, you've got more outcomes than you have students, but you're just taking that total number of outcomes and dividing it by your number of students. OK, so the next one is using the CAEP DIR. So this will be in a lot of your guidance. You'll get in the summer of different ways you can do things in NOVA, well, we'll obviously list the DIR. You can preferentially use the CAEP DIR. I reckon you'll be able to use other DIRs too, but we'll focus on the CAEP DIR for now just this snippet year, where right now we're looking at those prescribed ways on the CAEP summary. But again, just like we talked about on the set for WIOA reporting, I'm pretty sure you'll have the same flexibility when you look at CAEP goal setting, where you'll have a number of different ways to look at goals. So everything we've looked at here related to CAEP goals. So far, we've looked at the CAEP summary, but this is just to show you, you can also look at that CAEP DIR. What I'll also show is here's a download, it's on the CASAS website. I'll send it to Veronica so we can post on the CAEP website also. But here's the link to our website at CASAS where I know we have this document. That is, there's a document that has 10 quarters of Statewide DIR performance. It's at the specific rate. It basically goes from the start of the '18/'19 program year through the second quarter of 2021. So count of 10 different quarters, you can look at end of year four '18/'19, or '19/'20 as well. So again, all kinds of ways to compare your own agency or your own consortium to Statewide performance at that link. And I'll just say that's the framework for how I suggest you use the DIR. There's not really, maybe there are ways of doing this that you know that I don't. I'll just do the figures, I mean, I have a version where all this is right. There were a bunch of typos. If I basically weigh TMI, but if I just didn't waste time this morning checking things, everything would have been completely right. But I got nervous, Nellie, this morning and made a bunch of edits. And quite frankly, I just shot myself in the foot by doing that then messed it up. It's all kind of a skew. So undo do what I did this morning and go back to my 4:00 PM yesterday version. If I had just done that, everything would have been fine quite frankly. So I'll send this out at the end of the presentation. I'm not going to spend 10 minutes going and do those calculations now. So I just go back to the website after the presentation. If you want to get into the weeds with that, I'll just go back again. My version I did yesterday was so much better than the one I fixed this morning. Anyway, way TMI. Anyway, here's the way that you could use the DIR. Mostly, you would just compare it like we did with the Data Portal for NRS where we look at averages and just say yeah, we're going to be equal to or greater than the norm. So in this example, we're at 34.3% of students without 12 or more hours of instruction. You'll just have to take my word for it, probably mess this up too, but the Statewide average in 2018/'19 was 24.1%. So we'll just look at it and say yeah, is that agency, we're pretty messy, we're not doing hours. For whatever reason, this is an area that we need to do a little bit better. So yeah, our goal this year, we're way below average, so we'll just work our way so we're at least an average agency this year. By end of year, we'll be at that average of 24.1%, just giving you example. Obviously, you could apply the CC logic to any of those 27 items on the CAEP DIR. OK, I'll just take any questions here. Some of it is the data figures, some of it is pointing to that link. But let's just get the last one out, everybody hanging in there OK? I'll take that as a yes, I don't see anybody. OK, no questions. Thank you. All right. So anyway, next up for bit is NRS performance goals. We talked a little bit about how that was done early on. So we're not going to repeat that. But what we will say is obviously, many of you are WIOA II NRS. Just as obvious, I think is that CAEP is totally aligned to NRS and WIOA II. So any agency can actually do this, but I got to say it really only makes sense for those of you that are actually funded for Title II. If you're funded for Title II, then you have things like NRS reports and payment points that you're using. You can refer back to resources, such as the CASAS Data Portal. Again, lots of really good ways to measure your local performance, compare it to the state. So if you are WIOA II funded and/or have been funded for a while, all of those resources that WIOA II agencies use, you can, of course, look for CAEP reporting. If you do so, you need to realize that you're only measuring your students and ABE/ASE and ESL, you're obviously not measuring your students and the non-WIOA reality programs. So most likely, you'll want to combine this with some other metrics that do include all of your students that are in CAEP reporting. But certainly as one way to measure, you can certainly start using things like the Data Portal and focus on performance and ESL and/or performance at ABE. And obviously, you'd have a great deal of additional resources at your fingertips to have more metrics and more data evaluation, and so on. So if you were to do that for CAEP, that would be one of those where you'd really just do the same method as what we said in performance goals. So here, we're just reverting back to what we showed you when we were highlighting NRS performance goals at the beginning of the session. We pointed out that ASE low is an area where we need to improve. So here's an example of a goal where in this case, all right, we will improve our NRS performance and ASE low from 22.6% to 25% just as an aside that's a total made up arbitrary percentage by me. That isn't me clicking Control-V in the right place again. That was just me making up a goal. I'll just say that 25% is completely made up, maybe it's right, maybe it's wrong. But I will show this, I did that on purpose to say with this type of NRS goal setting as well as all the different kinds of goal setting you might do, whether it's for NRS, whether it's for the CDE set, whether it's for NOVA and CAEP, whether it's for all of the above. When you set goals for all of these things, you're never under the gun to reach the "goal", quote unquote, or reach the average. You always have the flexibility to set the goal where you want it. So sometimes that might mean you want to set the goal better than average. If our performance at ASE low is 50%, well, obviously, it would be rather stupid to set our goal for 37% and say we're going to go backwards. But that's not to say it's inappropriate to set a goal if we were at 50%. In ASE low, however, we might want to set that goal for 55% next year. Conversely, in this case, we're not just below average, we're way below average. You might say we're a borderline basket case as far as a ASE low is concerned. So in this case, we are going to target it, we are going to set a goal for it, but we're going to be super conservative because we know it's a big stinking mess right now. So we're going to be cautious and just say yeah, we're going to really look at this, we're going to really work on getting better. But for now, we know it's a big stinking mess, so we're just going to get a little bit better in hopes of cleaning it up this year and maybe making better strides in future years, with this year being the first year where we get things all cleaned up and good enough to move forward, whatever, whatever, whatever, whatever, whatever. OK, I'm going to stop and just say I bird walked and I really didn't pause. That was a card carrying and Jay Wright saying five different things and one extended run on sentence. So I'm going to stop and say, hey, is everybody still following me nonetheless? Thank you, Kareema, quick response. Everybody is jumping in. Thanks very much. I know you'd say you'd all get the last go out and ring me by my neck if I wasn't, right? Yes. OK, so here's another tip if you want to look at it using NRS related metrics rather than our CAEP homespun metrics. So to speak, is we do have the CAEP tables and TE. Got to say we haven't spent a lot of time on these. As we move into goal setting though, you might say the time has finally come where we start promoting these a lot better than we have. How many of you have used these just out of curiosity? They've been in TE circulation since the summer of 2017. That said, we haven't spent a lot of time talking about these because I've got to say a lot of folks, myself included, like to keep summary better for a lot of reasons. But that's not to say these can't be helpful, just in my myopic mind of mine, they're good but not as good as the CAEP summary. But I'll just say as we get into goal setting, we'll start using it more. I'll point out they've been around for four years now. Bottom line is that you like the NRS way of looking at these metrics and you want it done for you with those preset NRS calculations, that would be a good way for you to do it. Instead of using the CAEP summary and doing all that crazy column calculating, you might want to just go ahead and use the CAEP tables and that will have some things a little bit better built in for you. And the other advantage is it includes those CAEP students and programs like CTE and workforce preparation that did pre and post testing that would never be included in any of the NRS reports. So I'll add here, when these especially might be helpful for you is if you're a CAEP agency that's used classes pre and post testing for students outside of ESL, ABE/ASE. These would be especially helpful because they would include those CTE students that tested and include them in the percentages that allow you to evaluate all your CAEP programs in terms of pre/post gain, not just those in the NRS programs. OK, move along here, I guess. OK, so next item up for bid is defining goals for special populations. If you want some detailed examples, there's four, count them four, I think, pretty good examples in the presentation we did a week ago. We looked at things like the barriers, we compared them by EFL, we compared them by CAEP program. I think we looked at labor force status for one example. I can't remember what they all were, but we used four examples. We really looked at special populations. And in particular, we used that new ad NRS crosstabs report. So we did, for example, using those crosstabs where we basically just use that mover, we select what field we want to populate, the x-axis, we select a different field to populate, the y-axis, and then presto, abracadabra, hocus pocus, there comes our ad hoc NRS crosstab report. How many words can we come up with? Which one did I miss? Presto, hocus pocus, abracadabra, which one did I miss? There's still an obvious one, fill in the blank, who is going to help me out? I know somebody is going to. Anyway, so for kick goal setting is-- fantastico. All right, fine. I haven't heard that one. I like that one, wawaha, there's the one I was looking for. But I think it's just shazam. All right, I think I like fantastico better. Diana, I will definitely use fantastico the next time I get myself in that magician situation, that's for sure. Anyway, so you can use that ad hoc NRS crosstabs in a variety of different ways. Here's an example that we use. I used a different one here. So in this case, we looked at cars in MSG. I have these two listed on the screenshot just because they were next to each other. Got to say, for a lot of reasons, cultural barriers, you could use that, but that would not be one I'd recommend. But person with a disability, I think, it's an outstanding example of one you might use for barriers. And that's the one I'm really focused on here, where I'm just keeping it simple. One metric has MSG, doesn't have an MSG. So you can see, we've got 35 individuals with disabilities, 12 of them have an MSG, so we can just say yeah, we're going to improve from 34% to 40% of those with the disability obtaining an MSG. So that gets directly at that example. I think Yahoo is like myself appointed out of a lot. We are maybe at your agency, you're doing a good job in terms of performance on table 4 and performance on the CAEP summary and so on. But maybe once you dig a little deeper and look at your special populations, your performance gets worse in a hurry. So if that sounds like you look at your special populations and you can set goals like in this example, when you're focusing on that, in this example, we're focusing on our students with a disability. OK, and then consortium level reporting, we showed we had a few of those again to some of the questions. We still do need to build the number we have available at the consortium level. But obviously, a way you could do goal setting is look at it from a consortium point of view rather than an agency point of view. And we're just showing some examples. I got to say when I got into this, this is one that is a stand-alone presentation. I think for later in the summer, I couldn't really do this without taking 30 minutes to get into it. So I'll just say, for now, yes, you absolutely might want to look at those consortium level reports. If more than I do, I'd be happy to hear some of the wonderful ways you're doing this. But this is one for me to develop, I think, and come up with a more cohesive concrete presentation later on this summer. But here's some examples of ways you could do goal setting, where in this case, we'll say everybody in consortium X will have no more than 27% of learners missing 12 or more hours of instruction. So a lot of that is in the way you write it. Maybe we want consortium wide to have an average of 27% or better. In this case, I was thinking more and I didn't write it as well as I should have. I should know better from my IEP experience in the past life, but you've got to write in exactly. Sorry, I should never have done this. My apologies here, I overshot. What I was really looking for I think is this one where like here we said OK, this agency is going to improve to 34%. So this is an IEP way of writing a goal that I like. Where here, we're basically setting the bar at 27%. So this is an odd way, but it really does give a good way to get everybody over the bar. What you're basically just saying every single agency will be 27% or better. At the individual agency level, we're going to say we're 34% for better, but consortium director, come on, give me a break. Most agencies will do great, but there'll be that one sticky wicket that doesn't quite make it. So we'll set the bar a little bit lower for everybody and we'll know if everybody gets over that 27% bar, then that's a good way to ensure that we'll average that 34% at the consortium level. Hopefully, you can see the method to the madness, but that would be another way if you want to get what step that's far low to where we can make sure that every single agency in the consortium has a really good chance of getting over it. Or we can go the other way and just give an aggregate per set, where here we used another metric. Where we're just saying all agencies in our consortium at the aggregate level will be 70% better pre/post test persistence over the next year. Again, you say tomato, I say tomato, different ways of looking at it from the consortium level when you can say all agencies will do at least this or all agencies in the consortium will average at least that. Of course, there's many other ways you could sprinkle that love around to measure that at the consortium level. OK, and then other considerations. If you're a veteran of that NRS performance goal series, yes, the slide looks suspiciously familiar. But lots of ways of doing it. This was looking at NRS, but this applies to CAEP too. Integrated Education and Training, well admittedly, that applies a little more to WIOA II than it does to CAEP. But nonetheless, it's a good goal for CAEP. If you are an IET agency, that absolutely there's no question that if you put all your eggs, I'm saying yeah, our goal in '21/'22 is to develop a state of the art IET program. We've got it up and running, but it's not as good as we want. This year, our focus is and making it the best anybody's ever seen. Well, obviously, that's a great idea. Obviously that indirectly improves your CAEP program. I think most people would say it dramatically improves your CAEP program. So certainly a good idea if that's what you wanted to do. We've talked about things like geographic data and some of the panels I've done, we've had some panelists who do a great job with that. But again, look at the communities in your region, develop goals by region. By your 3-year planning, that's obviously what we're getting into. So CAEP 3-year planning that's what we're doing. You could do things like your MOU with WIOA partners, how are you collaborating with Title I, Title III, or Title IV? Some of you have some formal goals you're setting locally, FPM and WASC. I've heard recently that is now actually aligned. So if you've had a federal program monitoring visit from CDE and/or you've had a WASC visit recently, maybe look at some of the things they've identified. I've brought this up as a good way to handle NRS score setting. If you're doing it for CAEP, it's equally valuable in my opinion. That's another way to do goal setting. You might say, here's another obnoxious cliche. But if you want to kill two birds with one stone, this might be some low hanging fruit for you where you could look at what the WASC evaluators said you needed to improve. You may agree, you may disagree, but you've got to improve whether you like it or not. So use that area as an area for improvement for CAEP. Again, kill two birds with one stone. Aligning to regional priorities is one we've talked about in the NRS performance goals or again you are looking at your region aggregately. Sometimes it's what you do for CAEP or you're looking at the college and the different k-12 is that how you're collaborating. Sometimes is your WIOA partners, sometimes it's looking at local employers, other times it's looking at local government, yet other times it's looking at all those outside shareholders, but looking at how you collaborating. Look at results that might involve good effort from you, but might also rely upon good effort from somebody else. OK, coming up for air, we've got 8 minutes left. I got to say we're right on-- OK, 7. But anyway, we're right on cue nonetheless, we've got about 5 minutes left. I'll just say any questions. Hopefully, you can see with every single area I covered, do you all see how I could have gone at least a half an hour on each and every single one of those line items? So you might say all of those line items were covered in unsatisfactorily because all of them had a lot more room for growth. I tried to get a little bit of attention to each of them though. Again, I want to say Neal's probably already stated in the chat we'll be doing lots more training is my understanding. This is really just more first Salvo. We'll be training at Summer Institute, we'll be training in the middle of the summer, we'll be training at the end of the summer, we'll be training in the early fall, we'll be training at the end of the fall, we'll be training all over. Isabel, sent me an email if you don't mind, I can connect you with somebody class's for that. Very good question, thank you. OK, so I don't-- all right, thank you, Diana, thank you, Connie. So lots of other things you could explore. I will just stop any questions, though, I'll just say thank you, Diana, Connie, Isabel, great questions, great comments. Confirming, understanding, but any questions before I do these last couple slides? OK, so I'm not going to dig into it, but just showing the Statewide CAEP summary from last year. I just wanted to do this because whenever we look at, doing things like goal setting and doing these comparisons, nobody can really do this without saying, hey, wait a minute, I want to see if I'm above average or below average. But for WIOA, I can always say, well just look at the Data Portal, I have an easy out. For CAEP, I don't have an easy out. So you might say here's my best stab at giving myself an easy out, where I'm showing you the Statewide CAEP summary. So if you're looking at wanting to see how well you're doing in HCST, or HCSC, or CAST I-3, or maybe you want to focus in on how many receive supported services, you want to focus in on the percentage with a pre/post pair, there's all kinds of different ways to do this. But if you want something to hang your hat on and have a method for comparison, here's something instead of nothing where you have the CAEP summary for 2019/2020. So again, here is our Statewide persistence and Statewide performance rate for '19/'20. You can see a little bit lower than usual, but not as bad as everybody expected it to be. So if you want to compare yourself to '19/'20, here you go. Again, it's on the slides. You've got it On the slides. Everything you see here, is on the slides. And then here's the outcomes rate. Again, we were almost at a one to one. You might remember from two years ago some of you calculated in your 1.0 something and you all thought you calculated wrong. Well, no you didn't. If your rate is better than 1, that just means you're from an agency or a consortium that's doing a really good job with this. You can see Statewide, we've made good strides here, where in 2018, I think we were in the low 80s on this one as well, I believe. Now, you can see we've gone from the low 80s to the mid 90s. And then here services, again, our Statewide enrollment rate was 89%. The persistence rate was-- that is the participants, not persistence. But the participants, sorry, I'm talking too fast, but that is the participants rate of everybody on the map. How many had 12 or more hours of instruction? That's 69%. So I'm just giving you some Statewide averages as a rule of thumb reference. Sorry, I can't get out of the way of my own cliches this afternoon. And that's it. That's everything I had. Were right at the end. Sorry, if that was a fire hose treatment. Yet again, I'll just say what this presentation has showed me is we have lots, and lots, and lots, and lots, and lots of room for growth and lots of room of different ways in which we need to develop some more training. So I'll just say some of you have sent me some really, really, really, really, really, really good examples for this. But back to that, I didn't have enough time. There were some of you that I even blatantly told I would use it in this presentation, but I didn't out of lack of time. But I'll just say I will be looking at this more for more presentations. If you have examples of goals you'd see in this presentation, maybe you think you've got a great brainstorm, you've got the greatest idea you've ever had, send it my way and I will be sure to try to incorporate it for the benefit of everybody. If you want to chat and go over this, I would gladly do that. It always helps me a lot more than it helps you because it gives me ideas for future presentations. OK, Veronica, or maybe it's Mandilee or maybe it's Howie, take it away, please. VERONICA PARKER: Thank you, Jay, and thank you, everyone, for participating in this afternoon's webinar. As we mentioned, we will receive the updated sci-tech from Jay. And we'll be sure to send that in the follow up email along with the recording as well as the link to the evaluation. And my colleague, Mandilee, has posted in the chat a link to the evaluation. So if you do have a couple of minutes at this time, please go ahead and complete that evaluation. It only takes about 1 to 2 minutes, but it gives us a lot of information that we can use for planning purposes with Jay and classes and the rest of our contractors as we are continuously planning and putting out professional development over the next few months. And also, there is a link to register for upcoming webinars. So be sure to check that out. Again, we are always adding more webinars to our list of offerings. So continuously check that. Also, we communicate via the newsletter. So if you are not registered for the newsletter, please go ahead and register for that so that you can receive all of the latest communication regarding all of our upcoming offerings. And additionally, I just lost my train of thought, but you'll receive a follow-up email tomorrow with the link to the recording as well as the evaluation and the PowerPoint presentation with a supplemental form. Oh, and I just remembered what I was going to say. So in your chat, you also have received a link to our student data reporting page on CAEP website. And that's where we house all of our webinars related to CAEP data reporting. So if for any reason you missed or want to go back and look at part 2 or even part 1 of the classes data dive series, they are posted there and available for access. So if you want to reference them for future or if you want to pass it along to colleagues who may want to reference those materials, they are on the website. So again, thank you all very much for your time and your participation and we look forward to seeing you soon. Have a great afternoon.