Jay Wright: All right, thank you. I guess I should share here-- I have the PowerPoint here. I'll blow it up. OK. And so I ball in the participants. It definitely looks like we have a lot of people that have been to these sort of sessions in the past. A lot of you, I think, have probably been to some of these sessions that we've done here over the last month, where we're talking about performance goals from a federal perspective.
A lot of these concepts, of course, are the same. The way to evaluate it isn't too different, but we will. The idea here is we're trying to relay roughly the same sort of concepts that we've been talking a lot about in these trainings, that we've been talking a lot about in these NRS performance trainings. But trying to apply them to some extent with more of a CAEP spin. What are some CAEP reports you can use from this? How does this all relate and interact together?
So a lot of you have been to these sessions over the last couple of months. So you know I have an unabashed affinity for analogies. So this time my analogy is going to be the ice cream scoop, where we're trying to do a double, double dip cone here. So today is kind of like that first scoop. We've got a nice full barrel of ice cream, so we're just going to stick that scoop in, and we're going to scrape out whatever we get and have a nice heaping single scoop.
And, hey, next week, hey, we're going to make a double scoop. So we're being a little bit more deliberate in how we're scooping because there won't be as much left in the bin when we need to do that double scoop. But we still want to get a nice hefty dollop for that double scoop. That's my analogy. We're not separating it into qualitative and quantitative, or separating it by our persistence and performance. We're talking about all of this both times, but we'll just kind of collect a little bit more along the way when we do that the second time here next week.
Not unlike some of these things we've been talking about in our data with that whole circle of life or house of mirrors sort of stuff we've talked about in other sessions. So here is the agenda. We'll start with just some basic updates. Some of you have been to TE network meetings and WIOA network meetings. And so you've heard about this over the last few months. But it's been about four months I think since I've done anything with TAP, kind of the longest TAP gap I've had really since COVID hit in early 2020, I've got to say.
So it's been a while since I've done much of anything for TAP. So a little bit of reset on some of the new nifty stuff we have in TE for CAEP reporting. That's really where we need to start. So we'll do it here at the beginning of the first presentation. We'll reintroduce those metrics. You know, Howie, Mandilee, I think you talked about this a little bit in that CAEP session yesterday that Veronica and Myra and Neil did. I think you kind of hit the reset button on what the mandatory metrics are, the optional metrics.
Kind of want to reset that one again. Want to align some of what we're talking about with CAEP reports and TE to those metrics. There's a million ways we could do that, of course. Seems like the best way is let's take what's right here in front of us and come up with some ideas related to that. And then we'll start looking at those TE solutions for consortium and agency level metrics. You know, again, going through the different metrics that CAEP and NOVA have kind of already laid out for you, kind of one by one.
Some will touch on a lot more than others obviously. But looking at those metrics, giving you some solutions in TE for which specific reports to use for those metrics, maybe different ways of looking at it. And then that fourth bullet is where we're hopefully going to dig into something new and not just recycled information from last fall or from some of the other workshops we've done, where we'll have our TE solutions related to those CAEP metrics and CAEP reports.
But then furthermore, we'll kind of provide some tips, a little bit of a shtick with it. You could decide whether you like it or hate it, I guess. But try to look at it from the, hey, here's what our metrics are. Here's some specific TE features and TE reports you can use for each metric. And then, oh, by the way as you're doing that, here is kind of the questions you kind of want to be asking yourself, asking your staff, asking your partners, whatever, when trying to figure out what's going on with your data and establishing all of this stuff.
And then we'll look at some optional suggestions. Again, looking at it in terms of TE solutions and data evaluation tips. And then we're not going to do the second half until next week. But we'll kind of combine it. A lot of you have done these other ones where we look at agency level and student level solutions. So we'll look at what we've introduced here in this first session and kind of transpose it over some of those agency and student level solutions we've talked about quite a bit when we've done NRS performance goals.
And what do you know? That yields a whole bunch of other questions we should be asking ourselves, a whole bunch of other tips we might benefit from when we're trying to solve these sort of issues. So I guess I'll just start by saying, does that sound more or less like what you were hoping to get or expecting to get yourselves into this afternoon. Yes or no? Any comments, questions, concerns about any of that? Yes or no? OK, well a few of you are going to go ahead and bravely say, yeah, that's exactly what you were thinking. Thank you for that. OK.
So here, we'll start by-- sorry, I'm going the wrong direction already. So OK. So we'll start with some of our reports updates. So how many consortium managers do we have here? Either the ones who are running the consortia, or maybe you're not running the consortia, but you're the TE consortium level data manager for your consortia. How many of those do we have here? Nobody jump at once here, I guess. Maybe we don't have many of those. Maybe this isn't something we need to talk about much.
Oh, right here. OK, there's Janie. OK, you think you-- OK, so we're coming out slowly but surely. So Connie, Chris, yeah, you're pretty close to it. So I'll count you, Chris. And I'll count you, Connie. Yeah, I know you belong, Cathy. All right. So plenty. All right. So I was kind of feeling like I was preaching to the choir at the beginning. Hey, wait a minute. Maybe I'm not. All right, slowly but surely you're coming out of the woodwork.
So I'll just say I'm pointing you out or making a big fuss about this because a lot of you-- and I know some of you are the very specific people I'm referencing here-- but I know there's some of you consortium managers. And there's been a bit of an influx in this, I got to say. That is, consortium managers that are asking for really the same old reports we already have in TE, but just looking to be able to have that consortium level view in addition to or instead of the agency level view.
So sorry, I've got to step out for a minute and kind of wonder, as consortium managers, Sherry, Cathy, Connie, and Chris who qualify, would you agree that that need has increased lately, number one. And if so, would you be willing to proffer a guess? What is it? I kind of feel like this CAEP reporting phenomenon is not new. That's been a pretty constant flow for five years. But I got to say the spin where you're looking for things at the consortium manager level, that's unique to the last year.
Anybody want to proffer a guess or a theory on why that might be? Or am I just out to lunch and imagining stuff that hasn't really been happening at all? Any of you with anything to share about that, seeing as how we're outing or consortium our managers at the moment? Yes, yes. OK, so Connie, you're just giving the who's buried in Grant's tomb answer that, hey, we're doing three-year planning, so of course we're asking more for it. Is that what you're telling me?
OK, Cathy, all right. You're solidifying that a little bit. Thank you. Anybody else? Anybody agree with Connie or have anything else to bring up on why that is? Sorry, I'll never get a better chance than this to get a good answer. So sorry. OK, Janie, Sherry, everybody's jumping on. All right, Connie, you've got a fan club. Great. All right.
Good enough. I mean it's not a biggie. Just I'll never get a better chance to answer this question than now. So sorry, I've got to take advantage. So to that end, we do have some new consortium manager tables. The I-3 summary, not really new. That's been available for about a month, six weeks maybe. So that's the same old, same old that we've had available for about a year now. But we're saying it's new because now it's at the consortium manager, not just the agency level.
Then another question-- you're supposed to get questions answered here, not be answering my questions-- but if any of you looked at this enrollment by zip or enrollment by zip and demographics-- oh, wait. We've got something new. OK, all right, so you're just kind of giving more information than just what Connie said. A little deeper answer. Thank you, Sherry.
So if any of you happen to look at the new student enrollment by zip or enrollment by zip and demographics, these two reports really are new. I think they were in the build about two weeks ago. I think they were two Fridays ago, the 18th, I think was the build when these middle two bullets came in. All right, thank you, Cathy. You've looked at it and liked it. So there's one that just looks at enrollment by zip and city. That was what was requested.
You know, I don't think Ryan's at Mt. SAC anymore, but this is one he requested. If he was at Mt. SAC, I'm sure he'd be here. But he brought this up quite a while ago, wanting to figure out, hey, which communities within my consortium or region are well represented. Which ones are not well represented? He was noticing patterns but wanting to be sure that's right. That's what he wanted. Kind of facing that COVID marketing problem. Hey, we need to reach back out. So which communities do we need to reach out to and so on?
So we have one by zip and city. That one, I think, is pretty easy to manage. That's one that you can reach out right away, ready to use, fresh off the shelf, whatever cliche you want. It just gives you a breakdown of enrollment. Zip code is what we thought at CASAS would be the best way to do it. But a drill down.
That's a good question. Could you send me an email on that? I mean, right now I think it's mostly a consortium manager report, so I think the answer to that one is no. And that's because we only have it at the consortium manager level, not the agency level at this point. We go out of our way to say no drill down at the consortium manager level, only the agency level. So I think we kind of need to get to where this is an agency level, not just consortium manager report. But I can kind of look into that.
Send me an email though, just so I don't forget. Cathy, that's a good question. But we have zip and city that just allows you to get a breakdown of here's the communities that are contributing the most or not to our enrollment count. I think, Cathy and Chris, you both said, hey, it's kind of intended for marketing. I think that was Ryan's request. He just wanted to figure out which communities he might need to do more. Hey, we know we should really be serving this. This is one of the largest communities represented in our consortium, but for whatever reason, we're not students in that community.
Maybe we need to do a better job. Whatever. We added a more complex one by demographics where I'd say, it's way more informative, but more difficult to use where it's really big. I'll show you a screenshot of the PDF file just so you can see. We're kind of looking to see if maybe there's ways to put it in more bite size morsels. But it's taking everything you know in that second bullet and just adding a three-dimensional variable where we're looking at student enrollment by zip. And now we're also looking at the third dimension.
It's breaking it down male/female, race/ethnicity, age range, whatever. Are there patterns within these communities where we've got maybe more males than females? Or more age ranges or ethnicities represented, et cetera, et cetera? You know, where you can break it down and see if there's maybe demographics-related issues to these communities that might shed more light on this kind of information. So anyway, that's also new.
And then there's one that I think is brand spanking new, called Butterfly Hours. I'll say that's really, in my opinion, a lot like the ones we've already been offering related to enrollment by hours and service enrollment by hours. But it's got a little bit of what I'll call a cutesy shtick. It has different hours ranges than our official CAEP reports do. And similarly, it's kind of set up with cutesy titles, like the different stages of the butterfly. It was introduced by LAUSD. I know they think their stages or phases are a lot better than the ones we have in our CAEP reports.
There's some people in different consortia that strongly agree with them. So we know it's not just, hey, our way or the highway sort of thing. We know a lot of people have warmed up to this. So we add it in TE for everybody that likes that. So anyway, these are our consortium manager reports. Again, we've focused on this a lot because that's what you've been asking a lot here lately. It's more at that consortium manager level.
OK, and then here's just another way of looking at it. Some of these are available right there on the TE menu. Some, I know, we kind of need to do a little better job with our own alignment. But some of them are available, but not directly from the TE menu. So if you're having trouble there, you might just need to go to those general CAEP manager tables, and you can see all of the ones that are available at the consortium manager level or in that report selection screen in the setup window. So here's just another way of looking at all the different manager reports that are available.
OK, I'm sticking with the setup window here. Again, some of you, this is going to be super duper helpful. Some of you, this is going to be nothing more than authentic frontier gibberish, I realize. But just to show you, for those of you that are dialed into TE, these are two really, really, really, really, useful check boxes in the setup window. One allows you to look at chart analysis for some of this.
At the agency level it's useful, but not always. At the manager level, I'd say it's more consistently useful probably to be able to get those nifty charts. You're more likely to want to present data at board meetings or whatever. You're less likely to want to drill down or whatever. So that chart analysis is probably going to be helpful.
Another one that's important to managers is this checkbox called aggregate. Multiple agencies for most reports-- I'm not sure if this is true 100% of the time-- but usually by default this checkbox will not be checked, meaning when you run it at the manager level, you're simply going to get separate reports for each of those agencies in your consortium. So I see Sherry's picture on my screen. So I'll pick on her because she's the perfect example here, where she's got 22 agencies that are consortium. So all she's going to get are 22 different pieces of paper for all the agencies in a consortium.
That might help, but more likely she might want to check the box here called aggregate multiple agencies, so she can just get that one aggregated report that has all the information from her consortium all locked into one nifty, handy dandy report. You say tomato, I say tomato on this one. You can have separate reports for each agency, or you can have one aggregated report for your whole consortium. Either way, if you want that one nifty aggregated report, you need to make sure that the box at the bottom of the slide is checked.
If it's not, you're guaranteed to get it separated by agency, not aggregated for the whole consortium. Does that make sense to everybody? Especially if you don't know nothing about TE. Just wondering. Any of you that don't know nothing about TE, did that help, or was that totally three minutes you'll never get back? Just wondering. No news is bad news probably here. OK, thank you, Cathy. All right. Well, Connie, I know you use TE. But anyway, thank you, everybody.
So anyway, here's the zip code and city. This one, you know, I think that if you run this at the agency, there is some drill down here now that I'm looking at the copy. But again, I don't want to boldly make proclamations that I'm not sure about. So I'll still give you the same answer, Cathy. But you can see it gives you the enrollment by program.
This is just kind of a bean count, This, I really think, captures most of what Ryan requested last year. It's just kind of figuring out, hey by zip, who is showing up at our classes. Maybe by program. Maybe our ESL students are coming from a different spot than maybe EASC. Maybe that's different than CTE, whatever. So in any case, this is kind of what it looks like. It allows you to break it down and see where your students are coming from.
Here is what I was talking about when I say unwieldy. Obviously, being cutesy and showing you this as screenshots. You know, hey, this is a good reminder that if you're trying to hire somebody for your marketing department, I'm probably a bad hire because I'm just going to scare everybody away rather than war them in. I'm using vinegar here, not honey, aren't I? But in any case, this is what our enrollment by demographics and zip does.
Again it's three-dimensional. We can't figure out a way to make paper three-dimensional, so we just have to spread it out more and more and more until it becomes a crazy, you know, wallpaper report here, rather than anything we can ever print out and spread out on the kitchen table. But I use this a little bit to be cutesy and a little bit to show how much information is by demographics and zip. If there's like a real handy dandy way that would be more helpful, we know that there's a fountain of information here. We feel way more than 100% confident that the information on this report is really valuable.
But at the same time, we're equally sure that if it's like this, it's not going to help you much. But it's three-dimensional. It's kind of hard to get it all. It's just too much information to make it easy. So maybe there's ways to make it more bite-size morsels. Maybe there's ways to prioritize and use some of this, but not all of this. If you're finding any of this to be more useful than others, we're still happy to get feedback on ways that we might have more options to make this more useful and a little easier to manage. Just say it.
Well, it gives you-- well, you can break it down by agency. I think if you run it for the consortium level, it doesn't. But you could go back to that setup window. You can run this by agency. So Cathy, you could run this with that box unchecked, and then you'll get a report for Ventura, a separate one for Oxnard, a separate one for Santa Paula, et cetera. Hopefully, you're getting my gist there, Cathy. So you can do this, but I know that's not what you're asking. So you can certainly add this to give you a little better detail probably than what my answer gave you.
OK, then here's what you get when you check. You get more than this. This is just the top row or whatever. It gives you more than just pre/post. But pre/post, as you know, are perfect for numbers. So I got to say the pre/post charts look a lot snazzier than the other charts do usually, but it's not just CASAS people making a big deal about pre/post and ignoring everything else. There's charts for all of these different types of gains, not just pre/post. But hey, here's some positive marketing where I know your average agency just has niftier, snazzier data for pre/post than for other areas.
All right, so I'll stop right here. We're kind of changing gears a little bit but sticking with quote, unquote, new reports. I'll point out, though, that I'm talking about a report that's really not new. The last time we did CAEP TAP training with me was on December 3 of last year. At that time, this was a new report , but that was almost what? Four months ago. So it's definitely not new anymore, but we have two new reports. One called CAEP Enrollees by Hours. One called CAEP Service Enrollees by Hours.
Again, these were new in 2021. Now that it's 2022, they're not so new anymore, but it's been a while since anybody's heard from me. I know a lot of you have not used these. So here you go. Here's the Enrollees by Hours. We did these specifically because we thought they'd be really, really useful for this CAEP goal setting activity. There's the area where we're moving more. You know, we're moving more people into the adult cert. We're moving more people into the participant's bucket. We're moving more people into the enrollees bucket.
So we're trying to get more students to have 12 or more hours. We're trying to get more students to have some hours instead of no hours. So this tracks the progress for students according to those three buckets that CAEP is using for those mandatory metrics. So it's basically looking at the CAEP summary, breaking it down into those same basic three categories you already know and already know on the CAEP summary. That is the literacy gain or pre/post bucket, the CAEP outcomes bucket, and the services only bucket for the right hand side.
For the left hand side, it's a moot point because everybody has to have 12 hours, like it or not. So we don't waste your time or the programmer time by breaking down something that ain't going to help any. But we do break down the outcome section. We do break down the services section into three additional categories. That is those three buckets you've been hearing so much about for goal setting, the 12 or more hours bucket the, one to 11 hours bucket, and the zero hours bucket.
Here's how that breaks down with those students that qualify for the outcome section of the CAEP summary. Here's how that information breaks down for the students that are in the services section of the CAEP summary. So you can do that detailed tracking according to enrollment and hours.
So I'll stop here. Does that explanation make sense? Yes or no? I kind of went expedited on that one. I'm stopping and smelling the coffee all this time, and now I'm hurry, hurry, hurry. Did that help? Hopefully. So we're going back to the CAEP summary here. This report obviously isn't new. You know, have we all seen and been familiar with the CAEP summary? Yes or no? I know I'm asking it in such a way that if your answer is no, you're already officially scared away, I admit.
But that said, does everybody already feel like they are familiar with the CAEP summary? Vaguely yes. OK, so we've got a vaguely here. We don't have any no's. OK, but we do have a vaguely. OK, so here it is in all its glory. I'll point out literacy gains to the left, outcomes in the middle, services only to the right. Columns B, E, and M are the MAT. Second, the main. OK, we sound like we're talking about the Dark Prince there. Was that the prince that was there in like the 1300s? I think that died the premature death. That was Henry II, the vague.
Anyway, so we've got those three outcomes or those three sections of the CAEP summary. The different areas based on the criteria we know and love. Again, we're using NRS data in that left hand side for pre/post. We're using non-NRS data in the middle, but we are using all of those NRS parameters, like demographics and 12 hours of instruction. And then over to the right, we're ignoring all of that. No hours required, no demographics required. Not even CAEP enrollment is required. As long as you show up somewhere, you'll show up in that right hand services section.
I just bring this up to get our foot in the rail. And I do think a couple of these columns, it's like a year old. But there's that new I-3 column, Column F. And we added that career services all the way over to the right. That doesn't really affect your data results much, but that is new in the last year.
And then I'd be remiss without talking about the CAEP DIR. We haven't talked about the CAEP DIR in a while, so here you go. The big thing I think that's worth bringing up is there were some new items related to things, like phone number and address and social security number. A couple of you talked about how you've been dialed more into WIOA lately. You know, I guess probably I have too in some ways.
So with this new DIR items, we've talked about it lots for NRS and WIOA. We haven't really brought up those new items for CAEP as much. But those same new items that you're already loving a lot, I'm sure, related to data matching and related to follow up survey, we added those in the CAEP DIR, just like the NRS DIR.
The bigger issue with the CAEP DIR, though, for goal setting, I think, is not with the report itself. But we do have that document where we have three years worth of annual and quarterly statewide averages. I know some of you, in looking at these data points, have decided, you know what, maybe watch board is it for us. You know, but hey, that TE CAEP summary really isn't for us too. Either we want to use the DIR because it's the one we're using every quarter. It has a lot more resonance to everybody at the debt plate level, so we want to make it DIR. So everybody's on board.
The more esoteric stuff we get in the CAEP summary, or other CAEP reports in TE, or any of the data from launch, seems more like pie in the sky from us. We're going to be grounded and use the DIR. There's no real app. There's no real, you know, targets for that though, so instead you can use averages. We've got documents that summarize three years worth of averages at CAEP. There's a similar document that summarizes three years of averages for WIOA NRS. So pick your poison. Some really good baseline information you can get from this link.
OK, so I'll stop here. We're at a transition point. Everybody hanging in there? Yes or no? Or kind of? I'm officially out of my hemming and hawing phase here. We're done kicking the tires and reducing. You know, we're going to start diving in from this point forward. Everybody feel like you know what we're all talking about here? You kick your tires. So is it too much air, not enough air? Most likely, it's plenty of air, but it's like at about 45 PSI. It's going to explode, it's so full of hot air. That's probably where you're at. Too much air or not enough, right?
Do you got a cast on your foot now because your tires were rock hard because they were so full of too much hot air, right? All right. No taking the bait there. All right. So we've set up. We've done our coffee table book about coffee tables. So now we need to set up and begin our work focusing on improvement. A lot of you, I know, have been on these performance goals sessions for NRS. So we talk about, yes, we need to target your data to the areas of need. We're going to spend the next hour talking about that.
And then we're also going to be talking about some of these cutesy titles. Again, it's not like I invented these names or anything. These are all real common, you know, English language words. But I'll just say these are not from any textbook. These are not terms that are used all the time for this. I did kind of make these terms up to a certain extent. Full disclosure right off the bat. There is a made up element to this. This is kind of how I see it. Words I like to use, not necessarily words everybody likes to use.
But anyway, these are some really good concepts, in my opinion, you might want to employ when you're reviewing the data. So we're assuming at this point we figured out steps in TE. We know the reports. We know how to generate them. Maybe we don't know much more than that, but we know how to get the reports. But now, we're trying to figure out, now we know how to read the report, what do we do with all this onslaught of information.
Well, here's just some basic tips. Longitudinal data is going to be a point all through today and next week. Always good to know your history. Usually when I say longitudinal data, that means, hey, you don't need to go back forever, but go back a few years, especially now that we're in the COVID era. This is always true. But in the COVID era, you might say, way more true even than normal times. What I would say is normally, I would say look back about two or three years and you're fine.
Because we're in the COVID era, you may need to work it back more like four or five years, not just two or three years, knowing that 2021 and 1920 are going to be a little bit out of whack because of COVID, of course. So look at your data longitudinally. Have we always been this darn good? Have we always been this stinking bad? Maybe we're always up and down. Maybe we're really stinking bad now. But if you go back five or 10 years to the glory days, we were an award winning program. What on Earth happened?
Maybe we were an award winning program until 2020. Then we felt like a ton of bricks because of COVID. Whatever. That isn't going to necessarily help us find the answers. But knowing that context is going to be mandatory if we really know what the right answers are. The solutions for a once hallowed program that's come crashing back to Earth is probably a different set of solutions than it is for the program that's always been a basket case and never been anything other than lousy, for example.
I'm being a little flippant with my remarks, but just being that way on purpose to try to make it resonate. And again, this is where made up Jay terms come into play. One term I've really taken affinity to called hot spots. Another one called diamonds in the rough. Kind of the same concept. That is, when we identify our areas for improvement. Within those areas, we know that ABE is our weak spot, for example. ESL, we're great. CTE, we're great. But ABE, we're beyond lousy, so that's where we really need to work and focus on improving things.
So we're going to say, for the sake of argument, ABE is kind of a hot spot. That's where we need to improve. Within our ABE program, we'll find hot spots within. That is, within our ABE program, they'll probably be a few classes, a few teachers, a few groups of students, that are way worse than others. That's kind of like the fire that we need to put out. Everybody in ABE is a little below average. But oh, my gosh, these two classes are way worse than anybody else. This is the fire that we might need to put out right away, where our data is noticeably worse than even our already poor performing areas that we already need to improve, so to speak.
Similarly, we'll find diamonds in the rough. Hey, ABE is a basket case, but here's that one token teacher or one token class, or maybe one token cluster of students. That even though we know we stink at ABE, there's this proud but substantial minority that's doing a bang up job. Even though ABE overall needs lots of improvement, that's also something we really want to look out for. How is it that these students or these teachers or these classes are doing so exceedingly well amidst the rest of this where ABE is just a conflagration that needs a lot of firefighting or whatever?
How are these students or these teachers doing so great when everything around them is so stinking [audio out]? What is it that explains those contradictions? What ifs is another one, hey, these students just made a little bit more gain on their post-test. What if we just got a few more hours? What if we just cleaned up this one small cluster of our data? Would that allow us to do a lot better? Look for things. What if we just sort of made a few changes in the margins? Are there any what ifs out there that if we just flip the script a little bit, that would probably enact a big difference in our data by doing just a little bit around the margins? Ask those what if kind of questions to see if there's anything like that that might flip the script for you.
And then there's another concept I like called neighbors. This is one that I got to admit might be a little bit better for NRS evaluation than CAEP because it's kind of looking at it from that NRS EFL's perspective. But kind of setting up it up on the wrong side of the tracks where, hey, you might say our high levels of ASE are on the wrong side of the tracks. Every agency is outstanding, but our ASE high level of ABE, are really rough. OK.
Hang on on that one. There's a lot more to follow on that. If we feel like at the end of the presentation we've got nothing to go by, then we'll kind of backtrack on that. But yes, we're presuming we've done some of those initial areas, like what we do for NRS. I'll back up. To some extent, there are some real basic ones we do for NRS that might be good to do. I'll keep that in mind. Sorry, somebody asked me a private question. But it's a really good one, so I felt like I needed to speak to it.
Anyway, so by neighboring, that's a little bit more NRS related. So it probably came up as, hey, how are high levels of ABE ASE so bad? But you just look at ABE III, intermediate ABE, and everything's fine. Look at neighbors. if ASE is really, really bad, but everybody in ABE is fine, why is one good but not the other? You'd think it would be the same if it's low level ESL and ASE, well, that doesn't help at all. That's apples and oranges. But maybe it's like, if it's apples and oranges, it ain't going to really help.
But if it's tangerines and oranges, then hey, you don't expect it to be the same, but you kind of expect oranges to at least be a little bit like tangerines. How on Earth can tangerines and oranges come off more like apples and oranges? You'd expect tangerines and oranges to be similar, but they're not. Why are our tangerines really, really good but our oranges are so stinking lousy? You'd kind of expect tangerines and oranges to both be good or both be lousy.
I'll just say, did that analogy make any sense at all, or is that one that I just have to admit one off like a Led Zeppelin? Any instant feedback there? Has everybody followed that on neighbors? Nobody's listening. All right, never mind. All right. So let's go on here. All right. Thank you. I can always count on Connie to answer my goofy questions.
Anyway, so we're going to change gears here now. Those are just kind of things to be thinking about. That'll keep rearing its ugly head for the next 50 minutes, like a Led Zeppelin. Yeah, you know, I was thinking L-E-D, but hey, it can be lead. You know, whether it's led, L-E-D, or lead, L-E-A-D, it's both. Anyway, that was a John Entwistle quote, seeing as how you asked.
Anyway, goal setting and targets. So here are our official CAEP targets. So we have mandatory consortium level metrics, mandatory member level metrics, and then that list of 10 optional metrics. I think Neil broke this down a little bit yesterday. If not, I know he has in the past. If not, I know I have in the past. But for now, we're just giving you this basic list. We'll kind of break down these different sections as we go along.
So we will break down the first one right away. That is the mandatory consortium level metrics, where we get a list of four barriers, I'll say. Mandilee or Holly, can you possibly confirm or deny that I've got the right barriers? I'm pretty sure I do. But again, the mandatory ones are kind of some of the big-- whoops, didn't mean to do that-- but kind of the biggest English literacy, low literacy, low income, long term unemployed. You don't need to do all four, but you do need to choose one.
And the one that you choose, for my understanding, is pretty flexible. Though whether you want to set it up to whether you're serving more individuals with that barrier, or whether you want to orient it more to improving performance for those that have that barrier, more on that in a minute. And then that enrolled adults, that's the reportable individual. So you have adults served. That's basically everybody who shows up on the CAEP summary.
One subset of adults served is what CAEP is calling reportable individuals. The reportable individuals are those that get at least one hour of instruction. So here the saying goes, yes, if they only have one hour of instruction, is it Miller time yet. Well, absolutely not. But if they have one hour of instruction, we're all willing to agree that's better than having zero hours of instruction. So we know there is a lot of CAEP students with zero, so we're keeping the bar really low because it's mandatory, number one, and we're sticking with the consortium level, number two.
So all we're really doing is looking to move people over from the adult served at zero hours bucket to the reportable individuals or one-plus hours bucket. Does that description of the metric at least make sense? We're not talking about any analysis yet, but describing that metric sometimes as inherently confusing. Does everybody feel like you at least understand what these metrics are? OK, so we shall move on. Thank you, Connie and Diana.
So here are our TE solutions. We have two barriers to employment reports. One for NRS, one for CAEP. The short answer is these two reports are, of course, not exactly the same, but you certainly got to admit they're very, very, similar. The difference of-- back to more who's buried in Grant's tomb sort of answers. The short answer is NRS barriers to employment helps you for your NRS reporting. Your CAEP barriers to employment report helps you for CAEP reporting. That, of course, helps you like not at all.
But you can see by the screenshot for NRS, it's set up like Table 4, where it gives you the barriers broken down by those 12 levels. You know, that is the six levels for ABE, and six for ESL. CAEP barriers to employment is set up more by program. Here's how many have ABE, ESL, CTE, workforce prep, and so on. Provide two sets. Well, obviously the safe answer here is, use CAEP if you're not sure because you're setting goals for CAEP.
But digging a little deeper-- my two cents, this is not the right answer, just my opinion-- is if you're looking at basic numbers, that is, you're not really worried about how they're performing, you just want to commit your agency or consortium to serving more individuals with barriers, I would definitely say the CAEP report is a lot better than you're just trying to look at bean counting. So you absolutely want to make sure those students in workforce prep and CTE and adults with disabilities are included here. So you want to capture everybody. So use the CAEP report. Make sure those non-WIOA programs are included if that's what you're looking for.
On the other hand, if your focus is hey, we already know we're doing a great job with barriers, but we need to focus more on the level of performance for those students with barriers, then that NRS style report might be more helpful, where you can focus on specific areas. Hey, you want to focus on just your higher levels of ABE. We want to stick with low level ESL students or whatever. Then using that NRS version of this report might be more useful because it's a lot easier, I think, to track performance when we can dig down and look at those NRS EFL's.
Does that at least make sense? Yes or no? You know, I hadn't really brought that explanation out before. Did that make any sense? I'm really curious about that when I hadn't really broken that down. I really sort of presented that as all the same in every presentation I've done with this up to this point in time, as far as I can remember. So there you go.
So if you want more detail, where if your focus is really on performance-- that is, I know some of you don't really need to show that you're serving lots of students with barriers. A lot of you know that you're already doing a spectacular job with that. By definition, those low literacy and low English literacy and low income comprise the overwhelming majority of who we serve already. We already can fly the flag proudly there. But hey, when we look at outcomes, employment, pre/post gains, all that stuff, hey, we're not so great there.
So then you might want to run the ad hoc NRS cross tab that allows you to pick and choose whatever result you want. And you can run it by barrier obviously. You can run it for just about everything else under the sun too. But I'll present this as the solution here for barriers for employment, where if you want to run it and show, here's how many students by barrier got jobs, here's how many students by barrier earned high school diploma, or anything else for that matter, you can basically run that NRS ad hoc cross tab report and set it up so you can organize it by barrier or you name it.
So I'll just say, is this enough? I mean, there's a kind of 15 or 20 minutes. We don't have to dig in and go step by step. There's several other places like that, so I'm not going to dig in any more than I am here. But here is kind of showing you the report and where you can go with a few screenshots to give you the general gist. That puts us in transition here.
So here's the first salvo here with questions we can ask. A few more of these are going to show up. Boy, oh, boy, aren't you excited? So starting with barriers. So that's our first big ticket option here. Hey, barriers to employment, like it or not, we're going to need to address with this goal setting. So here's some barriers to employment, specific questions we really ought to ask our consortium, our team, or at least ourselves. Hard to say which. But anyway, what's our track record overall?
Some of you, I know, have been doing a bang up job with barriers since its inception in 2016. Some of you are sending me handouts that you developed back in 2016 to show what a wonderful job you were doing collecting barriers. Hey, look at us. We're doing a great job. Share this with others that aren't doing a great job. So there's some of you, I know, have done really, really well with collecting barriers for five years. So hey, knowing that is good.
If you know you've done a great job collecting barriers for five years, for example, you probably don't need to set a goal to make sure you're collecting barriers for every student because you're already doing that. You've already done that for five years. So that suggests you really need to focus on performance because you're already there in terms of collecting it.
Others of you, maybe in 2016, were barriers. What barriers? What are those? You've jumped on the bandwagon more recently. So then maybe you really need to focus on collecting it for everybody and just being able to show evidence that you're serving these students. It's really what you need to do right now. You can worry about performance later once you figured out how to collect it for everybody that has them.
Then over here, we can look at some deeper questions within our group. Are there any data groups there within our consortium that might be better or worse? Hey, when we look at barriers, we think we're doing great with barriers, but looky there. For ABE and ESL, we're like at 99.99%. But all our non-WIOA programs, CTE workforce prep, we're just barely over 50-5-0. So maybe we target it for our non-WIOA programs to get our non-WIOA programs up to speed with our WIOA programs.
Or maybe it's not like that. Maybe it's just some head scratching issue where everybody is great, but there's those laggards at ABE again bringing everybody else down. So maybe we need to focus on them. Maybe there's better representation than others. Hey, these barriers we keep collecting, but hey, we should really have these other barriers that look just as good.
Maybe we dig down in terms of performance. Most of our barriers, there's really not much difference. But for low income, for whatever reason, their pre-post gain rates are way lower than everybody else. Their job and wages rates are way lower than everybody else. Why are these barriers so much worse when other barriers, quite frankly, we can't really notice any stinking difference?
And then here's our what if. Are there any that appear underreported? We know our data is fine. But hey, wait a minute. We need to go outside the box for these green questions. We're in an economically depressed area. We're in an area that's been economically depressed for as long as I've lived in this community, and that's for a long time. We should have low income students coming out of our ears. But we only have 14 low income students out of 500 reportable students.
That can't be right. There's got to be a basic step where we should be reporting students that we're just flat out not reporting them. There's no way that we can be only 3% low income in an area that's been economically depressed since the 1960s or whatever. That just can't be. So those are also questions we should be thinking about when we're moving right along.
OK, so moving on to the next topic. We're looking at enrollment here. So for enrollment, that's that enrollment by hours. Here we're looking. We're breaking it down. You know, hey, that services only section of the CAEP summary, breaking down all those enrollees. Here's how many of those have 12 or more hours. Here's how many of one to 11. Here's how many have zero.
What we're looking at for this particular, of course, is where the red and green arrows show up. We have this many in one to 11 hours, this many in zero hours. This consortium-wide metric relates to getting them out of the zero hours bucket and at least into that one to 11 hours bucket. That's what we're trying to do. Back to Diana's tire kicking. Hey, we're at least getting them to where they're in a CAEP funded program, as opposed to not be. That's the two columns we're comparing.
So again, the same thing here. We're comparing the one to 11 column with the zero column. Oh, just say, here's, looking at the math, where I'm including all two columns. Talking out the math just confuses everybody. So I'm going to use this without saying anything to say, if you're trying to figure out the math and figure out these percentages, the slide I'm showing you right now-- that is incorporating columns G, H and I together are the ones that-- or I should say F, G, H, and I together are how you kind of compute these percentages for the one to 11 hours and the zero hours and so on.
OK, and then here's the CAEP summary. I'm kind of stepping out here. So actually, let me stop here for a minute. A weird time to stop and sanity check, but let me sanity check for a minute. How are we looking? Is everybody hanging in here? Yes or no? How is this looking here? Good or bad?
Diana Batista: More or less. But I have a question. Can you go back one slide?
Jay Wright: So I'm sorry.
Diana Batista: Can you go back to the TE hours? Because when I printed my hours, I got NOVA program hours. So what is that report?
Jay Wright: This is from TE. This is our TE Enrollees by Hours. There's a very similar one called TE Services Enrollees.
Diana Batista: All right. So that's not a CAEP report.
Jay Wright: No, it's in the CAEP report. If you're looking straight on the menu, this might be one where you need to go into that setup window and go to that report selection. I know it's something we need to do better, but it is what it is. When there are a couple of these that you need to access by going to that report selection in the CAEP tables report setup window, not all of these are right directly from the TE menu.
Diana Batista: OK. It looks like everyone is--
Jay Wright: What you're asking.
Diana Batista: Yeah, everyone else is on track. So I'll keep playing around. Thanks.
Jay Wright: OK. So anyway, here's the CAEP summary. So I'm stepping out here for a minute. Yes, I am officially wasting everybody's time. I'll say, I don't know if this is still a problem. But it was in November, December, when we were first talking about CAEP Enrollees by Hours and CAEP Service Enrollees by Hours.
So what we're trying to do is show you how to connect Enrollees by Hours and Service Enrollee by Hours to the CAEP summary. So admittedly, this is assuming that everybody is pretty good with the CAEP summary. It's assuming you know nothing about those new reports. But it is assuming you're already pretty knowledgeable about the CAEP summary. I know that's not true of everybody, but I think that's probably the default situation with your average CAEP user in TE.
So what we're doing is we're matching the CAEP summary with the CAEP Enrollees by Hours and just matching columns. So we'll start with the easy example. And quite frankly I think the easy example is what causes all the confusion. In the first example, it's super easy. We're looking at column B on the CAEP summary. That is the column that has the total number of enrollees in that literacy gains section and comparing it with that Enrollees by Hours. And what do you know? That's also column B. So it's kind of coincidentally super duper easy and super duper convenient, where column B on the CAEP summary is exactly the same information as column B on the Enrollees by Hours.
So that kind of leads you into a false sense of security, perhaps. So now, we're going to look at the next example. The next enrollees bucket appears as column E. That is, the number of enrollees that show up in the CAEP outcome section of the CAEP summary. And here I go again being a little bit obnoxious. I want to find out, are we still hanging in there. Do we still know what we're talking about right now? Yes or no? Sorry, I'm going to be a little bit of a stickler on this one. OK, thank you, Corinne, Maureen.
All right. So I just want to make sure. So again, column E on the CAEP summary is the same information as what appears as column C as in Charlie on this new Enrollees by Hours. So you can see there's the-- are you showing parts of an overall? Yeah, I'm showing the CAEP summary on the left, the CAEP Enrollees by Hours to the right. So column E, that's the CAEP outcome section of the CAEP summary. Column C, that's the CAEP outcome section of that newer CAEP Enrollees by Hours. Hopefully that's your question, Connie.
And so final example, here is column M in the services section of the CAEP summary. That equates to column F in this new Enrollees by Hours. Again, 338 enrollees on CAEP summary column M is the same as 338 enrollees on CAEP Enrollees by Hours column F. Again, I don't know if many of you have been using this much lately. But back there four or five months ago when these reports were new, this question did come up a couple of times. And again, the column letters don't match, so it does cause a bit of confusion.
But this is just to show you this is where that data comes from. If you're familiar with the CAEP summary, but you really don't know these newer reports, that's kind of a good way to get your foot on the rail to show you, yes, this is all data you already know and love. It's just taking it a step deeper than what we get from the CAEP summary. Yes, those are-- no, those are duplicated.
Sorry, let's go back here. So good question. So looking at the CAEP summary overall, I'm showing you the full poop here. I think the full poop is necessary for this one. So let's just look at column E because we've been dwelling on it a little bit. So I'll say, everybody, but especially Karen, you see the 220 enrollees in ESL and the 215 in ABE and all that. So everybody, but especially Karen, do you see this? Yes or no? Thank you.
So those are our column E numbers. So you can use B or M. It's all the same really. But I'll just go on to say we've got 220 in, ESL we've got 108 in CTE, and so on. So you add that all together. That equals 571. That's our duplicated count. So again, it's intentionally set up to be duplicated within program, duplicated across program. So if you have somebody enrolled in ESL and they're also enrolled in CTE, they'll show up in that 220. They'll also show up in that 108. That's on purpose.
So you get that duplicate number of 571. If you look at the bottom, that's where we get the unduplicated. You can see 571 overall, 76 in two or more programs. So the unduplicated number is all the way to the bottom. That's that 466. So good question. But I'll make sure. Do you understand the answer now, Karen? Everybody? Thank you. Great. Thank you for responding. Everybody else is free to answer, but you asked the question, Karen, so I was looking for your thumbs up more than anybody else. Thank you.
So moving back to our, hey, little doctor over here asking all these stupid questions. So here's another round of them. For the one-plus hours, the obvious one. Has this always been an issue? Have we always had a lot of those students kind of clunking around without pops if it's NRS? A lot of students clunking around the services only section if it's CAEP? Is this something that's always been a problem around here? Or is it something that we haven't really noticed before?
And I got to say this. All of these could relate to COVID. But this one, I think, is especially one that could be a big issue now that's crept up since COVID. Maybe five years ago, everything was a tight ship, super clean. But COVID at some agencies, I think, has probably created a lot more clutter in some cases. So the COVID question, really, I think, relevant to this particular issue.
And then again, just like we looked at with that issue with barriers and general performance, are there specific programs that really look bad? Hey, ESL has a really tight ship. You know, CTE has a really tight ship. But hey, that ABE ASE, everything is all over the map. Why is ABE so messy when everything else is kind of neat and tidy? And then here's the big one here. This is the outside the box question. Are these students really service only students, or does this just mean we've got dirty data? It could be either one, but we could kind of look. ,
Well, somebody's got a big one here. I got to look because it's a big one. OK, you're just helping. All right. So sorry. That was the one question I had to look because I thought maybe I was missing something big. I think you're trying to help everybody else. So anyway, outside the box.
So back to that service only section. Sometimes you might have a lot of students at your agency with zero hours because it's reflecting what you're doing right. That is, you just have a lot of students that are service only. You have a lot of students that came for CPR training, and that's all they intended to get. That's just something you've got to know about your agency. You know, that is, are you offering a lot of those short term services? And so thereby, it makes sense that you've got a lot of students that are receiving services but not instruction.
Or is that just kind of, it can't be where we do services, but we do just a little bit of services. We do a lot of bit of instruction. If so, it shouldn't really be that we have a large number of service only students. It's probably way more likely to just be incomplete data. So hey, do we have a lot of good reasons for this? Or does it just mean we've got a lot of data that needs to be cleaned up? You be the judge of that. But I'll just say that's the big question here.
Sometimes we get a lot of zero hours because we just have a lot of students with zero hours. Other times we get a lot of zero hours because we just need to clean up our data I'll stop there. Does anybody know what on Earth I'm about with that green question? Or is that just another two or three minutes nobody's ever going to get back? Got to ask. OK, thank you. All right. Somebody does. Good.
OK, so let's move on to our member level metrics. Again, these are mandatory at the agency level. Ones related to funding, I'm not going to get into that one. But the number of enrolled adults that become participants, so that's just moving the bucket one over where instead of being worried about moving them out of zero and into one to 11, now we're worried about moving them out of one to 11 and moving them into 12-plus. So once again, that's another job for the new CAEP Enrollees by Hours report.
The report is the same, but the highlighted arrows are slightly different. We're looking over here at the 12 hours and one to 11 hours bucket. I'll point out, though, there's a little bit of a sleight of hand here. You could do it different ways. My opinion, if you're looking at that first one where you're looking at moving it from zero to one to 11, my recommendation is use that services section. So I'll back up. Again, use the services. That is, F, G, H, and I for this one. You know, whereas for this next one-- sorry, I need to go further back.
Use G, H, and I over here. Use that section called services to measure this first one. But now that we're looking at the next metric, I'm suggesting we want to use outcomes for this one. So use column C, D, and E for this one, not G, H, and I like the first example. Again, if we know that we're looking at one to 11 and 12, we want to look at students that have hours. This eliminates those with zero hours. The first one we're looking for the zero to one to 11, so it kind of to some extent eliminates those with 12 or more hours.
It just allows us to get to the heart of the matter better by looking at this outcome section for the second metric. That's a little bit of an advanced tip. Anybody know what on Earth I'm talking about with that? That one, I'm not sure about. But I will point out for the first metric, I say, yeah, use the services section. For the second metric, I'm going out of my way to say use the middle outcome section, not the service section. For this, again, the concept is the same. We're looking at moving them out of column D and over to column C. That is getting more and more students to accrue at least 12 hours worth of instruction.
So here's a way where I'm doing the math. I find when I try to explain the math it just messes everybody up. But I am going to leave this example. You can go back and look at it. It's easy to figure out when you look at it. It's hard to figure out when you're listening to my confusing explanation. So I'll go back.
Here's a bunch more stupid questions for you. You know, obviously as 12 hours already being an issue, you've heard a lot of information here in the last six months or so. Lots more information come out about 12 hours. Maybe you've heard somebody like Thoibi Rublaitus talk about it. Digging deep, you know, is this something that you've always had a lot of trouble with? Or is this something new? COVID could be rearing its ugly head with 12 hours. I kind of feel like COVID probably rearing its ugly head more with that first metric we talked about than this one. But it's still a big problem with this one too.
And then dig deep. Is our big problem that we have a lot of people with zero? Is that the majority of those students with less than 12? Or do we have a good handle on that, and most of our students with less than 12 really are a true hours issue where they have hours? It just happens to be a number less than 12. We've looked at this a lot more lately, where generally if you've got a big number in one to 11, that suggests probably a different strategy than what you would do if it's just a bunch of people with zero hours.
And then, again, break it down. Does ABE do better than ESL? Does ESL do better than CTE? Are there any differences between high level students or low level students? Maybe it's by zip code. Maybe it's a special program. Turn it over and over, and find out where the pockets of strength and pockets of weakness are. And then, hey, the what if. If we just tacked on a couple hours to every single student, would that magically improve our performance? Or is it a deeper rooted issue than that?
OK, and then here are optional metrics. Again, somebody yesterday said, gee, why aren't we doing more on this. Why aren't we doing more on this? So again, you can do everything. You can do pre/post. You can do employment. You can do transition. You can do high school diploma. You name it, you can do it. But again, these are all optional with the idea that, hey, some of you might be doing I-3, but some of you aren't. Some of you might have all ABE and ESL, so you're doing pre/post-testing. Some of you might be more CTE dominant, so it's not as relevant. Whatever.
So we'll pick and choose a few of these. So we'll start with diploma and HSE. That's obviously a bread and butter category. So we're saying, if you're doing HSE for CAEP purposes, we're going to say use the CAEP summary. Column H gives you the total number of high school diploma and HSE outcomes. So a few data mining sort of tips here. Again, it's column H. You know, we looked at some of those setup window options early.
I'll just say, this view on this screen right this moment is the default view in TE. The default view in TE is set up to break down your program areas exactly the same as the way they set it up in NOVA. That just to us seems like the default way to do it. So right now, if you just run it, it'll run it just like NOVA. But if your agency or consortium decides to target HSE or HS diploma outcomes, I would strongly advise to not set it up like NOVA.
Go to the setup window and break it out to make sure that you've got separate columns for ABE and ASE. Furthermore, I'd actually say break it down even more than that, where it's like the real world, not the CAEP world. You know, CAEP is just kind of different than the other 98%, 99% of the planet. The other 98%, 99% of the planet has ABE, HSE, and HS diploma as completely different categories. Got to say, if we were in the real world, that appears as different categories, not the same category.
So again, CAEP is just different than the rest of the 99% of the planet. We can stick in CAEP land for most of this, but for these outcomes, we really got to be like the other 99% of the planet and have these broken down into separate programs. That is, ABE is one row, HSE is another, HS diploma is yet another row. If we're looking for high school diploma outcomes, we probably need to have a row that shows our information for instructional program high school diploma and instructional program high school diploma only.
So that's what I would just recommend you do. If it's HSE, you need to look at program HSE and so on. So that's a tricky little suggestion. That's not that important for every single outcome, but I got to say really is for this one. And then, again, you can go to that outcome summary. That breaks down every individual outcome if you want to itemize it out between HSE and diploma, or you can also itemize it out between GED and high set or things like that.
OK. Whoops, I went the wrong way. Sorry. OK so here is our batch of stupid questions again. Again, longitudinally, what's our history with HSE and diploma? Maybe we've offered it forever. If so, if we've been good with this forever or not so good. For some of us, I know we've been offering this for just a short while. So we don't have a track record. So that's fine, but knowing that obviously is going to be an important factor to consider when setting goals.
Are there any programs outside of HSE or diploma? That is, are we magically getting CTE students getting diploma? Are we magically getting ESL students earning diploma? That's where we want to look outside. What happens when we break it down into separate programs? Does our data change a lot when we do that setup window change I just suggested? If it's a big change, well, then we really need to dig into that and figure out where the differences are.
And then here's our outside of the box question ask. Are there students maybe just a subsection from passing? Are there learners in high school diploma with just a few remaining credits? Some of you will be able to find that information easily. For some of you, that may be pie in the sky. But look. How many of our students in diploma or HSE are really close to completing diploma or completing HSE? Are there a lot? Are there large numbers of quote, unquote, close calls that might really make sense for us to look at and close the gap on some of those that might really be close to crossing the finishing line, so to speak?
OK, so here is the next one, post-secondary. We can look at post-secondary achieved. That's column I. It's kind of a similar answer. It's a similar answer for HSE, just a different column. You can use the outcome summary to target specific outcomes. I got to say, for this one it really depends a lot more on what your agency has been doing. Some of you, I know, have been very dutiful bean counters. You've really been making your list and checking it twice to figure out what the heck the difference is between occupational certification and occupational licensure, for example.
Others of you have just been dumping it all into one bucket willy nilly. Hard to say what you've done. But I've got to say, this is the ultimate input/output type thing, that if you carefully organize this, then you can carefully check it out. If you've just put it all, make sure you check to figure out which of those different bubbles or boxes or whatever you've been using. To keep it consistent, use that outcome summary to itemize it out.
Again, for some of you, I think this will really be a good thing to track. For others, it might be really tricky. Same as for transition. Again, same information I'm giving you, but a different column. In this case, L. You can differentiate potentially between transition to college and transition to CTE. So here is, again, the same questions. You know, I think here this is the transitions and post-secondary. In my opinion, the process you're using is going to resonate a lot more here than other categories.
That is, have you always been reporting these transitions the same way? Or have you been manipulating or making adjustments to the way you've been doing this? If you've been making a lot of adjustments, I'm not saying you're wrong. Because I've got to say, I guess the point I'm making here is things like HSE, diploma, pre- and post-testing, enrollment, those are things we've been doing forever and ever and ever. In our data, things like transitions and post-secondary are comparatively newer developments.
So your average agency out there really has it down, in terms of what the best way is to track pre- and post-test results, what the best way is to track HSE, what the best way is to track basic student enrollment and hours of instruction. But for transitions and, you know, CTE outcomes, that's a newer development. So we're at a lower stage in the process at most agencies with this. So figuring out our system has been difficult. If you've been kind of trying to find your footing with this, that's going to affect your history.
You just need to know that what we did in 2018-19 might have been a different method for transitions and post-secondary than what we're doing now for things like high school diploma and pre- and post-testing. I wouldn't expect very many people to have that issue. The way you've done pre- and post-testing has probably been the same for 20 years. The way you've been doing high school diploma and HSE has probably been the same for 20 years.
I'll stop. Sanity check. Everybody know what I'm talking about here, or more authentic frontier gibberish? Not really sure which one it is about me. Let me stop. Just take a breath of air, if nothing else. See if anybody takes the bait. Same since caveman days. OK, you're helping me out with that. Thank you. OK. We're OK.
All right, we're creating charts. All right. Thank you. Great, great, great. OK. So I'm just bringing it up. Yeah, some of these transitions, you know, there's a big issue where, yeah, when we're looking longitudinally, look at the process, not just the data. Have we focused? What transition has been our focus? This has really been more of what your agency and what your consortium has decided is the priority.
Some of you are just into all transitions, but a lot of you when you're saying, yeah, we're all about transitions, a lot of times that really means college. Other times that really means CTE about what transitions have been a big deal. All right, the data gathering is different. It's difficult because you've got those 12 different bubbles that all seem like the same thing. So a lot of times it's just deciding which bubble are we using at West Contra Costa? What bubble are we using at Berkeley? And sticking with it.
That's what I mean by plus S. A lot of you are figuring that out. Some of you have had the same way for a while. Anyway, look at the different areas. College versus CTE. You know others. You know, again, kind of same question. Are there maybe some specific transitions that have been priorities for a while that we know we've been capturing well for an appreciable amount of time? If we've been capturing this specific transition the same way for a while, I'm just saying that might present itself as a better item for goal setting than this area of transition that we've been doing well for the last six months.
But until we really got into '21-'22, we know the way we were doing it was a mess. So we know that we don't really have any baseline that's worth the paper it's printed on. Again, if you're in that position, you're just like everybody else. You're not in trouble. That's totally A-OK overall, but it does mean that it's probably a bad area to set as a goal because you don't really have any baseline data that's worth the paper it's printed on. That's back to my pontificatements.
Hopefully, that makes sense. OK, thank you. Janie gets it. Thank you very much. Marie, short answer, if you're doing EL Civics, got to [audio out]. This is another one where, hey, if we've done this, [audio out]. So again, in CAEP land, that's called past I-3, column F, that relates to those COAAPs. It also relates to those citizenship assessments. So we've got all kinds of detailed I-3 reports.
The good news is that AB 2098 has mandated a lot of things we've had to do for this. So because we have that, we've done a lot related to this I-3. We've had that 2098 umbrella we've been able to use for coverage to get a lot of good work done in this area, if I do say so myself. So there's lots of good detailed I-3 reports that both the agency and consortium levels to review that information.
Here's another math example, where we're just comparing the number that are in this outcome against the enrollees. Just say, if you're doing HSE, if you're doing post-secondary, if we're doing transitions, the math is basically the same as the math I'm showing you as an example here for I-3.
OK, and then here's a bunch of questions related to I-3. First off, you know, I should have asked the initial question. That is, is our agency or consortium an EL Civics? So that's kind of what I'm asking. Those of you that have been doing EL Civics since it started in 2001-02, you're going to have a different way of doing this than if you really haven't done EL Civics, and you're doing it only because you want to get numbers and I-3 for CAEP. Have you done anything new I-3 related?
Or are you really doing the same thing you've always done for EL Civics and hoping that through osmosis some of those EL Civics outcomes will transpose? I know there's a lot of you in both buckets there. It's not right or wrong, but just knowing which way you're going, you know. And then just like we do for pre/post, are there classes, teachers, groups of students that look better or worse than others? Are there any specific COAAPs or I-3 categories that are better or worse than areas?
OK, and then kind of like the what if. Hey, if we just had one student do one COAAP per student, would that make any difference? Yes or no? Again, just saying. So lots of ways we can tackle I-3.
OK now, pre/post-testing. We've got 5 minutes. So again, we're looking at this left hand section. These are examples that we've shown a lot over the years. But again, when we're looking at our pre/post persistence and performance rate, we use that left hand section of the CAEP summary. We look at columns B, C, and D, the number of enrollees by program in column B. And specifically, it's the number of enrollees that qualify for NRS and have that qualifying pre-test.
So the numbers that have a pre- and post-test are showing up in column C. The numbers that make the gain in completed NRS level are showing up in column D. So if we want to figure out persistence, we're just comparing column C to column B. Again, here's the math. Column C divided by column B equals our persistence rate. For that one, I can say 70% or better is a good way to look at it.
Again, in COVID era, that's not true. So for COVID I don't think we'll really give you an answer. But for 2019 and earlier, you know, for 10 years or so, our persistence rate was right around 70%. From like 2011 to 2019. So in general, 70% is a good number to use still. And then if we're looking at performance rate, we're doing column D as in delta divided by column B as in bravo. That's our performance rate. That's showing the percentage that didn't just complete the pair, but the percentage that actually made that gain to move to the next level.
OK, and then here's more questions. If we want to focus on that, if we're WIA/WIOA II, we can build the data portal and really do a lot to figure out our history, in terms of whether we've done well historically or not. And then we can dig into the myriad of NRS performance questions we've looked at in other sessions, when we dug deep for NRS. Specific teachers, classes, special programs, and so on.
If we're looking at it from an NRS point of view, we can look EFL by EFL. Maybe there's EFLs that really look better or worse. And then here's what I like. This is in NRS land. We used to call this chasing the 209's, where, hey, if we have students, if every student just did one point better on their post-test, would that make a big difference in our NRS performance data? Or would it not move the meter much? Those are good what if questions to ask. Say, do we have a lot of close but no cigar? Or is everybody a long way away?
Sometimes solving that question just gives us good context to tell us what we might need to do next in order to effect improvement. OK, so I'll just say-- how did these questions-- is B-- Yes, column B. And a pre-test. Right, that qualifying pre-test, column B basically shows those that qualify for NRS. Short answer is, yes, Janae. The difference is that pre-test, though. Column B requires the pre-test. Column E and that outcome section completely ignores everything related to pre- and post-test. So yes, Janae, you're right.
OK, so quick question. What about these stupid questions, like I'm showing you on the slide now? Is this just kind of telling you the obvious? Does it give you food for thought or not really? Haven't really done it this way before. What do you think about this process? I'll move on while you answer.
But just a couple areas that I'm not going to dig as deep on. But employment and wages, that's kind of hard on watch board. It's just sticking with data match only, so it is kind of sitting around and waiting two or three years for that data match to finally show up. MTE, I know you can't really use self-report for this. I've already heard people, like Myron, Karen, and Neil, poo-poo'd this. So I wouldn't really use employment. You might have noticed I didn't use those columns for employment and wages because I already know that's kind of being poo-poo'd.
If you want to use TE, though, instead of watch board, you can use your follow up employment and earnings survey data. That's getting deeper in the weeds. I don't know how that will work. But I'll just say, hey, you can go into those CAEP quarterly survey results if you're doing really well on the survey.
Another good one is to use CAEP outcomes because it's so large. I'm not going to dig into this. OK, it depends on which cartoon. OK, so looking at this CAEP outcomes, again if there's any specific you'd really think would be useful for this, I'll mine it and use it for future presentations. But again, if you want to go by outcome, here you go. Especially in the transition post-secondary section, you know, is where you really might want to specify which transition outcome and/or which post-secondary outcome based on what our agency or consortium has already prioritized.
CAEP DIR, here is an example of actual parlaying those DIR numbers to actual percentages. If you're WIOA funded, here's that data portal question. Nora, I think you asked the question early on about baseline. It's probably I was making a lot of assumptions here. But if you are WIOA funded, go to the CASAS data portal. That gives our averages and goals we've had as a state, dating all the way back to '04-'05. I probably did make a few assumptions on that.
I'll just say, though, for CAEP reporting, we do not have targets. We do not have defined performance goals. So there is a little bit of you need to define your baseline, I think, as far as what that is going to be. So my default answer is there are no state level baselines for CAEP. You need to use your agency level baseline and/or your consortium level baseline, which quite frankly, means you've got nothing. You've got to build your own baseline and then build off it. That's kind of why I have all these goofy questions and all these sidebars, whatever, I guess.
And then another example is if you want to focus on special populations. So here's different ways of looking at it. We're going to serve more learners with disabilities. Or maybe we're going to make sure that individuals with disabilities perform better. This is just two different ways for that. And then here are some tips about NRS cross tab. I'm over time here. These were ones that I really wanted here, just so you can see. But if you want more detailed ways of running this, here's a bunch of examples in these last five or six slides here. I'm going to give this a mercy killing now.
And I'll just say, the second go around is going to be the BNA crane. BNA means [inaudible]. So our giant crane that we have there, it's all the way back in the up position. But it's going to go down and do another swoop. A lot of these things we're tackling are going to be tackled again, but it's going to just be a deeper dig than what we did this time. So if you missed any of this, you'll get another chance to catch it a week from now.
We're going to do a deeper dive. We're going to dig into these gaps. Chris, you were the gap master. So we're going to look more at these gaps. We're also going to look more at these hot spots diamonds in the rough. We're also going to look at it from what do we need to look at when we're talking to our students and staff point of view, not just the standpoint of what do we need to do when we're looking at our data reports point of view. And get deeper and deeper. So you can see Mandilee's already done some shameless promotion.
So on that point, I'm five minutes over already. Thank you for bearing with me and hanging in there. I'm sure I'll talk to you all sooner rather than later. And here you go. Take it away, Mandilee. Truly sorry I ran over.
Mandilee Gonzales: No problem. Thank you so much, Jay. And thank you everyone for joining us. I did pop in the chat an opportunity to fill out the evaluation. We would really appreciate it if you just take a few minutes to do that. Along with the registration link for next Thursday's session, again, with Jay.
So we really appreciate you guys being here. I did get a couple of requests for the PowerPoint. And once I can finish remediating that, we'll go ahead and post that to the website and share out, along with the video. So that'll take a little bit of time, but you will see it shortly. All right.
Jay Wright: Thank you.
Mandilee Gonzales: Thank you, guys. Bye.
Jay Wright: Take care.