Jay Wright: OK, hopefully, everybody can hear me. And here I'll blow up the PowerPoint. So you heard Veronica say that this is kind of a Part 2. I'll start with an obnoxiously self-serving question to everybody and a little bit just to make sure everybody hears me, whatever. How many of you are in the session that we called quote-unquote, "Part 1" that took place on April 28th? Just curious.

OK, a bunch of you say yes. And to be clear, it's not required. In fact, they're a bit different. I think most-- obviously, a lot of you were in that first one. So you might remember, we called that first one the Cocktail Napkin Session because it was all of those little notes that had been recorded on matchbook covers and cocktail napkins over the last two and 1/2 years.

And so that one happened organically where it just wasn't planned at all. It's what emerged. You might say this is the one that was originally planned that got blocked. So that's what we're doing here today.

And we're also doing it tomorrow, or next week on the 25th. That is part 2, and 3 will be more directly related to TE probably a little bit more like what everybody expects. I'll also add the one we did a couple of weeks ago. We barely made it in an hour and a half. I kind of think this one will be a little quick or easier.

So here is the overview. We started the Data Dive on April, 28. That's the date that I wasn't thinking of a minute go, where we really detailed all the outcomes. That was a little bit on CAEP Reporting. But quite frankly, we didn't spend a lot of time looking at TE stuff at all.

On April 28th, we really spent 98% of our time just reviewing those outcomes and trying to clarify outcomes, not really looking at TE Reports. That said, here on May 18, we will be looking at TE reports. And we'll be focusing on Barriers and Equity. I'd say, quote-unquote, "special populations" is the highlight of this session.

We'll be looking at some different ways to highlight special populations in TE, things like barriers to employment that we've talked a lot about I-3, and so on. And then we'll get more into Performance Goals next week. I'm sure a lot of you have probably participated in NRS Performance Goals, some of the regular sessions, we've had lots of really good panel discussions the last few months.

So it'll be that same NRS performance goals concept. But it will not be looking at NRS Reports, of course, because it's a CAEP TAP webinar that will be looking at CAEP reports like the CAEP DIR, CAEP summary, CAEP program hours reporting and so on, using those kind of things to establish goals for CAEP reporting. So the concept, just like NRS, but the particulars, of course, will be different.

OK, and then here's the agenda for today. We'll review a little bit what we spoke, what we discussed a couple of weeks ago in Data Dive Part 1. And then here is getting into what we'll talk about today. Obviously, we'll be looking at the CAEP Summary, CAEP DIR, and an upcoming new one of CAEP Hours Report. We'll also review those Consortium Level Reports a little bit.

And then we'll get into, not sure what the right title is, but we're calling it Special Features to Isolate Priority Populations. So we'll look at barriers to employment reports, we'll look at some basic steps you can do to filter any TE Report for specific subsets. The example, we'll stick to barriers to employment. But it's not all about using it for barriers to employment. It's filtering reports for barriers to employment or pretty much any other demographics, demographics field, education level field, outcome, labor force status, you name it.

You can filter reports that a lot of different ways. So we'll look at a couple examples that will hopefully illustrate, the larger concept for lots of different ways of doing it, where we'll look at how you can filter for specific subsets. There's a new report called NRS Ad Hoc Cross Tabs. I admit the title is probably less than tantalizing, but it does allow you to dial-up your own report. I admit that beyond just the title being a little bit unappealing, it's also a little bit oxymoronic to use the NRS Report in a CAEP presentation.

But we call it what we call it because it basically looks at NRS Reports. And basically rise every single possible TE field under the sun that relates to NRS Reporting. So we'll look at L-2, dial up your own reports for that. And then, we've also talked about immigrant integration or I-3 reports. And a couple of previous workshops.

We will cover to nearly as much detail as we did in those workshops in March. But we do want to cover it. We really do feel like I-3 relate strongly to this concept of barriers and equity. And again, and in particular, looking at special features to isolate priority presentations.

So I'll mercifully move along from the Agenda page. I get sorry. Inquiring minds insist on knowing. Does this sound more or less like what you expected to get yourselves into? I'll encourage people to say no. If that's a big-time no, I really do want to hear it. And I can adjust accordingly if there's any disappointed exasperated sighs.

OK, you have no idea what you were getting yourself into. So how the heck can you complain if you didn't know what you were getting yourself into in the first place? All right. I like that. That's an exceedingly honest answer.

So we'll get moving. So we'll start with the slide that I've bet you-- 98% of you have seen 10, 20, probably close to 50 times by now. That is the AB 104 CAEP Outcome slide. Here are the six areas of AB 104. And again, itemizing out by category exactly which outcomes fall under each of those six categories of AB 104.

No, I'm not going to detail this list and read them off blow-by-blow but I'm just using this common slide as kind of review. Because, of course, that's what we did in detail in Part 1. So this is getting a little bit, addressing it a little bit more succinctly. We talked about those a ABE, ASE, and ESL outcomes, looking at things like high school credits, trying to distinguish and identify ways that marking EL Civics, CAEPs and other types of outcomes might encounter in EL Civics, other types of outcomes might encounter in ASE.

We also looked at trying to distinguish workforce preparation outcomes versus CTE with the short answer they are being if it's something short term, if it's something suggests that great progress was mad, but nothing was actually completed. That suggests workforce preparation outcomes, suggests some of those CTE literacy gains we've been talking a lot about. If it's longer term in nature, if it suggests something that maybe is a finished product, then it's probably one of those post-secondary outcomes related to CTE.

We went into detail related to transitions. That continues to be a really big discussion starter here lately. And then we also spent a little bit of time trying to clarify ways of recording short term services. So I will say we barely made it. There were a lot of outstanding questions. I probably did make a little bit of a get away from the crime scene before everybody was able to frame me, so to speak.

So I will stop here and say, hey, a lot of you were at the session. So any lingering questions, or maybe questions that weren't now lingering them, but are malingering now that you sort of looked at the material from a couple of weeks ago and kind of applied it to what you're doing at your agency. I will just stop now. I guess to encourage you to say no. What the answer's no. And then I'll know everybody is just happy go lucky with what happened on April 28. And we'll start talking about April, 2008. And start talking about May 18.

OK, nothing new. So I'm going to take that as, hey, everything's hunky-dory from the last time we met. So we'll start with the CAEP Summary. It's a different slide. But the same information that I know we've talked about many, many times, that is the three sections of the summary.

Again, I'm not going to get too much into this because this should be review for pretty much everybody. As I'm fond of saying the CAEP Summary is the main report you submit within the year. It's the main report that you use to track the outcomes. I like to say it's divided into three sections simply left-hand section, middle section, and right-hand section.

The left-hand section is literacy gains. That is the specific outcomes generated from pre and post-tests. The short answer is that data is gotten directly from Federal Table 4. So it uses the same criteria as an NRS Federal Tables use. The middle section CAEP outcomes includes all of the CAEP outcomes that are on the table except those literacy gains that are accomplished real pre and post-testing. So basically, all outcomes except post-testing.

The middle section requires all of those demographics and requires 12 or more hours of instruction. But it's the section that goes out of its way to exclude pre and post-testing outcomes. So it also excludes the requirement for anything related to pre and post-testing.

And then the right-hand section is the services section. So that includes services only students that also include students that have some kind of affiliation with one of the seven CAEP programs but don't have, necessarily, have the 12 hours of instruction, and/or don't necessarily have all of the required demographics. If you relate to Cape reporting, but don't meet all of those requirements for outcomes, you'll still, make it on the report you'll fall into that right and Services section.

And I'll just add, over time we've made more reference. If you look at columns, M and N, I should say, column Mike and column November. We've tried to make better distinction between those two types of students more on that shortly. That is, we look at the total number that qualify for that section and compare it to the number that actually receive services.

So getting into this in a little bit more detail, there should be review for most of you. Again, the left-hand literacy gains section that includes all your results from pre and post-test gains. Again, it includes information directly from Federal Table 4. Column B just gives you the number by program. Column C gives you the number in each program that completed a pre-test and completed a post-test.

And then Column D shows of those that completed a pre-post pair, how many of those students made enough of a game to complete a level on Federal Table 4. Again, This is all related to pre and post-testing. Here is the middle section that relates to all of the other AB 104 CAEP outcomes.

I'll start by saying here is Column F. This is probably the column here that generates most of the questions lately. Column F is the column that's new. Test I-3, I-3 is the little acronym we use to refer to immigrant integration into the-- say that 10 times fast. Immigrant integration indicators or I-3, that's why we call it I-3, I guess.

So Column F includes those new outcomes. I-3 outcomes relate to EL Civics COAAPs. More on that toward the end of the presentation. But I point that out first to show you what's new Columns G through L, definitely not new. They had different colored letters last year.

Column F was other literacy gains, Column G was HST and so on.

OK.

Hello? I think some-- OK, I guess that was straightaways. So we had a different column headers before because we moved I-3 into that F position. But everything other than I-3 same columns, as always, even though the letter designations are different.

OK, and then here's our services only. Again, it's those individuals that receive short term services. And all of those enrollees don't meet the basic requirements for outcomes. Again, basic requirements such as demographics, or 12 or more hours of instruction. We use column M to basically identify all those students by program.

Column N is what we use to say, OK, of those that sort of got deposited into that right-hand section, here's the number of those that actually did receive the service. And then, we have those right-and columns for each of those different services categories. I'll add here, unprompted. Getting a little bit into the weeds already, I guess you can say that we also have this information down at the bottom where we have the not applicable. That is, you might have some students that receive services but don't necessarily have any evidence that they relate to any program.

In most situations, the students receiving services probably did other things too. So they have an entry in an update. And they have a bunch of other information. So if they do, TE will go out of its way to assign those students to a program.

But as a lot of we don't require any of that for services. We don't require a program. We don't require demographics. We don't require anything for services other than ID and what you record as a service. So that number NA are those that have services, but or, but do not necessarily have any evidence that they are related to a CAEP program.

OK, I'll stop here. I'm transitioning. OK, I see the note from Veronica. Hopefully, that's not for me. I'll just say, with that note from Veronica, and I am at a convenient transition spot, let's use this opportunity is a sanity check here. Everybody hanging in there. Any questions? As usual I'm just babbling into one overly long soliloquy.

Yeah, not me. OK, we'll take the quiz now. OK? OK, everybody is hanging in there. That's enough. All right.

So here is a new report. I've got to admit I forgot to ask exact timeline. But I'll just say if you look right now, this will not be in TE, but it will be in TE very shortly. I'll just say, as evidenced by the screenshot, there may be a few minor changes from the screenshot, but I really think the screenshot is what this report is going to end up looking like.

We're calling it simply enough, the CAEP Hours Report. But this is one that I at least have talked about a few different times, that is you've heard people like Neil, I think, talk about the importance of distinguishing students with 12 or more hours, versus those students that have one to 11 hours, versus those students that have zero hours.

When we did CAEP Data Diving a couple of years ago in person, some of you might fondly remember that was a really big issue in the previous CAEP Data Dive series that we did in labs where we had a lot of activities set on having you pass out this information. That was a little bit in the absence of a simple report like this one. Now that we have a simple report like this one, hopefully, this will be easier.

But just say-- you might say the burner was turned up on this because of a lot of those things that people like Neal and others have been talking about related to Nova. With Nova, we're all going to be required to set performance goals starting this summer.

So some of those relate to looking at these three brackets, that is the bracket with everybody that qualifies for 12 or more hours, versus the bracket of those that have hours but they don't have 12, versus the bucket of those that might be services-only without any ounce? What if you have agency level access? Yes, I'll just say nobody has access to this right now.

Again, I'm giving you a little bit of a sneak peek here. I do think it should be available very, very soon, because this is what it will look like. But I'm pretty sure everybody that tries to go into TE and looks at it at this moment is probably disappointed because that's not something that's in TE yet. But again, very soon.

But I want to give everybody a preview because I really do believe this is what it's going to look like. Either there a few little wording changes or column changes that can happen. Sure. But again, this is a sneak preview that we'll use for both outcomes and services.

Like you've heard people like Neil and others say here lately, there will be some required goals everybody will need to do. Some of them will be optional, but whether it's required or optional, there is a lot of language that's going to be bouncing around are related to how you're managing these three brackets. And it's pretty simple, I think, but a lot of it will relate to, hey, if you've got a lot of services only students, a big goal will be to move them into programs and instruction. If you have a lot of students receiving instruction that don't have 12 hours, then there will be a lot of effort to move them to get more hours so they'll have enough hours to qualify for official CAEP reporting and obviously require-- qualify for all those outcomes.

And I'll just say thank you, Neil, for the reinforcement. That actually does make me feel like yeah, we're on the right track here. And I'll just say, to pinpoint that reference between what you've probably heard Neil and others say, and what I'll also reference that WestEd workshop from a week ago, you heard Blair and Jessica talk about this, that whole concept of participants versus adults served, that is a full fledged participants with 12 or more hours versus merely being an adult served that makes the reporting but doesn't necessarily qualify for Alcott's.

OK, I see questions for others. Sorry, I missed that somehow. But, OK, yeah. All right. No, that is a relevant one. I guess I'm not sure what it is, but, yeah, what is butterfly?

I'll take that for counsel. I don't have a good example on how to set hours. But I will get a chat. I guess what I'll do is-- the short answer is you could do that. What I'll point out is I have examples that relate to specific student characteristics but not hours. But there are similar steps. You can probably use on TE where you filter by number of hours rather than filter by student characteristics.

So I'll give you that as a clue. But that said, I'm not sure if the examples we're doing here shortly will be close enough to give you the feel. OK, so interesting. All right, great. Zero-hours is fittingly enough, the big donut, I guess, right?

So anyway, I will take that for counsel. And that is a good example to consider. I'm not sure I can come up with a good example here on the slide. But I guess I do feel like I could probably take that for council and include an example like that in the workshop we do here a week from now on May-- its May 25th.

Yes, I'm familiar with their concept. But like Diana, I had I miss the butterfly analogy. So I'll take that for so it's a good idea for an example next week. And I will point out that there's lots of ways where you can be creative in TE. The amount of filtering and sorting, you can do, in my opinion, overwhelmingly, and exponentially greater than any one person would ever want to do what to do.

So you can definitely run reports and filter that way. You can do things like run any record and use the Navigator bar on any report set up window. And you can filter for instructional hours the same way you can filter for things like barriers to employment or student demographics, and so on.

Speaker 2: hopefully, there's an answer there some--

Jay Wright: OK, so here, I'll just say this does not answer your question, Paul. But this does kind get into similar reads like the question. I'll just say, in terms Flava, F-L-A-V-A, we're on the same wavelength. And that we're kind of taken down is, again, we're sitting at a data dive. So I'm using this to dig deeper in the summary.

Here are some common reports setup window actions that you may know all about. You may not know about it, but I'm just sort of bringing this up. There's some good things you might want to consider.

The one that I'll really bring to your attention is the one at the top called CAEP Program Areas. This is probably most critical for that old CAEP Program Hours Report. But you can use this option for all of those CAEP reports that show up in those CAEP Tables option on the TE menu.

That is in CAEP plan, CAEP considers ABE, is basically what I think most people wouldn't consider a be high school diploma and HSC all rolled into one. I'll just say CAEP considers that one program. Most of the rest of the free world considers that 20 separate programs. So because CAEP considers that one, when you go into NOVA usually it adapts to the CAEP view of the world where that not one program.

So the default answer will be gives you that note simpler NOVA format with the assumption that that's what most people want to use the reporting for. But you have the option to do what you did what we did in this graphic, where it itemizes that out and shows a B high school diploma and high school equivalency as three separate rows on the report. There's another option where you can break down, and have CTE all broke down, or you can roll up CTE to where it includes CTE workforce prep and pre-apprenticeship and one.

Again, you say tomato, I say tomato here, but there are ways of doing it. Again, this will be more for next week. But when we get into things like, gee, how can we look at our data and pass it out for goal setting? This will be a really useful option because you might want what-- ESL is not here, because ESL, there's no real conflict. ESL as a standalone program for CAEP and NOVA reporting. It's done exactly the same way in TE. It's not exactly the same way in 99% of the rest of the world.

So there's no conflict with ESL like there is for ABE, and ASE. So there's no action on the setup.

Speaker 2: Hopefully, that makes sense somehow.

Jay Wright: So that's one thing you can do here in the setup window. Again, we're digging in the weeds on purpose. Another good one to consider is down at the bottom. You cannot when you're looking to drill down on the monitor, when you're looking to look at test results and drill down, it will be-- the default way is it'll just show you those that already completed a level. I've got to say most people like it the other way. You might want to check all areas. So you can see everybody that has a pre-post pair, not just the ones that made again.

again, this explains what I just said, CAEP Program Hours? Again, how are you breaking out CTE? How are you breaking out ABE and ASE?

Again, to your question, Jared. In other programs, CAEP and the rest of the world are in sync. But again, with CTE, and also with ABE/ASE, maybe not necessarily. So again, you can send it up for CAEP Reporting or you can set it up in other ways that might make a little bit more sense with most of the rest of the things you might be doing at your age.

Another-- also, just to clarify, I'm not sure this is useful as it used to be. But that checkbox where it says, use NOVA format for hours, just to explain what we're talking about here, is that box will be checked by default. When you're running that CAEP Program hours report, by default, it'll just give you that simple format that's required for NOVA. That is, these programs will be all rolled up.

It's not going to worry about splitting hairs and passing out the hours by multiple programs. It's just going to do it in the simplest manner possible. So you get the information that you can copy cut directly into NOVA and wipe your hands of that requirement.

But some people really want to dig deeper. They really want to see proportionally how many of those hours belong in ESL versus how many of those hours a log and career tech Ed or whatever. So you can uncheck that and get the report to report those hours more complicated as you are-- wish to do it that way.

OK, so moving on, I got deep there a little bit. I guess I'm going to start here for just a minute. Any questions here? I realize I'm probably talking about way too many things at once.

Everybody hanging in there. Sorry. It occurs to me I should be asking this.

Speaker 2: OK, thank you. All right. I'll just have to. OK, thank you.

Jay Wright: All right. So anyway, I'll just say I use this to show the program hours. Again, here you can see ABE high school diploma and high school equivalency are shown on separate rows. CTE work-class prep and apprenticeship, shown on separate rows. Again, different ways of running it.

Again, I'll reiterate the reason why I'm making such a big stink and deal about this for one. It helps to know how to run the report. Two, when we get into goal setting related examples in the presentation next week, a real obvious way in which you might want to be able to do that is to hone in on specific programs.

For example, maybe a priority at your agency and your consortium is to focus on high school diploma attainment. If that's your focus, going to say, exceedingly good idea to be able to break down the CAEP summary and other CAEP reports to isolate those students specifically enrolled an instructional program, high school diploma. If they're getting locked in with ABE or locked in with high school equivalency, I've got to say that makes your data decidedly less. Interesting and decidedly will be less useful if it's just bundling it up like that when you're trying to use your data for that specific purpose.

So if you're looking at things like that you might find it to be a lot better to run this report the way the real world looks at instructional programs, rather than looking at it the way CAEP does, that is breaking it down into more granularity by specific program. I'll just say, if you're looking at GED or high set results, that would be another example where running the report this way would probably be a much, much, much, much better idea than lumping all those programs into one row. Hopefully, there's some stats to that somewhere with what we're talking about.

OK. And then here's the example on that CAEP Program Hours Report. Again, this is just to let you know what I'm talking about. The simple way just gives you two columns. It doesn't break down all these programs. But in some cases, you might have students that are in more than one program.

So if getting that level of detail is important, you can expand that to show total hours versus proportional hours and kind see how those hours are broken down especially if you have a lot of students enrolled in more than one program. OK. I'll switch here to Data Integrity. I'm sorry. I keep checking in. I feel like I'm going in too many different directions. So I'll do another sanity check or even knowing that I just did this two minutes to go.

So everybody with me? Again, sorry, I went when I get in the weeds, sometimes it's hard to really see the sky. You can't see the audience, whatever. OK, these are experienced users. Give me a break. All right, message received.

OK, so we're looking at CAEP Data Integrity. Just you-- I'm really just using this screenshot as a segue to wave my arms up and down and say, OK, we're not talking about the CAEP Summary anymore. We're talking about data integrity.

Starting with a slide that many of you have seen 10, or 50, or 150 times. That is when we look at CAEP Data Integrity, my line. And I've continued with it for a long time now is to understand DIR.

Step one is to get a really strong understanding of that top section. If you understand what's going on in the top section of the DIR, in my opinion, it's pretty easy to figure out what's going on with the bottom 80% or 90%. But if this top section titled Summary Information is nothing other than authentic frontier gibberish to you, most of the DIR is going to be authentic frontier gibberish as well. You want to be able to sort this out.

So in some, in review, just to help make sense. And this obnoxious number of little text boxes, I have on the slide. Again, you've got the summary information. That is, there's some reconciliation that's necessary before processing all this data that's necessary to display the DIR.

So as you've heard me say before, I'll use the same tired cliche too, TE likes to pass that wide net, that is that fishing analogy. We're testing that wide net. We're getting everything we possibly can get. We're really looking at getting every last fish in the sea.

Again, if we get a bunch of kill, we get a few rusty license plates. No big deal. That's going to happen. Our goal is to get everything we possibly can. And basically, collect everything so we have all the information available to start, that is the students in the services section.

Any student with any relationship whatsoever to CAEP is included in that TAP number. So it includes the big number. You can see at the top and kind of works its way backwards, kind of bad example. It's a little too clean for our own good but we've got those students in the services section.

Of those, there's going to be a certain number not enrolled in the CAEP programs. In this case, there's not many. We're too clean for our own good. But there's going to be four students that we need to eliminate because they don't really have any CAEP program enrollment.

So we're going to go 1318 minus 4. Once we get the 4, and the way to see it is you can see the indentation, Students and Services section of the top. Indentation, students not enrolled in the seven programs, indentation. And then we've got those five rows LISted there, those correspond to each of those areas in the CAEP outcomes.

So all this is doing with that extra indentation, is TE taking a little extra time to smell the coffee? Basically looking and saying, OK, we've got a few of these students that we're eliminating here. So while we're eliminating those students, let's kind of look at them and see if there's any good fruit there. Well we're eliminating and we're seeing, hey, again, no good examples here.

But as it does it-- hey, maybe a few of them earned high school diploma, maybe a few of them got a job, maybe a few of them made a transition. So if there is any of them that we're checking out, we'll know that there might be some real tasty fruit. And some of those were taken to the curb. It's just giving us an FYI of those that we're kicking to the curb. We might want to look at a little more carefully because we're basically letting some good data getaway, so to speak.

That depends on how you do it. But that's possible, you might, because again, when you proxy, obviously, you then need to follow up and provide that enrollment. So if it's a test in a vacuum, it might well do that. Sometimes you might already have that program attached with the test. If so, then the test should align to some kind of CAEP program enrollment. But obviously, that's not going to be true all of the time.

OK, so anyway, we're eliminating a certain number of students. So you can see 1318 minus 4 equals 1314. It's also eliminating the concurrence. So again, we arrive at 1309. Again, not a great example. But for computation sake, it's fine.

So again, it's doing that reconciliation at the top. It's coming up with that number 1309. It's kind of the final answer after we weed everything out. So bottom line is once it processes all that information at the top, it uses that final number. In this case, 1309 is the denominator for all 27 items in that data integrity report.

Again, that's important for the DIR. It needs to make sure it's calculating all 27 of those percentages consistently. The only way it can do that is by doing this reconciliation at the beginning and making sure it can apply that same denominator across all 27 items on that DIR equal.

One click aside, I'll also bring up with the DIR. And this just because it's come up a lot in the last couple of weeks or in COSUS tech support. It has to do with the difference between the DIR in the CAEP summery.

I'll say the more common question, quite frankly, is the difference between the NSR DIR and payment points. But that question is basically the same as 1 that also comes up in pay plan, which is to ask the difference between the CAEP Data Integrity and the CAEP Summary. The short answer is the CAEP Summary or the payment points reflects-- again, the exact X's and O's are a little different. But the concept is the same for the CAEP Summary Reporting.

Just like for payment plate Summary 4, we are at two agencies. Those reports basically apply to those basic guidelines before displaying anything on the report. That has those hoops that the students need to get through in order to qualify for the report that is the student has all the required demographics and has 12 or more hours of instruction. CAEP Summary payment points insist that you have all that information before it allows the students to appear on that report at all.

On the other hand, the DIR is the opposite. The DIR and goes out of its way to include everybody. It includes students with less than 12 hours of instruction, it includes students that are missing required demographics, and so on.

That's necessary because again, part of what the DIR does is it gives you that complete number of students missing date of birth, missing 12 hours of instruction, and so on. Obviously, the only way it has a chance of calculating those totals accurately is that includes those students with missing demographics and/or the ones that have 12 or more hours of instruction.

So that was unsolicited. And that that's a question that's really come up a lot here in tech support for whatever reason. So I'll just say-- I think I said it last week, but I'm not sure. Probably-- Veronica rolling her eyes at me now. But I'll send it to Veronica.

I think I sent her a few supplemental reports. If not, I'll add it with the PowerPoint so they can have it posted on TAP. I know this is something we have posted on COSUS. But I'll send it to TAP to make sure it gets posted in the same area as this PowerPoint in the other day to dive PowerPoint.

But there is a relatively new cleaning up your CAEP Data Integrity. It's not really hot off the presses. But it was one that we updated in January, I believe. So it is up to date. And it basically includes some really gory detail information on what we've just spent-- what I've spent the last five minutes boring you with, that is telling you column by column row by row, what each of those names and getting in relating column numbers to column numbers and row numbers to row numbers, really getting in the weeds and explaining what all these items mean.

OK, and then I also wanted to bring up this. This is what we talked about a lot a year ago as we were in COVID land where the big question a year ago is maybe we've been doing all this, now COVID has hit. What is the-- what are the issues where COVID is going to be a big problem versus other areas? We're, quite frankly, we don't think it should really affect your data much.

So these are some examples where we admit it COVID might make your data little funkier than usual. Obviously, 12 hours of instruction, getting people with testing at all. Obviously, getting people with pre-post pairs, HSC and high school diploma, some agencies might be affected sometimes getting some of these outcomes, having an outcome but not qualifying, especially because they may not have 12 hours of instruction. Those are items that also might look a little funky this year because of COVID issues.

On the other hand, there's some things that we feel like shouldn't really be affected that much. All of those demographics, gender, race, and ethnicity, date of birth. COVID shouldn't affect that at all. Primary, secondary goal, barriers to employment, ditto. Again, sometimes some of those earned outcomes should be kind of the same. I admit, it's a little underhanded to put this one on both LISts. But I got to say, it's equally likely to be affected or not depending on the circumstance.

OK, so moving along here. Moving here to Consortium Manager Reporting. This is a screen-- well, actually it's a change of screenshot. But again, this is just showing how to do consortium level reporting.

I see your point saying-- though, I would encourage you to send an email. If you think there are other things that would help at the consortium level, I will say we sort of set on those three for a while. We did add a fourth one there a few months ago. The one we've added most recently is that data integrity. Up and it was just a few months ago that we added that to the consortium management report management.

Here is the slide we've used for a while. That is, here is the consortium level of demographic summary, just to give you a taste of what some level reporting looks like. So again, this is the demographic summary. It's giving you really detailed information for all the different demographics. So this is just an example we're looking at, highest level of diploma.

So you can see it gives you the agency ID, it gives you the number and percentage, in this case, of all the different categories under highest diploma. So again, this allows you to look at it agency by agency as well as compare and contrast. All the different agencies in you consortium. The one little hangup, I admit, is just because the space-saving. In this case, we have a really large consortium. We did that on purpose.

The only way we could get it all on one page is to list by ID rather than agency name. So I admit, sometimes you do need to know what your own agency's IDs are. OK, again, data diving.

here is another in the lead screenshot here. One thing I'll bring up about-- again, the consortium level, CAEP Summary and Barriers to Employment. You have two ways to do it. Again, you say tomato, I say tomato.

You can run it page by page for each agency, or you can get an aggregated view. That's always the Freudian slip I make every time. Aggregated versus aggravated, well, there you go.

But anyway, you can click Aggregate Multiple Agencies. That's what basically triggers the report behavior. If you check that box, that will basically give you a one-page report that aggregates all of your agencies into one neat and tidy report with all of your agencies rolled into one, or if it's unchecked, then it will give you a multiple page report so you can look at all the different agencies in your consortium, page by page. Again, kind of if say tomato I say tomato. Issue. Again, the way you govern that, though, is by digging deeper in the setup window, and checking that box or leaving it unchecked.

OK, and then here is just proof of the pudding. To show that it also includes the DIR. Here's that consortium manager view. And just say we did a larger consortium on purpose. Again, you can generate it. So it puts them all. You can see it's 28 pages because, again, each agency is probably-- are two pages per agency because it spills over.

So this is giving us about a 14 agency consortium. Here, 28 divided by 2. And you can see over here to the left, you can just use the Navigator if you want to navigate from agency to agency to agency.

OK, I think the guided tour portion, of course, I was just honest as usual and said this would be quicker. But I've rambled and rambled and rambled. And I've got to say it's exclusively my fault. You haven't really peppered me with many questions at all. I've just been rambling. So sorry, I was a false advertiser.

All that said, I'm kind of in a transitional as everybody hanging in. Feel free to throw the tomato can at me now if you like as I certainly deserve. OK, so moving on barriers to employment.

So I'll just say, how many of you have heard people like Neil Kelly, Carolyn Zachary, Javier Romero, other state-level people talk about the importance of barriers to employment, talk about how the whole world, including the state, including the feds, have really been looking a lot more carefully at barriers to employment? How many have heard that illustrious information?

OK, a few of you are willing to admit have. Not sure if you have or you're just humoring me. But either way, I'm going to take the money and run and say yes, of course, everybody's heard this stinking important that is.

So to get everybody's foot on the rail, there are two reports and TE that summarize and display barriers. I think most of you know this. Even Neal has heard. He's heard himself talking. and he's used a bunch of exclamation points to describe that to you, I bet.

Anyway, so I think everybody knows these reports. Just to say, though, out of curiosity, Rosalie, your question is perfect timing here, are we under oath? And here is one where you really shouldn't feel bad about-- if it's a negative answer, I really want to hear it.

But how many of you have used either the NRS Barriers to Employment or the CAEP Barriers to Employment? Doesn't matter which one, you've used one or the other. OK, thank you. Lydia chimed in quick. OK, mostly the chime are in this. OK.

So go figure. The first five or six people to respond all said yes. Now that will now everybody say yes, as soon I say that, but as we got past the first five or six, then the nos spilled in. OK, it looks kind of fair and balanced here. Half are saying yes. Half are saying no.

I've got to say that's a good ratio. That's about what I thought. But I'll just say FYI. There are two reports and TE that do that. These are the two with the menu options, they are for whichever-- whether you, again, this will be the cliche of the day.

If you prefer to say tomato, you can use the top one. If you're more of a tomato person, use the bottom one, they're very similar. Just so you can kind of see the one for CAEP uses CAEP programs like on the screenshot I'm showing you right now. And then the screenshot I showed you to introduce the topic is the NRS example, same exact barriers, very similar, but it gives you that information by NRS level, rather than by CAEP instructional program.

OK, so just say with barriers, here is that the quote-unquote, easy way. What's come up back to that obnoxious question asking about-- reminding you that everybody's talking about it, and so on relates a little bit to this. That is, I think, in terms of collecting barriers, and making sure we have barriers to attach to each and every student, and we can look at our data and show, yeah, we have lots of students with lots, and lots, and lots, and lots, and lots of barriers. Everybody does a great job on that. If you run those two Barriers to Employment Reports, it will verify that everybody's doing a great job.

But when you hear state and federal level folks talk, what they're really concerned about more is looking at your data by barrier, rather than looking at whether they have barriers. That is when you're looking at pre-post learning gains, when you're looking at employment data, when you're looking at transitions data, et cetera, et cetera, et cetera.

What state and federal officials want to know is, are all the students with specific various categories achieving outcomes at the same rate and the same level of success as everybody else? That is are students with disabilities getting jobs as much as all of the other students are? Or are homeless students making pre-post-test learning gains as adeptly as everybody else does? Or are migrant workers transitioning the way everybody else is?

Again, I'm just throwing different examples out there. But the idea is-- what the feds would say is, they can see that lots of local agencies and states are making progress. So they're basically turning up the heat a little bit, and say, yeah, that's great that everybody is a success story. But taking it to the next level, we want to make sure that all subsets are also doing that. I'll tie it to a lot of loftier presentations, where you've heard lots of people at state-level talk about that concept of equality. That is, we want to be able to target special populations and be able to say that these special populations are performing great, just like our students are in the aggregate.

Again, you don't necessarily always have to look at it in terms of identifying various categories. But I will point out this is the main field we already have built into our data. And it is a quick and easy way to address that important concept of equity and find specific groups of students at your agency that fit that profile. If you take a little bit of an expanded view of equity to include all of these different various categories, my hunch is that just about any adult education agency, community college or adult Ed, you're going to have at least one or two barriers that are represented very well in any adult education agency.

It should be pretty easy for any agency to find a few of these categories that are relevant. It will vary from agency to agency, but it will certainly be relevant. OK, so sorry, I'm getting on my soapbox. I'll try to stop that. So anyway, this is just showing you how you can filter any report in TE by a specific barrier.

I'll point out you can do this flight by any category, not just barrier. You're looking at other categories, the specific Navigator bar option will be different. When you click that funky little funnel icon at the top of the list or you'll get a different laundry list. If you're looking at education level then you will if you're looking at barriers to employment. But from a mechanical point of view, the clicks will be a very similar sequence of clicking.

If you're looking at Filtering By Barriers, refer to the report set up Navigator like on the screenshot. In this specific example, barriers is included in the end program years lister. So you would click In Program Years. That will give you the program years lister.

You can right-click on any column to add barriers to employment. I think that was a little fix we made. So you don't have to do that bottom step. But either way, in your In Program years lister, that's where you can get the barriers to employment column to appear. And you can see by clicking that little icon that will give you the list.

In this case, we check low-income. That will filter the list or to show only those students whom at barrier of low income. If we filter it that way and then generate, that will generate this particular report, but it will generate the report only for those students that have the barrier of low income.

So that was a gigantic mouthful. It might have been a little too much in the weeds for a slide, but I'll just battle-tested. Did anybody get any of that stick? Yes or no? It's simple. No problem, right?

OK, sorry. I see your question. So Jennie, you're wanting less clicking and more soliloquy if I'm reading it right? Jennie, I'll let you sorted. I'll say, Jennie, maybe you might want to on your cell phone talk if you don't mind?

Speaker 3: Sure. Thank you. You just made a comment at the beginning of that description. And you just said that we need to look at barriers as barriers, not as something-- and it was sort of separating how we should be looking at this. And I'm new to this--

Jay Wright: No, that's a great question.

Speaker 3: I just wanted to know if you could repeat that so that I can just get the overview. that's the point, sir.

Jay Wright: And I will be a little obnoxious to restate it the way I just did that you're wanting a little bit more from the soliloquy because I think that is the part of the presentation you're asking about. So what I said is this, so I am going to be even more obnoxious with detail. As we started collecting barriers five years ago, when we started WIOA, new field. I got to say, the state did a spectacularly good job responding to the field. Within a year, everybody was collecting this field. A lot of you had developed extra handouts to make sure you were reaching out to all students.

So everybody's collecting this in your data. So for a couple years, that's the way we looked at various just, hey, is everybody collecting it? Is everybody dutifully adding this when they collect for every student? Yes or no? By now, I think it's safe to say just about everybody answer Yes.

What's come up here in the last year or two, however, as looking at barriers in a little bit more detail-oriented fashion, instead of only looking at it in terms of, are you collecting this field as an agency, yes or no? We're looking at it by bearing it. That is, we're looking at it to segment out of data to look at data from a specific barrier.

So back to this example, we're using this report to filter it to include only those students with a big area of low income. So, in this case, we're looking at our data. And we're kind of hypothesizing. Yeah, we're doing a great job at our agency. But we admit, our high-income students always seem to get jobs and make learning gains, are low-income students always seem to struggle with those outcomes a lot more.

So we're going to start running our CAEP summary. We're going to run outcomes reports. We're going to run NRS tables, whatever it is, to start-- and start filtering it. So we can look at outcomes for just those students about the specific barrier. And at the end of the day be able to see how those students with the Bay Area compare to our population overall.

That's what the feds have been saying for a few years. I referenced state-level people are now saying exactly the same thing. That's the way we're moving to where we're not just showing in our data that we're doing that our students are doing a great job, but we can take the next step and show that students in specific learning populations are also doing a great job.

OK, thank you, Jennie. OK, so moving on to NRS Ad Hoc Cross Tab. Some of you diehards that have been in that statewide TE Network meeting have received some information on this report before. If you're one of those lucky warriors, we originally called it the NRS Ad Hoc Federal Tables Report, I believe. After we played around with it and fixed it, we change the title to NRS Ad Hoc Cross Tab.

I admit the title is not that catchy, but it is. It does very clearly illustrate exactly what the report does. It's from the federal reports menu. It's using NRS in the title, simply because as a basis for the report. What the report is doing is it's just peeling every field under the sun, that is, any field in TE that could conceivably be used for any of the 14 NRS tables is included in this Ad Hoc NRS Cross Tab.

It goes out of its way to include all the CAEP-specific fields as well. So it's basically a report where you can dial-up your own categories, and run a report of your own choosing, so to speak. So again, we're go into the federal report's menu. And again, selecting Add Hoc NRS Cross Tab.

And you can see when you open it, you navigate to the setup window. The setup window has to really strikingly obvious features. One called Rows categories, another Column categories. Those two items do exactly what the title suggests. Row categories allows you to govern the rows of the report. Column categories that allows you to govern the columns of the report.

And basically, you use those movers to basically select what you want in the x-axis versus what you want in the want and the column axis. Here in this one, we selected CAEP program for the x-axis. We selected Barriers to employment in the y-axis.

I guess we're just showing examples without the setup window. But what I'll just say here is in the setup window, we would select CAEP program. In the first example, we would select the Barriers to employment. We would just select row and column by selecting one field each and these two little move our features on the screen.

And then here are some examples of different specific finished products that you might do, just sort of throwing out examples. So we've spent the last 15 minutes talking about barriers to employment. So I'll just say we've probably got to use barriers as one of these examples, is that we spent 15 minutes on it. So that fittingly enough this example number one. We're running the CAEP programs and barriers.

So here is the way where you can see by CAEP program, the number by program that has that specific barrier, shouldn't surprise you. We've got a high number of English language learners and ESL, for example. If we're more concerned with maybe deeper in the weeds, like white homeless or migrant worker, you can see the number that show up by barrier for each CAEP program, again, you can get in the weeds as much as you want. You can drill down as well.

Speaker 2: OK, sorry.

Jay Wright: OK, this is just another example. This one is a little bit more NRS, you might say. This looks at the NRS levels rather than the CAEP programs. And it's looking more specifically and MSGs is on Federal Table 4.

If you're want to look at it more from an NRS point of view this would be a better example. Here is one that's a little bit crazy level of detail, but this one is looking at all of our education results by age range. As you can see, there's no way to include all those education results and have it display as anything other than a big stinking mass. There's just way too many education results to have it not be a big stink of best, mass.

So I'm here to tell you with some of these options, no matter what you do, 125% guaranteed, big stinking mess, exclamation point, exclamation point. There's little-- that's just the way it's going to be. So here's a good example where it's 125% big stinking mess. We've got what, 2025 education results running from left to right.

So you can see could what deciphering all of those column headers. So I'll just say what it does for you to knowing in advance that that's going to be a problem no matter is it gives you all of the education results. So you can read left or right and get it all spelled out in longhand here. So you can easily figure out, what all these Ds are doing and what all these truncated words mean, knowing that you have no chance whatsoever of figuring that out.

OK, here is just another example, Labor Force Status by CAEP program. Here we ran a program on the y-axis instead of the x-axis. And there's four examples. We could give you many, many, many, many, many more, but I've been babbling keep a raw long and false advertised and everything else. So I think we need to move on to I-3 reports here. But I will stop any questions, any anything?

Sanity check. OK, everybody is getting tired of these sanity checks. OK, so here is the menu. Here is the demonstration proof of the pudding, whenever. Here under the TE State Reports menu.

That's where we have our Immigrant Integration Indicators reports. That is right down there at the bottom. It has EL Civics on the title. There's the full-fledged named EL Civics Immigrant Integration Indicators, EL Civics I-3 summer.

OK, starting a little bit-- a little dizzy, but hanging in there. We'll talk about dizzy. We're just going to give an eye bopper of what's the bell ringer here, right off the start as far as dizziness goes. Dizzy.

We're going to start with the statewide CAEP summary. I've got to say maybe it was dumb of me to slot this one in here right off the bat. But I wanted to do two things about this. One is I wanted to make the connection between these I-3 reports that we're dealing here and back to that new column three and the summary, that is that column called past I-3, where we're looking at that information from EL Civics COAAPs.

So number one, this is just to consecrate that linkage, that's what we mean by I-3. And when we're talking about it is this is the information we're getting into with that new column on the CAEP summary that we've referenced a couple times already but not really explained all that well.

I'll also use this as a segue way to say, when you look at the state level, last year, we ran the statewide data for this particular column. And I'll just add, we had a lot of these outcomes without even knowing that last year. That's because all of these come out of EL Civics COAAPs. A lot of you, I dare say the majority of you have been doing your civics class for a very, very, very long time.

When students pass those COAAPs, it will show up and it's an I-3 outcome. So if you're a high performing EL Civics agency, pretty much by osmosis, you should start generating good results and this past I-3 columns. If you're a CAEP agency and to not to EL Civics, you don't do WIOA II, we'll just point out you're now officially invited to start doing EL Civics COAAPs.

I'll get into this here as we go through these slides. But there is other people statewide, at COSUS and other places statewide that are not COSUS that have done a lot of research and have a lot of discussion on this topic. And so we've now allowed you to use the EL Civics COAAPs. We allow you to use them for all programs, not just ESL.

So in you're CAEP but not EL-2 that means your CAEP but you're not EL Civics, you now have the opportunity, you're now cordially invited to the wacky world of EL Civics COAAPs. That's because we've related a lot of that content in COAAPs to the same content that a lot of experts and professionals in the field have identified as priority areas in the area of integration.

What I'll just say is yes, these are duplicated numbers. The I-3 numbers are duplicated just like all the others outcomes are. I'll just say I truncated this because I was looking at I-3. But because you asked, you can look at other CAEP Summary screenshots for proof in the pudding. I'll just say those numbers in blue. I guess I truncate it because I really was just trying to show you why three here not all the gory details.

But below that NA is where TE takes those steps for duplicated and unduplicated. The 116,000 is the duplicated number. The 87,000 is the unduplicated number, kind of a tricky little sleight of hand here that's a little bit fact and a little bit circumstanced, but it sure makes it easy to explain is all point out the bad 87,758 unduplicated number not far away and not far away at all is that 87720 in ESL.

99.999% of these outcomes from last year are all generated from California EL-2 EL Civics. So it really ought to be 99.9% ESL. In this example, thank goodness, you can see it pretty much, yes. Hopefully, that answers your question, Karen.

So moving along here, sorry. So immigrant integration indicators gives you kind of detailed information. So here's a screenshot. It's giving you a list, Student by Student. It's giving you that Immigrant Integration Goal Area. And then within that goal area, listing the specific EL Civics past for each student.

So here's an example. We've got two students here. So again, you can see we've got two students listed. So we've got two students listed one student has-- well, they both covered the same three immigrant integration areas. Here's the one student. Here is more precise, we've got this one student.

The student has passed five different COAAPs this year. Those five co-ops cover three different I-3 goal areas. You can see a couple in the area of community, a couple in the area of digital literacy, and so on. We'll be explaining this soon.

OK, so here's I-3 By Class. So you can generate it in the class level. Getting the same information. Again, listing, those specific COAAPs were past my student. And again, relating each one of those civic objectives to those goal areas. We'll be getting to more explanation here in a minute.

Before we dig deep, though, here's the other one of those two reports. We've got the EL Civics I-3 Summary. This one is more of a big picture report.

So here we're listing by the Immigrant Integration Go Area. We're looking at each area and showing nearby agency. The total number attempted by goal area of those that were attempted, how many were passed, and then just the percentage by COAAPs.

So you can see some of these. Where we're doing really well. Others, we have kind of a surprisingly low percentage pass. So again, we can look by COAAPs. And I'm using this is a good way to troubleshoot.

Sorry, was there a question?

Speaker 3: OK, no question.

Jay Wright: OK, so here, we're just digging and showing. You can drill down, highlighting that. Again, if you want to dig deep on that.

First one, I'm not sure how much drilling down there really is to do. But in the second one, I think the drill-down will be very applicable. That is, you're looking at number attempted, number past. So you can drill down and get a little bit more detail, find out which students passed, which students didn't and provide some of that EL Civics information.

So know we go. Sorry. I don't know if I'll do this one again, but I should have brought this one up ahead, I guess. But here's that AB 2098 Framework. So in the presentation, we did in March, that was an hour and a half on nothing.

But I-3, this recaps what LIS did originally and what the AB 2098 work group kind of read did a couple of years ago. Looking at these areas that professionals and experts in the area of immigrant immigration identified 5 to 10 years ago. In particular, that group called LIS and the Bay Area, looked at this in detail, did a lot of research came up with some core areas that they thought were the biggest priority areas in the area of areas of immigrant integration.

Here are some of those areas. What I'd just say is some other professionals statewide did that research. What we did it cost systems we reached out to them, they reached out to us, we partnered up a little bit, came up with the conclusion that a lot of those priority areas that were identified in the area of immigrant integration are maybe not 100% the same, but there are 98%, 99% of the same , as the areas we've been looking at for close to 20 years in California EL Civics. So we looked at those three areas matched them to the 50 or 60, EL Civics COAAPs. And the short answer is we found a trend as evidenced by this chart. Here's that chart listing those three areas in giving you COAAP numbers that relate to each of these areas.

So for example, one of the key immigrant integration area is economic security. Not surprising, once we did the research. And so today, we agreed, the first six COAAPs on the menu all relate somewhat to economic security. So if a student attempts and/or pass COAAP one, two, three, four, five, or six that relates to data related to economic security, you can do the bottom one just as another example. So we believe that perhaps 47, 48, and 73 is another example relate to digital literacy. That's another one of the important areas related to immigrant integration.

So that's another example of digital literacy. Right. And it started as collaborating collaboration between LIS and COSUS several years ago. LIS did a lot of this on its own. I believe Paul Daley, you're here. I know you were one of the key people from long ago, but a lot of folks, Bob Harper, Ilsa Pollet, lots and lots, and lots, and lots of others did a lot of research on immigrant integration, came up with some areas that were determined to be the priority areas.

And so Alkela COSUS team looked at it and made these charts to kind of relate it. So when students pass COAAPs, we feel strongly I got to say that when they pass COAAPs that shows that they've taken significant steps to improving-- thank you, Paul, --to improving areas in the name of immigration.

OK, so here is one. And I'll just say this is what we also worked on a few years ago. A couple of you have surprised me, caught me off guard, whatever, by asking me for this. Most of the work, I've got to say really focused on EL Civics.

But there were some smaller, less notable efforts with this where we also looked at outcomes and services that we already collect for CAEP Reporting. So here's an example, of what went through the CAEP data dictionary and highlighted some of those outcomes. They're in the data dictionary that are NTE, and highlighted some of them that we felt related to that chart.

So there's no report on this. But if you're interested in recording outcomes that you think are beneficial for your immigrant students, no official credit for it. But we have kind of color coded the data dictionary to show those priorities. If you're interested, I'll just say we did something very similar with services, as illustrated by this slide, where a lot of those individual services we also find related to a certain number of COAAPs.

OK, so this one, I'm not going to go step by step. But it's one of those things where I don't include it, I'm a mess. So if you are one of those agencies that has not been doing the EL Civics over the years, obviously, the learning curve for getting COAAPs and is difficult. So here's the slide that shows you how to manually enter co-ops and TE.

If you're in EL Civics agency, this is no big deal. But if you haven't been doing EL Civic, I admit, non-intuitive. Here are the detail-oriented directions. I'll just say probably too little too late now. But I'll just say this is directly from a webinar we did in March, which hopefully was a little bit better in terms of bringing this up for people that wanted to get started.

And so I'll just say if you are one of those agencies that's getting started with this and you're in that oddball position where you're manually entering this, and you're kind of fumbling your way around. What I will just say is, I don't know if it helps or not, but when you're manually entering COAAPs, this from my two set is the issue that's kind of the number one, number two, and number three.

Most problematic, most confusing issue is when you do this, you're basically going into the task list or an creating a new test, only, you're not creating a new test, you're creating a new additional assessment. So what you need to do is most of the time, when you're doing this, you are doing a test manually. And this example, though, of course, you're doing an additional assessment. So it's really, really, really important to go in and change the assessment from fixed form which is TE term for pre and post form, to EL Civics additional assessment form that's what allows you to parlay it from regular pre or post-test test form to EL Civics form.

OK, we've got five minutes. 404. What's 404? I'm not sure what's the 404. It sounds like the room number for the human resources department, does it?

Anyway, so here is a preview of next week, for 5-25. That's a week from now we'll get into goal setting. A lot of what we've talked about, I got to say, yup, it's one digit away from the LA parking lot.

What I'll just say is a lot of this session and next session relate to each other a lot more closely than this one does to the one we did a couple of weeks ago. We'll use a lot of this information related to CAEP Summary, CAEP Hours, and CAEP DIR to get more specifically into goal setting.

I'll make note. We'll have an hour's related example. That's a good example. A couple of you have given me some other examples. We'll try to do that. But we'll basically look at the idea of goal setting.

We'll corporate some of those NRS performance goals, examples. We've talked about in many other NRS, WIOA II workshops. And then we'll try to take a stab at sticking that chocolate bar into the jar of peanut butter, where we'll use that NRS performance goals concept and apply that to CAEP reporting, and give you more CAEP sort of results, the same sort of results we talk about a lot when we look at the data portal and NRS performance goals.

But instead of providing those kind of examples that are great for NRS or not so great for CAEP, we'll use examples related to things like the CAEP Summary. We'll, of course, try to focus on what barriers, do what example, an I-3 three example, also to make sure it relates well to what's going on with both workshops.

OK, I think that's it. Yeah, so I'll just say I am finished. Any questions? Anybody with any gunshots you need to fire my way? You've been waiting all this time you've got a couple of minutes to.

I won't be able to get away. You can shoot me now. If you want. I'll turn it over to Manderley? Veronica? Hawley? Not sure which one of you is taking off.

Speaker 4: Thank you, Jay. And I don't see any questions check-in the-- it looks like Neil has mine. Can non-ESL student enroll in a COAAP?

Jay Wright: Yes, I did mention that. But yeah, I was talking about that with the million other things. At least that's the difference between CAEP and California EL Civics. Most of you are El Civics, you already know. If they're not in ESL, forget it.

Those EL Civics, COAAPs, and all other real civics only count for learners enrolled in ESL. For a COAAP reporting, that's a discussion that's been had several times at the state level. Everybody believes, come on, immigrants can be from all programs, not just to ESL. So thereby I-3 outcomes apply to all at all programs, not just ESL.

So likewise, if you're a non-EL agency and CAEP and you want to get started on this, this might be an invitation. Similarly, if you've been doing EL Civics all this time and you've kind of been itching to want this to apply for ASE, or ABE, or Career Tech Ed, or whatever, here's your golden opportunity. Thank you. That's a good question to bring up.

Speaker 4: Hey, and I think those are all of the questions. If anyone has any last questions, definitely feel free to post them in the chat. And also in the chat, Madeley has posted a number of links, including the evaluation link.

We just transitioned to a new registration site. And so the links are a little different. So our apologies for any confusion. But the evaluation link is in the email, as well as links to our upcoming webinars. So the one-- WestEd that originally had a webinar scheduled for tomorrow on CB 21. That webinar has been postponed.

And so you will be receiving communication about that. And we will also send communication of the new date has been set, as well as automatically register anyone who had previously registered for tomorrow's webinar, so be on the lookout for additional information regarding the rescheduling of that particular webinar.

We also have one on Friday, which is another one of our CAEP Deeper Dive State Priority Webinars. And this one, is going to be based on the state priority program development. And the topic for Friday's webinar will be on adult education-- California Adult Education's role in California's economic recovery.

And our presenters are Jennie Mollica and Peter Simon. And they have also invited panelists from The Care Consortium, The Glendale Learning Consortium, as well as The Sonoma Consortium. So you'll be able to hear about practitioner's experience as they are preparing for California's economic recovery through adult education.

And then we'll be back next Tuesday with Jay for the third part of his day to Data Dive series. And then, later on this month, as it was mentioned in the chat, there will be a webinar on member effectiveness and the processes and plans that will be implemented as of July 1. And that will be with Neal Kelly and Dr. Caroline Saccery of the CAP office, as well as COSUS and CAP TAP. Registration for that webinar will be released tomorrow with the newsletter. So you will be able to register at that particular time.

If you are currently not subscribe to the newsletter, we definitely invite you to subscribe to the newsletter because that's where we send our communication regarding all things to CAEP. And that's kind of your first point of information, as well as the Colorado Ed website. So if you aren't subscribed, I would definitely invite you to subscribe so that you don't miss out on any pertinent information related to CAEP.

Tomorrow, You will receive an email with the evaluation link as well as the recording and a link to the PowerPoint presentation. Again, please use that for future reference as well as share with others who may have not been able to participate in today's webinar and they would like to learn about this information as well. So again, I'm checking the chat. And I don't see any additional information or see any other additional questions. But again, if you have any questions, definitely feel free to contact TAP or even COSUS and we'll be able to provide assistance.

So that is all that I have for this afternoon. Thank you so much, Jay, as well as to the top team and all of our amazing attendees for today's webinar. Hope everyone has a great afternoon.