Blaire Wilson Toso: Hi. Thank you, Veronica. Welcome, everyone. We're delighted that you're here. We are doing the webinar, Why Do My Data Reports Look Different? Next slide, please, Jessica. So before I go further, I wanted to introduce the people who'll be presenting on this webinar. We're delighted to have Jay from CASAS join us. But I think I'll let Jessica say a few words and Jay introduce themselves, and then we'll turn it back to me. Jessica?

Jessica Chittaphong: Hi, Jessica Chittaphong here. I'm the Adult Education Pipeline Dashboard Manager for the LaunchBoard tool. I've been supporting the LaunchBoard for a few years now, and happy to present a little bit more about the data today.

Jay Wright: And hi. I'm Jay Wright from CASAS Just eyeballing the group. I think I know most of you, but I've done a lot of training for CASAS on this stuff pretty much since ground zero, I think. So good to see you all today.

Blaire Wilson Toso: Super. Thanks. I'm Blaire, and I work on the CAEP project at WestEd and have a long history in adult education. Started out as an ESL teacher. And I've taken that adult education route, which wanders and meanders, as many of you most likely have. And we're delighted you're here. And let's just jump into it, Jessica. Thank you.

So the goals for today are really about we're looking because we get a lot of questions when we do our work, and I'm sure Jay probably gets them as well, is looking at the difference between the AEP Dashboard data and the CASAS TOPSpro Enterprise data. And they don't always align, and we're curious about that. So we're hoping to be able to talk about some of those key features, the different uses for the AEP dashboard, and the CAEP TOPSpro Enterprise reports or summary reports as well as looking at some of the metrics and how they map between the TOPSpro Enterprise and the way they're shown on the Adult Education Pipeline Dashboard.

And we just want to say that sometimes people-- we've been doing an onslaught of professional development recently, and so if it's this isn't what you expected and you were hoping to get something more about the introduction to the AEP dashboard or what changes have occurred between the 3.0 and 4.0, we do have other webinars that have been recorded, and they're on the CAEP website. So just go ahead and check those out as well. Next slide.

So one of the things that we are trying to do is we'd like to really sort of outline what is the same, what's the difference, and how data comes together on this. So one of the key features between we have these three data sets from COMIS, then TOPSpro Enterprise and CASAS and then the AEP dashboard. And COMIS, the institutions that report into COMIS are noncredit college WIOA II as well as some noncredit college non WIOA II, but they still live in that noncredit realm. They report into the data that will come into the dashboard.

TOPSpro Enterprise looks primarily at the noncredit community colleges WIOA II funded and the K-12 adult schools. They collect all that data. And then that gets put into the AEP dashboard. Which so then what's represented on the AEP dashboard are all of those institutions, which sometimes if you're just looking at WIOA II or you're just looking at noncredit community colleges that are non WIOA, you won't see the data that they don't necessarily match up with what you're seeing in one or the other systems. We all have coding resources that are available to you.

Also the other difference which informs the way you might want to use some of the data is how regularly they're updated. So COMIS gets updated semester and yearly. TOPSpro Enterprise goes quarterly, and you get reports based on that. But the AEP dashboard is only updated yearly. So we always like to say that AEP dashboard is not a real time tool and that if you want to do more ongoing and see how your data is-- what your data is saying or whether it's reviewing those data pieces, the COMIS and the TOPSpro Enterprise are probably a better place to go look in more real time, because that data gets entered at different times throughout the year.

Similarly, data can be tracked on a semester basis for COMIS. It can be ongoing within TOPSpro. And again, yearly for the AEP dashboard. And in line with that is where you can verify and review your data. You get to take a look at your data in COMIS. You have to go into data mart and you can look through that. And on their reporting, you can verify it and then submit corrections.

Whereas TOPSpro, you have a much more frequent opportunity to review your data and submit that on the quarterly and a yearly basis. And then AEP dashboard really you only get to look at your data on the dashboard that is, as I said, yearly. So that would be after the fact. It's a little bit more difficult to update your data on the dashboard because updates should take place in the systems where you are directly reporting your data, which is in COMIS and the TOPSpro Enterprise.

Jay Wright: The one thing I could add here, if you don't mind, is just on our side, obviously you can update your data whenever you want, but we say quarterly for official purposes just because you do those quarterly data integrity reports. So from an official point of view, we would call it quarterly updates. But I'll back up and say if any of you are wondering, hey, wait a minute, I go in whenever I darn well please to update and review my data, continue to do so. And of course, you can, but quarterly is just more from an official reporting point of view.

Blaire Wilson Toso: Super. Thank you. Are there any questions before we move on?

Jessica Chittaphong: Blaire, I wanted to point out one thing as well for folks who are not familiar with COMIS. We're focusing on the noncredit community college, but it is the reporting system for all community college reporting credits included. Just wanted to point that out.

Blaire Wilson Toso: Thanks, Jessica. See, with the three of us, you're going to get all the details. And likewise, I did want to say that if people have questions, feel free to raise your hand or to unmute and ask the question as well if you don't want to put it in chat. That's totally fine, too. I think the three of us are very informal as presenters, especially in this presentation. Next slide.

Jay Wright: Sticking with that mode, I'll just say one other thing is what may not make sense to everybody is that Participant Individual Record Layout or PIRL. But that's the document that the feds put out back about five years ago when WIOA started. So there was a little bit of, hey, where do you find the baked goods in the supermarket? Hey, in the bakery section. That doesn't tell you much. Where's the bakery section?

So that's kind of what I was thinking when I said the WIOA II data dictionary. It's not really explanatory enough. So just so you know, the background behind that is what the feds called the Participant Individual Record Layout, which is their detailed definition of all the fields you're required to include for federal reporting and what sort of options you should have for each one of those fields. So I think from root cause wise, that's probably really the root of where all these fields come from, just FYI.

Blaire Wilson Toso: Super. Thanks. And I see, Jessica, you're answering some questions in the chat about the data sets we'll track as well as the lag time for when data is posted in AEP after EOY. Jodi, that oftentimes depends on how the data gets cleaned up and the calculations are put in place. But usually EOY if you're talking about June, it will standardly happen in late February, early March. Jessica, can you confirm that for me?

Jessica Chittaphong: Yeah, so the dashboard usually publishes data on the previous academic year. So right now our latest build, we have '19, '20 year. We get the files from both CASAS TOPSpro and the COMIS folks at the end of the calendar year, basically. So for '19, '20 data, we received that in the beginning of 2020. Certain metrics will lag behind by one more year, and we can talk a little bit about that further in the presentation.

Blaire Wilson Toso: Super. Thank you. Go ahead and move to the next slide. So some more of the key features are the adult learner population. Again, there's a little bit of a difference between COMIS and TOPSpro Enterprise. And then AEP dashboard works at combining that. So all the adult learner population is represented regardless of whether you're WIOA II funded or maybe you're just in the noncredit community college CAEP related courses.

It's not the entire noncredit community college participants. You have to be considered in one of the CAEP programs to be included. Each of us has a different way of verifying and validating data, which is there in the box. And many of them are automated, and we work through that to ensure that the data is as clean as possible when it comes in and that we aren't duplicating our results and that also that the dashboard, the AEP dashboard, verifies through a different variety of measures.

Which is one of the reasons why going back to one of the data sets that we track through COMIS is because we verify transitions through COMIS to ensure that students who have transitioned to credit classes that they're actually enrolled in credit classes as opposed to just a self report or maybe misentered data via the person who is entering the data in the community college was moved in that transition. We track all of these regularly except for our COMIS and TOPSpro regularly, whereas AEP dashboard we track those annually.

So then another part of the question is, what's the difference in how do we want to use these different data sets? And you can really use the COMIS and TOPSpro Enterprise for ongoing continuous improvement efforts for tracking outcomes and for planning. And then when you go over to the AEP dashboard, you can use it for planning, evaluation, trends, outcomes and equity. But we do not suggest you use it for ongoing continuous improvement unless it's built in at particular points in your ongoing continuous improvement plan, because it doesn't give you that data that you can track regularly across the year. So that would be one of the ways that we would use that.

The data informs-- each of them inform different reporting measures such as COMIS and then TOPSpro Enterprise, as you all probably know for CAEP, and WIOA NRS. And then the AEP dashboard informs the CAEP yearly legislative reports as well as the three year planning process.

Data display. COMIS, you have great charts, robust charts. TOPSpro Enterprise provides a more robust list of ways to see your data, which your data charts, dashboards, and drill downs. As does the AEP dashboard. You can go onto the dashboard and you can see the data charts, graphs, drill downs, as well as new comparison levels among institutions. Jay, Jessica, anything I missed on that?

Jay Wright: I'll just add I think the way you have it arranged is good. But yes, we have the different data integrity reports for the different types of reporting. We obviously emphasize that quarterly report is a way to have agencies validate their own data and make improvements as each agency sees fit. So we have a DIR for CAEP and a DIR for NRS and a DIR for California Payment Points. I'll also just add in that middle section there is an automated process, whether you're scanning, whether you're importing, or whether you're doing manual entry. It's under the hood. But as soon as any data gets entered into TE, it's automatically validated. And my guess is that's what COMIS means by logic check. Just a different term for a similar process, I believe.

Blaire Wilson Toso: Yes, yes. Thanks, Jay. Yeah, those are all automated to support data verification and validation. Super. Thanks. Jessica, should we go to the next slide? I'm going to turn it over to you.

Jessica Chittaphong: Thanks. I was muted. How many times do you hear you're muted, Jessica, a week, right? So we wanted to talk a little bit more about how we do the verification and validation process once the data gets loaded on to the adult education pipeline dashboard. The dashboard relies heavily on student information not only to do the data matching between the different data sources but also to ensure that we're capturing outcomes across the different data systems.

So we are talking a little bit about transitions and how to count them. I think Burr had posed that question in the chat. So for transitions, we're really looking at student enrollments and seeing them start in one place and end up in the community college system looking at specific course enrollments. And so that's how we are tracking transitions and that's how we can verify that those are actually happening within the community college and adult ed system that we're working under.

For enrollments, we have a number of different student files that we look at across the data sets. But that's a really important source of how we can track students where they are and which particular year. And one of the pros of having the dashboard is that we have historical information that we can pull in. And tracking students from one year to the next year is really how we use the enrollment information. And we can tracked them there.

For earnings data, the adult education pipeline really relies on the state's match with the employment development division's UI wage file. And now there's limitations to that specific wage file in that they are only counting students with social security numbers. But for those students, it's a very accurate match and can give us a lot of different information for that. And so we're able to find students in both the CASAS data set as well as the COMIS data set. And if they may be missing their SSN but they have SSN listed in the different data set, we're able to find that SSN and connect it to that student so that they show up in the earnings and employment metrics.

Now, talking about timelines, we've touched on this a little bit about the receipt of the data files and why we only get yearly data files. And that's historically been so that we can ensure the completeness of the student information, at least on the COMIS side of things for the community college side of things. The schedule for updating your student files is different from the schedule for updating your award files. So we kind of wait for the full year to pass so that we not only get the student information but we also get whether or not they were able to achieve an award for that particular year. So we wait for all that to trickle in. And that's kind of why we have that time lag by one year.

Time lags also appear in a couple of different metrics. And so let's go back to the implemented earnings metrics. So we only really want to track implemented earnings metrics for students who have left the system. So they're no longer getting-- they're no longer enrolled. They're no longer getting services from our institutions or from the community college institutions. And so they're fully in the workforce. And so that's when we would want to track them. And to be able to do that, we actually wait a year to see basically if they show up in any of the two systems. And if they don't, then we count them in implemented earnings metrics.

So that was a lot of information. So I see a lot of questions pop up. Blaire, do you want to highlight a couple for me?

Blaire Wilson Toso: Yeah. There's one. Kelly's asking if it's showing up in the noncredit LaunchBoard, and is it the employment placement match with the EDD? And so yes with caveats, right?

Jessica Chittaphong: Yeah, so the AEP dashboard is only counting the noncredit enrollment, noncredit students who are in a CAEP program. And yes, we are using the EDD matching.

Jay Wright: What I'll try to do here is there's this other question that I kind of feel like it might need clarification. But Sharon asked about adult core indicators. I'm thinking maybe you're asking about core performance, but I'm way less than 100% sure. I'm going to pretend, though, that I am sure and say, yeah, that's your question and just say, yeah, we have been doing that employment and earnings survey in TE starting in the third quarter.

It's been a total pilot basis year this year, but next year we'll use that same employment and earnings survey for CAEP, like what we've been doing for WIOA II the past couple of years. We had a training in March that covered some of the details. So I know most of you have been doing it. I know on the college side, there's a survey that you do. So my point is, one, Sharon, we have been implementing employment and earnings on TE for CAEP.

And then bigger picture, we're doing that on the K-12 side. And I don't know a lot about it, but I know there's this employment survey they do on the college side. So the idea is kind of using multiple methods to get employment sort of like what we'll say the Neil and Jay show has said forever. We also have the self reported employment outcomes. We know that's not always terra firma, but we're using data match. We're using self report. We're using multiple kinds of survey to try to build it at several different locations to get employment information knowing that if we put our eggs in any one basket, it'll be very incomplete.

Jessica Chittaphong: Thanks, Jay. Totally support that. And yes, we've actually mapped out some of the metrics that we use with the EDD file with a couple of the metrics that are showing up on your CAEP summary report, which we'll be talking about a little bit later. We are talking about how the noncredit version of the CTOS survey can get incorporated and how that can add to the information with the TE survey as well. So that's definitely going to be up for discussion probably this summer. And we'll see what we can make sense of in the next build of the adult education pipeline.

Blaire Wilson Toso: I just want to add to that, Jessica, that as we talk about metrics and incorporating survey data into the AEP dashboard that it's not always an immediate process, because, we being the WestEd who oversees the AEP dashboard, is that we don't have control over that process necessarily. It's informed by members in the field. There's work groups that inform metrics. And there's also at the chancellor's office, they approve any new metrics or any new ways that we're going to ingest data into the adult education pipeline. So we're really hopeful that it's going to push forward. But I just didn't want to think that-- it might not be ready for next year, because there are these processes that we have to follow for the adult education pipeline.

So we're just going to move quickly through a couple of slides. Some of the big questions we get are how are metrics identified? How are they calculated, added, changed? So I just mentioned that really quickly that most of the time, our metrics are identified by work groups and input from the field, discussion with our collaborators, with CASAS, with COMIS, with the chancellor's office, also the researchers at community colleges. So there's a broad group of people who inform our process. And then as I said, it goes through and we make recommendations, enter into discussions, and then they go through a formal approval process. But it's not something we do willy nilly or even when we know it's a good idea that it's a guided and informed process.

And then we update our metrics based on input also from the field and from the items that we identify that we think need to be refined to better reflect the data that's being ingested into AEP and then shown for you all to use. So we look at ways to allow for adjustment so that it aligns with the needs and outcomes to reflect both new policy. That's sometimes how it gets added. New information or added detail from different people. So like the surveys, there's something that we're working on seeing how we can incorporate additional data that makes it a really robust, comprehensive data set. Next slide.

So one of the topics that we were asked is how are the metrics identified? What do we know and where is it coming from? And so in the metrics definition dictionary that we have, it's linked to out of our AEP dashboard, is we identify, for example, the TOPSpro data source elements. These are what we get, and that's how we identify and match across the COMIS and the TOPSpro data source. So those are the elements that inform our calculations, and we'll go to the calculation.

And in there, it will tell you exactly how these are calculated. And we use the calculations to identify comparable outcomes in both data sets so that both of the TE data and the COMIS data is represented and accounts for institutional differences in programming. So in other words, for different measures, we have to look at how they compare and meld those. But if you ever are curious about how we do that, it's in that metric definition dictionary under the TOPSpro calculations. Also just to be clear, if you're curious, there is a COMIS section as well that will clearly state how those work.

Super. And I think I'm going to hand it over to Jessica. She's going to really talk deeply about the metrics. And Jay will also participate in this as we talk about how we match the different data sets.

Jessica Chittaphong: Thank you, Blaire. Yes. So we're definitely going to be focused a lot more on how we've interpreted the CASAS data export for the CAEP program into the adult education pipeline. Just wanted to remind everyone that we get essentially a special export from CASAS TOPSpro Enterprise. Jay and Debalina help support that as well.

And some of the data elements, and you might have noticed some of the naming on the metric definition dictionary that Blaire had just walked through, might not be familiar to you. And that's because that's just how the data element is presented in this particular expert that's provided for us to support the CAEP program. Some of the data elements are pulled directly from the update records or the entry records, but some of them are pre-calculated for us from the CASAS team. So I just wanted to point that out.

And the definitions that we'll be going over in a second are specifically the TOPSpro Enterprise calculations. But please remember that these are often layered with any outcomes that the student may have achieved in the COMIS data sets, so in your partner community college or elsewhere. So I just wanted to say that. We're highlighting the TOPSpro Enterprise data this presentation, but any difference in numbers may be related to the student achieving the COMIS outcome as well.

So let's get started. So the first definition that we wanted to go over is really how do we identify students in programs from the CASAS export? And for those of you who are familiar, we're really looking at the instructional program. And so those listed on the left are what have been identified as the California Adult Education programs. And we get those as part of the student record that comes over.

The high school diploma and the high school equivalency programs are grouped together, and that's how we identify adult secondary education as its own program. And so we're really doing a lot of information. We do look at the instructional program first, and then we look at how many hours of instruction the students received in a particular program year. And in the export that we receive, it's reported to us as the total number of programs for that student in the year.

If the student has 12 or more hours, they're identified as a participant, meaning they met a specific threshold of that 12 hours. Participant definition is very important, because it is used as the main denominator for most of the outcome metrics that will be displayed later on. So we really only want to track outcomes for students who met that 12 hour threshold. If the student had one or more hours or they were flagged as having any type of service, then they are counted as a reportable individual or adult served.

This is a broader category of students, because we do want to measure how many students we're able to serve with or without them engaging in 12 hours of instruction. The reportable individual metric is used as a denominator for the barriers to employment metric. So that's kind of where that shows up in the dashboard. So I'm gonna pause here and making sure that--

Jay Wright: One thing I'll just add real quick is I think you're just trying to talk about how the bundle is set up from TE to LaunchBoard. But this is a good example, and I'll just say this one hour versus 12 or more hours we've talked a lot more about here in recent months. And I think it will be part of a new NOVA. So just so you know, there is a new report coming up that will provide more clarity and distinction between those with one or more hour versus those that are participants in 12 or more hours. I do think this distinction is something we're going to be concerned with a lot more over the next year.

Jessica Chittaphong: Thanks, Jay. Yeah, and just to answer Jodi's questions about the hours of service, we are not collecting hours of service. This is specifically hours of instruction. The service is just generally a check mark of if they got the service or not in that particular year.

OK, great question, Kelly. We have a question about COVID concerns and how that's impacted the positive attendance hours on the community college side. So on the community college side, we track the hours of instruction using positive attendance hours. And for those of you who know, positive attendance hours are generally in person. And that's been very disruptive due to the COVID shutting down and switching to online instruction. For this past build, since we only have '19, '20 data we're able to capture the fall data pretty good, but the spring data is essentially what is messing us up on the COMIS side of things.

So the way we are addressing that for this build is that we actually got rid of the thresholds, our thresholds right now for spring 2020. So you'll be able to be counted as a participant even if you don't meet the 12 hours for spring 2020. Now, there are larger conversations above my head to talk about how should how should we approach this for 2021 and tracking that on the dashboard.

I think the state is engaging with some non-credit field practitioners to come up with some solutions and some guidance for that. So definitely keep that in mind. But it is something that we've tried to alleviate for this particular build as well. So hopefully that answered your question. All right.

Blaire Wilson Toso: I think you've got all the questions, Jessica.

Jay Wright: Yeah, I'll just say Neil added a few comments from a state perspective.

Jessica Chittaphong: Yeah. More research needed. All right. So going forward, these are what we identify as progress metrics. And I've laid out a few ways that we've dealt with how we use the CASAS data on tracking completion of one or more educational function levels, completing the workforce preparation milestone, and completing the occupational skills gain.

So for the educational functioning levels, we're really limiting our student universe to participants. Now, if you remember, participants are students with 12 or more instructional hours. And these participants need to be from either the ABE, ASE, or ASE-- ESL or ASE program areas. And they have to have a qualifying pre-test and other requirements needed for the NRS table. And that's something that CASAS pre-packages for us. So there's a specific flag that flags that for us.

And then they're flagged as having completed the level. Again, this is calculated from the pre-test and a post-test. And on our side, we just get a one or a 0 essentially saying whether or not the student was able to complete a level for that particular program year. And so for the educational function levels, there are some limits to the student universe. But they come to us, essentially, in a prepackaged way. But we do look at specific program areas for that.

Jay Wright: You mind if I add something here? I'll just say with apologies to Cox Communications, we've kind of got it to where we put it together in what you might call quote unquote "bundles." And I think that works better. So this example is the one I like to talk about the most, because it deals with NRS. There's obviously lots and lots and lots and lots and lots of NRS minutia that goes into all of those cells in table four. So instead of trying to put it all together in an export, we sort of pre-package it and call it completed level that kind of has all the NRS minutia built into the package before we pass it over, and that seems to work better.

I'll just add, we've had to try to boost this up a little bit, we've had several meetings since the beginning of 2021. And for my two cents, that's been a little bit of a shift in how we've been transferring the data between TE and LaunchBoard is a little less minutia and a little more focused on bundles. That is, we package it in bundles for things like the CAEP summary. So we just pass that as a bundle to somebody like Jessica, and that's a lot easier to put into LaunchBoard, I think. And it also makes sure that those little inconsistencies that people have said will be minimized.

Jessica Chittaphong: Yeah. Thanks, Jay. So are there any questions about the ESL one? Because I know that's been a point of contention sometimes. Doesn't look like there's any right now.

So for the workforce milestones and occupational skills gains, we're just looking at all participants no matter what CAEP program they're in. And we're looking at specific learning results that are corresponding to workforce preparation and observational skills gain. And these should align with a lot of the presentations that Jay's done probably in terms of the bubble charts, which has definitely been super helpful for me.

OK. So talking a little bit more about-- how does MIS show gains? Great question, Maureen. So there's two ways that MIS shows gains. There's the SA07, which is the student assessment, which is similar to the pre and post-test. But SA07 is recently going to re-brand to something called AAO3. Blaire, you can correct me about number. Which is adult ed specific, which is going to be adult ed specific file. But essentially, it's going to do the same thing. So that's how they can track assessments on the MIS side. There's also, of course, progression, which we track using an element called CB21.

Now, a few years ago they aligned the CB21 levels with the federal ESL levels. And so we look at the course enrollment from last year to this year. Yeah, Doreen, sorry, it's not meant to a MIS specific presentation, but since the questions where answered. CB21 is the course element related to how far below transfer level a course is. And with the passing of AB 705 on the community college side of things, there was a lot of effort to make sure that the CB21 levels are aligned with the educational functioning levels at the federal level for math, English, as well as ESL.

And so what we look for is course enrollment in this year. So if you're starting at one CB21 level this year and then in the next year you have a higher level, CB21 level, that's what we consider a course progression. And that would be counted as a ESL gain. And yes, thank you, Neil. We do have a recorded webinar specifically for COMIS coding. So that should be a great resource.

Jay Wright: I wanted to jump in and answer Jodi's question, because I think what I said might have thrown her off and made her ask this question. Yes, we do include CTE in the CAEP reports for pre post. And NRS, obviously that's a big time no way. But what we do is we use NRS logic to place CTE. Making a long story short, they generally show up on the ABE side of those CAEP tables. In this case, CAEP table four. So know that if you're testing CTE for the reasons you mentioned, yes, that should be captured. It's captured in that TE export. So that nifty little bundle I mentioned. Yes, we're using NRS, mind you, so but when it's from the CAEP reporting side, yes, it does include those non WIOA programs if you happen to be testing those students.

Jessica Chittaphong: Thanks for the clarification, Jay. All right. I don't think there's any other questions. Blaire, please stop me if I missed any.

So let's jump into transition. So for transitions specifically to ASE programs, we're looking at transitions moving from ABE and ESL into ASE. And then as I mentioned before, we look at enrollment in one of these two programs, instructional programs, high school diploma and high school equivalency. Again, the benefit with having past year's data is that we can track specifically first time transition. So students who transition for the first time in that year. That's what we're counting here.

For a transition to post-secondary, the student universe is meant to match what we're tracking for the ESL gains. So we're looking at participants in specifically ABE, ESL, and ASE who get flagged as having a transition to post-secondary. Again, this is one of those packaged data elements that we get from CASAS. And from my understanding, it's a mix of learner results as well, learner results from the work and the education buckets.

Jay Wright: Can I clarify something here? Because this is actually a long awaited slide for some. I don't know if people realize it. But when we talk about TE CAEP outcomes and we talk about transition, and I know you've seen them whether you'd admit it or not, those little diagrams that have the lines going from the blue boxes to the red boxes and all that. There's one that we always list that we always sweep under the carpet that people like me say is always quote unquote "under the hood."

There are no checkboxes for you to mark this. So Jessica's giving a good synopsis here of those transition outcomes that require no checkboxes. This is the under the hood. It's not really something that you see in a TE report. But in these exports that we send to LaunchBoard, we have some of these bundled groups that they put that will show who's transitioned to ASE and who's transitioned to post-secondary. So I'll just say for some of you who are trying to keep us in check, this is a slide you've been waiting for five years and now you finally have it. This explains what that under the hood answer that I always give really means.

Jessica Chittaphong: Yeah. Thanks, Jay.

Blaire Wilson Toso: Jay, you got affirmation that people have seen them.

[laughs]

Jessica Chittaphong: Yeah. Nice. So we have transitioned to post-secondary flag, which Jay just explained to us a little bit more. And then for transition to post-secondary, this also includes actually enrollment in a CTE program. So the post-secondary transition includes all of that as well as enrollment in a CTE program. And that's something we track both in the CASAS side and the COMIS side. Again, we're looking at first time transition specifically.

For transition metrics, I do want to say that on the adult education pipeline, we allow for students to transition in the following year. And so that's why the transition metrics maybe are lagging by one year. So right now the dashboard's supposed to have '19, '20 data. But for the transition metric, it's only going to show up to '18, '19. And that's because we give them this year to transition or the next year. And that's giving them a little bit more runway so that we're able to capture this outcome more accurately.

So the next slide is really just breaking up the post-secondary transition into specifically transition to CTE and transition to credit college. So these two metrics are probably not new to CASAS folks using the CASAS reports. But these are two new metrics using the adult-- two new metrics that will be displayed in the adult education pipeline. So transition to CTE or looking at enrollment in a CTE program or those specific learner results.

For transition to college for the CASAS side of things, we're looking at the learning results transition to credit specifically. But again, we're relying a lot on the data match to be able to track students that go into a community college credit course. Looks like there's no questions?

Blaire Wilson Toso: No questions, Jessica.

Jessica Chittaphong: Thank you. So this is where we're going to talk a little bit more and get into the nitty gritty of the CAEP summary report and my interpretation of which metric the best with which column and in the summary report. And hopefully this will help clarify some things. And I want to give a disclaimer. I'm sorry. We're data people. We're not designers. So these next couple of slides may give you a bit of a headache. But these are the best-- this hopefully should still be useful information.

Blaire Wilson Toso: Hey Jessica, Emma just did ask about transition to college. Is that data entered by the adult school or the college?

Jessica Chittaphong: Yeah, so if we're looking at the CASAS data exports for the learner results, I believe that it's entered by the adult school.

Jay Wright: Sorry, I answered that, but I answered it to Emma by mistake instead of everybody.

Jessica Chittaphong: Thanks, Jay. So yeah, so the learner results are-- I think Jay probably answered your question. But we also make sure that we're tracking the enrollment in the community colleges. And that would be college data specifically. And we're looking at the enrollments in credit colleges. So we only count the transition once. So if you transition in both places, the student only gets the one transition. So hopefully that's clear. Clear as mud.

Blaire Wilson Toso: So yeah, and there's also why not report the transition data the first year and then update it the second year? And also to Burr's point, yes, they are self reported, which is one of the reasons why we do the match for verification.

Jessica Chittaphong: Yes. And we realize-- there is a couple of pros and cons, Steve, with displaying partial data. I think because the partial data will have to get updated the following year, we felt that that might be causing more questions rather than helpful. So that's kind of why we went with the plan of just waiting for the second year to pass before we published the data.

Blaire Wilson Toso: Thanks. I think that that's it.

Jessica Chittaphong: All right. So for the CAEP summary data and AEP data, I think in the CAEP summary reports, you're able to see all of the outcomes by program area. This is just kind of a breakdown of how we've aligned with the program areas on the adult education pipeline for your outcome metrics. You can drill down by the four main program areas, which would be ESL, ABE, ASE, and CTE. But it's not available for every metric, I would say.

This is for the program enrollment metrics. We're only counting them if the student is a participant with 12 or more hours. So the numbers that show up for the program areas in the adult education pipeline are numbers based off of the number of participants in that particular program area and not all reportable individuals. So that's one clear difference. Another difference is specifically for participants enrolled in workforce reentry or workforce prep, I think it might have been renamed, or pre-apprenticeship, we're taking that number out of the participants enrolled in CTE. So again, not out of the full student universe, but specifically out of the CTE program area.

Blaire Wilson Toso: Jessica, there is a question about state match of data. Emma would like to know, how would we have access to that data at the practitioner level? Are you talking about-- yeah.

Jessica Chittaphong: So I'm looking at how would we-- state match data. Are you asking for student level data so that you can look up specific students? If that interpretation of the question was right, I think my answer would be for the LaunchBoard, it's a public-- [reading chat comments] the state level data match. Between CASAS and COMIS data sets? Neil referred to it. Sorry Neil. What did you say?

Blaire Wilson Toso: While you respond to that, I get to answer the easy questions here. Jessica is that really one-- just as far as when you're talking about the difference between the two rows, there on the upper row is CASAS data, the TE data that is reported into TE and how it shows up on your summary report. And then the bottom row with the bluer one, it is actually how it gets represented on the adult education pipeline.

So if you're looking at the number that you report into CASAS and it shows up on your CAEP summary report, that will be in that upper row where you've got enrolled in ESL 1534. In AEP after the matches and the deduplication, the number that shows up enrolled in ESL for the AEP is that second row. And this is really to demonstrate how things that get reported and then how they end up showing up on the dashboard.

Jessica Chittaphong: Thanks for explaining the table, Blaire. I forgot to go over that. Yes. I'm sorry. I'm still not understanding the state data match. I think for the TE COMIS data match, I'm not sure we can share that particular information. I think at the state level, we can definitely provide counts that come from COMIS and counts that come from TOPSpro Enterprise separately. But again, I'm sorry if I'm not really sure what the question is still.

Oh, asking about data about transition specifically. Who ended up from one to there. Yeah. And I'll just say that the AEP dashboard is meant to be a public dashboard. And that's kind of why not all information is readily available. If student level data specifically for transition is required, I think that's something that needs to be worked out with the state. And that's not something we can provide from the LaunchBoard. It needs to be something that the chancellor's office or the CDE requests.

For the other dashboards, not adult education pipeline, we have been able to provide student level data to specific colleges through a service called data on demand, which is where we work with the chancellor's office MIS departments directly to give that information to them. And then they facilitate the distribution to the specific colleges. So hopefully that process is clear.

OK. So again, you'll see sometimes on the dashboard, you'll see some stars show up and you'll see the stars show up on this table as well. And that's because, again, the dashboard is public data, and anyone can access it. So to protect student level data, to protect students, we suppress any cells that are less than 10. So if there's less than 10 students in that particular outcome or in that particular drill down category, they may not appear on the dashboard. And that's because of some federal laws that protect public information and what student information can be displayed. And so if you see a little star there, that's probably why. And it does affect smaller institutions more, unfortunately. But there's really no way around that particular rule.

We are running short on time, and so these slides hopefully can be a great reference for you. Again, I tried to explain-- this is a little bit more on the adult education pipeline focus. So this kind of explains what the differences are. We talked a lot about EFLs already. For the outcomes, we've kind of matched the row here that you'll see here and what metric is really connected with that particular column. Sometimes they'll be blank because we don't report everything from CASAS into the adult education pipeline.

And I would love to go through this in detail, but I think we're short of time. So I don't really want to take up the remainder of the time to do this. So Blaire, would you like to summarize and close us out and make sure that we've answered all the questions?

Blaire Wilson Toso: Sure, thank you. I think we have-- and I'm just scrolling back up. Yes. So in summary, I think what Jessica was going through, and had we had more time, and we'll remember an hour and a half next time, that there are those notes. But in summary, the differences often come down to who's included in the two different data sets. And it's really important to remember that we are combining two data sets and neither data set belongs to the adult education pipeline, which also then makes it a little difficult to trace across some of those pieces that you were asking about, Emma. And also because it's a public data set that anybody can access.

So the differences are who's included and how it's ported. The data sources, we're matching different data sources and making matches between different metrics and trying to make them the equivalent. The way we define those are sometimes different between COMIS and the TOPSpro. And so again, it's like how do we make sense of those and make them equitable between two systems that don't necessarily report the same or use exactly the same nuances? The reporting structures are different between the COMIS, the community colleges, and the TOPSpro. And then AEP, again, we have different reporting structures. So they're going to look different.

And also the overview of them or the oversight of them are different. AEP has different requirements on how we match and report data, like Jessica alluded to earlier. Similarly, our validation and deduplication. Some of that self report data doesn't get into the AEP because we have conditions in which we have to meet a match. For example, matching that enrollment into the community college credit course or the completion of a course or the social security number match.

Those are all the ways that we at the adult education pipeline we have to validate that data. And there are different systems for reporting for that validation occurring in CASAS and then in COMIS and once again in AEP. And also one of the larger ones that you've already noted and we've discussed is that while we're working on it, there is no survey data included on the AEP as of yet. Stay tuned. So we're hoping that will happen. Next slide.

Right at 2 o'clock. Go back and review the AEP purposes, which is one of the questions that we always get. And explore how you want to use it. We do not say that it's a singular, all purpose tool, and that's why we want these differences to be really clear, so that you can use each different data set as appropriate for your own planning and continuous improvement.

So it's 2 o'clock. We're going to move right to thank you. Please, our email are there. Please do not hesitate to email us any additional questions as well as if you would like a deeper walkthrough or explanation. We do walk through people's CAEP summary reports and compare them one on one or institution or by consortia. So reach out to us if you'd like additional information. And I'd like to just take a moment to thank Jay for joining us and really helping us review the PowerPoint, our data, our points, and the way we're making sense of the CASAS to AEP.

Jay Wright: Thank you.

Blaire Wilson Toso: Yes, Kelly, I reiterate what you say. So much data, so little time. Thank you all for your questions and for thinking about data. We really appreciate that.

Veronica Parker: All right. Thank you Jessica, Blaire, and Jay for joining us this afternoon, and thank you all for participating in today's webinar. In the chat, we have posted several links, one being the evaluation link. Please take just a couple of minutes to complete that evaluation, especially if there are topics that you would like WestEd to explore. They are in the planning stages of their next PD for the fall. So if there are things that you feel are still left uncovered, let's take the opportunity to address them in the evaluation so we can get those responses over to them.

We also posted a URL for the student data reporting section on the Cal Adult Ed website. That's where we will post the recording for this webinar as well as the PowerPoint presentation. We will also send a follow up email tomorrow that will include the recording, PowerPoint, as well as the evaluation. So if you know of people who were unable to join or you would like to utilize those resources for future reference, they will be available to you.

WestEd will be back with us I believe it's next Tuesday or Wednesday. Next Wednesday, I believe. But you'll receive notification. If you haven't registered, please be sure to register. We have posted the URL of our registration page as well as a link to register for their next webinar. CASAS will also be joining us. Jay in particular will be joining us next Tuesday for part two of his data dive series. So again, if you haven't registered for that, please be sure to do so.

So thank you all very much for your time, your participation, and we will see you next time. Have a great afternoon.

Jay Wright: Thank you.