Blaire Toso: Good afternoon, everyone. We are one minute into that afternoon on Friday. So thank you very much for joining us. I know that most of us, while we may not have the option to, would maybe be out enjoying the sunshine or going out for a coffee with our friends.

But we're instead going to talk about what we think is really exciting is that the AEP 5.0 dashboard is released with updated data. So welcome. And as we just want to say, Jessica, who is our AEP guru, will be walking through the dashboard and noting some of the changes and the differences that have occurred.

And we'd like to hold questions-- I know everybody has lots of curiosities and questions about this and how it's going to affect your data and maybe what you're seeing. If we can hold those to the end, we have plenty of time built in for questions. And we'd appreciate that. Thank you. Next slide, please.

So we are here today. The Wested team that works on the CAEP dashboard build, and professional development associated with that. And that's myself, and then there's Jessica Keach and Ayanna Smith. And you'll hear from each of us today. Next slide, please.

We're delighted that Mayra Diaz, and her colleague, Lindsay Williams, are here. And I just like to pass it over to them.

Mayra Diaz: Hi. Good afternoon, everyone. Just want to welcome you all to today's webinar presentation. My name is Mayra Diaz. And I'm the CAEP program lead with the Chancellor's Office, and my colleague, Lindsay Williams and myself work collaboratively to support the statewide adult education programs. And in today's presentation, you will get to hear about the adult education pipeline, which has now gone live. So I will pass it over to Blaire and Jessica. Thank you.

Blaire Toso: I think it's Ayanna. You're up on deck.

Ayanna Smith: Yes. So just a quick overview of today's objectives. So the goals today are to review the changes implemented on the adult education pipeline. Specifically, we're going to highlight some new metrics and features, key metric changes, as well as improve user experience features.

Blaire Toso: Thank you. So we're just going to level set. I think almost everyone on here will know this, but we always like to go back and review like what is the purpose of the adult education pipeline? And it's really there. Improving education and informing practice, with the ultimate goal of economic mobility for adult learners.

And that is along an educational or a career or work pathway. But in order to do this, we look at the learner journey and the program data across the multiple years, as well as the multiple audiences that might use the dashboard. And for us, we really are looking at administrators, consortiums, program administrators, instructors, and legislators.

But we also know that other people will take a look at this because the data is open and available to researchers and the general public, which is really a wonderful gift that we give them to help people understand what adult education is all about. And as they're, we all know we're in the throes of three-year planning that the board has used a lot for program planning, as well as to track progress.

Not just on your annual plans and your updates. It's also about what's going on in your program. So we think those are all really important. But we also think that one of the best things that the adult education pipeline data does is to prompt and question, and then answer those questions in one way or another.

But really, we believe that it's to help you explore your data and identify things that make you go, hmm, or aha. And then go from there. And also, we use a dashboard to help people identify where you might be struggling, where you might need some technical assistance, as well as for us to reflect on our own procedures. Next slide, please.

The highlights, which Jessica is going to go into further, are those pieces that combine the data to offer this holistic consortium view, as well as a regional institutional college and district and statewide view. Right? There are all these different views, and so we can look at them in different ways.

And we can look at those as they are aligned to the student journey, which includes everything from the entry, the progress, transitions, completion of credentials, and employment. And we've been asked before does a learner have to meet all of those spots on that transition? Do they have to make the full journey in order to be identified and reported on the dashboard?

And the answer is no. As soon as you submit that data or you begin to enter data in MIS or TOPSpro, they add it and then submit it to us. We will register them regardless of how short or long that journey is. It ties together multiple data sets. It's very unique in that way, where we're pulling MIS data, K-12, TOPSpro data, or TOPSpro data is actually from anyone who's WIOA II funded, or is chasing to use that.

And then also, that EDD wage data. We're really proud that people are actually always wondering about the complexity of it, and what it does. Well, it's really the only complete source of college, non-credit in K-12 adult education student data and outcomes represented holistically.

It is that data source for all of the different institutions and consortia. And we do like to point out that this is not real-time data. That when people are asking us for their data in the instant, this is not the source for that. You will want to look at some of your own reports or other reporting sources. This is really a snapshot of where you have been across the years and where you might want to think about going. Next slide, please.

As you know, the dashboard is refreshed each year to add another year of data. And as I said, this is once yearly. And again, to reiterate, it provides an overtime view of the data, as opposed to a real-time view. And the yearly process for update allows for full data sets from CCAE and MIS to be populated.

And it accommodates the different reporting structures. For example, the community colleges submit their data, but not to us. But then they get the opportunity to review it. And then, they get uploaded to pass along to Westend and ERP to populate the dashboard. Next slide, please.

And so why do we do this? It's not only to add more data, but it's also we have to realign to the other suite of dashboards on launch board. We also receive a lot of feedback from the field. Maybe we've identified a coding error. And then also it's when we look at our definitions, and this is truly about that feedback from the field or in conversations with our partners and stakeholders that change or that we need to make a definition more precise.

So we are continuously not just updating the data. We are continuously reviewing the work based on feedback from the field stakeholders and our own processes as systems change to update and reflect that in the dashboard. And now, I know. I get the lead-in and people kind of can sleep through this part.

[laughter]

You know, I'm going to hand it over to just Jessica who will really talk about the exciting new features and views of the dashboard.

Jessica Keach: Great. Thank you, Blaire. I'm going to start by walking us through some of the new features in AEP 5.0. That's kind of what we refer to it as. The new version of the dashboard that is live right now. So you go to the next slide, Our top five institution charts.

So previously, you were able to view the top five institutions based on the percent of students, who are meeting that outcome. Users are now going to have the ability to toggle between number and percent of learners for the top five institution charts. This is going to help you identify institutions that you may want to reach out to or collaborate with to emulate promising practices. Next slide, please.

OK. So the Program Type Drilldown, this is a feature you're already likely familiar with, but we've added this option to additional metrics. So as you've seen, this feature allows you to disaggregate by ABE, ASE, ESL, and CTE, depending on the metric. Next slide, please.

OK. So now I'm going to walk us through some key metric changes that you can expect to see in AEP 5.0. So the first one I want to talk about is reportable individuals. So in order to align reportable individuals with the California Adult Education Program, the one-hour threshold used in the calculation has been restricted to one hour in a CAEP program area course.

So you can see the highlighted changes in the definition at the bottom of the screen appear in blue. So I'm going to go ahead and just read that definition aloud. So "the number of learners who have one or more hours of instruction or positive attendance hours, across all enrollments in an Adult Education Program, and/or who receive services at a K-12 Adult School or non-credit services at a community college."

So previously, any non-credit enrollment was included, excluding supervised tutoring and study skills top codes. And additionally, that asterisk you can see with non-credit services, previously credit-only students who had received non-credit services were being included in this metric. And that will no longer be the case.

So if you go to the next slide, we can talk about what this looks like, and what this change means in the data. So you're going to see a more accurate reflection of non-credit community college case students. So it's going to result in a reduction of reportable individuals for non-credit community colleges across institutions, and across years.

Reportable individuals will also count all learners who receive some kind of service provided through an adult education program, except students that only had credit enrollments. So this results in a reduction of reportable individuals for specific noncredit community college institutions that previously observed inflated counts. And finally, you're going to see a reduction in the number of students with barriers to employment across metrics, because reportable individuals is the denominator for those metrics.

So we're all aware that COVID has had an impact on many aspects of our lives. And data and reporting is no exception. So I want to acknowledge that the Chancellor's Office recognizes the continued limitation that the noncredit community has faced in reporting student attendance hours in SX05 for noncredit distance education courses. In last year's build of the adult education pipeline-- and that was version 4.1-- included one term of COVID-impacted data.

The Chancellor's Office consulted with the noncredit field and determined that the benefit of maintaining the hour threshold requirement for noncredit students during COVID-impacted terms did not outweigh the cost of excluding those students due to their institution's inability to report student attendance hours. So therefore, during spring 2020 only an enrollment record versus hours reported was required for noncredit community college students to be counted in metrics that require an hour threshold. This year's version and this year's dashboard billed carries over this decision for the 2020-21 year of data.

So you can see the impacted metrics in the blue box on the right. And the most important ones, being the ones that you're probably most familiar with, are reportable individuals and participants.

So what does this mean in the data? You're going to see a specific impact to noncredit community college student data. It's a consistent approach. It was the same approach used in AEP 4.1 for spring 2020 and it treats all colleges the same. It also aligns metric definitions between dashboards across the LaunchBoard board. So this is the same strategy or solution that was applied in SSM to count noncredit students.

The way that the coding exception manifests itself at the institutional level does vary by institution. So for example, some institutions show expected declines in participant counts compared to pre-COVID years, whereas other institutions may show some unlikely increases. It's really institution dependent how this exception manifests itself in your data.

Additionally, the coding exception means that the participants metric JE 202 may include students that did not complete the 12 or more hours for noncredit community colleges. This allows for a broad universe of students to be able to obtain outcome metrics. So outcome counts are going to be available for all institutions. However, because there's a broader universe of students, you may observe lower rates of outcome of attainment.

So this is a look at how we're flagging this on the Dashboard. So you're going to see in notes on metrics where the coding exception has been applied. So for example, Reportable Individuals has this big note underneath that talks about everything I just explained.

And then any metric that uses participants as a denominator-- which is all of the outcome metrics in the Dashboard-- you'll be able to see, when you click on that blue question mark for Additional Information, you'll be able to see the information around the coding exception as well. So everything is very transparent and very clearly flagged on the dashboard.

Some additional changes that we've made-- the age groupings have been adjusted to align with the National Reporting System categories. So I want you to note that these bands do differ from other Dashboards in the LaunchBoard, but are really more specific to the adult learner population. In English Language learners, AE 305, the COMIS calculation has been updated to align with "ever enrolled in ESL" using the ESL program area definition in participants in ESL, which is metric AE 100.

Long term unemployed reportable individuals in AEP 4.1 a match against the EDD UI wage file was conducted to determine employment status and count students as long term unemployed. This process has been removed from the metric calculation.

And then, low literacy reportable individuals-- I want to flag this one, because this is different than what we discussed in our pre-release webinar. But in AEP 4.1, students ever enrolled in ABE or ASE were included as low literacy reportable individuals. And after feedback from that pre-release meeting, consultation with the field, and in alignment with the way CASAS TOPSpro reports students for WIOA Title II reporting, ESL has also been added to that calculation in AEP 5.0.

And then finally, in terms of the metrics, I want to talk about some calculation fixes. So the first one I want to mention is the transition to postsecondary. This is AE 602.

Previously, the TOP code for Health Professions, Transfer Core Curriculum was being coded as CTE. And Additionally, the calculation was counting students first time in CTE based on the first time they were a participant in CTE, not just on their enrollment version any enrollment in CTE, and this has been corrected in AEP 5.0. These last two fixes were actually identified by feedback from the field. And we were able to squeeze them into the last two weeks of the build.

So I want to pause for a second and just acknowledge the really important work that you will pay and close attention to detail that you will pay to this dashboard and to this work. And because of that, we're able to investigate each question, look into the code with our coding partners, and they often lead to improvements in the Dashboard. So I just want to say thank you before I talk about these two changes.

So Completed a Postsecondary Credential-- AE 625-- a coding error has been corrected to align it to the definition listed in the MDD. In the prior build, the code did not capture all intended certificates and awards. And then, we've also adjusted the time horizon for the COMIS calculation to align with the time horizon of earning those awards fits in the TOPSpro calculation. So it's earning an award in the current or any subsequent year. Those are now both aligned to each data set.

And then, Earned a Postsecondary Noncredit CTE Certificate-- a coding error has also been corrected here to align with the definition that's listed in the MDD. So in 4.1, the code did not restrict this metric to noncredit certificates in CTE. It was strictly noncredit certificates. So we've added that restriction to CTE. And that's now been corrected in the new version of the Dashboard.

I think this is the last group of things I want to talk about. We've really focused on the user experience and clarity and definitions for this build. So if you'll move to the next slide, I can show you we've done a comprehensive review in alignment of metric descriptions.

We've revised the metric definition dictionary, which I really encourage you all to take a look at. And we've improved our tooltip descriptions. And that's where you click on that blue question mark and more detail pops up. So we encourage you to visit those, view them, provide us feedback.

And then, we've also made several visual improvements in the dashboard to help you navigate it better and provide more clarity and tie things together when necessary. So you can see on the right, it's an example of where we've indented some metrics, on the left, navigation. And that appears across the different components of the student journey.

We have several resources. I'm going to turn it back over to Blaire, and she's going to walk you through some of the ones that we have available to understand these changes and these updates.

Blaire Toso: Thank you. And I am noting people's questions. And we can return to those at the end, just so you know that I'm actually pulling them out and I have half an answer, but I can't present any type at the same time. So let's turn to the resources.

So we do have resources that will support these. I think this is to Janay's question, are these noted on the Dashboard? And as Jessica said, some are noted on the dashboard. Either they are explicitly on the front where you click on the question mark. We also have a Changes in. Definitions document that's posted to the Adult Education Pipeline Dashboard.

If you click on this, it will take you there. And Jessica we'll show you later, but it's at the end of each page that you access on the Dashboard, regardless of where you are. When you scroll to the end, you can Click Here To View Resources, and then Find Out More About the Data. You will see that it's updated for the changes and definitions for April, 2002. And that's where you'll find all of this information that Jessica is providing. It's posted there and a document that you can refer to.

The other way to explore that is the Metric Definition Dictionary, which we really think is the primary source of information for all aspects of the Dashboard, including your agency crosswalks, terms used and explained, and importantly for this discussion, the definition of calculation for all metrics on the dashboard. This is really great for coding, but it's also for understanding how things are calculated, and will really provide you the information for both CASAS and for the MIS data.

It allows you a view into how the metrics are understood, how they're comparable, and made as equitable as possible when you're setting two systems together, and then gives you instructions on how you will want to code to ensure that your data is included on the Dashboard. And as I said, if you scroll to the bottom of the page, you'll be able to find that on every page there is.

So just quickly as a summation, I forgot to say, we also have a coding guide and other resources dropped to you. There's several series of infographics that we'll talk about-- CASAS versus MIS data, how it shows up on the LaunchBoard. We have those resources in the Adult Education Resource Library. And there are some really long ones. If you don't want to dig through there, we do have some summative infographics. And you might want to explore that resource library.

So on to a summation of what the changes will look like on the AEP Dashboard, before Jessica goes into a walkthrough on those. The changes are implemented for all years shown. So when we ask, are these just for 2021? Nope. They're retroactively applied.

You'll see those updated numbers, the new displays, and some of those new metrics, as well as the additional information on those question marks. We're continually trying to make the Dashboard as transparent as possible. And that's where you can find a lot of the information, particularly if you're curious about what the numerator or the denominator is, what's a lagging metric, how is it calculated-- that information is oftentimes included in the question mark. The question mark is used, because if there's that much information displayed by one metric, it loses some of the other relevant information.

And as usual, we are always encouraging you to use your data for planning. Use it to ask questions, identify trends, identify gaps, set goals. And then, also as we've added those the top fives, or that comparison feature, you can use it to identify thought partners, other institutions who are either like you or different from you, who are seeing the same trends, or are showing upward trends or strong outcome trends that you are also interested in. You can reach out to them, as well.

So we'll now go into the questions and the discussion. And I wanted to first address Steve's question, when he had asked, who do we refer to when we talk about the field and getting feedback from them.

Steve, we get that from all sorts of different ways. One is very much on the webinars, when people ask us questions, or provide us feedback. For example, Jessica noted that when we did our pre-launch, people who joined that webinar had some questions. And then, had we had a discussion about the way numbers were being identified and compared between institutions and consortia and different data sets. And so we took that feedback and were able to refine some of the calculations.

We do that, also, TA requests and questions that are passed along to us via SCOE TAP. They are also when we do one on one TA calls with consortium or institution directors through other conversations with people. We also had asked if people wanted to participate in a review of the Dashboard when it was in the staging stage, prior to release. And so we had a group of 10 people who provided feedback on the Dashboard in that way.

So Steve, when we talk about feedback from the field, it comes in a lot of different ways. And people also contact us directly to ask questions and also give insights and make suggestions about what they would like to see on the Dashboard.

Jessica Keach: OK. So I think we're going to go do questions in discussion right now. I know there's been several questions in the chat.

Blaire, do you want to just read those out to me? Or, do you want me to go through and read them? How do you want to do that?

Blaire Toso: Yep. I have them. I have them right here. And the first one-- I'm not sure we'll be able to answer your question, Dana. I think it's more the update of data into Nova.

So she was saying that some of the figures for consortium were already populated in a three-year plan, which don't match what is in LaunchBoard. Is this due to the update? And will that data be updated into the Nova site?

Jessica Keach: Yeah. So my understanding of the data in Nova is that it will be updated. But I would definitely defer to SCOE TAP on that, in terms of what's going to be appearing in Nova. But we're working right now to get the data that is from AEP 5.1 to Product Ops, who will be updating Nova.

Blaire Toso: Super.

Dana Galloway: Can I ask another question kind of related to that?

Jessica Keach: Sure.

Dana Galloway: OK. So I've been digging around in our data. And I'm trying to figure out where the figures in LaunchBoard are coming from, as far as the adult schools are concerned. So I'm looking in TOPSpro, and most of the time, I'm able to find the exact same figures, which makes me really happy. But other times, I cannot for the life of me figure out where they came from.

So what would you suggest? Maybe have a conversation with one of you guys individually? Or--

Jessica Keach: Yeah. I would say when you can't figure out why, your first resource is the Metric Definition Dictionary. So there are some differences. But of course, please reach out to Blaire and myself. And we're happy to take a look at your CASAS summary report, as well as the data that's in LaunchBoard, and then kind of walk through it and problem solve to see what the difference might be. So please, feel free to reach out to us.

Dana Galloway: Yeah. That would be great, because we do need to know where it's coming from in order to set our targets so we can keep track of it. OK. I will do that.

I'll try using the Metric Definition Dictionary a little bit more. And I'll dig around and use the tips. And if I still can't figure it out, then I'll contact you guys. Thank you.

Jessica Keach: No problem. Thank you.

Blaire Toso: Could I add to that? When you reach out to us with those questions, invariably, our first response is, can you send us your TE summary report so that we can see the numbers and do a little bit of research on our own? So when you submit those questions, when you're looking at your data, it's really helpful if you include that information for us. I can guarantee you it'll be our first question back to you.

Dana Galloway: OK.

Blaire Toso: So it streamlines that process. Thank you.

Dana Galloway: Great, thanks.

Blaire Toso: Yeah. We had another question. It says, CTE classes for credit will show up if K-12 students transferred or were co enrolled in four credit CTE courses.

Jessica Keach: I'm trying to understand more about what you mean. Whoever asked that question, if you want to maybe come off mute?

Adele McClain: I'm unmuted. I'm happy to answer the question. I'm with the Victor Valley College Consortium region. And I'm Apple Valley Adult School. My name is Adele.

When we started this process, there really weren't any noncredit classes, except for a few ESL. And if students didn't fall into that need, there wasn't a lot to refer them to. So we kind of did a lot of referring to CTE courses, because of course, that would help us with the WIOA metric getting them employed.

However, those numbers never show up, just the few students I had that were co enrolled, or enrolled afterwards, in the adult ed ESL. So that's why I was concerned. I had even reached out to Peter Simon. I said, hey, look. I have all this data. It's personal data. It doesn't have a match in MIS with anything. And why are my numbers so low, when I recognize my students when they graduate or pass on for having completed CTE credit bearing certificates? So that's why.

Jessica Keach: Yeah, that's helpful. I think for answers regarding your specific institution, it would be the same thing. Reach out to us directly. And also, which metric you're talking about makes a difference, as well. So if we're talking about transition to--

Adele McClain: Correct, to postsecondary.

Jessica Keach: Transition to postsecondary. OK. So a transition to postsecondary includes transition to CTE, which is CTE regardless of credit status.

Adele McClain: Oh, it would be there. So that'll work.

Jessica Keach: OK, great. Well, if you have any other questions, you can feel free to just reach out to us.

Adele McClain: OK.

Blaire Toso: Just a couple more questions. One, is there a date for the update for-- I believe, Guillermo, you are talking about moving the data from the Dashboard, having it reflected in to Nova.

And that would be either-- I apologize. I don't mean to deflect. But it will be probably SCOE TAP or the Chancellor's Office. I know they've been working really hard on getting some bugaboos out of there. So I don't know if somebody would like to respond?

Mayra Diaz: I can try--

[interposing voices]

Mayra Diaz: --on that.

Guillermo: I wanted to follow up my own question, there. Yes, we realized the data was inflated, and we notified CAEP that it was inflated, because we have other metrics to identify data. So when we projected our numbers going forward, they're different. They're quite different.

But we also wrote in the very beginning, because it's the only place you can write anything, to indicate these numbers are inflated. And we also stated what metrics we would use to identify our data going forward.

So if you change it, that would then behoove us to remove that statement, because then it would be in line. So it is important to know if you're going to make that change, and it's going to be made in the three-year plan, we sort of need to know this, because I'm the one that entering the narrative. So I need to know when that's going to happen, or if it's going to happen, or leave it alone.

Mayra Diaz: I think in regards to that specific question-- and thank you for your question. We can go back and follow up specifically with Product Ops to see what is going to be incorporated. I know that we're, as we speak, in the middle of all of those updates. And so communication should be coming out soon, hopefully. But we are in the middle of those updates.

And we'll be following up with some additional communication and clarification. So we can take that back and just be able to dig in a little bit further to specifically be able to address your questions. So thank you for that.

SPEAKER Just to clarify on his question, too, we have a very similar thing going on with us. But the narrative won't be deleted out of there, correct, just the pre-populated data piece? But your narrative should stay in there. Is that correct?

Guillermo: Not sure if you're asking me. No, the narrative we wrote in there. And it'll stay in there, so long as those numbers are inflated. But if they're realigned to the correct numbers, I can go back in and remove it. So I don't have to say that it's inflated, because then, it would be correct. But if it stays that way, yeah, I'm not going to remove that statement.

Speaker 2: And from the CAEP TAP office, I can speak to-- when Nova is updated, just like with the governance section and the CFAD-- many consortia had already inputted all of their data. Then, on the back end, the governance section was added. Everything that had previously been entered stayed. The update only affected the section that had to be updated. So if you've done your narrative and the update happens, the narrative should remain in place as you originally input it.

Speaker 1: OK. Thank you both for clarifying. That makes sense on my end. So I know what to do.

Blaire Toso: I'm also thinking that when you're thinking about the numbers, the numbers are populated into Nova from the Dashboard. If you wanted to take a look at what's going on with your numbers, it might be interesting. I don't know precisely how they will be imported into Nova, but they should be the same, since that's what are used to populate. So it could give you a preview as to what's going to be in Nova.

Jessica Keach: All right. Were there any other questions? I was looking through the chat, specifically for the Dashboard release.

There was a question. Can you clarify if the definitions for student enrollment and student participant counts continues to be the same as definitions, specifically is participant still a student with 12 or more hours of attendance?

So the definition remains. The coding exception applies only to COVID-impacted terms-- just spring 2020, and then the 2020-21 years. This isn't a permanent solution. It's not a definitional change. It's specifically applied to a period of time. Anything outside of that period of time does not apply the coding exception.

And I also saw there was a question around completing what I was saying about the transition to postsecondary. I put that definition in the chat. So you can view that.

It's also AE 602 in the Metric Definition Dictionary. So transition to postsecondary is among participants in ESL ABE and ASC. Those who transition to postsecondary by enrolling in either K-12, adult education, or community college CTE course-- which is regardless of credit status-- or non-developmental credit college course for the first time at any institution within the same or subsequent year.

And then the college CTE is noncredit and credit. That's correct.

And to be clear, the participant exception does not apply to K-12 data. That's correct. The challenges for reporting were specific to noncredit community college institutions. So the coding exception does not apply to the K-12 data, only to noncredit data. And we've made sure to put those notes right there on the Dashboard. So hopefully, your users will be able to read that.

And it says what I've explained, but that's a good point. So be careful when interpreting that data, as there's different coding exceptions applied.

Speaker 3: Just to add a little bit to that conversation, if you're looking at data and you have that asterisk, and you really can't compare, does it make sense to even show the data at the risk of creating confusion? Because you're not really looking at true data the way we've been looking at the previous year. So really, in terms of trend data, or any kind of planning data, you don't want to even look at that data, because it is skewed.

Jessica Keach: Yeah. So just from my personal research perspective, data always has caveats. And so just because there is that caveat, I wouldn't say we shouldn't look. So regardless of any of the coding exceptions that have been applied, across the board, at the statewide level, we've seen drastic declines in enrollment for our adult education providers, or our adult education students. And so I think regardless of what coding exception has been applied, it's important to look to see what strategies we as a system, and as consortiums, can employ to really engage some of those lost adult learners.

But I definitely hear what you're saying. Your point is well taken. And that's why we've really tried to make sure to be clear in all of the documentation and on every visualization to include that note.

Speaker 3: And just as one further step, I think you might have said this. I might have missed it. The issue about recording hours in noncredit has been addressed so that we don't have to ask for further data? So they are tracking. College noncredit side is reporting hours to be able to track the 12 hours, if it's online, or whatever?

We go into another shutdown of college, some colleges still don't have in-person instruction, I think. And so has that been addressed? Or, is this going to continue on until COVID is over, which I don't think is going to be for quite a while?

Jessica Keach: Yeah. So I'm not the Chancellor's Office. I do know that there have been some changes to SX05. And there are some ongoing conversations on how to make sure that institutions are able to report. So it's not a permanent solution.

Speaker 3: Thank you

Mayra Diaz: Thank you, Jessica. And just to chime in, that is definitely an area that we understand is a big issue. And we are, at the Chancellor's Office, definitely diving into that and trying to explore some solutions. And as we do that, as communication and solution has become available, we will definitely be communicating that to the field.

But we do recognize that this unprecedented times with the pandemic has-- we're just seeing the results now. And so we're trying to definitely dive in to see what we can do to address some of those issues. And it is something that is on our radar. We're seeking feedback. And we're looking to further explore some possible solutions. And we will be communicating that with the field, as well.

Jessica Keach: Thanks, Myra.

Are there any other questions that came through the chat, Blaire, that anyone has about the new data?

Blaire Toso: I don't see anything. Anybody else have any questions? You're welcome to unmute and ask further questions.

And I have to apologize. Jessica is not going to do a walk through. You all know the AEP Dashboard. That is for one of our overviews. And this is not the overview. So you all do not have to sit through us showing you how to use the Dashboard, which is clearly something you all already know, since you're asking such good and precise questions. So my apologies

Jessica Keach: All right. If there is no more questions, should we wrap it up? I can put my email in the chat. So if there are any questions that you all have that come up, you can feel free to reach out to me, as well as Blaire, and any of the folks on the--

Speaker 4: I have one comment. It may be something you thought about before. But talking to superintendents, it would be useful if we had a tool that compared LaunchBoard to Dashboard, because that is the language our superintendents, i.e. Our voting officers, speak. And all of us are adult education administrators. So we're always having to translate that piece.

I'm just saying. It might be useful, probably not that difficult to do. But it's something we each do, probably, and don't even think about it. But it would be useful to say, this is what you compare as a k-12, and this is what you compare as a k-12 adult ed. Just food for thought. Since nobody else is talking, I'm putting it out there.

Jessica Keach: Are you referring to the California Data Dashboard for--

Speaker 4: Oh, absolutely. So that's what superintendents have to live and breathe by. And they look at mine and they're like, Chinese.

Jessica Keach: Well, thank you so much for that feedback. That's really helpful.

Speaker 4: No problem.

Holly: OK. And with that, I think will turn it back over to Amanda Lee to close us out.

Amanda Lee: Thank you, Holly. I actually tried to start, but I was on mute. So I apologize.

So thank you so much to the Chancellor's Office for being present today and West Ed. We do really appreciate all of your work and taking the time.

Attendees, thank you for coming and asking very targeted questions. It does help improve all of our work. I did post an evaluation link. So please, take a few moments, if you will, and really give us some targeted feedback. It helps us be better and inform the professional development that we set up.

As you can see, we have some upcoming webinars. There is a link posted. So when you can, please feel free to register. We have the next one on May 10 with Adult Education Pipeline Data for Continuous Improvement in the Three-year Planning.

And with that, I will go ahead and close it out. Thank you all for your time this afternoon. And enjoy the rest of your day.

Holly: Thank you, Amanda Lee.