Blaire Toso: Hi. Good afternoon, everyone. Three minutes into that afternoon on a Friday day, which we just want to say thank you so much for joining us. We're really excited to start talking about the new features and the updates to the adult education pipeline dashboard and want to share these and prepare you for what will be coming up.
As Veronica said, we're going to go ahead and hold questions as we go through the items and then we can scroll back to answer any of the questions that we want to revisit or to discuss more thoroughly. We will be doing another presentation once the dashboard goes live. So we can do an actual walkthrough and show you what's going on. And what these changes look like and what the data looks like and we'll do that in late April.
I think it's been moved to April 29. We are looking forward to releasing the dashboard and that should be on the 22nd of April. So stay tuned, we'll announce it as soon as it's confirmed. Next slide, please.
So today, from West Ed, there is me. I'm a senior program manager. I work on the CAEP contract for the adult education pipeline dashboard. I also work on other adult education initiatives on the national and local level. And I'd like to turn it over so that Jessica and Ayanna, my co-presenters can also introduce themselves.
Jessica Keach: Sure.
Blaire Toso: Jessica?
Jessica Keach: Yeah, thanks, Blaire. My name is Jessica Keach and my pronouns are she, her, and hers. I'm a senior research associate here at West Ed and I manage the adult education pipeline dashboard as well as other data tools and resources to support data driven decision making and I work with Blaire on adult education. I'm excited to be here.
Ayanna Smith: Hello, everyone. Thank you for joining us. My name is Ayanna Smith. I am a program coordinator. I'm new to West Ed and so I'm here helping and supporting Blaire and Jessica with all of the upcoming PD related to CAEP so I look forward to continuing to work with you all.
Blaire Toso: Thanks, Ayanna. And it's really a wonderful team. It's a small and mighty team that you're looking at but there's a whole host of people who support us behind the scenes in both providing the professional development and building the dashboard and responding to questions and providing other examples of TA. And I would like to turn it over to one of our biggest supporters and the real force behind the initiative that we've got going on today and that's Mayra Diaz.
Mayra Diaz: Hi, everyone. Good afternoon and happy April 1. April Fool's Day. On behalf of the Chancellor's Office, I would like to welcome and thank you for participating in today's webinar presentation. My name is Mayra Diaz and I am the CAEP program lead with the Chancellor's Office. My colleague Lindsay Williams, she and I work collaboratively to support the statewide adult education program at the Chancellor's Office.
She's unable to join us today but her and I work collaboratively to help support the work, the hard work that goes into the adult education program. And in today's presentation, our partners from West Ed will be covering the new features and updates to the adult education pipeline dashboard. So we hope that you enjoy today's presentation and learn something new that may help facilitate the work that you do to support the California Adult Education program. Thank you.
Blaire Toso: So very quickly objectives today. We're just going to review the changes implemented on adult education pipeline. It looks simple, but this has taken months and months. And we started even before the last dashboard was released. So this is a long process but I like the way-- it looks so nice and clean that we've updated calculations, new metrics, and we have some new features and tools.
And just reviewing and setting the stage for the work that we're doing today. I want to talk a little bit about the purpose. The adult education pipeline has multiple purposes. It's to really improve educational practice and think about economic mobility for adult learners. We use it to look at the full learner journey and looking at the program and data across multiple years so that we can have a more longitudinal idea of what programs and learners are doing.
It is developed to serve multiple audiences. So it's administrators, instructors legislators, it's very unique in that the public, the data set that sits under the dashboard is also available to researchers. So that's many, many audiences and that's why we are always rethinking how we present it. We asked if we want it to be used by consortia for program planning, which I think is not unique and certainly on the front of all of your minds right now.
We also hope that the dashboard doesn't just provide metrics but it also helps you to prompt and answer some of your key questions and also to spur on asking good questions. And then it's also used to identify consortia who might be struggling and may need some extra technical assistance from your CAEP tech providers.
So some of the adult education pipeline highlights is that as I mentioned, the metrics are aligned to the student journey. It offers a combined multiple data sets and to offer a holistic view on a regional, institutional, college, district, and state wide frame. And then it also allows users to view that full student journey.
It's not no matter how long or short. And so when we're talking about these changes, it's always the perspective to keep is that we're not just talking about one item, we're talking about a full journey and really how to express that and support you all in identifying where you might see some opportunities and some gaps.
We are the only complete source of the college non-credit and K12 adult education student data. We also highlight the pipeline and that it provides legislative reporting metrics for us. And then it also identifies key metrics for you all to explore.
And lastly, one of the things that we're really proud of and has been very informed by the field, which you'll hear a little more about is how that it offers AEP dashboard allows a more refined understanding of individual metrics. For example, being able to disaggregate by race and ethnicity. To look at it from a first time student perspective or returning student perspective, as well as being able to have a comparative view between years or institutions and consortiums.
Next slide, please. And as you know, the dashboard is why we're all here. It's refreshed every year to add another year of data. And this is only occurs once yearly. And it is really meant to provide an overtime view of the data as opposed to a real time view of the data. So this is why it's used for planning because you can have a view across the years and then try and think about how you might want to project forward.
And the yearly process allows for full data sets from CASAS and MIS to be accommodated and their different reporting structures and then be reflected in the board. So for example, our timing aligns with the community colleges and their review system because they get to review their data correct and make any corrections and then resubmit. So we wait for that full process and then we use that to refresh the dashboard with the new year's data.
Next slide, please. Why do we make these changes? Why do we go through this effort? Well, obviously, to provide a new update the data and provide you with yet another year of data so we can continue to move forward. And it was also to align to other dashboards. As you may know the AEP is one of a suite of dashboards on the LaunchBoard and they share data amongst themselves in the suite.
This for example share adult education pipeline shares with the student success metrics or informs that the building of that. And so those data points need to align before we can launch those dashboards. We also received feedback from the field, which helps us to identify ways in which the data can be better represented, clarified, or fine tuned to your needs.
We also capture feedback through a pre-dashboard launch of field testing process, which we've just wrapped up for this year. And this is where a team of consortia directors, community college, and K12, adult education directors, instructors, or data staff, test the dashboard and provide feedback. And then some of that is integrated immediately into making sure that the dashboard is well prepared for you all before the launch and others depending on the complexity of the changes is held off for the next year.
So as I mentioned, we're already using some of that to plan forward to 6.0. And so we used it that way. Also through technical assistance, when we have one on one conversations with you all in the field that also really helps to inform the work that we do and clarify the work that we're doing and what we're representing on the dashboard for you all.
So sometimes we also have a coding error is identified, that skews the data or doesn't represent the CAEP universe. And so for example, age groupings didn't carry over at one point and so we're back reviewing those and adjusting those for this coming year. And then we also review and revise definitions so that they align to the CAEP requirements. That is-- I'm going to turn it over at this point to Jessica who's going to really walk you through all the detailed information.
Jessica Keach: Great. Thank you, Blaire. I'm excited to be here today. I'm going to start by talking about some new views and features that you can expect to see in the upcoming release of the adult education pipeline, what we call AEP 5.0. So if I referred to AEP 5.0 know that that's the new dashboard with the new data coming in the next month.
Next slide, please. OK. So many of you are probably familiar that we currently have a future in the dashboard that allows you to identify the top five institutions based on the percentage of students that are obtaining a particular outcome metric. After feedback from the field last year, we've now incorporated a toggle that is going to allow you to view the top five institutions based on the number of students completing that metric. And this is available at the state wide and region level so you can see on the right of the screen.
This is what you'll see in the new version of the dashboard. And you'll be able to switch between top five institutions by of percent students to view top five institutions by the number of students completing the outcomes. And the list on the screen is the top five institutions and where you'll see this new feature up here. So you'll see it appear on participants who transition to post-secondary, participants who earn a high school diploma, GED, or high school equivalency, those that completed a post-secondary credential, those that complete a post-secondary non-credit CTE certificate, employment two quarters after exit, employment four quarters after exit, and the annual earnings compared to the living wage.
Next slide, please. OK. This is also a popular feature in the dashboard where you're able to drill down by program area. So ABE, ASE, and ESL students. We've added this feature to more metrics. So it currently appears on several metrics but we've added it and now, you'll be able to drill down by program area on the participants metric, which is those that have received 12 or more instructional contact hours in a CAEP program.
Participants who completed in immigrant integration milestone, employment two quarters after exit, employment four quarters after exit, and participants who earned a diploma GED or high school equivalency. All right.
Now, I'm going to walk you through some of the key metric changes that you'll see in AEP 5.0. The first one, one of our big significant alignments we did this year, in order to align reportable individuals or what's commonly referred to as adult served, or AE 200 metric with the California Adult Education program, the one hour threshold used in this calculation has been restricted to one hour in a CAEP program area course.
This is important for our non-credit institutions who offer non-credit courses and programs outside of the adult education program area. For example, courses for older adults. So if you look to the bottom of the screen, you can see the definition that's being used in AEP 5.0 and the blue text. In the blue hovers are the changes, the big changes that you'll see in this metric.
So the definition is the number of learners who had one or more hours of instruction or positive attendance hours across all enrollments in an adult education program area. Previously, this was any non-credit enrollment and/or who received services at a K12 adult school or non-credit services at a community college. And you'll see that there's an asterisk next to non-credit services.
So previously, in AEP 4.1, the metric allowed for credit only students who received non-credit services to be included. So the metric was really not looking at enrollment when it was counting students based on services. They could have had any enrollment or no enrollment at all. But that resulted in was some inflation for a few specific non credit institutions and we've adjusted this now.
So previously, credit only students were included now that won't be the case students who are credit only and receive a non credit service at community colleges will no longer be counted as reportable individuals. This has really helped account for some of that inflation that a few institutions saw.
Next slide, please. OK. So what is the impact of these changes? So what you're really going to see is a more accurate reflection of non-credit community college CAEP students. And what that means is you'll see a reduction of reportable individuals for non-credit community colleges because before, we were cutting any student that had an enrollment or one or more hours than non-credit. Now, we're really narrowing that down to just non-credit students in the California Adult Education program in those CAEP program areas.
We'll also be counting all learners who received some kind of service provided through the adult education program except for students that only had credit enrollments. So that was that asterisk we just saw on the previous screen. And this is also going to result in a reduction of reportable individuals for non-credit community colleges specifically for some of those institutions that saw really inflated numbers in AE 200.
This is also going to reduce the number of students with barriers to employment across metrics specifically for non-credit community colleges because reportable individuals is the denominator for those metrics. So foster youth individuals, folks who are experiencing cultural barriers, those are all out of reportable individuals. So if we're reducing the number of reportable individuals, we'll also see a reduction in the count for those barriers metrics.
OK. I want to take a minute to talk about the COVID impact and we're all very well aware that COVID has had a significant impact in many of our lives and data and reporting has not been exempt to some of these impacts. I want to acknowledge that the Chancellor's Office recognizes, the continued limitations that the non-credit community colleges have faced in reporting student attendance hours in SX05 for non-credit distance education classes.
In last year's build of the Adult Education Pipeline, which was first version 4.1, which contained one term of COVID impacted data spring 2020, the Chancellor's Office consulted with the non-credit field and determined that the benefit of maintaining the hour threshold requirement for non credit students during COVID impacted terms did not outweigh the cost of excluding students because their institutions had an inability or difficulties reporting student attendance hours for those students at courses.
So therefore, during spring 2020, which is what you see in the dashboard now, only an enrollment record versus number of hours reported was required for non-credit community college students to be counted in metrics that require an hour threshold. And after a continued consideration and consultation version 5.0 is going to carry over this decision for the 2020 21 year of data. And so that was the new release of the dashboard that you'll see later this month.
The impacted metrics and these are metrics that have historically relied on in our threshold are in the blue box on the right. So we have reportable individuals, which has that one or more hour requirement. We have participants, the students with 12 or more instructional contact hours. Another metric that's not as commonly used is the students with 1 to 11 instructional contact hours. And then the final one is students with an enrollment in an adult education program who received services.
Next slide, please. OK. So what is the impact of this logic, what we're calling COVID logic? So we're specifically going to have an impact non-credit community college student data this is a consistent approach as it was used in AEP 4.1 for spring 2020 data. It treats all colleges the same avoiding an instance where a college is disproportionately impacted. So that might be a case where an institution may show very few students because they weren't able to report hours and participants were avoiding that. So we're not going to have institutions that are disproportionately impacted based on their challenges in reporting.
It's also aligning the metric definitions between dashboards. So Blaire mentioned earlier that a lot of this data is present in other dashboards including the student success metrics. The same method of counting students in those dashboards is being applied elsewhere. So it's the same logic being used in AEP that's being used in other dashboards in the LaunchBoard.
Additionally, the participants metric that you see on the dashboard may include students that did not complete 12 or more hours for non-credit community colleges. This strategy allows for a broad universe of students to be available to obtain outcome metrics because remember, our outcomes are only tracked for our participants AE 202 students with 12 or more hours. So we're allowing for a broad universe of students available to obtain outcome metrics during the time period. So you'll be able to see outcome counts for all institutions. This goes back to some institutions not being disproportionately impacted.
However, because the universe of students available to obtain outcome metrics during the time period is broad, you may observe lower rates of outcome attainment because AE 202 what's in the denominator for all of those outcome metrics, may be broad or maybe big. You may see a little bit of lower rates but the counts will be retained.
Finally, we have notes. So we really wanted to be clear about our documentation of these challenges. So folks are really aware of the data that they're looking at. So you'll see notes on any metric that is no longer applying the our threshold we also have hover overs and tooltips. And anywhere that's impacted, you're going to see that detailed definition. So we can really promote understanding of the data that's in the dashboard.
OK. Additional changes. So under barriers to employment are long-term unemployed reportable individuals a 309 used to conduct a wage match. After consultation with the field it was determined that that's no longer appropriate, it's not an appropriate way of capturing these particular student barriers and so we've removed the UI wage match and only students reported in TOPS through barriers to enrollment and students reported in COMIS through SG17 are going to be counted in this metric.
For low literacy reportable individuals, previously, students who were ever enrolled in ASE were actually being counted as low literacy in this metric. This has been removed. So only students who have been enrolled in ABE or students who have been reported as low literacy individuals through TOPS barriers to employment or COMIS SG20 are going to be counted in this metric.
And then finally, Blaire mentioned this earlier, but we are aligning the age groupings to the National Reporting System categories. And note that this is a difference between the age bands that you'll see in other dashboards in the LaunchBoard.
Next slide. OK. Finally, we've really tried to focus on improving the user experience. So if you'll go to the next slide I'll share with you some of the things that we've done we've done a comprehensive review and alignment of metric descriptions and you'll see that reflected in the revised MDD, metric definition dictionary that appears on the LaunchBoard board and it'll appear there once we release 5.0.
We've also reviewed all of the tooltip descriptions. So there's a little blue button, I think it actually has-- it might have a question mark, it might have an I on the dashboard but when you click on that button, for each of the metrics, a hover pops up and provides more information on how to interpret the data. So we've reviewed all of those and they will be improved in the new release.
And then finally, we've made visual improvements. So an example of that includes inventing some metrics on the left navigation so you can see that over on the right of the screen. Underneath participants in career technical education you'll now see indented subprograms, CTE subprograms and this is the case for several metrics that have some metrics across outcomes in the dashboard.
All right. Now, I'm going to turn it back over to Blaire who will walk you through some of the resources in understanding these changes.
Blaire Toso: Thanks, Jessica. So as you can see that there were several different some changes and as Jessica said, we're trying to be as transparent both in listing those changes on the dashboard as well as in the MDD and then there's also a changes to in definitions document. These are both identified. If you will scroll on every page, you can find these two important-- sorry. I completely lost my train of thought, everyone. My apologies.
So if you scroll down, it's listed on every single page of the AEP dashboard. You'll see this information here where is the screen capture. You will scroll down to the end of bottom of the page. Go past that click here to view resources and actually go to that carrot where it says click here to find out more about the data in the Adult Education Pipeline dashboard. And there, you're going to find the changes and definitions for AEP April 2021. And that is-- it's where you're going to have all of these listed out exactly as how Jessica was showing you but it'll be fully delineated, it identifies the changes, it explains and the purpose of the change. And it also provides the rationale for all the changes and the fact for the dashboard.
Also, the metric definition dictionary is not a new resource but it is one that we always consider as the primary source of information for all aspects of the dashboard. And there you can not only just see how those changes are implemented and where they're noted, but you can find all the information about crosswalks, agency crosswalks, terms used and explained. And most importantly, for this discussion a definition and calculation for all metrics on the dashboard.
And if you're unfamiliar with the metric definition dictionary, they provide calculations on information for both CASAS and MIS data so you will be able to view how the metrics are understood to be comparable as well as gives you directions for how you might want to code your data to ensure that it's included on the dashboard. This will really give you the full understanding of these changes that Jessica's been talking about and what it means to you and how you might want to adjust your coding practices for that.
Next slide. So in summation, you'll see these changes that are implemented, the changes are implemented for all years. You'll see the updated numbers in another year's data added to the dashboard. Some of the displays will be updated. There's new metrics might be added. And also, any additional explanatory information will be directly accessible on the dashboard.
Or as Jessica was mentioning, there's new language that clarifies or highlights the information on the metrics and as well we have signaled where those changes. For example, the changes to enrollment, they are signaled on the dashboard as well. So it's very transparent about how these calculations are being addressed.
Next slide. And a quick reminder as we know you're in the throes of finalizing your three year plans and we just suggest as you probably already are that you're using this data to ask questions, to identify trends, identify gaps and successes, and set goals. And we also want to highlight that you can use the dashboard to identify thought partners as you go through this because you can in those top five institutions charts you can identify others who are possibly like institutions in your region or in the state showing strong trends in those top five categories.
And we encourage you to reach out and network with them and see if they might be able-- you might be able to tap into their knowledge and expertise and share yours as well. And then be able to become thought partners and identify maybe some new networks or think about your data and apply it differently.
Next slide. So this is a short sweet presentation. And we wanted to open up for questions and discussions at this point. Yes, Annabelle?
Annabelle: Hi, good afternoon, everybody. I have a couple of questions My first question is around the one hour threshold. So I understand that this begun in spring of 2020 and it's now being extended or that threshold, the one hour threshold is being extended to through 2020 and 2021. Is this only due to COVID and the difficulties community colleges have in tracking because of the online positive hours? And after 2021, is it going to go back to one to-- or the 12 hour threshold. It's my first question.
Jessica Keach: Yeah, that's a great question. So you're correct in that the logic and the-- our threshold, the changes are in spring 2020-- and the spring 2020 and then the 2021 year. We understand that there are still challenges in reporting non-credit distance education positive attendance hours. So that's going to be an ongoing conversation at the Chancellor's Office with the field, with CDE and no decision has been made at that time. But Mayra is on this call and I encourage you, if you have thoughts about this to reach out to her. Maybe she can drop her email in the chat and then any feedback we get on this, we also share with her as well.
Annabelle: OK. And I understand and as a consortia director, I'm housed at the college and so I hear these conversations all the time about the difficulty around this and we have a research analyst as part of our consortia and so we're pretty tuned into that. However, in our consortia, it's been a harder sell or it's not that it's a sell, but it's been harder for our Adult School partners to understand this because it feels very unfair to them that they're measured on a 12 hour threshold and the community college is measured on a one hour threshold.
So do you have any thoughts around that or how I could better communicate this to them and telling them it's temporary? It's an easier sell but I know like you just said this is an ongoing difficulty for the community college. So any thoughts on that?
Blaire Toso: So, yes. Thanks, Annabelle. And it is difficult it is temporary in there is ongoing research. There's been a lot of research and investigation and what this is allowing people is to really come to a solution that will serve not just for the immediate but for long-term because its COVID has impacted that transition to online learning.
And so thinking through that in a deeper way is-- I think what's at hand here is that it's not just simply a decision. It is a decision that's been made as a temporary decision as they move forward in finding a solution that will work for beyond-- for now, and beyond. So I do think it's temporary. I don't think that this will be the continual solution but it's because it involves a whole data system that they're really taking consideration, talking to a lot of the community colleges, non credit programs, to the researchers, to the researchers in the Chancellor's Office that it just isn't a decision that is being made on a whim. It's really taking thought and we will be running comparisons between what previous years when this threshold was not in place to see how it's impacting the numbers and being able to inform the conversation from that position as well.
Annabelle: Yeah, and one of the things that's been really difficult when we're working with different kinds of agencies like an Adult School in the community colleges is you know-- because we've always we've always heard around data. Like we want to compare apples to apples and oranges or oranges and you can't in these very two different systems.
And so this kind of agitates this that whole concept and so I just wanted to add that. And the other thing that maybe you all could consider that I think would be helpful for the consortia is-- and I remember Randy used to talk about this in some years back and I know there was work on this but around crosswalks.
If we can do an apples to apples and oranges or oranges, what could a potential crosswalk look like? Especially for maybe a college that doesn't use CASAS. What does literacy gain look like? And I know this information is out there, but it's not presented to us in across so we've had to in our own consortia, we do our best to do crosswalks but it takes a lot of time and trying to figure things out and then sometimes the coding we're not even sure if the coding is accurate and up to date at the college.
And so it's just really complex and this could be something that I think the field could really support us in. And I have another question, but I'll pause because other folks might have a question.
Blaire Toso: I just want to say thanks, Annabelle. I appreciate the suggestion I think that's a great suggestion and we'll take it back to the team. Really appreciate it. And please, I would like to say that we all know that this is complicated and it is-- that it is not being taken lightly. And there is if the surrounding discussion as Mayra can will attest to and she gets even more of it because she's in the COO's office, that it's not-- it is a hot topic.
Annabelle: OK. Well, good that you all share our pain. We are one in our pain.
Mayra Diaz: Thank you for your question. We value hearing this information because it helps us hear what struggles and challenges you guys are facing and of course, this obviously was brought up because of the pandemic. And it just-- it makes us go back to the drawing board and just talk to our leaders and recognize what are some of these challenges and what we can do.
So we're definitely working on trying to arrive if there is a possible solution or at least be able to narrow down what are these challenges. And so we value that. I put my email in the chat box, please feel free to email me with any questions or concerns in regards to this because this information helps us take that back to the drawing board and present that to our leaders.
Annabelle: Awesome. Can I squeeze in my other question? OK, and this is probably for Jessica. In the presentation, the PowerPoint you were talking about low literacy and what qualifies as low literacy you mentioned ABE, and AEC but you didn't mention ESL.
Jessica Keach: Yeah, that's a good question and we have had that question previously I know Blaire can speak to it. So what I mentioned is that previously we had included ASC individual. Students who had ever been enrolled in ASC and marked them as low literacy. So that will no longer be the case.
There will be folks who are reported as low literacy through the barriers to employment metrics or folks who are ever enrolled in ABE. But we also recognize that there are various levels of ESL and just because someone is enrolled in ESL doesn't automatically make them low literacy.
So we haven't included that at this point but again, please send us your feedback. We're continuously collecting that information and it goes into next year's bill. So I don't know if, Blaire, there's anything else you want to add to that?
Annabelle: I have a comment on that.
Jessica Keach: Blaire, you're on mute.
Blaire Toso: OK. I was just going to say that we have investigated that and we've looked at the way the different systems report. And at this point in time, we are trying to align with the systems and ESL comes in and it does not define by language level or literacy. It doesn't come in as literacy levels. So because of that, it goes into that the overarching group of the literacy level barrier.
Yeah and I totally understand, I do a lot of work with skilled immigrant and internationally trained professionals. So it's something that we have as we begin to refine these definitions and continue to have the conversations and we would be happy to engage in a really directed conversation with a group of people. Some maybe like a small focus group to think through how we might be able to identify and refine these pieces because, yes, ESL is not a unified and a unitary marker for literacy level.
Annabelle: And I do understand that as far as you can't define low literacy by someone's lack of English literacy. However, this has implications for us right now for our three year planning because one of the student barriers that we are supposed to be selecting to one, is low income and the other one is low literacy.
And so we're trying to capture-- it's a little bit easier, I think, to capture it on the Adult School side but at the college side, they are telling us that they are marking-- they are suggesting that we use both low income and low literacy so that we can capture ESL students. They are telling us of their coding low literacy that all ESL students are coded as low literacy and low income is coded by students who have filled out a FAFSA or a College Promise, which excludes students that are enrolling in non-credit and/or ESL classes.
So we're selecting-- so I'm now a little confused of how are we capturing low literacy students or ESL students if it doesn't fall in one of those three metrics that we're being asked for right now in our three year plan.
Jessica Keach: To speak to how the colleges are reporting that data, so that would probably differ based on what college institution you're at. So what we're taking is the MIS data. And I think it's SG17 or SG20. And it's just a one or zero flag. They flag the student or they do not. So if your institution is sharing the way they flag students as low-- I think it's more of a local conversation.
And I think again to Blaire's point for these metrics, engaging a group in the field to really look deeper into them and see what some of the practices are across colleges would be a really good thing, but the definition as it stands is those that are marked through cultural barriers, in through the barriers to employment in TOPSpro or reported through that SG level or for low literacy ever enrolled in ABE.
Annabelle: OK. Thank you.
Jessica Keach: And feel free to reach out to us. I can put my email in the chat as well. I know we've connected before but please feel free to reach out to us and we can talk through the definitions more or see if there's a way that we can provide additional assistance in helping you get to the numbers that you need for planning.
Blaire Toso: And I did want to add that it isn't just at the college level where ESL becomes marked as low literacy. I think that TOPSpro oftentimes if you are marked as ESL as a barrier then that oftentimes those are they are also considered low literacy. If I recall my correct-- I believe we've had this conversation recently with the folks at CASAS.
So we can go back to them and make sure that that is aligned and that we are once again trying to do that merging of the data and doing the apples to apples comparison, which is sometimes more apples and bananas squish together make an orange, which is sort of how when we look at that metric definitions dictionary when we can see that the calculations really how we create those to really-- if they aren't exactly the same that they are on par, that they're equitable measures.
And Dana that's interesting to hear because-- I'm sorry, I'm looking at your message in the chat that because before we revised the our definition about tagging ASC taking them removing them from the low literacy is we also-- when we make these changes we check in with CASAS to make sure that we are aligning with their definitions.
So we'll go back and verify that because the response that we received is that ASC was not tagged as low literacy. So thank you for bringing that up, Dana.
Dana: Yeah.
Blaire Toso: Oh, go ahead Dana.
Dana: That was one of my confusion in listening to this is that you're talking specifically about LaunchBoard and yet, the systems that we use like TE may not be doing the same thing. So yeah, if we could get some clarification because like I look at TE and run my reports and then I look at LaunchBoard and they don't match up. So yeah, if there could be some consistency and then if we can get clarification on the definitions if they are the same in both systems or if they aren't, that would be great.
Jessica Keach: Yeah, I was going to just briefly read out the definition, the new definition that might be helpful. And for context, Dana, what we do in all of these metrics is look at the TOPS data, we look at the COMIS data but we also understand that students may appear in both so there's a merging of those records.
So for low literacy reportable individuals, this is AEP 5.0, for in TOPSpro if a student comes from TOPSpro, we're looking to see if they have ever been flagged as low literacy for employment barriers or if they have ever enrolled in a basic skills ABE program area. And then that would be at any time up to and including the selected year at any institution.
Or if they met the definition in COMIS, the COMIS definition is students who have ever been flagged as low literacy that's through SG20 or ever enrolled in ABE at any time up to and including the selected year at any college or if they were flagged in TE as being low literacy and that's how we kind of look across for those students.
So again, what we're looking at in LaunchBoard is two sets of data if they were flagged in their respective systems as low literacy or if they were ever enrolled in ABE. So that's a definition but I definitely hear what you're saying in terms of alignment. We're continuously working on that. And you can-- I put my email in the chart so please feel free to reach out to me if you want to walk through any of those definitions. And I think Jason has his hand up.
Jason: Hi, good afternoon. Happy April Fools. I was just kind of wondering you know considering the differing methodology like right now, in Nova, the 1920 is from AEP 4.0. Will the 2021 data be reflective of the new methodology? If so, will the 1920 data be updated in Nova to also reflect the new methodology?
Because I know folks are already completing their plans and everything and a lot of folks were using the 1920 as their baseline and with a reportable individual's definition straight up changing a little bit. That can really change like how you set your targets too. So it's kind of wondering how that would look like in Nova and how that would be reflected and how we have to look at it for our three year planning.
Jessica Keach: I can answer that I think. So like Blaire mentioned earlier, when we make these updates like the change to the reportable individual's definition it is made for all years across the dashboard. So the data that's in Nova, it is my understanding that it will be updated with the new release of the LaunchBoard board with new definitions.
Also, recognize that that is challenging because you're going through three year planning and I'll defer to Mayra and other folks who are more engaged in providing technical assistance on that issue, but our goal is to make sure that the data is consistent across time. So when we change that definition, we apply it historically.
Jason: Yeah. We noticed that when we were just like messing around with it. Just more of like because what's already in Nova is already very different. So we just kind of wondering like how we have to play around with that and what we would have to inform our consortium.
Mayra Diaz: Yeah, and that's where the Nova enhancement-- we're working on once the AEP release the 5.0 is released, it's a smooth transition into NOVA so that you guys can access the data. And there will be some enhancements in Nova. In a previous webinar, we talked about how the annual plan for now is going to be inactive, the link will be inactive so that way as the enhancements are made into Nova to allow for some of these connections to sync. We'll be able-- and we will also be providing some additional communication to that. So hopefully, soon that should be coming out. Please keep an eye out for that from our STOE top team.
Jason: Awesome. Thank you so much.
Blaire Toso: Thanks for your question. Chris, you have your hand raised?
Chris: Yes. You mentioned about people who are marked as low literacy in TE fulfilling a certain requirement but I was just wondering, led me to thinking, is there a commonly accepted definition for what low literacy is from that point because I'm not sure if at least in the past our counselors, the people who are marking that, upheld sort of a commonly accepted definition of what that might be.
I know it probably at one point it was probably if you were in an ABE class, you were low literacy. But beyond that, I'm not sure what it was that we thought of it. So it might be seemed like it was approaching a chicken and egg sort of situation where you're marked as low literacy, what is low literacy? Well, you can't-- you're low literacy. So obviously, that's what we're marking you as.
Blaire Toso: Thanks, Chris. That's an excellent point. The dashboard, the AEP dashboard does not set those common definitions. Those definitions are taken up with your data systems or your coding. So for example, TE would set, CASAS would set their definition of low literacy and then it would also be then low literacy as it's defined in MIS.
But your point is very well taken that it might be interesting to go back and set a common definition that can be broadcast out to MIS and CASAS and see how they're matching up to that conversation because it would certainly allow people to code in alignment to something. So that's an excellent point. Thanks, Chris.
Jessica Keach: Yeah. And I'll just note again, excellent point. I believe that in MIS and I can look it up and put the variable in the chat. It's the individual low literacy mark one or zero. And these elements, these SG elements are actually relatively new. Probably, maybe three years ago, maybe a couple of years old and they were created to support reporting with for the Adult Education Program.
And I believe North Orange actually did a little bit of work around trying to come up with some common definitions and so I think that's definitely something that the field should continue to pursue as well as work with the Chancellor's Office in terms of trying to define some common definitions because I don't think that they currently exist. So are there any other questions?
Blaire Toso: We really appreciate your posing difficult, complex questions, which really evidence both the thought that you all put into the work that you do and reflects the work that we do in trying to reflect some of the data back to you. As I said, we'll take these-- the questions that you've asked and some of your concerns and we'll begin to pull that through and look at how again, we start a new and fresh and begin to look at our definitions and take these questions and apply them to our work.
And we'll have those conversations with and make sure we circle back with CASAS TOPSpro and as well as with MIS and ask some of those questions. And pose to the field that may be a group needs to come together to create some clear definitions of what little literacy is along those lines. And we'll capture some of the answers that we can provide at this time and then we'll make sure that they go out, when we send out the PowerPoint. Oh, I'm going to hand it over to Ayanna. Thanks, Ayanna.
Ayanna Smith: Sure. So now, I want to discuss or share with you all that we do have some more webinars coming up over the next couple of weeks after the dashboard officially launches. So if you do want to get more information about how to better utilize the dashboard with your three year planning in some ways that we suggest or recommend using that data, please feel free to join us.
We have all the dates up on the screen. There are some individual webinar opportunities such as this one and then we also have some two-part opportunities. And so for those who want to just make sure that you're registering both for part one and part two so that you can get all of the information.
And then if someone can post the link in the chat where people can register for these, it's the same place where you registered for this webinar. All of those should be online for you to go ahead and sign up.
Jessica Keach: OK. I think that's it.
Blaire Toso: Yeah. Thank you all for joining us and have a wonderful weekend. Thank you very much.
Ayanna Smith: Thank you, everyone.
Mayra Diaz: Everyone have a great weekend.