Blaire Toso: Thanks, Veronica. I appreciate the introduction and all the housekeeping rules. And I see people posting who they are or where they're from in the chat which welcome to all of you. Really appreciate you joining us today to learn more about the Adult Education Pipeline dashboard.

So this is really an overview of the purpose, the key metrics, features, supportive resources, and a walkthrough of the dashboard so that you'll have a general understanding of the dashboard and how it's informed and built.

And just as a note, we oftentimes refer to it as the AEP dashboard or AEP. So that's our shorthand as we go through this. Your very first acronym of the day. Lucky us, right? So thank you.

Next slide, Ayanna. So today we are from West Ed. And I am Blaire Wilson Toso. I'm the senior program manager, and I oversee the CAEP work on the West Ed side of things and that really it circles around and highlights the Adult Education Pipeline dashboard.

Well, we are also going to hear from Jessica Keach who is very deeply embedded in the build of the dashboard. And all of your data questions, she will happily answer or definitely research and get back to you.

And then we also have Ayanna Smith who has recently joined our team and is a force in and of herself. And she will be supporting this webinar, but look forward to hearing from her in a more substantive way in the future. So that is our team. And I'd like to pass it over to the Chancellor's Office.

Mayra Diaz: Thank you, Blaire. Good afternoon, everyone. On behalf of the Chancellor's Office, we would like to welcome you and also thank you for participating in today's webinar presentation. My name is Mayra Diaz. And I am the new CAEP program lead with the Chancellor's Office.

I'm excited to join adult education and work to lead the wonderful programs that serve to provide opportunities for our unique students. I want to acknowledge all the hard work that you have all done during these challenging times to continue to serve our students while maintaining operation of these programs and services.

Our outstanding and expert partners from West Ed will provide you with an excellent overview of the Adult Education Pipeline dashboard and provide valuable information to assist the field as they are working on their three-year plans. Spring is in full swing, and we understand it is a busy time for CAEP as work is being conducted to produce reports and meet important deadlines.

We hope you find today's presentation valuable and are able to take this information to assist you in your role. And I would also like to introduce my colleague from the Chancellor's Office that has joined the team in supporting CAEP efforts, Lindsay Williams.

Lindsay Williams: Good afternoon, everyone. Thanks for attending. And it's Lindsay Williams. I am the new program assistant for the CAEP program at the Chancellor's Office. I look forward to working with you all here in the future. Thank you.

Blaire Toso: Thank you. Thank you, Mayra. Thank you, Lindsay. We really appreciate that you're here and are so strong collaborators and supporters of the CAEP work. And we appreciate your attendance. And any insights that you want to add to the presentation, please feel free to just pause and pop it in the chat or just interrupt us.

And that goes for the full audience is we are a fairly informal group as we present. Today it's an information-laden presentation. I see some of the names. And I know some of you will be familiar with what we are reviewing today but going in a little bit more deeply, and I hope that this will provide you with some additional information as you go forward in your planning.

So with that said, please feel free to post questions in the chat, raise your hand as we go through the presentation. At certain points we may ask you to unmute so that we can have a fuller discussion.

And then after this orientation to the dashboard, we hope that you will join one of our other presentations such as the three-year planning presentation or the equity for identifying equity gaps in your programming that will be offered later this spring.

This is our agenda for the day. We've got the introductions, LaunchBoard overview. We'll look at the Pipeline, key terms, dashboard features, and then look at through some of the other pieces of the dashboard, inclusive of a live demonstration. Slide please.

So first what we'd like to know is a little bit about you all. And we're going to kick it off, because we know that data is often seen as complicated, as a compliance task, or used for oversight, and sometimes it's obscure. And so we'd like to start off by switching up the discussion and think about a tool, that data as a tool, to inspire and inform us.

So, for example, on this slide, we can see that 4.5 million-- can you click on to the next piece? Yes. Approximately 4.5 million people over the age of 18 don't have a high school degree.

And then if we look at the data up in the blue oval, the AEP dashboard identifies that CAEP program supported nearly 470,000 adult learners to access more than 12 hours of education, which we know for adult learners is a sizable investment of their time. So during that time, they're building literacy, language, digital, and workforce skills.

And knowing that a little bit of education for adults goes a long way, we are using these two data points to inspire us in our work to figure out how to reach more of that 4.5 million learners who could access and benefit from adult education services.

And if you think about data inspires us, it's inspiring to know that 4 million possible participants offers a lot of opportunity to us as adult educators to make a large difference in the adult education population. Next slide, please.

So we'd like to take a moment. Many of you have already put your name and institution in the chat. But we'd like to also know what data point or key idea about adult learners compels you in the work you do in serving them. And if you could please type your answers in the chat.

So another way of framing it is how does data inspire you in the work that you do?

Thank you. "Program improvement and access," yes.

Thanks, Emma. "The number of students that are college educated."

Oh, interesting, Doreen. Thank you. "Data can help people believe." I like that. Yes, "Data can help find identify services that will accommodate them." Cathleen is saying, "Watching individuals make learning gains, completed levels, improving English."

Yeah, so all of those pieces which sometimes we flip and think, oh, it's just showing a number is also a number of what people have achieved. And then when there are gaps, instead of saying there's a gap saying, wow, that is a space that we can fill. "Validating the efforts that are paying off," exactly, thank you, Jim.

All right. Keep putting them in there. I find it inspiring to find out how people are using data. Yes, data can give information on show successes and areas that need to improve, right? Needing to improve doesn't mean that you haven't been successful. It's just it's an opportunity again to move forward. All right. Thank you. Can we go to the next slide?

So we'd first like to introduce the Adult Education Pipeline dashboard by talking about the LaunchBoard which is the home of the Adult Education Pipeline. LaunchBoard is a suite of tools that support California Community Colleges' statewide initiatives as well as the adult education programming. It helps to track student progress across education systems and towards economic mobility.

The AEP dashboard is one of the suites of dashboards and they serve all of these dashboards, including the Adult Education Pipeline, serves to inform professionals, policymakers, and the general public-- because it's public to most, to anyone who would like to come and look at the data there-- about the work that is done in adult education and community colleges.

This is a really unique aspect of California's work is that it's available to the public, to be able to come in and view the really excellent work that's going on. But it's important to know that these dashboards don't exist in isolation.

For example, the student success metrics in the AEP dashboard both offer data on noncredit learners. However, the tent and presentation of the data will be different depending on the purpose of each of these dashboards. And as I said, the AEP focuses specifically on the adult learners as defined by CAEP.

I'd like to mention that these dashboards are built through partner collaboration. While West Ed provides program management for all the LaunchBoard dashboards, we work with Education Results Partnership to design data visualizations and data metrics calculations. We work closely with the Chancellor's Office to continue to make it a robust dashboard and identifying data points that will inform people in their work.

And we also are informed by talking to the field, for example, yourselves and getting feedback from other agencies like the RP Group and the Centers of Excellence. So it is not a singular effort. It's the many people informing it. And it's also used to support other work. Next slide, please.

So now we are going to focus specifically on the Adult Education Pipeline which as we all know it is centered on the adult learner. Next slide, please. AEP is meant to serve multiple audiences as I mentioned administrators, consortium, practitioners, and policymakers primarily.

It's been developed to help us explore questions, not just to track outcomes, but to explore questions that we have about the system, about consortia, about programs, and about the learners. It's also been developed to support program planning which is particularly useful as the AEP dashboard identifies key aspects of the learner journey, from enrollment to transitions, educational gains, and employment outcomes.

It also gives practitioners and decision makers access to more data about students and programs. So it's meant to be a fairly deep dive. The dashboard can answer questions like, what type of students are in my program, how many students leave and earn a living wage, how do my success outcomes compare across different student populations or programs.

And we want to be very clear about the fact that the Adult Education Pipeline, the data is not real-time data. People ask us this and it is not real-time data. This is purposeful as we go through a process that ensures reliability and validity as delineated by the Chancellor's Office.

And that's really so that we can track outcomes longitudinally as well as year by year. And we'll cover much more of this in depth as we go through some of the key metrics. Next slide, please.

The dashboard is also important as it helps to understand alignment with the vision goals for success, the vision for success goals. Through our dashboard, you can easily identify completion of credit, credentials, certificates, and skill attainment. You can use the data to focus on filling equity gaps as well as achievement gaps. And these can all be viewed by different locales, region, consortium, and by institution.

The AEP also allows us to look at the beginning of the route for some of the adult learners for transitions to further educational experiences. So you can begin to see how these align with the success goals. And even though we may not be fully looking at the transfer to UC or CSU, we do track those initial transitions that students are making from one CAEP program to another or into additional post-secondary and training opportunities. Next slide.

So we stand here and talk about the value and the validity and, oh, that you might want to use the dashboard in this way or that way. But as I said up front, we also get a lot of feedback from the field. And we do want to know how are people actually using it, so you don't have to rely just on us to understand how the dashboard and the data can be used.

And so we reached out continuously to consortium directors, administrators, researchers, and practitioners to better understand how this work informs what you all are doing in the field. We also engage a team of field testers to review the adult education dashboard prior to the yearly launch, which is coming up in the next month.

And that effort is going on right now. We've got practitioners, administrators, and consortium directors testing what the 5.0 launch is going to look like and gaining feedback from them. So in reaching out to the field, we've received a variety of responses about how the AEP helps inform their work.

And it's usually to highlight data from the Adult Education Pipeline to explain how data and the TE and MIS are summarized in many different ways. So people use it to get a holistic view of what's going on in their consortium. They have said that they use it to identify some of the similar pieces that you all posted in the chat about trends, and gaps, and outcomes, growth and success, and transitions in numbers.

And you can see how they also use the AEP to explain their work to other people in the field and how it might relate to their work, especially for stakeholders. And you can see on the slide how some of these, these are direct quotes from people who have gotten back to us or we have talked to in helping them support their efforts.

And just a quick chat wondering if anybody has used the Adult Education Pipeline to promote questions? Yes, Ray, we will be identifying where the data is pulled from. So if you all have any ways that you've used the Adult Education Pipeline, please go ahead and post it in the chat, and we'll pick that up later. Next slide, please.

So, Ray, there are multiple data sources that inform the Adult Education Pipeline. And we use them to track students at different parts of the journey. The data sources, while they may be different, they are also matched by identifiers like SSNs, the combination of a name, date of birth, and gender.

And Jessica will talk specifically about how we go about this process of duplications, so we aren't getting an overabundance of representation of the same students with different outcomes from different institutions.

The main sources that we use for the Adult Education Pipeline are MIS, employment and earnings data, and then data from CASAS TOPSpro Enterprise. And so we use both of those, all three of those sets, but primarily our data sets are MIS and from CASAS TOPSpro Enterprise.

And here is how you can see, like tying all those data sets together, we work with the existing data systems. And we do not require additional data reporting from any other institution or consortium, right?

We use the data sources that people are already reporting to. And the main ones, while there might be the wage data, the main ones really are CASAS TOPSpro Enterprise and the Chancellor's Office Management Information System, which we refer to as TE and MIS.

So as most of you know, all the programs that are funded by WIOA II are required to report into CASAS TE. And that's for K-12 adult schools as well as noncredit community college programs.

Those data also get reported to the California Department of Education, to CDE, whereas CAEP funded, non-WIOA II, noncredit community college programs report through MIS exclusively. And that's why we tie-in both of those is because we need to represent all those institutions who serve adult learners under the CAEP system.

The data for these two data systems are combined on the Adult Education Pipeline using a series of metric definitions where we set up an equitable comparison between the two different data systems so that if you are reporting outcomes, they are counted on a like system and reported as a unit to your consortium.

And then you can see them separately if you're from an institution that reports into TE and not in the MIS, those data will be reported under the metrics as they appear in your CASAS TOPSpro data submission and similarly for the MIS reporting field.

And we, as I mentioned, we go through a yearly update. And currently we are finalizing the update for this year which will be the 2021 data. And we are testing that to make sure that the data is both being calculated properly and showing properly. And people want to know why we update yearly.

It's not simply to refresh the data, to add another year of data, it's to align to other dashboards. We also incorporate feedback from the field or from any other sources and this is both through the user testing as we engage in conversations. And we also identify suggestions through our professional development activities.

If we hear something in our technical assistance where we're working one on one with an institution or consortium, we incorporate that information. Sometimes we identify a coding error or a glitch and so that needs to be corrected so that data can be more precise.

And then we also look at precision of definitions, which means we refine the definitions as the field changes their definitions. So, for example, this could be in conversations with the field that we are informed that the field has a different definition as they have worked with their data.

It might be in changes in the way CASAS TOPSpro is both collecting, if they've added different metrics. For example, they recently added the immigrant integration metric. So we want to refresh and make sure that that data is provided on the dashboard.

Also, if there are changes in MIS. For example, last year we had an addition of an intentional high school equivalency attainment flag, and so we updated for that as well so that people have the best way of reporting their data and seeing their data. Next slide.

So in making a dashboard user friendly and student centered, we've structured it to focus on the learner. So we have structured it to focus on the learner journey and that is where it starts, with students and programs, and then it looks at the progress that it makes while I'm programming, so on some progress, transition, success, employment and earnings.

But I want to be clear that just if your student does not achieve one of those successes or a transition, they are still represented on the dashboard so that if a student has no gains or no, several, or such transitions, they will still be counted in the students and the programs. If we get an employment and earning outcome for them, it will also be related to-- that will also be visible on the dashboard.

One thing to note is many of the later ones, the progress, transition, success, and employment and earnings are based on the participants, those students who have achieved 12 or more hours. So in other words, we capture their journey no matter how long or short. Next slide.

So quickly before I hand it over to Jessica, she'll be diving into really the meat of the dashboard. We wanted to frame the high-level features of the dashboard and so that when these slides are available, you can return and frame how the next section falls through. And so just that we offer the visualization by region, consortium, institution, by program year.

We have six high-level metrics on the tiles and then you're able to dig deep into them. We offer the AEP score card that is easily accessible as a tile. So you can just quickly look in and see how you're doing on your measuring or success metrics which are also reported to the legislature.

Each one has summary infographics on each page and then we have detailed data charts. And we also have the feature where you can disaggregate most metrics by race, ethnicity, gender, age, program, and then also first time and continuing. And we've also added a comparison view on some of these features.

So you can identify how you're doing from your year or you can also identify how other institutions are comparing and who you might be able to identify as someone who might be able to resource you if you have a question about that area in which they are keeping strongly.

All right. Now I'm going to pause and ask if there are any questions. I saw that Ray that-- thank you. I think Jessica responded to your question. And thanks, Doreen said she did a presentation using the data for professional development. Great. Good to hear that. Really appreciate it. All right. I am going to go off camera and hand this over to Jessica.

Jessica Keach: Thanks, Blaire. Good afternoon, everyone. My name is Jessica Keach, and my pronouns are she, her, and hers. And I'm going to start by walking us through some slides on the Adult Education Pipeline's key terms and some important processes.

All right. We'll start out with something that is fundamental to the dashboard. As Blaire shared earlier, the Adult Education Pipeline takes two separate data sets, data from TOPSpro Enterprise as well as data from MIS and merges them together to provide information on adult learners. Well, as you can imagine there are students that appear in both data sets.

The LaunchBoard uses a derived key which is based on a combination of last name, first name, date of birth, and gender to identify unique students across multiple data sources. Now, there's some confusion on what that means and what that results in in the dashboard. So I'm going to be really clear that ultimately a student will only appear once within a specific locale on the dashboard.

So, for example, if a student attends a community college and an adult school that are within the same consortium, if you are viewing the consortium view on the dashboard, that student will appear only once. However, if you're viewing at the institution level, that community college data or the adult school data, that student will appear in both those independent views.

So as you get higher in level of location, at the consortium level, at the region level, at statewide, we're only seeing those students' outcomes counted once. And as you go lower into disaggregation at the institution level, you're seeing those students captured in each of those views. Next slide, please.

OK. So two key student type definitions in the dashboard are reportable individuals and participants. Our reportable individuals or students with one or more hours of instruction or positive attendance hours and/or students that have received services at a K-12 adult school or noncredit services at a community college.

Participants are students that have received 12 or more hours of instruction or positive attendance hours in an adult education program area. And like Blaire mentioned before, participants is the most commonly used denominator in the dashboard. And we're only tracking our outcomes for participants. The number of hours in this participant calculation is cumulative across CAEP program areas and across institutions.

One final thing that I want to note here is that in the upcoming release of the Adult Education Pipeline, the reportable individuals definition of one plus hour will be aligned to one plus hour indicate program area. Currently, it's counting one or more hour in any noncredit and this is specific for community colleges.

So we'll have a webinar, an upcoming webinar. We'll actually have two, one before the release and one after, that helps really dive into some of these definitions. And I really encourage you to attend that to learn more about the details behind what goes into these metrics. Next slide, please.

OK. Here's a look at our CAEP program areas. These are the program areas that fall under CAEP. First, English as a Second Language; the Adult Basic Education; Adult Secondary Education; Career Technical Education which is the umbrella for workforce preparation, commonly referred to as workforce reentry, pre-apprenticeship, and short-term CTE.

We also have programs for adults with disabilities and programs training to support child school success. And on the right, you can see a screenshot from the dashboard that looks at the participants in each of those program areas. So you're able to get down to that program area view. Next slide, please.

OK. I'm only going to spend one slide talking about math. But let's refresh on what a denominator and a numerator are. So the denominator, which is what we see in the bottom of a fraction, is the total number of students who can be considered for a metric, be identified in a metric.

The numerator, so it's on top of the fraction, is the total number of students who meet the criteria of the metric. And these concepts are really important when you're looking at the proportion of students who are meeting specific outcomes. So we'll take an example, the proportion of students that are achieving an educational functioning level.

The denominator in that metric is the number of participants enrolled in ESL, ABE, or ASE programs. And the numerator is out of those students, out of those participants enrolled in ESL, ABE, or ASE, the number that completed an EFL gain an EFL level by pre-test, post-test, or a course progression in the same program area. OK, that's enough math. Let's go to the next slide.

All right. These are lagging metrics. Just so we're clear, lagging metrics are metrics that by definition take time to measure beyond the time frame of currently available data, right? So an example would be our employment and earnings metrics.

We have data, it's currently available, but we need to wait till the next year of data comes in to check to see if the data or if the students in our current data set have met a particular outcome. And in the case of employment and earnings, we need to wait till we get the next year of data to see if they have exited the education system and look for their wages and employment records.

On the dashboard, if you're trying to view an employment and earnings metric as well as some of our transition metrics, and you're trying to view it for the most recent year, you'll be prompted to view an earlier year. And that's how you'll know you're trying to look at a lagging metric when that data isn't available for the most recent year. OK, next slide.

Now I'm going to walk through some highlighted dashboard features. And I'm going to do this at a pretty high level, because these are just screenshots that we'll be looking at. And at the end of the presentation, I'm going to do a live demonstration. But I'm going to try to stick to some of the key points. So let's go to the next slide.

OK. We'll start off by asking ourselves the question, how do I find my CAEP score card metrics? So as Blaire mentioned earlier, we have these data tiles on the home screen that correspond with the student journey.

The AEP score card metrics are on the top left data tile. And you can see that where it says AEP Score Card. And on the slide, you can see there's a red box around it. So in order to get there, I'm going to select my location and then I'm going to click View AEP Score Card. That's going to take us to the screenshot on the right.

So a long, well, not so long, a list of the metrics required to be reported to the legislature annually. These are your key metrics that you really want to focus on, and it will provide a high-level overview of your outcomes for the year that you've selected. You'll also be able to click on those little Vs or carets and expand to see time trend information.

This information is for reportable individuals, participants, educational functioning levels, workforce preparation milestones, transition to post-secondary, information around completion, employment as well as earnings. OK. Next slide, please.

All right. So when you're setting filters and you're looking at the AEP dashboard, your high-level filters appear at the very top. Here you're able to select your locale, your institution, as well as the academic year of data you want to see, and also this will be really important when you're trying to look at lagging metrics. And then you'll click View. A really important thing to remember is you have to click View to refresh the data with your applied filters. Next slide, please.

All right. We talked a little bit about the data tiles. This is a close up of the Student and Programs data tile. So you can see that there's the live graphic that corresponds to a component of the student journey. And then each data tile will have that blue View button that you can click on to jump into more detailed data.

And here is a look at the detailed data. So you can see on the left, we've clicked on the View Students and Programs button, navigated to the detailed data, and now we're able to look at each of our metrics over time. So this provides you with the time trend, so you can compare not just your current year outcomes but compare them to previous years as well.

OK. Most progress, transition, completion, and employment metrics can be disaggregated by population and program. And you can see that the yellow, orange, and purple boxes around drill down, program type, and student type. So you're able to select from each of these dropdowns to look at your data in a more detailed way.

Specifically, I want to point out the drill down function that allows you to disaggregate metrics by gender, race, ethnicity, and age group. This can be a really useful tool when you're trying to understand equity and student outcomes.

You may also want to compare your data and your disaggregated data to your local demographics to see if you are not serving learners that are represented in your population. Are there learners not adequately represented in your programs? And you can do that by visiting the CAEP Fact Sheets. And we have several resources on the CAEP Fact Sheets.

If Blaire or Ayanna can you put that link in the chat? That would be great. It's a great place to look at your local demographics and the demographics of your community and compare them against the demographics of the population you're serving in your programs.

OK. So let's put this all into practice. Say we want to answer the question, who is achieving transition outcomes in my program? And let's say I want to specifically focus on transition to CTE. I can visit this metric by navigating from the Transition tile which you see on the left, click on that blue button, View Transition, and navigate to the Detailed Data tab.

And then I can select Transition to CTE and disaggregate my outcomes by race and ethnicity. It's pretty small, but in this specific example, I can see that American Indian Alaska Native students as well as Pacific Islander and Hawaiian native students have lower rates of transition to CTE compared to other student groups.

So this illuminates the potential need to provide services or additional supports to these target populations. And I also really encourage you to look at this data across time. So to look to see if certain groups have consistently lower rates of transition to CTE or any other metric year after year. And this can be helpful in really making sure that you're confident about some of the inequities that you're seeing in your data. Next slide, please.

OK, our top five charts. This is an opportunity to help you identify institutions you may want to collaborate with or learn from or reach out to see if there are specific strategies that institutions are implementing that are really working well.

So it's showing us the top five institutions across the state based on the percentage of students that are achieving a specific outcome. Here we're looking specifically at employment four quarters after exit. There's also a top five chart for transition to post-secondary. You can look at the top five institutions across the state as well as within your region.

And in the next edition, the upcoming release of the Adult Education Pipeline, you'll be able to toggle between the top five institutions based on the percentage of students that achieve that outcome as well as the number, the volume of students that achieve that outcome. So it'll be two different view options in the new dashboard release.

OK, tool tips. So anywhere you see this blue question mark, this little blue button, you can click on it, a pop up will appear, and you'll be able to see more detailed information on how to understand the data that you're seeing. And we're hoping these are helpful in providing more clarity on definitions. For the next release of the dashboard, we've done a full review, so you'll see those refreshed as well.

This is my favorite view and my favorite feature of the dashboard because this, the detailed data comparison feature, helps us really answer the question compared to what. So you can see your outcomes for your students, but what does that mean in context? And this feature allows you to answer that question.

So you can compare the outcomes for students in your consortium at your institution to the same outcomes for students across the state or in the region. So I really encourage you to explore this view and see how it can provide more context to your data.

Finally, there's an export feature that you can use to export the aggregate data that's in the dashboard. And this is if you want to work with it in a different way and manipulate it in a different way.

OK. So next section, only two slides, what does coding have to do with it? I want to briefly take a moment to talk with you about coding and why it's so important. And when I say coding, I'm referring to the way your institution enters and categorizes data and the important codes that are assigned to things like courses and programs.

OK. So we know that accurate coding ensures more reliable results. I want to encourage you to engage with your research and/or consortium staff and colleagues to review the definitions in the Metric Definition Dictionary. It's available on the CAEP website or on the dashboards website. You can download it, and you can view it live in the dashboard as well.

It's really important, and you'll be able to see the new Metric Definition Dictionary in one of our upcoming webinars around when we release the dashboard, you'll be able to view the new definitions, the refined definitions. It's really important to understand how students are getting counted in these particular metrics.

And specifically for our community college noncredit programs and partners, it's really important to understand how students get counted as reportable individuals as well as participants. It'll be important to review your noncredit course category codes at CB22 and then also review your top codes, because those are important in determining how students get into different program areas as well.

OK. This is just a brief example of why coding matters so much. So I went into COCI, the Chancellor's Office Curriculum Inventory. And I searched for noncredit courses with computer in the title. And these three courses, well, many courses, but these are three of the courses that popped up, three different colleges, three different CB22 codes.

So one of these courses is coded to elementary and secondary basic skills. One is coded to courses for older adults. And one is coded to short-term vocational and then two different top codes. OK. So based on these codes alone, students would get categorized into different program areas, based on their codes, based on the definitions that we have and the codes that are assigned to these courses.

These codes may be completely accurate. But we want to make sure and encourage you to continue to review these codes and make sure they're aligned with the curriculum in your courses and encourage you to work directly with your local curriculum approval process if these codes need to be changed.

And you can reach out to your research office your instructional support services offices, and they can get you in contact with the right folks to do some deep dives into the codes that are currently assigned and in the system for your programs and your courses. OK. I'm going to turn it back over to Blaire to chat briefly about the resources that are available. And then I'll be back to do a live demonstration.

Blaire Toso: Thanks, Jessica. So we know this is a lot of information, and not just in this presentation, but in using the Adult Education Pipeline dashboard in general. So we have developed a host of resources to support your work in understanding, navigating, and applying the information provided by the AEP dashboard. Next slide, please.

So some of these resources are we'll be doing a series of CAEP webinars. We are offering several two-part sessions, so we can really dig in deep and have an opportunity for people to ask questions and explore data. We've developed how-to guides, data dictionaries, and infographics, and other written resources.

And, also, we are very available. We are available through email. We'll do institutional or regional training, or we can do check ins with you. It is always a pleasure for us to be able to interact with you. And we also always learn from these interactions that we have.

It helps us to continue to push and be really attentive to the fine details, to understand what questions the field has, and understand how data can be represented and talked about with the field. So we consider ourselves also a resource. Next slide, please.

And some of these resources, there's a resource library which you can access by clicking on Click Here to View Resources. It's at the bottom of each page on the AEP dashboard. And you'll see it when Jessica does the live walkthrough, but it's right there. It clicks on and then you can search for a host of different written documents that help you understand the data.

There are deep dives like the CAEP Adult Education Pipeline dashboard coding guide. It covers a lot of the information that we talked about here, really in-depth and related to your work of how you might want to ensure that your data is represented on the dashboard.

One of the things that we really hope is that people are able to submit data, and then be able to really see the full representation of their students and the work that you all are doing on the dashboard.

So there's also an AEP dashboard overview which we have a little icon there and that's a much shorter, more condensed version of the Pipeline dashboard coding guide but also gives you some great graphics. And we'll give you just some really easy walkthrough steps of how to navigate the dashboard.

We also have one that we'll look at why does my data on the dashboard look different from my TOPSpro data, comparison of MIS to TOPSpro data. Those are all pieces that are there. And then we also have things like Adult Education Pipeline FAQ. When you get the PowerPoint, you can also just click on these. These would all be live links in the dashboard, I mean, in your PowerPoint presentation. Next slide, please.

So one of probably the heaviest and hardest pieces or at least the piece that people don't frequently use because it's such a dense resource is the Metrics Definition Dictionary. This is also linkable, it's accessible by the link on the AEP dashboard. And really it is what we call the one-stop resource for understanding the structure, the metrics, and other details that inform what you see on the dashboard.

You can see how it talks about their data definitions, which we'll talk about the different data points, how they're displayed, any limitations or caveats. We also offer key or source documents which is the agency crosswalks, how your institution or your consortium gets identified into a region or a district, talks about the definition of living wage where you can find that.

And all metrics appear there, and they are definitions, how they are related to COMIS or CASAS TOPSpro metrics. It will tell you how we are calculating that and any notes that are of importance. And, Jessica, I don't know. Did you come on because I'm handing it over to you, or did you want to call attention to the question in the chat?

Jessica Keach: Yeah. I saw that Marina asked a question about the completed six or more college credit units. And I'm going to have to look more deeply into it. So I put my email in the chat. So if you'll just send me that question via email, I can look and connect with our coding partners to make sure I'm giving you the best, most accurate response.

Blaire Toso: All right. Super. I'm going to hand it over to Jessica to do the most fun part the AEP dashboard. While it seems really heavy as we talk about all these pieces and components, it's a really fun tool to play with as well.

Jessica Keach: That's right. Thank you, Blaire. All right. So I'm sharing my screen. Just give me a thumbs up. I think everyone can see it. What we're looking at now is the home page of the Adult Education Pipeline.

So here you can see these detailed or these data tiles that we talked about previously. We have our CAEP Score Card data tile. We have our five data tiles that correspond with the student journey-- progress, transition, success, employment and earnings. And then our last data tile that allows you to export consortium data.

Let's start by picking a locale and visiting the AEP score card. So I live in San Diego, so I'm going to pick the San Diego-Imperial region. And you can just start typing and it should filter for what you're looking for. So I'm going to select San Diego-Imperial.

Now, remember what I said earlier. You can see that I made the selection, but nothing has changed. It would be really important you need to click on this View button to apply your filters. So you can see that the data has been refreshed. I'll show you that again.

So let's go to a different region, South Central Coast. Nothing has changed. But once I click View, now you can see that the data has been refreshed. And we're seeing the results of the filters that we've selected in the top.

So I'm going to go back to San Diego-Imperial. And I'm going to visit my score card metrics for the region. Click here on this blue button, and it's going to take me to the Adult Education Pipeline score card.

At first level, you can see the outcomes and the data for the year that's been selected, 2019-2020. And remember we have those lagging metrics, so our transition metrics, employment and earnings, they're not going to appear for the most recent year.

And you'll know that because they have a dash in the new score card, in the 5.0 version. We're going to have symbols that are the same across the board here. So you're going to see that the data will not show for those lagging metrics.

OK. And I can actually select a prior year. Now that data has been populated for those lagging metrics. So if you want to make sure you are seeing all of these full metrics and all of the data, you can select an earlier year. I'll go back to 2019-2020.

And, as I mentioned earlier, you can click on any of these carets to expand the data element. So now you can see the time trend chart for each of these elements. You can see it and can close it back up. I'm going to show you with reportable individuals.

So in 2019-2020 in San Diego-Imperial region, we're about 73,000, 74,000 reportable individuals. You can see that it's fluctuated a little bit over time. And when you hover over the line on these time trend charts, you can see more detailed information. So you can see 73,730.

You can see past years as well about 107,000 in prior years. And this is also the case for metrics that are calculating a proportion. You'll be able to see the percentage when you hover over. So I really encourage you to explore that option.

Let's return to the home page and revisit our question about the number of students that transition to CTE. So to find data related to transition, I'm going to find my Transition tile. And I can click on this Transition tile. But, remember, this is a lagging indicator. And we know that because it's prompting us to select an earlier year. So I could click right into the View Transition and get to that detailed data.

But I'm going to show you, again, and reinforce it. We've got to click this View button to make the filters apply. And now you can see all of our data tiles have been populated. So I'll click on View Transition. I'm going to navigate to the detailed data section. And then I'm going to find the metric I'm interested in, which is transition to CTE in the left navigation.

So you have a definition here, "Among all ESL, ABE, and ASE participants, the number of students who transition by enrolling in either K-12 adult education or community college noncredit or credit CTE course for the first time at any institution within the selected or subsequent year."

Here's our time trend chart. You can see that while the number of students has declined over time. The percentage of students meeting this outcome, it's actually remained relatively stable so 14% in 2018-2019, 14% in 2017-2018, and 13% in 2016-2017. Now I'm going to drill down to learn more about students from different demographic groups and how they're transitioning to CTE.

So I'm going to select race and ethnicity. So this chart is showing us raw numbers, but when you hover over the bars, you can see percentages. Now I'm going to do that. And I'm just going to pause for a moment to allow you to look at this data and digest it.

So in looking at this data, I'm able to identify groups with lower percentages of transition to CTE. And so this includes our Asian students in the San Diego-Imperial region. Their transition to CTE rate is about 13%. Our Hispanic students rate of transition to CTE is about 13%. And then our Black and African-American students is a little higher but still lower than other groups at around 17%.

And, again, I'm going to want to repeat this process for different years to identify groups that have consistently lower rates of transition over time to see what opportunities of service I can think about, work with partners to identify in serving these populations that may be experiencing lower rates of transition.

Now, remember my favorite tool, the detailed data comparison that answers the question compared to what. So we know that certain student groups are experiencing lower rates of transition compared to others within the region, within my program. But what does that look like in comparison to the state?

So I'm going to scroll up I can click Detailed Data Comparison, I'm going to select Statewide, select the appropriate year, and click View. Our data has been refreshed. It takes us back to the time trend view. But I want to disaggregate by race, ethnicity. I really want to dig into some of these equity questions and see how students are performing and meeting outcomes.

So we can see here, I'm going to hover over each of these groups. Do you remember our Asian students, our Black African-American students, and our Hispanic students were experiencing those lower rates? So I'm going to check in comparison to the state what that looks like here. So you can see for our Asian students, we're actually seeing slightly lower rates than statewide figures.

I see the same thing for our Black African-American students. And the same thing for our Hispanic students. But also what's important to me and what I noticed here is that our rates are slightly lower than the state, but these inequities that we're seeing in our region are apparent statewide as well.

So you can also scroll down to see this detailed data table at the bottom if you want to take a screenshot or pull that data for your reports or presentations to the board. And I really encourage you to take this data and have those conversations. This data is a starting point. It's not an ending point.

And you can use it to facilitate conversations with practitioners, folks that are serving students help devise better programs, better services, and more opportunities to improve these outcomes, and also inform further research. And you can inform that further research by looking at this data and developing qualitative approaches to capture the student voice.

It's really important to take the quantitative data, but also to help it inform questions you want to ask students, focus group, surveys and really collect that student perspective and understand experiences of students.

So the last thing I want to show you is this transition, or I'm sorry, is the top five institutions for students who are transitioning to post-secondary. So I'm going to click on this. It's going to show me that we cannot display this metric because we're still in our detailed data comparison view.

So something helpful that the dashboard does, it will prompt you. If you've kind of navigated yourself into a place where the data is not going to be able to show, it will tell you why. So I'm going to go up. I'm going to remove the detailed data comparison. And I'm actually going to select Statewide because I want to see the top five institutions across the state.

So I've selected Statewide. I'm going to click View. And here you can see the top five institutions that are transitioning students to post-secondary. And these may be institutions that you want to reach out to, that you want to collaborate with, and they want to better understand the practices and the projects that they're engaging in to support students in transitioning to post-secondary.

So I'm going to go back to the home screen, and I'm going to open it up for questions at this point. And, Blaire, I'm not sure if there's any more slides but happy to have a conversation about the dashboard, answer any questions, click on any views that you'd like to see.

Blaire Toso: Your timing is perfect. We were going to questions.

Jessica Keach: Oh, good.

Blaire Toso: We've scheduled this to be a little bit of a longer webinar because we have found that our 60-minute webinar, as you can tell, we run up to this point and there's no time for questions, so we are happy for discussion. And as Jessica noted, if we don't know the answer, we won't provide you with one. We will find out the answer and get back to you.

So it says, you can probably tell that there's a great deal of knowledge that goes into this. So I think that's great. Are there any questions, any curiosities, anything that you would like to look at as far as exporting data or showing where those resources live? Anything that you don't know quite how to get to or you're wondering why it looks the way it does?

Cathleen Petersen: Hi. This is Cathleen Petersen from Garden Grove. Can you go ahead and show us the trail to get to the resources?

Blaire Toso: Absolutely.

Jessica Keach: Yes, let me show you that. OK. So we're on the home page here. And if you scroll down, you can see at the very bottom there is this link that says Click to View Resources. So I'm going to go ahead and click on it. And this is going to take you to our resources page and there's a whole host of resources like Blaire said.

We mentioned earlier the fact sheets. Here's a guide on how to understand the CAEP fact sheets, how data align, so it's really just right here at the bottom. And let me show you how it appears at the bottom actually of every page. So anywhere you are on the dashboard you should be able to go to the bottom and find this link to Click to View Resources.

Cathleen Petersen: Thank you.

Jessica Keach: No problem.

Blaire Toso: Jessica, did you want to talk a little bit about exporting the data for people who really like to get their hands dirty and can export it in a state into a form that they can manipulate and do their own calculations?

Jessica Keach: Sure. So let's go to the Home tab. And you can View Export. And so you can select the consortium's data, community college district. You can select Include Institutions or not. And then you can specifically export just your score card metrics or you can export all metrics. And then you click on this button Export to CSV. May take a few moments. Let's see. Over here.

Let me make sure I'm sharing everything. Here we go. And this is the export that you get. So if you are a researcher or if you are interested in manipulating this data in some way, this is a good opportunity to do that. To be clear, this is not student level data. You're exporting aggregate data.

And then you can just look and see. Here's the consortium name locale, the year of the data, the metric ID. You can find the metric ID in the metric definition dictionary. But it also has the title of the metric in the file, the description of the metric, the source and then the different levels.

So you can see if it's data for overall or if it's for a particular subgroup of students, first time or returning. And then you have your value, which is your numerator; and then the denominator, which is your denominator, the number of students considered for the metric; and then the percentage.

So you'll also be able to see if there is data missing or hidden because of privacy concerns and that's what these FERPA flags are and what they mean.

Blaire Toso: The FERPA flags are another reason why data may not show up is that if you do not have enough students. That if by displaying them it would compromise their anonymity, we go ahead and we hide those.

Jessica Keach: Are there any other questions? Anything else anyone wants to visit? OK. I think Emma has a question?

Emma Diaz: So, hello. Thank you for this great presentation. You know I have kind of a general question. And it goes back to right now with the metrics with the three-year. Is we came up with an issue for the community college in the sense of we don't collect-- when we collect the FTES, it's not always per student.

How are you identifying the number of students that you're selecting for community college, because I'm having that issue and working backwards with FTES to be able to put that into the metric section of the three year plan?

Jessica Keach: So for the Adult Education Pipeline, we aren't looking at FTES. So there are specific calculations to calculate FTES. And for the Adult Education Pipeline, as it is right now, what we're looking at for reportable individuals is the number of students who have one or more hours in adult education on noncredit specifically. That's what's in the dashboard right now or--

Emma Diaz: Yes. And how do you calculate that? That's what I'm trying to figure out is how do I calculate the number of students? Because that's the metrics in the three-year versus FTES, which is how we report the program area for hours, versus you're saying right now that the way it triggers is you have in that category the number of students who have been in a class for more than 12 hours. So I think all of those are kind of mixed in my head right now of how do you get to the number of students.

Jessica Keach: Yeah, it's confusing. So it's different for different data sets. So in CASAS, the data is provided aggregate to the student level. There's a column that says number of hours and a flag for program area. So it's pretty straightforward for CASAS.

For MIS, there is an MIS data element called SX05 where institutions report the number of positive attendance hours at the student level that a student is attaining in a course. So we look at that data across all of the noncredit courses.

In AEP 5.0, that will be restricted just to indicate program area. And if the student has met the number of hours requirement, then they are counted as one, one student. So regardless of if they took 10 classes or one class, it's really based on the student and the number of hours associated with the student.

There's also calculations that take into consideration services that a student received. So the best thing to do would be to look at the MDD, the Metric Definition Dictionary, to try to replicate those calculations. And if you want, you can just reach out to us, and we can help walk you through that and try to navigate some of the differences you might be seeing.

Blaire Toso: Jessica, do you want to bring up the MDD just so that you can walk through or we can walk through like how it's broken out in that way? It's a great resource and last year we did some really intentional look at it. Because when you open it up, it's pretty overwhelming. But if you know how to read it, it's actually really easy. So you just have to understand what you're looking at, I think. So if that works, it might be helpful.

Jessica Keach: Yeah. So on every page of the Pipeline, you'll be able to see this box at the top that says, interested in how the data is calculated? See the Metric Definition Dictionary. So I'll go ahead and click on it. And then you can see-- you can click on any of these, but what I think you're talking about is reportable individuals or participants.

I will take participants because it's a little more straightforward, if I can find it. Which...see, y'all aren't the only ones. OK, right here. AE 202, students with 12 or more instructional contact hours. So you can see the description here.

And then in the MDD, there are always two components of the calculation because the Adult Education Pipeline is looking at two different data sets, TOPSpro or COMIS. So like we mentioned, TOPSpro is pretty straightforward.

So what we're looking at is a student that has greater than or equal to 12 hours across the program year and in any of these selected areas, and is enrolled in any of these selected areas. So these flags are equal to one. Let me go to MIS.

So here's the COMIS calculation. So the same thing, adult education student 16 years or older that has 12 or more positive attendance hours in the year summed across all noncredit courses in a recognized adult education program area which are the program areas that we talked about earlier, ESL, ABE, ASE, CTE, adults with disabilities or parenting support.

So here is the SX05 value that I was mentioning. So it's taking students' course records where the course is N which means noncredit. And CB03, which is the top code, is not supervised tutoring or study skills.

That's a specific change that was implemented last year to try to make sure that we weren't inadvertently including students who were taking noncredit courses to support their credit enrollments. And then taking the number of hours across all colleges and all adult education program areas.

So these ones up here, trying to count the students that had more than 12 hours. And then this section here is a definition of those program areas. And it's based on the courses and their codes which is why earlier I mentioned why coding is so important.

Emma Diaz: So one more question on that because if you scroll back up please to where it says 16 plus, so for like CAEP reporting it's 18 years old, how would we aggregate or separate that data out for our reporting in our three year plan?

Jessica Keach: Yeah, that's a great question. At this point, and I'm going to let Blaire speak to the history of including 16 or older versus 18 or older, but at this point, everyone in the dashboard is-- the data that you're looking at is for students that are 16 or older. There is an ability to disaggregate by age group. But I'd have to look more deeply into that, Blaire, and how you might be able to translate that into the three year planning.

Blaire Toso: Yeah. I think you would have to disaggregate by age in the metric that you're looking at. So that you can drop those out, I think. The history on it is because CAEP, many of the programs are also WIOA II funded which allows for 16-year-olds that they are included in the dashboard.

And so we wanted to make sure that programs could represent their full demographics of who was being included. So I think, traditionally, we have disaggregated by age with a distinct 16 and 17-year-old group, which disappeared from the dashboard last year. And it got collapsed into 19 or less.

The new refresh will directly address that. We're not sure where that change was made in the coding process last year. So in the updated one, that is fixed. Remember when we talked about why do we update yearly, it's so that we can find those errors and correct them.

So for your planning, once it's released in mid to late April, you will be able to look at that and then disaggregate and just take those numbers out of your reporting. However, what we've also found though, is that the number of 16 and 17-year-olds is very, very small even among programs that it's under the WIOA II allowable programming service ages.

Emma Diaz: OK. Thank you.

Blaire Toso: Thanks for your questions. And I did want to note that alongside of the update of the dashboard, we do update the MDD, so those are aligned. And you can reliably use the MDD aligned to the most updated version of the dashboard because it is also reflects all those changes. Are there any other questions?

And, Emma, thanks for asking that. We know that the counting of students is difficult in the way in trying to make sure it's represented well. So thanks for asking that question. I think that that's one of the more important questions we can explore because it is important to how students are represented on the dashboard. It is confusing, Emma. Yeah.

All right. Well then, I think we're a little early if, Ayanna, you can share the screen one more time just so that we can go through the last couple of slides. So we've done that. I just wanted to say we talked about the resources.

We've got some upcoming webinars and those are, yes, Get Ready for AEP 2021, Continuous Improvement and Three-year Planning, and then we'll be looking at our Education Pathways Through Education to Workforce. And that is really centered on another dashboard that matches up educational CTE offerings as it aligns in your region to the labor market. It looks up the experience one needs and living wage.

So those are the three just individual ones, but we're really excited to be offering two-session professional development opportunities. We've received the feedback that people really just want to dig in and look at their data or have some conversations among the colleagues that have facilitated the center on these pieces.

So we've got one about using Adult Education Pipeline, the CAEP fact sheets and other data resources for three-year planning. And we're pulling in a couple of people who are leading these initiatives in their consortium so that we can have really some ongoing examples of what people are doing in the field and how they're using the data and other resources to inform planning.

And then we're looking at we'll be running one that explicitly looks at equity in CAEP programming and how to think about that and using these tools to inform your thinking. It's somewhat similar to what Jessica was showing you earlier about how you can dig down into that.

And then the Creating the Career Pathways is related to the education to workforce dashboard that I was just talking about. And thanks, Holly posted some of those links in the chat, which we really appreciate. And then also there's I think, Holly, you probably want to say something about the feedback form?

Holly Clark: Yes, absolutely. Let me get my camera on. So I have dropped in the chat the evaluation form. We would love your feedback. I know the West Ed and Chancellor's Office folks would love your feedback on this session. They spent a lot of time preparing it for you.

But we always look for ways that we can-- if we haven't met your needs, let us know how. If you would like to see something slightly different, let us know. We do use these evaluations to form our future events and how we manage those.

So please, everyone. I will drop it in the chat one more time for you just so it's the most current. If everyone could please just take a moment and give us your feedback, I appreciate it. And then, Blaire, do you have anything else you guys want to go over?

Blaire Toso: No, just to say that we really are open to any feedback, whether it's in that feedback form or to us directly, here or email, questions that you think of after the fact, please send them on. We really appreciate that.

And I'd also like to take a moment again to thank the Chancellor's Office, Mayra and Lindsay, for supporting this effort, and also to Holly and Veronica, the whole SCOE TAP team who supports us in getting these webinars seamlessly presented, so thanks, Holly.

Holly Clark: Of course. And Blaire, and Ayanna, and Jessica, thank you so much for your time today. And, honestly, looking at our events calendar on my other screen, I'm just going to give you a thank you right now because you guys are giving us a lot of your time to get this information out to the field. We appreciate everything you provided today. It was very helpful.

And we do also appreciate, of course, Mayra and Lindsay from the Chancellor's Office, their partnership with this to help make it smooth. We will have the recording available and the PowerPoint available on our website. However, as you're all aware it does have to go through a remediation process with an outside vendor that takes a few weeks.

But as soon as we get it up, we will send a notice out to everyone who has registered to let you know that it is available with a link of where to find it. And with that, I think that we will go ahead. Everyone just found 10 extra minutes of your day. So we will go ahead and let you enjoy those 10 minutes. And I will close out if we have no other questions.

Blaire Toso: Thank you to everyone who showed up, and participated, and asked questions, and listened. We also would like to extend the thank you for being interested in this work and using it to inform your own work, much appreciated.

Holly Clark: Yes. All right. Thank you, everyone. Have a great rest of your Thursday. Bye bye.