Blaire Wilson Toso: I am, thank you. Thanks, Veronica, and we would like to also extend a welcome to you. We appreciate you all. Attending we know that noncredit coding is not always the most exciting, it's not going to your musical. But it is important and there'll be a lot of information that we're going to share today, as well as sort of walk you through some of the processes that we use. We'll be looking at-- just keep moving. I just want to introduce myself.
I'm a senior program manager at WestEd, as well as a company by my colleague and just a delightful person to work with Jessica Chittaphong. If you want to unmute and turn your camera on just say a quick, hello, Jessica, introduce yourself.
Jessica Chittaphong: Hi, everyone. Welcome. Thanks for joining us today. Jessica Chittaphong here. I'm a project coordinator at WestEd and currently, the adult education dashboard. I've been supporting the launch for data tools now for a few years. So happy to be here.
Blaire Wilson Toso: And she will answer all the technical questions that you will have. So she's fabulous, but we'll just go ahead and get started. So really what we're hoping to accomplish here today is learn a little bit more about the AEP dashboard, the build process, explore the MDD, which is the metrics definition dictionary, which is huge. And we're hoping to demystify that a little bit so that it becomes a resource that you all can use. We're going to highlight some key metrics.
We're also going to spend a little bit of time highlighting some of the new metrics in our process for updating the metrics maybe why, and then briefly touch on features like the export quick learning the process for correcting data. And that's basically what we'll wrap up with. We're really excited. Part of this is to step the stage for the launch of the adult education dashboard 4.0, and that will be launching April 26 so we're really excited about it. But it also means that people really are quite, quite busy. So that's where we are today and we entertain any questions that you might have even if they're not exactly on topic.
So please feel free to use the chat or open your mic. Jessica and I both very informal presenters, so we are fine with being interrupted. And sometimes it's easier because I know when people use the icon to lift their hand, which is very polite. I tend to miss it. So feel free to say, excuse me, I have a question. That's all super for us. So we're also-- I wanted to say that when we're highlighting the key metrics, though, not necessarily brand new metrics, but they are metrics that sometimes go underused.
And so we're hoping that by talking a little bit more about them bring them the forefront as you begin to think about coding and planning to use your data that those are more forward in your mind. So I'm going to hand it over to Jessica. One of the things we're trying to do is be very transparent about our processes. So she's going to talk a little bit about what goes into the dashboard build process and why we build a new dashboard at least once a year. Thanks, Blaire.
Jessica Chittaphong: Feel free to go to the next slide if that's OK. All right. So I apologize for the simplicity of this slide. As you can see, I'm a data person and not a designer, but we've tried to map out the key activities for the adult education build that happens throughout the year. And I just want to emphasize that this is a pretty long process, it's very iterative. And it happened simultaneously with a number of other launch core data tool build as well. And so keep that in mind while I overview the process.
We start the year really in July, one continuing trainings on what was built in the previous year, as well as starting the planning process with the Chancellor's Office team, as well as some other strategic leadership as well. And we work a lot with partners, including both CASAS, as well as our coding partners from education result partnership to really plan and review what worked and what didn't work from the last build process and figure out what we want to implement for the upcoming year.
So that happens throughout the fall. And between fall and winter, we really take the time to update the technical documentations needed, specifically for our coders and to support resource development for the field as well. We receive annual data files from both CASAS TOPSpro Enterprise as well as the Chancellor's Office management information system in December and January. So the CASAS folks give us a specific data export usually within mid December and then we get a export from the Chancellor's Office sometime in mid January to incorporate into our new builds.
During that year, we're setting up the foundation, we're doing all the coding and then we're able to test and do quality assurance early spring. And then ideally our releases happen sometime in the spring, usually end of March or mid April. And then the rest of the year we try to do trainings with you all. What you see here is just the key activity. There's a lot of little steps that go into that. And as you can see if one thing goes wrong, everything else gets shifted. So that's just part of the build process and something we have to deal with as we update anteriorly.
So next slide, please. So this slide, I've just stolen from the Chancellor's Office site and trying to make sense of what the colleges submit versus what we get as part of the launch for team. And as this is a pretty much the noncredit focus, we're going to be focused a lot on the MIS data element. So as we've been-- if you look at the yellow arrows, that's kind of where I've flagged, which files we really need from the MIS system to incorporate into the built education pipeline.
And so as you can see, the colleges may be submitting at different points in the year, but we really only get the data once in January. So we wanted to point this out and I wanted to flag the specific data files that we rely on to really power the bill. And it's mainly going to be student files as well as the course file.
Blaire Wilson Toso: I'm sorry, Jessica, when do we get the CASAS file?
Jessica Chittaphong: We get the CASAS file usually mid-december.
Blaire Wilson Toso: It is pointed out because I know that you all received some reports, the CAEP reports, which summarize your processed data on a quarterly basis. But we received the full file only once a year. All right, any questions before we move on? So the dashboard we structure really to focus on the learner, and therefore it's designed to follow the learners educational journey. And it starts at students programs, which is where the learner is and their enrollment and then it looks at the progress we make while in programming then transitions and then on to success then ultimately the employment and earnings.
And we call these the key student progress metrics. We'll still show up here underneath. And you can see how they're categorized and I'm sure all of you are familiar with them already, but that's just how they break out onto the dashboard and we collect data on each one of those. We also don't want you to lose sight of the fact that this is we really see it as the learner journey. And we start out with the learners in the programs. And so, for example, which is we'll talk about some-- a student who enrolls in the CTE and then as they journey on throughout their trajectory.
But as such, we also know that all the learners are different with different goals and different trajectories. So we also account for entry and exit points that track across the academic year and/or their journey, right? So they have the arrows, that means that they exit. But just because they exit, we like to point out that there is also data. The data also shows up. Should they either do educational function level gain or if they end up being enrolled, that those will show up in your different parts of the bucket site. Just because they exit, it doesn't mean that they don't get tracked in the system.
So that also that as a coding note, it's really important to collect the data in that first bucket, the learner and programs of that journey because these data points also affect how data is attributed to learners and your institution across the year in the dashboard. So for example, ensuring that the learners attendance or enrollment or captured influence whether the participant metric is activated, right? And the participant is different from the reportable individual, which Jessica will talk a little bit that more about later.
We try and make the dashboard as complete as possible until we identify new outcomes to measure or we receive feedback from you all as the field to add additional measures that are meaningful to chart in the student's progress and the possible relationship to the services. For example, we just held field testing on this dashboard and got some really great feedback, both on what worked and some of the pieces that people would like to see, which we can then think about and see if it is able to be incorporated into the dashboard. And that's one of the reasons why when we hold these say, oh, they're new metrics, this is part of the rationale for that.
So the MDD is your go-to resource. And it does definitely appear to be unwieldy. We know due to the amount of information in it. However, if you know how to come at it, it becomes less overwhelming. And for me, I put these stairs here because when I first started working, it was dead on the AEP. I did not want to touch the back of the MDD at all. And yet as I explored more, always reminds me of like going into the different parts of the library and like opening and exploring a secret. And that's what the MDD should be.
It shouldn't be so overwhelming, but rather it's should be perceived as going like that basement room in the library and realizing that it's an entre into the dashboard coding process. It gives you absolutely all the information that you will need to know. So for example, you have all the data points, you can find out all the institutions that are tracked for the dashboard. It will tell you about the displays, and then any of the limitations or caveats, as well as in this part about where it says all metrics.
It lists all related cumbersome process tops for enterprise, so if you're wondering what you should be coding or where those data points are coming from, they will tell you that, it will also give you the calculations. And we have for each of these, they have a metric ID, which would be the number and then the label which is three. And this example reportable individuals. But also, in the naming conventions, they all have numbers and they're sequenced so that they start with the lower numbers in those initial buckets and then they continue to move up so that they're actually mirroring the learner journey by structuring the trajectory by increasing numbers.
And once again, I'd like to just make a plug for remember that what you code in the journey affects other counts. And so that you want to make sure that your information is complete. So I'm just going to walk you through what we call as the information about a metric that you're going to find and this is part of why the MDD appears so long is because we take every metric and break it down this way. It will include that metric ID and the label, and of the following fields, which I'm going to walk through. There's a description, there's the student types.
So this one is about participants in ESL. So we're looking at adult education ESL, display is what it will look like on the dashboard. And then importantly, these are the data sources that we use to inform this metric, right? So we pull data from both TOPSpro and from COMIS. Other piece that is important is that it will show you the breakdown for each of those sources, right? So here is the TOPSpro. This will tell you where the data is coming from. And then this is how it will tell you how we get the information on the dashboard. What do we include in that calculation?
So here's the calculation for the participant. And then because it's for ESL, we're also looking for somebody who's enrolled at ESL. And as I said, it's a lot of information. For COMIS, similarly, we have that information and it allows you to also cross-reference that you're completing all the fields needed to correctly calculate outcomes on the AEP dashboard, right? And these because we have access to the COMIS definitions dictionary that you can actually these are all wavelengths.
So if you have any questions about this, when you're looking at going into COMIS, you can click on that. It will take you over into the COMIS section, into the COMIS definitions dictionary and you'll be able to access and understand what's going on there. Cheryl's asking if there's a discrepancy in COMIS and what's the resolving this in terms of the metric used in launch for? That's a great question. Thank you, Cheryl, and it's one that I know we wrestle with. As I said, I'm going to let Jessica answer that because I think she'll be able to tell you more about how that works in the calculations.
Jessica Chittaphong: So Blaire, so I guess I'd like to know what specific discrepancy you're referring to. But in terms of counting students, we count students in either data system. So if you're not flagged as a ESL student in CASAS but you are flagged as a ESL student in the COMIS side of things, you will be counted as ESL student on the launch board. Not sure if that was what you meant, but that's how we do the counting.
Blaire Wilson Toso: Or for example, another example would be the high school diploma program. And then the HST, that's acquired through either the GED or the task or high set, and that while they register differently, we look at both of those fields. The HST, usually only appears in COMIS. So we will reviewing that calculation. We merge those two so that they both become one number. We see them as the equivalent.
And so that's how we adjust. We make register equivalencies between the two systems and then merge them together so that they both appear on the dashboard as a single number. OK, I'll just continue on and please, again, post on the chat if you have questions or if that didn't quite clear it up. Please feel free to continue that discussion. Just wanted to clear, people are asking us-- are always curious about what's the denominator. And so then we look at-- that's listed here as well. And then the numerator is obviously those who meet that criteria.
It also will tell you how things are displayed on the dashboard. So for example, for this one you'll be able to see in percentage and distinct counts. It will also tell you where you can see and disaggregate the information. You'll be able to see this disaggregated by gender, race, ethnicity, age group. And then whether there's a program type or if they're student types. So they won't break it down by CAEP program, but you will be able to see if there are first time returning or continuing student. And that it also tells you how you're aligned with the SSM if that's important for you all. So Jessica, again, are there two separate systems and maybe it's the duplicating students.
Jessica Chittaphong: I was just trying to respond chat. But just quickly, we do a data match between the two systems using student information, specifically their name, their of birth, and gender so that we can find the same student in both data systems and that way we can make sure that we're providing unduplicated accounts.
Blaire Wilson Toso: They don't get counted twice. You only get counted once because we make a match in the system, whether you're in-- if you appear in both systems, we make a match and mail them that way. Thanks, Neil. The last section on this will be the notes, which will tell you any helpful pointers or update you on how metric may have changed and just give you any additional information. So we show us the example, you're showing now in the PowerPoint in the lunch board. I'm not sure-- did you want to say more about that?
DR. OLINGER: It was two slides before that kind of that. How does that look like? No, next one. Yeah, how does that look like in launch board?
Blaire Wilson Toso: So we had not planned to walk through what the display looks like. I can plug. We'll be having two webinars next week that will be really walking through the webinar. I mean, through the dashboard, but maybe if we move through some of this fairly quickly, we can just pull it up really quickly and demonstrate what that looks like. Unless Jessica, do you think you want to just pull it up and show them what to drill down looks like or shall we just go on move through?
Jessica Chittaphong: Let's make sure we can get through the content and then I can circle back to it if we have some extra time. Is that OK?
DR. OLINGER: Yes, that should be great. Thank you.
Blaire Wilson Toso: Thanks, Dr. Olinger. So this is probably something that you already know about have in place, but we just want to really suggest that if you don't already do so that you have a data collection and entry plan in place that talks about the roles and the responsibilities and the routine of who collects your data, how it gets collected, when it gets collected, and who actually enters and submits that piece.
The one is that people tend not to do when they're talking about data collection and an entry plan is that they-- that people tend to orient the people who are actually involved in the data collection submission plan, but they don't necessarily provide an orientation to all staff working in the program. So we think in part two, it's really helpful if you offer an overview of how everyone's work is part of the data system. And it also helps us to keep people simply informed on the possible on the ways that they might be able to support the data collection by identifying or keeping clean records.
And they might even have a different way of talking about how we might be able to identify ensure that the data is kept cleanly. We find that the staff that collect the data may not also entirely understand how that translates to the work and the role of the institutional researcher, for example, or the data manager, for example. So just a plug for a data collection and entry plan.
Jessica Chittaphong: Thanks. Before we move on, I think there's a lot of questions about getting training for COMIS in general for new staff. And our purview is really the launch board and adult education pipelines. But I know there's been a lot of discussion about providing a lot more resources for the field regarding COMIS in the noncredit side of things and maybe we'll talk a little bit about his efforts there. But it is kind of we're supporting the Chancellor's Office when thinking about that for the future.
Yeah, even though the reporting schedules are pretty set, I think there's a lot of local processes as well that are not set by the Chancellor's Office that need to get work through. I'll just say that.
Neil: Jessica, this is Neil. I would just say that in the recent survey, we did a noncredit survey of professional development needs. And that was one of the top needs was the COMIS training so we will bring this up to leadership at the Chancellor's Office and see how we want to meet that need. So more to follow on that.
Blaire Wilson Toso: Super. Thanks, Neil. I think that's one of the things that prompted us to start thinking about promoting a data entry collection and entry plan because in some of our conversations, we do know that there's a little bit of a gap between how that, when and how and who's collecting that noncredit data up to Emma's question about whether their priority lies in that. So thanks, I think that that's super and I'm glad people are talking about that. All right, Jessica, sorry, we barged in and forward onto you.
Jessica Chittaphong: Thanks, Blaire. Yeah so this next part of it, we're going to do a deep dive into a few select metrics just to walk folks through how the metrics are defined and organized. And maybe a little bit on how to think through it. And essentially, this is a summary of what you can find in the metric definition dictionary. But broken out in a little bit more of a easy to understand format.
So the first couple of metrics I'm going over is really the foundational metric. So these are the metrics that are calculated in use as the starting point for a lot of our other metrics. So the first metric is the reportable individual or adult served metrics. And the criteria for that is that we're looking for students who are 16 years or older and they either had one hour of enrollment in adult education or noncredit, or they receive services noncredit services or services from adult education regardless of their enrollment status.
So for the CASAS side, we get a flag if the student was enrolled in the program and then we get a flag if they received any of these particular services. For the COMIS side we're really looking at student enrollment in noncredit courses. And what you see linked here is the specific data element that is associated with that concept. So Blaire had mentioned a lot about the Chancellor's Office and the definition dictionary. So if you're really focused on the noncredit coding, definitely click the link and use that as a resource because that is definitely our backbone for all of these calculations.
So what we used to flag a noncredit course is the CB04 data element. And then we also looked at whether or not they had a one or more positive attendance hours and that is flagged in the SS05 data element. For this year, what's new is that we are specifically excluding tutoring and supervised study skills from the calculation, and that was a decision made from a working group that happened last fall that decided that that wasn't really in line with the adult education programs that we're trying to measure. So those are going to be excluded.
Additionally, we have students who had a disability using the SD01 flag and that's really to meet-- because one of our adult education programs is obviously adults with disabilities. So that's one flag we use for that. And then course CTE we're flagging pre-apprenticeship status using the SB23 student flag. There's no course flag associated with pre-apprenticeship yet. So that's kind of what we ended up with. For noncredit services, we look at the student data record and we're looking at specifically data elements SS16 and SS20.
Now that just seems like a really long grocery list, but it is kind of how we have to work through this to kind of get the information we need for this particular metric. So I'm going to pause here to make sure that there no questions regarding reportable individuals.
Blaire Wilson Toso: And Jessica, just the 16 plus, did you already answer that? This is a 16 plus over inflate the numbers for CAEP recording and CAEP is the 18 plus.
Jessica Chittaphong: Yeah. There's not that many students who are 16 to 18. Right now, that's kind of how we're getting the data from CASAS so we are trying to match those two data systems. So that's kind of why we defaulted to 16 right now. So I would say theoretically those inflated, but based on the data not by much.
OK. So the next metric is probably of one of our main metrics because this is used as the denominator for all of our outcome metrics. And it's really-- the easiest way to think about it is that it's all of the adults you serve, but they have to meet a 12-hour threshold to be qualified as a participant. And the 12 hours need to specifically happen in one of the California Adult Ed Program listed in point A under criteria. And so the way we do this for CASAS is they get flagged and then we look at the program hours and then that's that for the COMIS side of things.
Again, we're looking at student enrollment specifically in noncredit courses in one of the identified CAEP program areas. We're counting the positive attendance hours using SX05 across all of the CAEP programs. And then the following data points are how we define the specific program areas that we are counting. So for ESL, there's two ways that we do this. We look at the CB22 data element and it needs to be an A or a B. Or we also look at specific TOP codes using the data element CB02, which is the TOP code element and then we look at for these TOP codes listed.
So it could either be the CB22 or the specific TOP codes. So that's how we count ESL. For ABE, we're looking at the CB21 data element and specifically the value C through H. In the TOP code is the number listed there. For ASE, we're also looking at CB21 course codes. But at the A through C level with the TOP codes listed there. For CTE, again, we're looking at three possible scenarios for being a CTE program. We can look at this CB22 code of J. We can look at all vocational TOP code or we can look at the SAM codes that are clearly occupational and above.
For adults with disabilities, there's only one way to qualify which is the CB22 equals E. And then for the K12 Success, we're looking at CB22 in the F category and with the specified TOP codes there. One thing to note is that there was a lot of discussion with the field regarding how to capture SX05 the positive attendance hours during spring 2020 where COVID really kind of threw a hamper into the data collection process. And a lot of colleges were unable to consistently report on SX05.
The solution that we came up with is that the 12-hour threshold was suspended for the spring 2020 term for colleges. So that specific criteria will not be part of the spring 2020 term calculations. OK. And we're in further discussions how that's going to apply for next year's bill. All right, so the next one is a pretty tricky one because there's a couple of ways to get an educational functioning level gain.
So the general criteria here is that one we're only looking at participants. So folks who have the 12-hour threshold who were in ESL adult basic education ABE or adult secondary education ASE. We're only looking at those people from those programs and then they can complete an ESL level by either pre and post testing or through, of course, progression in the same program area. So in CASAS, this is a pretty calculated field. So we look at to see if they're in one of the three programs and then we'll to see if they're flagged as completing a level.
For the COMIS side of things we're looking at whether or not they were in those three program areas and then we look at both the assessment filed SA07 to see what level they were this year and what level they end up in the subsequent year? So that's one criteria. So the next criteria would be through the course progression, course advancement. And we track that using the CB21 data element so we look at their enrollments in this year to see what CB21 level they are here. And what happened the courses then and then we check to see them again the following year to see if any progression is made.
So there's no question. But the next focus metric focus we wanted to focus on was the transition to post-secondary, which is also a bit tricky. So we are looking at the same participants from the EFL games to here who will be counted in the denominator. So again, we're looking only at the ESL, ABE, and ASE programs. And you can transition in two ways. You can transition because they can vary by either enrolling in the CTE course or enrolling in a non-developmental credit course.
And we time to stamp it because we're looking for this transition specifically for the first time for that student. So for the CASAS TE side, again, we're looking at the same students. There's a flag for if they had a transition to post-secondary that we count and then we look at all of our records to make sure that that's happening for the first time in the year that we're looking at. For the COMIS side of things, we are looking at the same set of students and then we see if they end up enrolling in a CTE course so we're looking at whether the TOP code is marked as vocational or if they end up in a pre-apprenticeship program. And that's something you have to look for in the student file SB23.
The second criteria can be met by looking at whether they enrolled in a credit course and we use the CB04 data element for that. And then, again, we look back in time to make sure that transition happened for the first time. OK, so the last metric we want to focus on is the barriers to employment. And you'll see that there are huge parts that are going to be in the education pipeline. So there's a chart for if ever flagged and then a chart for flagged in the selected year.
So the original thinking with these charts is that we believe that certain barriers follow you through your life and are more long term and not temporary like some other economic barriers. So they should be-- for a long-term barrier should be tracked and recorded whenever you have-- if ever you were flagged as that having that barrier. And you can see kind of the distinction here where we have some longer-term barriers that may affect you throughout your life. And then we have the barriers that may be temporary for you.
Blaire Wilson Toso: As a student when you're coding if you would want to every year for each enroll that you would want to verify the flag and then selected you because that's what it gets counted for each year, right? Whereas if they're ever flagged, they don't necessarily need to be reported again.
Jessica Chittaphong: Exactly. And these are mainly five using specific student files then you can see in MIS.
Blaire Wilson Toso: Terence, I don't-- yes, I think the logic is worse, but I'm not sure that they're informed. Is there any empirical work or research that supports this decision, Jessica?
Jessica Chittaphong: Yeah, it's definitely a decision. That was inherited based on earlier versions of the launch board. And for consistency's sake, that's what was carried over to the adult education pipeline. I'll have to do a little bit more research to see if there was originally any work to think about this on the launch for itself, but it's definitely a legacy thing parent. So I don't have access to that now, but I can definitely look into that.
Blaire Wilson Toso: Good question. Thank you.
Jessica Chittaphong: OK. So things to look forward to in the new build that is meant to launch by the end of the week, so look forward to that. So we talk a little bit about the transition of post-secondary metric and there's like two criteria is that count as transition. We're actually providing two new metrics that break that transition of. Transition specifically to CTE and transition to noncredit developmental credit coursework. And that would hopefully support different institutions that are maybe aiming for different things. So that's going to be available in the next build.
We also are introducing something we called the top five chart. So there are new charts that are going to be available for a few metrics that tell you proportionally, which institutions are the top five in reaching these outcome metrics. So right now if you select-- if you're looking at a state wide view and you click one of these metrics, you'll see the top five institutions who met that state wide. And then at the regional level, if you select a region, you'll see who are the top five in that region and hopefully that'll help start conversations with future planning as well.
Blaire Wilson Toso: We're super excited. And a plug for the webinar a week from next Friday, we will definitely be demoing those and people who reviewed it really, really found these to be useful and interesting as Jessica said for creating conversations and looking to people who might be able to engage in conversations beyond your institution.
Jessica Chittaphong: All right. So we're expecting a couple of new data elements from the Chancellor's Office MIS system, specifically geared for adult education and noncredit. So our initial goal is really simple we thought. So we really just wanted an award data element for high school diplomas that are provided by the community colleges. Because right now we have a workaround that attracts us in the community college side using SB11. We decided it might not be the most accurate or elegant solution.
So we really were trying to levy for a specific award data element that tells us that flags high school diploma. And we also were hoping to update the assessment file so that EFL level gains can be tracked within a specific discipline area without having the math that back to the TOP codes. So what we ended up with after working with the MIS folks is that they recommended just having-- the starting a whole new file system for adult education, and that is going to be the adult education assessment file.
It's new. And it's going to be implemented in the summer. It's going to be started-- so we're going to have access to data files. The prone to doing this is that there's a lot of room for growth if we do want to have more noncredit adult and specific data elements added to MIS. The con is that the SA07 file has been moved to this new data file. So it's going to be now named AA02 and that's going to take some getting used to, but I think in the long run it might be helpful for little education and noncredit states. So that when we need new data elements there's a space for that. Blaire, do you want to add anything to that?
Blaire Wilson Toso: Oh, just that we're really excited. I think it goes back to some of those points about being able to carve out more space for adult education in the realm of the community colleges. So I think that that's really exciting for us as well. And I just wanted to touch base with Emma on the transitions that are charted. Yes, they absolutely re-chart out of adult education sites into the community colleges. Those are not just for transitions that are made within the community college.
Jessica Chittaphong: OK. So this is just a sneak preview about what the data elements will look like in the AA02 file. So here the MIS folks kind of merged our request for having the EFL Gains broken out by discipline area with a data value for high school equivalency exam. So you'll see here the EFL Gains-- the EFL Gains were set out from E1 to E5, M1 to M6, and S1 to S6 that are grouped by the discipline area and by the different levels there. The H1 is specifically for the noncredit high school equivalency exam.
So if a student passed a GED test, this could be where you flag it. Next slide, Blaire.
Blaire Wilson Toso: Yes, I just wanted to-- Thanks for your patience, Linda. But I don't know the answer to this, Jessica, will a local community college be able to pull a list of the students in transition from my adult school to that college?
Jessica Chittaphong: Yeah, so the pipeline itself only displays aggregate data. We don't have student level data there. There have been discussions about colleges being able to do data exports using data on demand. But that's a process that we would have to work with the Chancellor's Office to implement and to pull. So I'm sorry, we can't pull specific students, but we can give accounts.
No problem, Linda. And data's there. It's just a matter of whether we can give it to you or not. So that's kind of the limitations of a public data dashboard. The SP02, we were able to provide-- we are able to add a specific data award for the high school diploma that is provided from the community college Chancellor's Office-- the community colleges. So there's a new data element SP02, specifically for high school diploma and the data value is new now and it's the last-- it's going to be the last noncredit award listed.
Blaire Wilson Toso: I just want to say, we really see this just thrilled it's taken a good couple of years of discussions to be able to get these elements added in the COMIS. So it isn't that we haven't recognized that they're important. It's just that it's a process of moving more than one system along to join us and thinking what is important. So we're delighted that this will be in there. And this is a great time to give a heads up to any of your institutional researchers or whoever is doing that coding that these pieces are coming down the pipeline for next year data collection.
I think these may have already been addressed or maybe, Jessica, did you? Oh, no, there you are. For the COVID-19 updates.
Jessica Chittaphong: Oh, yes, I talked about this briefly. This is just talking about how we're treating this spring 2020 terms differently than the other terms just because we're trying to mitigate the issues of the collection within the time of COVID. So again, we're just doing away with the hour thresholds mainly. So students who are enrolled regardless of hours are going to be counted as part of individuals and participants.
We still have the TOP code exclusions and we are still looking at specific program areas, but the hour threshold is not counted for the spring 2020, specifically. And we'll provide an update if this is going to continue on for the following year or not.
Blaire Wilson Toso: Yeah, thanks, Neil is talking about the timing being good. Yeah, it's been interesting to look at the data and we think that it'll be interesting to continue to look at the data for next year as well. And then we just wanted to focus on one quick feature that you'll see in the dashboard is that there's an export feature, that you can actually export your institutions or your consortia's data into an Excel sheet so that if you want to look at your data more closely or manipulate it in ways that we are not doing that on the dashboard that that is up to you. But you can really get the data that's associated with your consortium or region.
We wanted to let you know that we're working on a coding guide to company this. It will not be as detailed as the MDD, so we are hoping that it will be more overviewing of information walking you through some of the key metrics. But at a higher level that will help understand the broad nature of the work that goes into coding. We have some coding tips into it and some guidance and a little bit more explanation of what to expect and what you might want to put in play, and what you need to know in order, again, to be able to move into the MDD with a lot of detail.
And thanks, Emma, yes. Jessica, is the question is that the same in TES regards of the students enrolled in 2021 with less than 12 hours will be counted in TE data and just into launch board.
Jessica Chittaphong: Not for this still Dr. Olinger. It's because we get TE data at the program gear level, we don't get it broken up by a term. So for this year, we're kind of using what they gave us for the 19, 20 data set.
Blaire Wilson Toso: We know that we've given you a lot of information, a lot of technical information partly because we're figuring all of this out and the coding is technical. So we appreciate your patience as we've walked through that. Are there any additional questions? Any points of discussion? Please feel free to unmute yourself as well.
If not Jessica, do I may stop sharing and we can walk through a couple of those features. Maybe I think there was the-- I'll have to scroll back up. I know that the top 5 would be a great one. What was the one that was asked earlier about? Oh, the disaggregation if we can show also how things get disaggregated. So I will stop sharing.
Jessica Chittaphong: Sure. OK. So this is a preview of the slide that's going to be live in a few days, actually. So these numbers might not match what you are able to access publicly. You'll be able to get to the site by just googling launch for adult education pipeline and you'll be able to access the site. So just putting that out there. This is what we call the home page and these tiles correspond to the stepladder that Blaire had mentioned earlier about the drain buckets.
So I think we wanted to look at specifically the program areas. So before we get into each metric, we're getting into the summer review, which is kind of like that infographic high level view. We're not really interested in that right now. It's pretty cool, but let's go to the detailed data view. And this is the page where we have each metric laid out. So the first metric here is the participants in the ESL program. So what we have here is the time trend view. So we started getting the adult education data in 16, 17.
So we've just kind of put that as a cutoff for this bill, so we're only providing information for 16, 17 onwards. And just to make it a little bit more consistent. From here, you'll see the times and view. And if you hover over it, you'll see the counts of the number of recipients who ended up in this program that we could count. Underneath is a little data table that plays the count as well. The draw downs are available on top of the charts. There's always going to be a drill down for gender, race, ethnicity, as well as age group. So race ethnicity will see the number of each ethnicity that we were able to find that are in the ESL. And then hovering over it provides account of the specific category, as well as the percentage.
So how you read this is that let's see for the American Indian, Alaska Native. Statewide we found 1,400 participants who were in this category. 9% of them which is 122 are taking ESL. So that's how you read that particular percentage point. So that's one drill down. You could also have a secondary drill down that specifies between first time and recruiting students. So first time students is whether this is the first time in either system the process or the COMIS data set. Or if they're returning, meaning they were found in either data set before the selected year. So that's a quick overview of that.
If we want to jump to a progress metric, you can just click this little arrow here and it'll take you to the next section in the student journey, which is progress. And you'll see here completed one or more educational function levels. You'll see here again the time trend. And you'll see that you can see the same drill down pop up. Additionally, for a lot of these outcome metrics, we also could provide the drill down by drill down type. So if you're interested in looking at whether ESL folks were able to get this educational functional little gain, you can just click ESL. And that'll update the chart to display just ESL students.
Now I'm trying to navigate to a top five metric and they don't happen until the transition page. So what we can see here is top five transitions to first post-secondary. And you'll see here because we allow for an additional year for students to meet that transition. We actually have to go back a year to see data. If we go back to 18, 19, you'll see statewide. These are the top institutions that are showing students who transition to post-secondary. And it is proportional so these values will be very different. And some smaller institutions will pop up. So it's based on the proportion of students. They are able to move into post-secondary. OK, looks like we have a few chats.
Blaire Wilson Toso: It was mainly about where they come up, but we have about one-- oh, thanks, Holly, for putting that into the chat. So those are people we're asking about when the next webinars were coming up. And so for the full list is in the chat and then you can register at the link that Holly just kept in there. So that is where we are right up at 1 o'clock and I don't know if anybody has any last questions. Otherwise, I know we're supposed to shut down on time. So I want to take a moment to say, yeah, thank you. Thank you all.
If you're interested in feedback, that really helps to drive a lot of these changes. So please feel free to send on questions to us when in the PowerPoint we'll have our last-- on the very last slide, it has our contact information. Feel free to reach out to either of us and we will respond either with the information that we need or we'll look into it or send you on to someone who might better be able to answer your questions. So thank you very, very much. We appreciate your participation.
Veronica Parker: All right. Thank you, Blaire and thank you, Jessica, and thank you all very much for attending today's webinar. As Blaire has mentioned and Holly posted a link to the registration page. Definitely feel free to register for the upcoming webinar. The next one is on April 27. And it's for updates regarding the adult education pipeline. So register for those webinars, as well as all of the webinars that we have listed on that particular page.
We also have an evaluation that we will be conducting for Jessica and Blaire. So please, take a minute or two to complete the evaluation and let them know what you thought about today's webinar. And then also, if there are any additional training topics that you think will be beneficial to the CAEP field as a whole. We look at those evaluation scores and the narratives, and that's how we do our professional development planning. So we use that as a resource to make sure that we are giving you all what you need.
So again, thank you all very much for your time. We will post a recording in the next couple of days, as well as the link to the PowerPoint presentation. So if you had to leave for whatever reason or if you would like to share with colleagues, or if you would even like to come back and reference it for future use, you'll be able to do so. So thank you all very much for your time and your participation and everyone have a great afternoon.