Dulce Delgadillo: Great. Thank you. Hi, everyone. I hope you're all doing well. I'm so happy that you were able to join us today for our presentation from the North Orange County Regional Consortium for Adult Education. My name is Dulce Delgadillo. And I am the director of institutional research at North Orange Continuing Education, which is a standalone non-credit institution as part of the North Orange County Community College District, and part of the NOCRC Consortium. Oh, there's an echo.
So I'm not sure. It might be-- not sure why you have an echo.
Mandilee: Oh. We don't hear it on this side.
Dulce Delgadillo: OK, perfect. I just want to make sure. Perfect. All right. So our presentation today for the CAEP Summit is around empowering your consortium to understand its data, and tips and tricks. We're really going to be going and diving into a little bit about just a background of where we come from. We are, again, a standalone non-credit institution. Three of us on this presentation are from the research department. And we have our great director of the consortium.
And I think a lot of the conversation and things that we want to present to you has been our journey in trying to understand the data and how to develop user friendly tools for a variety of users that will help them understand what they're implementing on the ground, how they're serving our students, or contributing to those outcome metrics on the launch board. And how are things, coding, or how are things that you're doing internally as an institution, impacting what shows up on those launch boards?
And kind of creating that data literacy within your consortium here at a local level and then also just staying up to date with what is happening at the state level, and really how everybody kind of plays a part and role in this data literacy component and understanding all of these moving pieces. So next slide.
Great. So, like I mentioned, my name is Dulce Delgadillo. We have Jason Makabali who's our senior research analyst. And he specifically really focuses in on the MIS, which is a management information system. This is very well known across our community college partners. But it's specifically for the California Community College system.
We have Janeth Manjarrez, who is our director of NOCRC, and then Dr. Harpreet Uppal, who is our senior research and planning analyst that focuses on the NOCRC component of the NOCE research. Next slide, please. All right. So what are we going to cover today? So we have a couple of learning objectives. So we want to go over what are those, what is the purpose, what we're hoping that you get out of this session. So again, some tools, kind of lessons learned.
We have our NOCRC landscape piece. So really understanding kind of, what does our consortium look like? How do we function? Who is a part of it? We also have some data tools that we've developed in-house to really be able to build up that data literacy within our consortium. So we have some handy dandy Excel files, some portals that are accessible publicly, accessible to everyone, so that we can help you start digging into what are some of the entry points that we were kind of exploring, or avenues that we were exploring, in trying to understand our consortium data.
We will-- we have Banner school? And do utilize CASAS? Yes, we are actually both. So at NOCE, we are a Banner school. And we are also a WIOA school. So we have TOPSpro CASAS data. So we will definitely dive into kind of how we've had to navigate both of those landscapes, not only to comply for community college data elements, but also, the feds and WIOA on those components.
Then we'll talk a little bit about the data literacy trainings. And we have some polls in there. And we have some Q&A that we would definitely like to get some feedback from all of you and see kind of what have been your experiences around this realm of CAEP data. Next slide, please.
All right. So we're going to go ahead and start it off with a poll. So I'm not sure how many of you are familiar with this awesome website called Slido. But you can go ahead. And I know everybody right there next to you, or maybe a couple steps away, you have your cell phones. So if you could go ahead and just scan this QR code with your photos, all you got to do is open up your camera on your cell phone, scan it, and it'll actually take you straight to that website.
If you don't want to do that, you can also just log in to slido.com. And it'll literally take you to just a screen where you have to punch in this number. And your number is 845835. And we just want to get a sense of, what sector are you from? Are you consortium staff? So are you an administrator, or admin support, or a staff member that's working directly with students as part of the consortium staff? Are you a K through 12 adult ed, adult schools? Are you a community college non-credit program within a-- within the California Community College system?
Are you community partners offering services out into the community and partnering maybe with your K through 12s or through your community colleges such as a one stop shop? Or we also have those others. All right, we got some good participation here. We got 29, 31, we got over close to 2/3 of participants here are consortium staff. OK, a little bit under a third K through 12 adult ed. Got some community college non-credit. Oh, that proportion is going up. OK, we got about 3% that are other.
And this is really just helpful for us to understand who our audience is and kind of what some of the contextual things we want to make sure to cover. All right. I'll give it maybe about five, 10 more seconds. We got one more coming in. OK. All right. Thank you, everyone, for participating. So as part of this session, we got that last one. All right. A little bit under 60%. So at 58% consortium staff. A little under a third, K through 12 adult ed. And we have some others. And we have some community college non-credit.
So thank you. Thank you for participating in that. Next slide. All right. So what are we hoping that you gain out of this? So again, I mentioned, it's really been kind of this ongoing journey for our consortium and for our research team to really dive into, what are the tools? What do we need to build our capacity around?
And what are some useful just lessons learned that we have gone through to be able to really understand how our data is not only coming out on the other end, but what are the things that actually impact it at an institutional or a local level that impact what comes out on that launch board, or on those metrics that are being reported out to the state?
So what we're hoping is that you're going to learn how to use CAEP data tools such as the launch board. We're going to be diving into the curriculum, the Chancellor's Office Curriculum inventory, which is COCI, so some curriculum components. What are some data dictionaries? So we may get a little technical. And so I know some people like that. But we want to make sure to kind of cover all our bases in terms of just what that data shows up in AEP launch board.
Participants will also learn strategies on how to build state literacy. So we'll talk about the conversations that we've had within our research department with Janeth being the director of NOCRC, and also with just other stakeholders within our consortium. So other staff within our standalone credit institution, how do we communicate this with partners that are outside of our institution? So really just some communication and some strategies on how we have approached this data literacy component.
And then we're hoping that you walk away with the confidence to expand your data at a much more granular level. So what does that mean? Obviously, we're all slightly different. We're all not functioning in the same wheel. But what does that look like? What are some avenues that could be helpful for you or that you're interested, I don't know if that makes sense, for your program, or for your consortium, or for your institution, to really dive into that data. Next slide.
All right. We wanted to bring it back to the CAEP Summit theme. We really wanted to make sure that we kind of connected the dots of, how does this really connect to this overall theme of reimagining adult education, and recovery, equity, and transition? And so really, it's-- obviously the big word here is data, data driven. We want to be data informed as we're making these decisions and looking at our data. A lot-- we have heard a lot from other institutions that there is-- due to COVID, we've had to recover a lot of our enrollments.
And so really understanding, how does our data, and how can we use our data to help facilitate those enrollments and increasing those enrollment numbers across our programs? Maybe what works for your ESL program doesn't work for your DSS program because you're either the data is telling you something, a story about those two particular student populations. Equity. Equity is something that we try to embed in all of our institutional initiatives.
And so we want to make sure that we are looking at our data through an equity lens. What does that data look like for student populations across race ethnicity, across disabilities, across our LGBTQIA plus populations? So really trying to look at your data through an equity lens and seeing what stories your data is telling you through that.
And then transition. I think transition, we typically understand it as moving for one thing and another-- I think now being in COVID, transition has a completely new meaning. And so really understanding, what does that transition look like for our students, for our consortium, for our institution, and overall for moving the needle in those education gaps that our students experience.
Lastly, the strand around program evaluation. So you can't make decisions without knowing what happened, what worked, what didn't work. And so really, that's where that evaluation component comes, and how important looking at that data. We always advocate that you don't just look at the numbers. There are stories behind those numbers.
So we always make an effort to include qualitative data. So either open ended questions, focus groups, or just hearing the stories of our students so that we understand why these numbers look the way they do, and really capturing those student voices as you are reassessing, re-evaluating the initiatives and the programs that you're implementing.
Next slide. All right. I'm going to go ahead and hand it off to Janeth, who is our fearless leader at NOCRC.
Janeth Manjarrez: Oh, thank you, Dulce. You're too kind. I really, really think this is a communal effort. And how do we get our consortium on board in terms of really understanding the importance of data as we know-- as a consortium for adult ed, we are very unique. We are only-- we are apprised of community college districts, K through 12s, nonprofits, and our side, we have the Orange County Department of Ed.
So how do we all work together to one, continue to support our students and or adults in our region for academic and student success? And how do we understand data? Or how do we let our partners or vote or members the significance of solidifying strategies or activities, CAEP activities, that would have immediate outcomes, immediate-- yeah, immediate outcomes to our CAEP metrics? And that's a big factor in our consortium.
And as I mentioned earlier, we are apprised of all different entities that come together for the North Orange County region to support adult ed in that capacity. So we do operate and support Banner. We operate by Banner. But we also have our very own Harpreet, which is a full time researcher, Dr. Uppal, who is amazing. And we're very lucky to have her at a full time capacity where she connects with our other members.
She is a NOCE team member and employee serving NOCRC or CAEP activities as a fiscal agent, as part of the fiscal agent, I should say. And she works with in this case with North Orange County Regional Occupational Program, which is NOCROP. It's a non-profit organization that supports K through 12 and adult education. And she works directly with them since they input their information through CASAS.
And kind of solidifying the process, we also have a very unique situation where we have-- I do see I think Melissa is here today from Garden Grove Unified School District. And Garden Grove Unified School District is one of the very unique districts where they are part of three consortia here in the southern California area.
So we're, again, connecting to ensure one thing, obviously continue to ensure compliance, grant compliance since we know that CAEP is a ongoing soft grant. And at the same time, connect that data for support, and funding, and obviously outcome driven. What are our adult education providers? I mentioned them already.
Our roles of our regional partners is to really come together-- hi, Meliss-- to come together and really work together in a cross collaborative manner, whether the K through 12s are supporting us in outreach, and marketing in terms of, are their parents ready to take some of our adult provider classes, courses in this case?
We have three funded members, North Orange County Continuing Education, Garden Grove Adult Education, and North Orange County ROP. Those are our three funded members in our consortium. And the rest are voting members as K through 12 and community college partners since we do belong to the North Orange County Community College District.
And then we have what we called our executive committee, where all the voting members come together on a monthly basis. And we share our activities, our ideas. And then we get approval or consensus, I should say approval for the activities, and the activities and the budgets related to it. And then finally, we kind of divert out. If you go to our website, and as you see here, our regional partners and voting members are part of our signature line, as you see in our PowerPoint presentation today.
If you're more than welcome to visit us at northorangecountyadulteducation.org. And you'll see kind of the way that we are together, our resources, we have all that information accessible to you. And finally, we have an octopus where we have work groups, or based on the program areas for adult education.
So you have basic skills, disability student support services, CTE. So you'll see, and they meet on a regularly-- on a regular basis once a month or bi monthly depending on the needs and the availability of each work group. And that's where the practitioners, the field experts, come together from all the entities mentioned here or shown here from the PowerPoint presentation to solidify the process and ensuring that the activities are getting met and are representing activities that are related to the outcomes for continuous student success.
And we have make it-- we have made it a priority. And I think that idea emerged, Dulce, correct me if I'm wrong, the data donuts. And this was their department. I am just happy to be part of them and coming together and say, hey, how do we simplify it? And how do we make ourselves accessible on a monthly basis or on as needed basis, quarterly, as we're solidifying our K20-- annual budgets? And we label it data and donuts.
And it has worked. We have those-- have had those in-person pre-COVID and online while and post COVID. And again, we just are very lucky to have a full time research like Dr. Uppal who works together with all these different entities and tries to come into a mutual or common understanding of what the needs are and what do we need to amplify, or expand, or enhance, or retweak for continuous metric outcomes being met, as well as activities being successful for academic achievement.
I don't want to make sure that I go into it. [laughs] Next slide. There it is. The annual report. Yes. So we made it an effort to have a collaborative open sessions with all of our members and guests about the needs of in our region for that are obviously related to CAEP metrics for student success. During these discussions, we come up with ideas, possibilities, where we measure them. And do we make sure that we approve them and continue and start the work, but it doesn't start without a data measuring process.
And this is where Dr. Uppal comes in, the rest of the NOCE team, Dulce and Jason, to help us coordinate those efforts and what those templates or templates or data gathering processes will look like. And a lot of the things come and emerge from what we already have and we already know, the adult education pipeline launch board. And then obviously, what are unique needs? And our region, just like we all do, we all have them. Just kind of explore that option.
And I don't know if we're going to talk about it in an evaluation plan that we also have integrated and adopted for our consortium in terms of our CAEP activities for all program areas and in terms of success and CAEP metric data gathering. I think that's-- next slide. I think that's it for me. And this is where I share it back with Dr. Uppal. Or I could be wrong.
Harpreet Uppal: No, you're good Janeth. I'm going to hand it to Jason. He's going to take from here. And then I will continue on after Jason's presentation. Thank you.
Jason: Yup. Morning, all. So the problem that we have here is in order for us to conduct an evaluation, we have to first understand what our data is, what our data looks like, why it's presented the way in which it's presented. As the research office for our institution, we were tasked with, OK, so this is the data that we're seeing. This is the data that's being presented on all the public facing stuff. Why is this the case? Why does our data look the way it does? What goes into this? What are these calculations? What is even going on here?
So in order to make those informed decisions about how we can use this data to inform what we're going to do as a consortium, we need to understand, what is this data telling us in the first place? And so we decided to dig deep and figure out, what's going on in these calculations? How can we improve our reporting processes? How can we improve the data that we're collecting, the quality of the data that we're collecting? How can we make it so that it's an accurate reflection of what we're doing as an institution so that we can further understand how we can better serve the students that were serving?
So our-- full disclosure here. Since we are community college based, most of what we've done so far is reliant entirely on MIS. We're still working with our top TOPSpro data to understand the processes and how it gets in there, and what exactly it is that we're saying in TOPSpro. But so the focus of what we're going to be showing off in these upcoming slides will be MIS based.
But we will touch upon TOPSpro related issues as they come up. So next slide, please. So next step was our approach to this. Our approach was first, we need to understand our data. And that's where we come in to dig deep. But we want to know how our data is actually being submitted, what we're actually submitting, how all these calculations are going on in the back end in the first place so that we can see what these numbers are looking at, and how they're being produced.
And so all of that requires a baseline level of understanding of your data. And then from there, as a research office, we wanted to develop tangible tools so that we could report our findings of what we found of how all this stuff is being done to all of our stakeholders, and how they can utilize the data so that they can understand what it is that they're doing and how they relate to the bigger picture of our consortium and our student services. So first, we need the tangible tool so that we can have something that they can use, they can touch, they can feel, that they can see.
And then once we have those tangible tools, we need to start building data literacy so that people understand how to use those tools. And we need to train folks so that they can utilize them properly. So from there, this is an ever evolving process. This is always ongoing. There's always data. We're always reporting more stuff. So we're always going to be having these discussions with all of our different stakeholders. So we want to look at what our next steps would be, whether-- for example, we've identified problems with how our curriculum is submitted.
So we need to take those discussions over to the curriculum committee to inform them and help them build better decisions into their processes. Or we have to do-- which requires the ongoing trainings and the stakeholder engagement with the data. So all of that just plays a part into how we have to look at our data so that our consortium understands how our data is actually operating a little bit better. Next slide, please.
So first, we wanted to get a feel for-- we asked for roles and representation to see how our-- what our audience looks like. But we were kind of curious. So who's actually presenting-- who's actually submitting their data through MIS? Who's actually submitting through TOPSpro? Who's using both? Who's not really sure? All that kind of good stuff. So please join the Slido poll. Thank you.
All right. Let's aim for at least half this audience. We're at nine right now. Can we get to 30, please, please, please? I believe in you all. We're getting there. It's not actually in Zoom. It's actually through a-- you can either go to slido.com and enter in that hashtag code right there 845835 to join this poll. Or you could scan the QR code up in the top right on your phone and it'll take you directly to it. And then it should allow you to vote in the poll. Sorry about that. Not bad, 29.
Just a little bit, one under. But that's cool. Looks like we're kind of coming to a stop here. But yeah, it looks like the majority here are using TOPSpro. So yeah, we're also trying to since we use both, we are trying to get a feel for how our TOPS data actually works, and all of our processes that go into place. Because the way we submit our TOPS data is kind of decentralized. We don't necessarily just have one person entering stuff into TOPS. So we kind of want-- it kind of comes from all over.
And so we're trying to get a feel for how that is entered and what kind of auditing process is regarding that. Of course, we check our DIR and everything to make sure that we're meeting our compliance. Because we are also WIOA funded. But we also want to make sure-- we want to make sure that everything we think it's saying is actually what it is saying. And that's something that we're still investigating on our part. It's interesting. All right. Cool. Good to know.
So yeah. Most are from the WIOA or adult K-12 adult ed side. Good to know. Good to know. Thank you. All right. Let's go next slide, please. So for us, the example that we're going to be running through today is that we were going to look-- for us to understand our data better, we decided to look into how our courses were coded in MIS. As a community college based adult education provider, a lot of it is reliant on the kind of coding that we use to identify program areas so that these programs are identified as where our students are actually learning.
It's a little bit different from TOPSpro where you just enter in the box that a student is a participant within ESL or within adults with disabilities-- or what you call it? Yeah, adults with disabilities programs, or the CTE program, or high school equivalency, or something like that. It's a little bit more complicated. So first, we had to understand how we were actually submitting our MIS course data over to it. This kind of stemmed from work we were doing prior when we were looking specifically at our CTE data.
And it just blossomed from there into like, hey, let's look at everything. Why not? Because it'll be best to understand how everything kind of flows so that we can use-- so that we can understand who is actually being counted into what buckets? Who is being considered for a program area and all-- who is actually included in these calculations for these participants in all these varying program areas? Because depending on program area and dependent on the participant definition, it does kind of impact what the metrics you're going to see look like.
So from there, we want to get a feel for how we're actually submitting our data through MIS. As a Banner school, all of our stuff is coming from our student information system, Banner. And we want to do an audit of how our Banner information looks, what we're submitting, versus what we actually sent for approval to the Chancellor's Office, which is what COCI stands for over there. It's short for-- it's the acronym for Chancellor's Office Curriculum Inventory.
So we want to first see whether there was any alignment between what we're sending up and what they actually have over there. And for the most part, yeah, it has to because otherwise, they would just reject our submission and be like, yo, this is not right. What are you guys doing? This doesn't look like what you're telling us you're doing.
And so after doing those kinds of audits and making sure everything is on the up and up, yeah, we see that it is matching with what's there. But the next question becomes, what do they have? And is what they have actually how we want to present our data to the Chancellor's Office? And we want other people to see our data, how we want our students to be looked at? And so that goes into that deeper dive. Next slide, please.
Of looking at what exactly they have at the more granular level. This is open source. Anyone can actually go to the Chancellor's Office Curriculum Inventory and pull any course level information for any of the colleges within the state really and check and see how they're actually coded. It's actually a great resource for anyone within the community college system or for anyone outside of the community college system.
If you want to see what kind of pathways are being offered in non-credit adult education, you can filter-- you can go there, you can grab the file of all classes for any college within your immediate vicinity and see well, for example, are there any non-credit classes within parenting? And then you can filter that out.
So we could go on that for a while. But for the sake of time, it's just a really great resource that anyone can access. It's open access. And you don't really-- you don't need to be a researcher to really look at that, as long as you have a question and you understand where the coding is coming from. You can kind of look at it. But yeah, so using these resources, we took them and we want to develop-- we want to develop tools that our consortium could use, which Harpreet is going to go into now. So next slide, please. Harpreet, to you.
Harpreet Uppal: Thank you, Jason Yes. So as Jason mentioned, I'm going to go through some of the things that we have created in house based on the data tools that have been available to us, especially utilizing the California Adult Education program launch boards, or AP pipeline launch board. We relied heavily on the metrics definition dictionary, which I will show you. I'm not sure how many of you are familiar, but I'll just take you to the slide so you can see how to get to it, and how to access this resource.
And that was-- we used that to really identify how things are calculated, what's included in the CAEP program area. So if we're looking at our participants under a career technical education in the launch board, and we compare to our internal data, if there's any sort of misalignment, we wanted to know why is that the case. Why are we seeing these numbers that are not aligning? And where are those gaps?
So in order to identify those gaps, that's why we went, digged a little deeper, and we relied on the resources that were already available to us by the California Adult Ed program. Because as Dulce mentioned, and earlier was mentioned, we do submit our data through TOPSpro as well. We wanted to-- we talk a little bit deeper using the California Adult Education program data dictionary, which is available on CASAS website.
Again, we wanted to see, how are we inputting things on the entry form, or on those-- into TOPSpro, and how that impacts the data that gets reflected for our consortium? So I am going to take you right here. As you can see, this is the launch board screen, the home page. And then you can look at your consortium data based on going to consortia and selecting your consortia. But what we were really-- what we were really most interested in was going to the technical definition.
So if you have not seen this before, if you have not accessed this before, you can actually click here. And it will give you the metric definition dictionary, which I have it open. And it is a very lengthy, lengthy document. And it gives you every single calculation, for example. If we stop here, this is a metric called participants in English as a second language or in adult basic education. So it will give you a description.
It gives you, what were the-- for TOPSpro, it tells you what were the top four elements that were used in this calculation, how things were calculated to identify, who was considered a participant for one of these two programs, and then--
Dulce Delgadillo: Sorry to interrupt you. We can't see your screen. I'm not sure if you were intending to-- we just see the slide.
Harpreet Uppal: Oh, that's-- oh thank you for stopping me, Dulce. That's really weird. Let me share my screen. So I was-- you guys saw none of that. So I'm going to stop share and share again. So if you can give me one second, please. Apologies. Let me see. I think this should be my full screen. I don't have the option to see if it shows-- let's try this. Now you see the PowerPoint, right?
Dulce Delgadillo: Yes.
Janeth Manjarrez: Yes.
Harpreet Uppal: OK, so if I were to end this, and then, can you-- nope. that's-- yeah, usually I get the option where it tells me, do you want to share the window? Or do you want to share the actual desktop screen? But I don't see that option. So that's why-- OK, let me just do this. And let me walk you through all of the documents first. And then I'll get back to the PowerPoint.
Dulce Delgadillo: Yeah, OK. We can see that.
Harpreet Uppal: OK, all right. Perfect. Yes. My apologies. I'm going to just talk backwards now for a minute. But this is what I was talking about in terms of the adult education pipeline launch board. So we can put the link in the chat as well. So this, I'm sure most of you are familiar with this. If you have attended any of the West End webinars, their workshops, you have-- you're quite familiar with this. So the data tool that we utilized that came from the launch board was the metric definition dictionary. And that's what Jason was pointing out.
We really wanted to see how things were calculated. So if you click on the metric definition dictionary, it will take you to this lengthy document. And then I had stopped right here where I was showing you the metric participants in English as a second language, or adult basic education. And then the way we utilize the information that was available on this dictionary was we looked to see for TOPSpro. What elements were used under this calculation to identify who would be considered a participant in the ESL program, or in the adult basic education?
We then-- because we submit most of our data through MIS, we were particularly interested in the COMIS definition. So Chancellor's Office MIS elements. What data elements were used? And how they were used in this calculation to really identify how we are coding our courses, and whether students were enrolled in our courses would get counted toward these program areas that are reflected on the launch board.
So I'll show you when we get back to the PowerPoint kind of how we used this, how we used this resource. And then if you're not familiar with MIS, every single element has its own dictionary. It has its own kind of like a calculation-- not necessarily a calculation. But it will tells you what it stands for, it gives you an explanation. For example, CBO3 is for top code and will tell you kind of what can be-- how your course can be coded and whether it will be accepted or not.
So that's what we used. So this MD-- this-- sorry, DED to identify how this was being used within these calculations. And I'll kind of show you right here. And I don't know if I can make this a little bigger so it's easier. And I-- yeah, so right here, as you can see, this is, for example, a calculation of those who are enrolled in non-credit ESL.
So at the Chancellor's Office and in the launch board, the way things get calculated is for any student to be included in this program area ESL program area, they need to be enrolled in a course. And that course needs to meet this criteria. So that's kind of the level we went to to investigate whether our courses were coded to meet-- to fit within this criteria. And then wherever we saw the misalignment if our courses, if it's an ESL course, and somehow we didn't code it as CB22, which is the non-credit course category, as A or B, which is ASL and citizenship, we were like why is the case?
And then we went back to the drawing board. And then CBO3, I was showing you right here, if you were to click on CBO3, that's for top code. If we were to go to CB22, that's for your non credit category. And this is what I was talking about E for ESL course, B for citizenship.
So we went through every single code to kind of identify whether our courses were coded in a way that they will get counted on the launch board. And then let's see. And this was the second tool I had on the PowerPoint screenshot, which we utilized for our TOPSpro data. So I'm sure those who submit data through TOPSpro are quite familiar with this data dictionary.
It kind of lays out every single field that we collect on the entry form and update form, and whether it's a WIOA requirement or a CAEP requirement. And then we go through every single element just to make sure that we are coding, or we're entering the data, or inputting the data, as it's indicated here. And but as Jason had mentioned, we have just started scratching the surface in terms of doing the investigative work behind the scenes, how does our data get impacted or how does the data look in the launch board.
And you may have seen differences in your data, like what you submit through CASAS TOPSpro might not get truly reflected on the adult education pipeline launch board. And we are really trying to understand why is that the case. And it goes back, again, to this thing. How are things calculated in the launch board?
So we have done this, the COMIS digging. And now we're on to the TOPSpro digging. And I believe these were all the resources that I wanted to kind of go through. And then let me go back and share my PowerPoint. And actually before I even share my PowerPoint, I want to actually show you the Excel file. Can you see the Excel file that I have on the screen?
Dulce Delgadillo: No.
Harpreet Uppal: No, OK. So I will have to-- yeah. I will have to share back. I apologize. I-- usually, I don't have this issue. But today, I think it's decided that it wants to have this issue.
Janeth Manjarrez: I did see the access to the launch board. But did you mean something else?
Harpreet Uppal: So there should be an Excel file. You should be seeing an Excel file right now. Can you see that now?
Dulce Delgadillo: Yeah, the CB course coding file. Perfect, it's on there already.
Harpreet Uppal: Thank you so much. Thank you. So what we did, so this is like the data tool that we have been talking about. And took us 40 minutes to get to that point. Sorry we lured you in. But this is what we built in-house. So we created a document. And it's a massive Excel file with many, many columns that I will show you. What it has, it has all of the active courses and any pending courses that are offered within our institution.
And that's why we wanted to give you that background. We are a non-credit institution, standalone non-credit. We do submit our data through MIS. And we use Banner. So as researchers, we have access to our back end Banner data. And I understand that that might not be the case for most of our institutions, especially those who are not using MIS. So we-- while we understand this tool is very kind of geared toward MIS data, we also-- we will go back and talk about how this can be done with TOPSpro data, or in one way to utilize this process for TOPSpro data.
But what we basically did was we pulled-- because we are able to query our data from back end, we pulled all of the active courses within our institution. And we pulled all of the courses, and the program that they're tied to, and additional information about the courses, like the division code that it falls within, the department code, when was this course first offer these terms, whether there's an end term code within Banner, and then what was the most recent term it was offered just so we know, is it a current course, or is it an old course?
And then all of the CB elements, so course basic files. So we pulled every single element that gets submitted to MIS. And they're just massive columns. And you keep going, and going, and going. And what we did with this information was once we pulled it, we wanted to identify whether our courses were being counted under the CAEP program areas. And the way we did that was we used the metrics dictionary that I was showing you earlier. And then we basically put in these calculations.
So these are just Excel formula based on the information that's provided on the metrics dictionary definition dictionary. We relied on that. So for example, this BA 2, that's a column that earlier I had shared with you, which was the CB22 non-credit course coding, which stands right here. This is J. And because this is a CTE course, J stands for workforce prep.
So we kind of laid out the calculation here as a master to the calculation that was provided in the MDD. And then that helped us identify whether our course would fit within that program area that's identified on the launch board. So we did that for every single course. And then we have these different flags. We wanted to see, is the CT course that falls with another CT program in the institution? Is it also being identified as CTE course in the launch board?
And then the sub-programs under CTE, we also used this for other grants, like other initiatives, Perkins, Strong Workforce, we did some of those calculations here as well. And then we looked to see our ESL courses. For example, this course is a CTE course. So obviously, we want to make sure that it's counted as true under CTE flags, but not true under other programs.
So obviously, it's false for ESL, adult basic education, ASE. But if we were to go under for our courses that are ESL, we want to make sure that all of the ESL courses that we have identified ESL are also being identified as ESL within the launch board. So we did that for every single course. And then wherever we saw discrepancies, if there was a case where a course is an ESL course but it's not being coded or not being calculated as ESL based on the MDD, then that's where we did the digging. That's where we were like, OK, why is that the case?
Is it because one of the CB elements was miscoded? And then that's when we had those conversations with our program directors and program managers to kind of get to that point so we can correct those errors so that we can submit our data right through MIS. And then that way, it gets reflected on the adult education pipeline launch board.
All right. And now I'm going to stop sharing this file. Sorry, one more share. And I'm going to share the PowerPoint now. Let's see. And please let me know if you can see the PowerPoint.
Mandilee: Yes. We can see it.
Harpreet Uppal: All right. Perfect. Thank you so much. Appreciate it. So this-- oh, that's not what I wanted, but I will quickly go to the slide. All right. So this was kind of the-- what I had showed you in the massive Excel file, kind of like what was done in the Excel file, how we used the calculation, and this is kind of like a snapshot of like, this is how things are calculated in the launch board. And this is what we took from it, and then how we built our Excel formulas to identify whether our courses meet this calculation so that they can be counted in the launch board.
And this is where I was talking about how we can still apply this kind of model to TOPSpro data. Because that was done all through MIS, right? So the way-- because on the MDD, the calculations are also provided for the TOPSpro data. So for example, right here, this is a TOPSpro calculation for participants in career and technical education. This was the COMIS calculation. So for the TOPSpro calculation, they're using the following elements.
And based on the webinars I've attended from West End, I believe they get a flat file from CASAS with these fields. And there may be some back end calculations that gets done from CASAS that we're not aware of. So that's something-- that's the sort of digging that we still need to do. And then for TOPSpro, the following calculation in order for a participant to be identified as CTE student, they need to be coded under the instructional program as career and technical education.
So that's how you kind of use this information. What was entered for our students in TOPSpro, MIS, and then how that gets used in the adult education pipeline launch board so that you can see. Because a lot of time, based on our conversations again with West End, we had heard sometimes, you have to get to the course level to see those discrepancies that you might notice in your internal data versus what's being reflected on the launch board. And that's really why we did this digging, because we were like, hey, when we pulled this data internally, we see these numbers.
But here on the launch board, it's saying something else. Why is that? And then because we had access to our internal data, back end data, we were able to do those comparisons to really get to that point where we can identify those gaps. And I know we only have six minutes. So we did have this poll. And if we can take just a minute to do this poll, and where's Jason's excitement? I don't know if I could cheer you at that level. But please, please, please if we can have some audience participation. Because we really wanted to have you take these tools back.
And we wanted to see how-- do you have access? Because one of the things we have learned through our presentations is that sometimes, not all consortia have access to a researcher, or a part of community college district. So they might be doing a lot of this work in-house or the consortium directors or coordinators may be doing this sort of investigative work. So we wanted to just kind of understand a little bit from you where you're coming. Where do you fit within this type of work?
Dulce Delgadillo: While we wait really quick, Harpreet, so Marina is asking if we've seen any requirement for tracking residency. I'm not sure. I don't think we have. We know it is tracked in the pipeline though.
Janeth Manjarrez: Yeah, I can answer that. I know that as part of the AB 2895 work group, in terms of immigration integration for CAEP, there isn't any real time or real hardcore data that we can accrue for that type of request at the point at this moment. I know that we work together to include it as in the legislative bill. But I know that we haven't heard back.
As far as tracking internally, we kind of already do what is in existence with other grant requirements. But it is not required for CAEP. So we're not integrating it at this moment. I know that for NOCE, we had a private grant. It's called Grads to Be program for-- and particular for that demographic. And we were able to support some funding since CAEP does allow to a degree the opportunity to leverage funds for student success. And that was part of it.
Harpreet Uppal: Thank you, Janeth, for jumping in. And yes, thank you, everyone, who participated. So yeah, it looks like some of you examined your data. And then you have access to someone who can examine your data. And I know from our previous polls, from our audience, I know about over 60% were our consortium directors and consortium staff. So I'm sure a lot of this may have fallen on you to do that sort of examination.
So what we wanted-- I apologize. I know we don't have a lot of time. But we wanted to kind of have a discussion around if you foresee your consortium applying the MDD to understand the gaps in your data, trying to do the similar work that we have done, or if you even have that capacity to do this type of work, or if you have done this type of work, or another work where you try to understand the gaps in your data between your internal data or the consortium data.
So I just wanted to open that space. But if anyone wants to come off mic, I know we only have three minutes. But it would be really good to hear any other best practices that are happening around the consortia. If not, I can always go on our next slide. I just wanted to give everyone an opportunity.
All right. And then we had also an additional question. Who within your consortium would you envision to take ownership of understanding consortium data at a more granular level? And you can also put it in the chat too if you don't want to come off mic. Feel free to answer these questions or anything related to this in the chat.
And lastly, I just wanted to share, what did we do with this? A research office, yes. We're quite aware of these data gaps. And we have internal conversations. But we also wanted to bring it to a larger audience. So what Janeth was saying earlier. And they'll say, we-- within our institution, we have been doing these data literacy workshops, what we call data and donuts. So originally, when we did it pre-COVID, we actually brought donuts to entice people in to have these conversations around data.
But we did these two data literacy trainings that were very specific to CAEP program. So we actually in the data donuts and dashboards, we took our audience, or our workshop attendees, to kind of navigate the dashboard, how they can access the dashboards, and how they can actually dig deeper into their data to get a better understanding. So we kind of-- that workshop, that was just focused on dashboards.
And then we did a workshop that was focused on course coding where we kind of showed them this Excel file. And then we walk them through how front end, especially our admin staff, who we're actually building these courses, we're entering this data into Banner, kind of walking them through how all of that impacts the data that we see at the end, in the back end. And that gets submitted.
And what then gets reflected for the launch board. So showing them whatever is being done at the front end, the data collection that's being done, and the quality of the data collection that's being done at the front end, has an impact on what's reflected, and what's being shown on these state launch boards.
So I know I'm out of time. I apologize if I went too fast. But if you have any questions, please feel free to reach out to us. I do have our contact information. And then we can make ourselves available to your questions.
Mandilee: Thank you so much, Harpreet, and Dulce, Jason, and Janeth. Thank you, everybody, for joining. Please make sure to fill out the survey. And we will be sharing their slide deck tomorrow. You'll find it on the vFairs platform. And you can find it next to their session. Along with the recording, if there was a portion that you were not able to attend or see, or if you think you might just want to share that with a colleague, you can always direct them there to view the recording, and then also gain access to their resources.
So with that, we're going to go ahead and close it out with a quick reminder to be sure to visit us in our networking lounge where you can connect with other colleagues around the state, create your own private chat room, and then maybe have a little bit of fun in our photo booth where we are loving everyone's photos. So we'd like to see you there. And right now, if you have the opportunity, we do have an exhibitor sponsored lunch. So if you want to pop over to that, we'd love to see you there. Thank you, again. And kicking off the CAEP 2021 Summit.
Harpreet Uppal: Thank you, Mandilee. Thank you, everyone.
Mandilee: Thank you. Bye-bye.