Harpreet Uppal: Thank you, Mandilee. Hi, everyone. I'm going to get started by sharing our screen. Let me know once you can see the screen.
Dulce Delgadillo: And we can see your screen.
Harpreet Uppal: All right, perfect. Dulce, do you want to get us started?
Dulce Delgadillo: It would help if I unmute myself, huh? [chuckles] Hi, everyone. Good afternoon. Happy Wednesday.
We're excited to be here. We are the research team from North Orange Continuing Education, NOCE, in North Orange County Community College District. My name is Dulce Delgadillo.
And I'm the Director of Institutional Research and Planning here at NOCE. And I will go ahead and let my colleagues introduce themselves. Go ahead, Jason.
Jason Makabali: Hey, all. I'm Jason Makabali. I'm senior research analyst over at NACE. Good to see you all here.
Dulce Delgadillo: And Harpreet.
Harpreet Uppal: Hi, everyone. I'm Harpreet Uppal. I'm also a senior research analyst with Jason and Dulce at North Orange Continuing Education and for a consortium.
Dulce Delgadillo: Awesome. So today. We're going to go ahead and do a deep dive, a bigger deep dive than I think we've done in the past, on consortium data understanding and really get into the weeds and the technical aspects of how is it that we have been looking at our data at a consortium level through a community college institution. Next slide, please.
So this is what we have on our agenda for our time together today. So we're going to do a really quick overview of what the CAEP Summit 2021 presentation was. This was really the starting point of our data literacy journey at NOCE around CAEP metrics and really telling our story about how is it that we started exploring this data. How is it that we utilize some of these tools. And where we are now as a consortium and as an institution in regards to understanding our metrics.
And then we'll jump into really having a discussion around the role of research. And how our small but mighty team has been able to position ourselves in a way to be able to facilitate these conversations around these metrics. And how we hope to utilize this data as part of institutional and consortium planning.
We're also going to go over some tools. So some tangible items that you can take over to your program to your consortiums that we have created as part of our consortium to be able to help facilitate those conversations around data. And then we'll go into how we have built this culture of data utilization at our institution and across our consortium. On knowing all the ups and downs and that big red flag of a pandemic impacting our data as well.
But also how we look at it across year by year. And for these three-year planning guides as well. And then, lastly, we'll wrap it up with what is continuous improvement and planning look like for our consortium. What have we done, and what we plan to continue to do. Next slide.
So before we jump into the nitty-gritty, we just want to take a pulse of our audience today here at this webinar. And we encourage you to join on this survey really quick. If you could just either take your cell phone and go ahead and use the camera feature to look up this QR code. Or you can log in to slido.com.
It'll ask you right off the bat to put in your number. Go ahead and put in that 8018 853. And please let us know what sector you're from. We want to know who is in our audience today, this early afternoon.
So are you consortium staff? Are you community college non-credit education staff? Are you coming from our K through 12 adult education/adult schools?
Are you a community partner? Maybe a one-stop shop center. Or are you an other, and you don't fit into perhaps a community partner in some other way, shape, or form, or capacity that is supporting this type of work. So please let us know.
So we are at eight. We can see mostly consortium staff, all right. It's coming up. We've hit the double-digit mark. And we're going to go on probably for another couple of seconds, minutes.
There we go. Let's see if maybe we can get to 15. We got 14. All right, we got a group. We're almost at 50% of our sample here.
All right, there we go, 50%. I'll take that. It's always a great sample size if we get 50% of our total population. All right, we got more than that.
So mostly, in our audience today, we have consortium staff. So 44% directors, this is administrators, managers, anybody who is supporting staff. And then followed slightly-- 1/3 is community college non-credit education. And then slightly over 1/5, we have K through 12 adult ed.
So that was great. Thank you everyone for participating. Next slide. So this is our presentation from October of 2021. And this was what we presented from our consortium.
It was called Empowering your consortium to Understand its Data, Tips and Tricks. And it was really our intent to tell our story of where we started in understanding launch board numbers and our local data. And how they work together, and how we can validate each of those sources as much as we can within reason.
But also what is it the tools that we developed. And how did we facilitate those conversations. So next slide.
This is just a quick summary of what the session covered. So like I mentioned entities within the consortium and how it functions. And I'll actually go over those pieces again on this presentation.
Pretty much, we at NOCE, the North Orange Continuing Education, we are a standalone, non-credit institution. We are one of two in the entire community college system. So we don't have any credit courses or any piece of our institution is credit.
We have two credit community colleges, Fullerton College and Cypress College, that are also part of our consortium. But for NOCE, we are only non-credit. And we are a research department that only focuses on non-credit.
And so we recognize that is a huge advantage. And so we have been able to dig into that data. We also covered, again, those data tools, which we will revisit here today.
Excel files. How can you access your curriculum inventory? So there's this awesome tool called [inaudible] And we'll discuss a little bit about how we utilize all of those free access tools that anybody can access to really start having those conversations and just getting data in the hands of those that need to have that data in their hands. For either planning purposes, for faculty, for managers, or for your CEOs, your presidents, your VPs.
And then we also cover data literacy training. So really what are some of the things that we've done within our institution and within our consortium to build that literacy. Make people feel comfortable with numbers. That's pretty much what it is. Making people feel comfortable with data.
And helping them understand how they can utilize the data and digest it in their way, shape, or form. So we're going to take all of those three pieces and we're just going to do a deep dive into each of those three components that we covered in that first session. Next slide.
So a role of research, like I mentioned, we are a small but mighty team. We are fortunate enough that we are able to dedicate a senior research analyst to just consortium planning. Although, we know that because we are a standalone non-credit and we have many other initiatives, that we also look at how this initiative, adult education program, really intermingles with the other initiatives, such as Strong Workforce Program or your Perkins program or even other local community college initiatives like Student Equity and Achievement Program and Guided Pathways.
And so really as a research department, we have been the lead in not only being able to take that data and validate it to make sure that what's in a AEP launch board is close to what we have locally. And how we use our local data to replicate those numbers on LaunchBoard, but also to facilitate the conversations. It's not just about getting the data and looking at it, but what do you do with it now that we have the data.
And so that's where that data literacy component. And so not only are we looking at the data going down the rabbit hole, asking the questions of why are our numbers this way, but it's also about taking it back, producing deliverables and tools that are usable for our audiences and a variety of stakeholder groups. Such as our faculty, our administrators, and even our students to some extent, but also our classified staff. And so tailoring it for that.
So the role of research has, I would say, evolved as we have learned more about our data locally. But also as the state has provided more guidance around how non-credit programs should look at their data. And what are those metrics that we are trying to define both at our local level and at our state level in addition, obviously, at our consortium level where you have multiple partners in that platform. Next slide.
So a little bit about our consortium. So North Orange County Regional Consortium is made up of three main providers, NOCE, which is the adult education. And the vast majority of the students that are being displayed on AEP LaunchBoard are being served by North Orange Continuing Education.
We also have the North Orange County ROP, the Regional Occupational Center. And so they do mostly career technical education. And then we also have another provider which is Garden Grove Unified School District. And they really provided support through these one-stop shops in the community center.
So really looking at where does research fit within this consortium is a bit tricky because all three of us are employed and are NOCE employees. We are a research department as part of North Orange Continuing Education. But, ultimately, we are still supporting the consortium and supporting students that are potentially not even NOCE students but are on the pathway to NOCE or vise-versa. Are coming from NOCE and maybe on a pathway to other providers, such as NOCROP or Garden Grove.
So one of the things that I know we struggle with as a research team is getting our hands on the nitty-gritty data of outside partners, so the NOCROP or the Garden Grove data. But our advantage is that we have full access to North Orange Continuing Education data. So we are able to query our data. We are able to pull it from the back end.
We are able to facilitate conversations around processes of how the data is being captured for NOCE students, which is a huge advantage in trying to understand your data. But really today, we recognize that that is not the case for many consortiums. And especially for many community college systems, we know that most of the IR offices are focusing on the credit side data.
And so today what we hope is that we're able to give you tools and language and resources to be able to start digging in. And where you can start asking those questions of how do I make sure that the data on the LaunchBoard is actually reflective of the students that we are serving and all of the work that we are doing as a consortium. Next slide.
And so we do this by providing this ongoing research support through our office to a variety, again, our entire institution and beyond and to our consortium members as well. And we break up these three ongoing research supports into these buckets around data literacy and data quality. So those tools include looking at our curriculum and our course coding.
Making sure that they're aligned. And how do we connect that work with our curriculum committee. Or how do we connect that work with our faculty that are writing the curriculum.
And then we're also looking at our course coding and MIS. How is the course coding that is being done at a local level within our programs by faculty impacting the MIS data and ultimately the AEP LaunchBoard data. So looking at what is actually being pulled into our CTE courses. And I'll give you one example right off the bat.
What we found is that we have an academic program that is called Career Technical Education, it's our CTE program. And so when we were looking at our AEP LaunchBoard data, we saw that not our CTE courses were being pulled into that because as a result of course coding. And so there continues to be a data literacy effort across our consortium and across our institution to really clarify that the state is looking at these particular metrics defined by these particular parameters.
And so what are the implications of those parameters on our data. And how do they connect to the curriculum work. How do they connect to the coding work that's being done at a local level.
And then we have that other prong, which is that TOPSpro CASAS data. And so we are a Title II institution. And we are capturing and gathering that TOPSpro CASAS data. So how does that other data source really work in conjunction with this MIS data source. So a lot of data literacy. A lot of moving pieces.
Next, it's this building knowledge base. So, again, how do we inform a variety of stakeholders about the importance and all of the nitty-gritty components of all of these pieces that have implications on our AEP LaunchBoard metrics And that's being done through these pieces, which I'll go into a little bit later.
But things like data and donuts. Or data coaching for a department. Or data coaching for particular stakeholder groups, such as our faculty or our managers, in order to understand that. So that's really part of that.
Not only the data quality and understanding and building the tools, but then now the so what. What do we do with this. And then the last piece is evaluation of our consortium effort.
So, again, that continuous cycle of improvement. And so how are we looking on a regular basis. Not necessarily it doesn't have to be annually, but how are be looking at our activities and the work that's being done.
And how are we evaluating if it's working, if it's not working, if it's efficient, if it's not efficient. All of those components. Next slide. Now I'm going to hand it over to our senior research analyst, Jason Makabali.
Jason Makabali: Thanks, Dulce. How are you all doing? I'm Jason. And I'll be guiding you through this slide here, I guess.
So we want to give you a little bit of background on why did we decide to embark on this journey of data discovery and understanding. Well, it all stemmed from a question. The question that every president tends to ask their research office. Why do our numbers look this way?
The AP first came out pretty recently. And we thought we saw what was there. And then we were just like, OK, that's a great question. We'll get back to you on that one. Yeah, because you don't obviously know what it looks like or how it got there without actually doing some digging.
So we decided to get in there and do some digging. So to start that digging, we're going to tell you our journey. So next slide, please.
So before we get into our journey, it'd be great to know what kind of data you all submit through CASAS? Do you all submit through CASAS TOPSpro? Or do you all submit through MIS? Do you all submit through for a little bit of background on us, we're community college based, of course, we have our MIS system. But you know, since we're also-- we are Title Two funded, we do submit our CASAS data, because got to get that green, right? So feel free to join in on the Slido, and just to have at it.
All right. Can we get to like about 50%? I mean, but sure. You want to call it here? All right. Let's call it here. I mean, it's kind of cool, like a surprise there's no one that's MIS only. That's kind of good to know.
But I mean, yeah, it's cool that we have a great mix over here, you know. I mean, this presentation will focus mostly on MIS-type data submissions and data examinations. But of course, you know, since we do use both, we actually like kind of go into both our CASAS and our MIS data just in very different ways, because they are two very, very different data systems, right?
And it is kind of more about understanding both. But for the focus of this presentation, we're focusing more on how we're going to be understanding our MIS data more so than our CASAS data. So just a heads up. The next slide, please.
So as we mentioned, for us to be able to understand how our numbers actually appear on the AEP launch board, like how it gets there, we kind of have to know our own internal MIS process. Like, how does our data get from here to the Chancellor's Office? You know, that was a great question too. And that's kind of what we had to dig into first.
Our student information system is Banner. We're a Banner school. So we had to figure out, like OK, how does the data that we're collecting, how does the data that our staff is entering into Banner make its way out of Banner and into the Chancellor's Office? So you know, what we decided to do was we're again, as a research office, we are very fortunate that we actually have access to these tools.
For someone who would not like have direct access, it would really help to make connections with like the Community College Research Office, or with the IT department, because they're generally the ones who handle this type of thing, who are in charge of the MIS submissions to see what exactly it is they do. How exactly do they find out what data is being submitted? Like, what is the actual process?
Because you know, what defines enrollment, what defines you know, like all the different elements that are being submitted up to MIS, and you know, that was the first step that we actually had to take. We had to go in there. We went into our Banner system, and we just like scraped through it to find out where exactly all this MIS information was coming from, right?
We were kind of auditing the process, seeing where this process is actually happening, who's in charge of the process, getting to know them better, so that we could coordinate with them to see like, hey, if our numbers look a little bit off, why are they looking the way that they're looking when we're submitting? Because you know, like, if what you're submitting looks off, then when it's transformed, and calculated, and run through all the different methodologies and procedures, that's going to look even more off, right?
So it's really imperative that there's some understanding in how the data is actually being processed, so that your submission is as you know, spot on as you can possibly make it. That was our first goal. So we went in. We tore through everything. We looked to see like exactly how our enrollment files were created, how our student files were created, how our services files were created.
We looked into how we report our courses, our course sections. We tore through pretty much everything. We actually extracted the code to see how exactly Banner works in that sense to create those files. We worked very closely with our district to understand our process.
And we worked with other colleges within our district to try to understand that process, you know? Because ultimately, it benefits everyone to understand what exactly it is you're submitting, right? Because from there, you can actually kind of keep track of how things are actually being done later down the line, or at least get some sort of clarity towards how stuff may be being done later down the line, right?
So that was the first thing we had to do. And that was probably the most important step about it. And we spent a lot of time trying to understand our process. And you know, it's still an ongoing process, you know? Like for example, there are actually new elements that are being added, which we are still trying to figure out, which we're still trying to build upon, right?
You know, we're trying to streamline our process to make sure that we can do things a little bit more efficiently. But you know, from all that, it comes from having an initial understanding of what your MIS process actually is. And that's really where you kind of have to start in order to actually keep moving down the line.
So really, it takes working with like whoever you know, or I mean, if you don't know, I guess we have to ask around and see like who actually is in charge of looking at all this data, who is in charge of submitting all this data, so that you can actually work with them to form that close bond, and see what needs to be done. That way, you can communicate to your constituents like what needs to be done so that they can do what needs to be done, right?
And you know, it's all part of being that well-oiled machine to be able to basically audit your data, right? And so that's kind of where ground zero is, that starting and understanding your MIS process. And then from there, we can go on to the next thing, which is-- next slide, please. We can examine it, right?
So the most direct way to examine the data is through the actual data mart, which I have. Can you please show? Yeah, the login page, yeah. Up right, yeah. So this is where you could actually request what your MIS data is. But since there is personal information, it is behind a login.
And generally speaking, someone in your institution should have access to this to be able to look at this stuff for you if you don't have direct access. So in this case, you would have to work with whoever has this information, so that you can kind of sift through all of this information. You know, again, coming from research, we do have the back end access to be able to look through anything that we kind of need to to be able to audit, right?
But you know, I understand coming from like since we all have different positions here, it would be a matter of finding out who actually has this level of access, who you could actually turn to to be able to look and see what the Chancellor's Office actually has. Because this is where you would actually find what the Chancellor's Office actually has at the raw data level. But I mean, but if you can't do that, so that you can actually check other things-- next slide, please. It's the next slide.
I mean, you can do a quick glance over through the California Community Colleges Data Mart. It's a little bit more aggregated. It has been run through certain methodologies. But it should give you a snapshot as to what your data might actually look like to see if there are any sort of clearing oopsies and uh-oh's within your data that kind of might have gone under the radar, right?
Like for example, when we were looking at our Data Mart data one day, we saw that we had absolutely no enrollments for a term. COVID is a heck of a thing, I tell you what. And because of the way the methodologies worked, yeah, the way it kind of looked was like we had no enrollments. So we kind of have to work around and try to figure out what was going there.
But it's kind of a great start, just looking at the California Community Colleges Data Mart. We'll provide a link to it later on our resources page at the very end. But yeah, it's a good start to be able to look to see, does this data kind of look like what I would expect? Because it's a little bit aggregated, but it's more or less as clean as it can be without actually having the raw data, I guess, is the best way to put it.
And then next. So beyond just looking at our MIS data, there's also the Chancellor's Office Curriculum Inventory. This is actually kind of completely separate from MIS. Although, they kind of live in the same place. The Chancellor's Office Curriculum Inventory, COCI, communicates directly with MIS. However, you know, they're generally coming from different departments.
Generally speaking, your COCI data is being inputted by your curriculum staff at your institution, if you're part of the community college system, right? And generally speaking, this won't have that kind of level of communication that it has with your student information system. So I mean, let's take a look at COCI. But you know, COCI is also very important and very imperative.
Because this is all of your course level data. If you go to the COCI2.ccctechcenter.org, it's open access too, so you can actually look at anyone's courses from here. Click on Courses. And yeah, if you really want it, you can look at ours right now. Just like search North Orange. We ain't afraid of showing nothing over here. We're all friends, right?
So you could look at all of our course coding. And there's a nifty Export to Excel button right there. You click Export to Excel. And then yeah, you could basically see how we code all of our courses. And this is imported by your curriculum staff, right?
So that's kind of how we kind of audit how our courses are actually going, right? So this is kind of where we see where our curriculum is. And then we have to compare it to our MIS data, which next slide, please. Next slide, yeah.
So this is kind of where the bread and butter is. This is where we have to do our analysis to see like, hey, is the stuff that we're submitting through MIS actually matching what our curriculum officers are actually submitting to the Chancellor's Office Curriculum Inventory? And we want to see what exactly they're kind of submitting to audit it. And this is where all the understanding piece comes in, and where we have to go more into detail about like what's being submitted, and seeing how it's being submitted.
Because what's being submitted and how it's being submitted play into how it's being looked at, and how your numbers will appear on the AEP launch board. And that's kind of the big thing that Harpreet is going to be discussing in the next slide with our whole course coding Excel spreadsheet, right? So I guess it's off to you.
Harpreet Uppal: Thank you, Jason. Hi, everyone. So I'm actually going to go over some of the actual concrete tools that were created by our office to start the data conversation, right? So all of the things that Jason shared about MIS, about COCI, and us as research team having access to the back end sort of information system, and being able to query all of this information led to us developing this Excel sheet, which I'm actually going to go over the Excel sheet.
But I just wanted to give you a snapshot of what will happen once we pull all of this information. And these are all of our course codes. So if you're familiar with the MISCB file, these are all our course basic elements. And we pulled every single one of them, because they are used in the AEP launch board calculations for various metrics to identify specific programs.
So where we use the launch board metric definition dictionary, we looked at the calculation of each specific metric to identify what's being counted, how does launch board identify, as Dulce mentioned, the CTE program, what gets included in it, what's being included under adult basic education. Because it's very different from how things are captured in CASAS.
So in LaunchBoard, anything that's pulled from MIS is based on the specific calculations as you can see on this slide show. This is a calculation of how a participant would be identified in CTE program area, right? So if a student is enrolled in a specific course, and that course meet this criteria, that's how that student would be identified as a participant in a CT program, and being counted under the adult education pipeline LaunchBoard.
So we pulled all of our active courses for the academic year. And we have been doing this for a few years now. So starting in 2019-20, files of this example is from a most recent for '21-'22 academic year, where we pulled all of our courses. And then based on the different codes that are used for these courses, we try to identify is this a CTE course? Would that course be identified as a CAEP CTE course? Would that course be identified as it CAEP CTE workforce prep course, so and so forth?
We also actually looked at strong workforce definition, because we wanted to see also across initiatives how course coding impacts whether students would be identified as a participant in the specific program areas, and specific initiatives. And then the way we did this, the way we were able to identify this true and false is we used this information from the metrics dictionary calculation, and then we created these Excel formulas. So we converted this language.
So for example, the CB22 equaled J, the elements in CB22 in our Excel file was under BA column. And we tried to see, OK, for that specific cell, for that specific role, OK, is this a J? Then this or statement, CB03 asterisks for top code. Like, any top code that has an asterisk is considered a vocational top code. And that's used to identify whether the course was flagged as vocational.
And then we have an indicator in our Excel file that identifies whether a course has a vocational top code or not. And we were able to use that identifier here. And then this or statement for CB09, ABC, and then we look to see, OK, is the element that CB09 is under this AG column, does it have A, B, or C? And we used all of the different pieces of information to identify them, would that course be considered a CTE course? And a student that's enrolled in that specific course would then be counted? You know, obviously, taking into account the definition of participant.
This is just to see just on a level of course coding. So I'm actually going to show you this, the actual file. That's not what I wanted. All right. You should be able to see an Excel file. Let me know if you can see it.
It should have a bunch of columns. All right. So thank you, Dulce. So this is kind of like our office produced. And this led to various conversations with our program directors, with our higher leaders, and so forth, and which we will go over our data literacy piece. So what we did, as I mentioned, we pulled all of this information from our Banner system.
So we pulled for each course that was active in our system for the academic year. We pulled like-- these are like Banner labels. But this is like a subject code, or a course number, status of the course, what division the course falls in. This is just internal information for our purposes. We wanted to see when that specific course was first offered, what was the most recent term it was offered.
And then so a lot of data here. And then these are the CB elements that were pulled from Banner. So if the course had a control number, what was the title that was submitted, what are the top codes. So and I'm just going to keep going and going. As you can see, there's a lot of information. and just to show you everything that was pulled.
And this was an internal file. I mean, it began with an internal file, because we were doing the exploration work. And we wanted to have a better understanding of how course coding impacts how LaunchBoard data looks. And that was the kind of initial question, because we wanted to understand why the data looks the way it looks, and then how can we really identify data discrepancies without doing that exploration. So this was really for that purpose to see, OK, what is a course coding issue?
Like, are we now coding our courses right? And maybe that that's where there's that gap. And so I'm going to get to the point. There's some blank elements, because there may not be information. But we pulled every single element.
So then once we had all of the information in our Excel file, and then as I mentioned, we had a flag that we created to see whether the top code was a vocational or not. And then this is where we kind of did that whole calculation piece, Excel formulas. And we have some great folks in our team that are brilliant with taking that information from the MDD, and translating it into Excel formulas.
And that way, we can really see, OK, is this information going to be identified into LaunchBoard the way we are thinking based on the course coding. So as I was mentioning, we looked at where is our COB three column? Where's the COB nine column to identify? And that was kind of like trying to go back and forth. That was probably the difficult piece out of all of this.
And then we did that for every single, again, flag. Like, you know, would be identified as an ESL course. Would it be identified as an adult basic education, or adult secondary? So you know, and then we also had additional columns, because we wanted to see was there a course that we offered that was not being identified as a key program area, right? Because we do offer emeritus courses.
So we still wanted to do that type of validation as well too based on our other courses that may not be CAEP program areas are coded certain ways where they may be identified or not, right? Or do we have courses that are like-- we think they're CAEP program areas, but because the way they're coded, they're probably not being identified as CAEP program in the LaunchBoard.
So we did this additional just trying to look at all the information that was available to us. And then we also wanted to see did we have any courses that may fall in more than one program area, right? Just again, based on the way they are coded. So that was kind of all internal.
And then when we tried to present this information to our stakeholders, like program director, or staff members, it was a lot of information. It was a lot of information for us to try to expect people to understand everything and follow us. It was a lot to take in. So we then tried to do an abbreviated version.
The abbreviated version is still pretty dense too. But we did try to make it easier by only giving the elements that were being used in the calculations, right, and not give every single CB element. Because we pulled every single one, but not every CB element is being used in the CAEP calculations.
So we only give like the specific ones, like CB03, CB08, CB09. And then we tried to the way that the courses-- or the way we labeled things too instead of just saying CTE, CAEP EB, we tried to make it more descriptive. Like, does this class fall under CAEP ESL program area? Yes or no. Does it fall under a adult basic education program?
So they can really utilize this Excel file to filter, so it's easier. Like, someone can go use this file in their office without our assistance. Like, that was the intention behind this. And then we also try to provide the calculations on the Excel file, so they don't have to go in the LaunchBoard and try to figure it out.
And we don't just give them the Excel file. We hosted individualized meetings that we're going to go over. So they went with some familiarity when they had access to this Excel file. And we only gave like access to the files at their program level, so that there's no additional information trying to like create-- we don't want to create any sort of confusion.
So we do try to tailor whatever we were providing for our program directors, so that it's useful information. At the end, we wanted them to use this information to really see where are the data discrepancies, right? Maybe when we initially started this, we have parenting courses, or courses that are offered that are under our K-12 student success work-group, those are geared toward training adults who support students for their success, which is one of the CAEP program areas. And we noticed that one of our main courses-- just because the way the top code that was being utilized, or that was being used to code that course was not a top code that was being used in the calculation in the LaunchBoard board.
So any student that was being enrolled in that course was not being counted under that program area. So then we worked with-- so this is the program I'm talking about. So when our course was not utilizing one of these top codes, and we had to work with our program director, with the faculty.
And this is the part where Dulce also was talking about, getting this information back to our faculty, to our academic senate, to show them the importance of course coding. Because you know course coding is relevant to the curriculum. And so it happens at the beginning. So going back and showing them how everything, the way they code certain things, their curriculum, their course, it does have an impact on what data gets reflected on the LaunchBoard. So we have to go back and forth, and kind of provide that information, and show them how everything aligns.
And then our faculty was finally able to resubmit that curriculum with one of these top codes that was the relevant top code for that course. And that also falls within this program area. And then we are now able to see those students enrolled, those students are enrolled in those courses being actually reflected and counted under LaunchBoard.
So that was kind of like one example of where the things that we have been doing. So let me go back to the PowerPoint. And then I'm going to send it back to you, Dulce, if you want to start us on how we have been using this information.
Dulce Delgadillo: Yeah, thank you, Harpreet. So that was a lot. So yeah, it really is creating these tools, and having these conversations within IR, and what makes sense for each of these stakeholders, right? Because what makes sense for faculty when they're having these conversations around course coding, these tools around curriculum development, is going to be a different conversation than you may be having with your managers, or with your VP of instruction, right? So I'm taking that into account has been really key in how to take back that data and put it into practice.
And so going back to this building a culture of data utilization, it's really been around being able to create avenues and platforms to have these conversations. And so our office has really taken the lead in trying to provide those platforms. And so that includes not only going into and doing institution wide things, such as data and workshops, or something on opening day for our school, and just say this is why it's important to make sure that students are completing these forms in an accurate manner.
Or if you see a student struggling in completing these forms, that we know from our perspective, that we know are feeding into this MIS data, then really focusing our efforts on training the individuals who are entering that data on the front end to ensure that that data is accurate.
So that again, when we're extracting that data, it makes sense. It's reflective of what is actually happening on the ground. So Data and Donuts workshops have really been helpful for that. And then we also do data coaching. So next slide, because I think actually we go into each of these pieces.
Yep, so these are just a couple of the Data and Donuts that we have had across our institution. These are workshops that are open to anybody and everybody. So we encourage students to come, faculty, classified managers, consortium members. And with Zoom, it's even more accessible. We record them. We put the PowerPoints up on our website.
We put any type of resource. We share it out. We blast it through email. We blast it in our committee work just to make sure. And we focus on different things.
So you can see here, these are three of the ones that we've had. So we did one on Data, Donuts, and Dashboards. So our research department has been developing some data dashboards around our own metrics. So understanding those dashboards, in addition to the statewide dashboards, right? So state is doing dashboards for adult education, for student success metrics, which is looking at adult education only, or ESL only students, right?
Anything that's pretty much a data visualization tool, we are walking them through how to interact with it. Again, just getting that comfort level, and meeting them where they are where they are in their relationship with data in a sense. Data and Donuts, down below, it was really kind of-- and I believe this one was around our CAEP metrics. So again, we really have to start broad in just saying, these are the things that are being measured. Let's not even go into the nitty gritty. Let's just wrap our heads around what is an enrolled non-credit student in CTE mean, right?
And so we really have found that one of the biggest lessons learned I think in starting this journey of data literacy, and a deeper dive into our data, has been that we really need to break these things into small, digestible chunks, right? So we tried for several times just to share all of the metrics, right? And we always only had like an hour.
At the end, we found ourselves just rushing through those last couple of metrics. And people weren't really grasping them, because we were repeating ourselves over and over again, right? And so now moving forward, we've really kind of said, OK, let's step back, and let's just look at two or three metrics. Let's just understand what it takes to get a student into the CTE denominator in AEP LaunchBoard.
And that's where we're getting more into the nitty gritty with such as Data and Donuts Course Coding edition. For this one in particular, we actually had our faculty chair of the Curriculum Committee. She came in and she explained to the audience what is the curriculum process look like at our institution, all the way from the start of, we're thinking that maybe we should offer this class because there's a need. So how do we go from that all the way to it being approved and offered at our institution?
And now, we're starting to have those conversations of let's start having conversations between Curriculum Committee and research to see what are the implications of these codings on our AEP metrics, or even on other metrics, like in Strong Workforce. That Excel file that Harpreet shared, they have done that for Strong Workforce program. They have done that for the Perkins. So it's a tool that you can utilize not just for AEP, but across any other adult ed, or just even technically, you could do it for credit initiatives. But it is a very helpful tool. Next slide.
So that's at the broader institutional level. And then as we found that we saw a very wide range and spectrum of data literacy capacities across our institution. So we had some faculty members that were ready to dig into data, give me more. I have all the questions, none of the answers, and I just I want to get in the data.
And then we had some individuals in the middle, where they kind of knew about it, but they weren't sure how to interact with it. And especially with the new data tools where you're interacting with data, it was very much around the data coaching of, what are the things that you're interested in? And again, creating kind of that buy in in why is data important, and why we should look at it. And what is your role in all of this, right, whether you're a faculty, a classified, a researcher, an administrator, or a community member, or even a student to some extent.
So individualized meetings, we held adult education provider specific meetings. So I know Harpreet and Jason went over to Garden Grove, and to our community partners to really focus in on their data, and building their capacity, and understanding their own local data, because they're not using MIS data. They're using TOPSpro data. So really, we have to familiarize ourselves with those tools as well, since that is impacting our consortium.
We did presentations during our weekly directors meeting, so again, breaking up those metrics in chunks. We have one scheduled during the summer, where we're going to be asking directors to say, for each of your funded activities, what is the goal? What is the intention? And how can we build a logic model around that goal of your strategy in order to make sure that at the end of the year we measure what should be measured? That we are measuring and using a metric that makes sense for the strategy that you are implementing.
Is the intent for students to transfer? Is the intent to increase the knowledge of students in X, Y, and Z, increase confidence? Whatever it is. But having those conversations around intentionality of these activities that are taking place, and how do we measure them in an accurate manner?
And then we went to a consortium executive committee, so again, tailoring these to a much broader level of what is it that consortium voting members need to see, and understand, and making sure that they understand who is being counted, how are these calculations being impacted by what the state Chancellor's Office is defining them as? What are some of the implications of these changes that are coming down the pipeline either due to COVID, or due to because institutions across the state aren't providing that data, or X, Y, and Z at a local level. And lastly, presenting them at your committees, so at your academic senate.
Our goal is still at Curriculum Committee, or at Professional Development Committee, or a Budget Committee, right? So how can you link this work to the committee work that's happening on the outside of your research, or your consortium world? Because ultimately, it is impacting those numbers. So kind of connecting the dots again, and looking at that evaluative cycle has been very helpful to bring it back to all of those committees to let them know what the work is, and then again, those implications on those metrics. So next slide.
All right. Actually, this is an example of one of the data coaching. I mean, we went into the nitty gritty, right? So we would say, OK, we are looking at this specific code. This is how you get to the top code. This is where it's extracted, those tools that Jason and Harpreet went through, or those publicly available tools we can say, this is how your top code is defined as right now, right? And so why does this matter?
And we would go through every single CB element and say, this is where this code is, and how it's defined, and what it is. And this is where you can find that information in a tool, such as the Excel file that we're providing you. This is why it's important.
Because your CB03, your top code is defining and determining where your students are being placed, and counted for in AEP LaunchBoard, and those are the implications of this. So that as you're doing your new top codes, you're having that conversation already, right? So that's really the intent of these data coaching efforts. Next slide.
And those are the Excel files. Yeah, so we would literally just say, like this is the Excel file. We would highlight just the way we did it for you right now, because we want to make sure that it's connecting to them. Because a column of CB03, and a random string of numbers isn't going to tell our users anything, right? They need to know why is this important. Why are we specifically telling them to look at this? And that's where those definitions and additional resources really play a role in that. Next slide.
And another one that we used was CB09. So our SAM priority code, right? So top code and SAM codes, so how are those defined? Why does this matter, so whether it's coded as a ADCE? This is also being used as an auditing tool by some of our programs. So should this course be coded as a D? Or should it really be coded as a C, Right? Is it really a possibly occupational, or clearly occupational?
And I think even with the top code in the SAM code, it's a good example of these are the other two elements that are really important for other adult ed initiatives as well. And then we bring up that Excel file and we say, when you look at your CB09, column W and X, this is why it matters, right, so connecting the dots to them for them.
All right. And so for this one, we always bring it back to that adult ed LaunchBoard, right? Because we want to make sure that we bring it back. This is where usually, our users are going to get that data, the adult ed LaunchBoard.
Usually, this is kind of what we're doing our comparisons with for our internal data numbers. And so when we bring it back to showing them, we really have to connect again, how is that coding on the right hand side? So again, we looked at your top code, your CB03, right? What is your Workforce prep code? Are you an apprenticeship? Or are you occupational?
All of those things in combination, so you're not just looking at one thing, because it's a combination of several things. How are those impacting whether or not your ESL student is part of those 5,060? Or your ASC student is part of your 1,461? Because the course coding that that student is enrolled in is going to matter. And this is how we're connecting the dots for them. Next slide.
And then the last one here, we really need to bring it in for TOPSpro. Again, because we have those two data sources, we need to make that connection for our stakeholders within our consortium. So you know, why are we doing both of these? Well, where are WIOA school? How are we utilizing the data?
And so we, as an institution that utilizes both of these sources have had to take that initial step, or additional steps-- sorry-- that additional step to say, this is how some of these elements in TOPSpro align with the elements that are in MIS, right? So whether it be your education, or completing a certificate, or a what's it called-- a barrier, identifying a barrier. And so making sure how do those.
And so from our perspective, from a researcher's perspective in a multi-college community college district, we have had to see how does this data, how does this actually flow into our existing Banner data? How is it taking it into account? And for right now, it's not. There are completely to separate data sets, right?
And so our role right now where we're at with our institution is really making sure that the Dulce Delgadillo in CASAS is the same Dulce Delgadillo in Banner in our student information system. So we have to again, take that additional step of validating between those two sources, and then communicating that to individuals, to the staff that are completing these forms, to the staff that are entering the MIS data through our Banner system. So again, connecting those dots for individuals in our consortium and our institution.
Harpreet Uppal: Oh, and I just wanted to add to this slide, Dulce, that as it was mentioned previously, so over other entities, like ROP and Garden Grove, they submit their data through TOPSpro. So while for NOCE, we were doing a lot of that work on MIS level and course coding, we also try to do data coaching with other constituent groups from those two entities.
And really, kind of like as Dulce pointed out, like showing them like whatever you're collecting on your CASAS forms, right? So this is like that example from CASAS update form on like where you are filling out what are some of the milestones the students may have reached, whether that's a certificate and whatnot, and then how that all aligns with the different metrics, right? So we try to color code things for them, taking inspiration from like Jay Wright's CASAS presentations.
And then really trying to link, OK, this metric, and the way things are captured here aligns with how you fill information on the CASAS form, how you collect information, and submit it through CASAS for that student. And then tying, again, back to the AEP methodology. Because CASAS I believe receives a flat file from CASAS. So the way the elements are here, they may not look exactly, or worded exactly they are on the form.
So EDU certificate means earn certificate. So if someone indicated earn certificate on this, that's what it would translate into. And then that's how the information is then being used in the AEP LaunchBoard calculation. And that's how the student gets counted for that specific metric. So again, while they are are two different system, but the way the LaunchBoard board calculation works, like, you can read it very similarly as long as you're familiar with what are these different labels and what they mean. So that was another way for us to build that knowledge base around our consortium members.
Dulce Delgadillo: Yeah, yeah, definitely.
Harpreet Uppal: And yeah, and this was a fun slide. We have shared a lot of information, not just here, even with our consortium members. And like, when we have been doing these workshops or individualized meetings, like, we have learned lessons on this journey, because we're living this data world, like, we see things from a different perspective. And data may not come as easily, or maybe not that it doesn't come easy, but more so like you know because our directors, our stakeholders, or executive committee are not living in day-to-day in this data world, you know, they're not familiar with the methodology, the MDD for the AP LaunchBoard as we are, because we're looking at it every day.
So we realize that we really need to break this information into like small chunks, right? And that's how we have been doing our data coaching now. Starting with like, let's share everything. Like, let's do a data dump. But we're like, OK, no, that's not fair.
And now, it's back to like, OK, how can we cater toward our audience? And I think we have gotten a better reception around our presentations and the way we have been sharing data. And then having then our stakeholders utilizing that data, so making that data digestible.
So I wanted to now shift gears. And I know that we have only about 27 minutes. So I really just wanted to quickly go over our process. Because whatever we have covered up until now has been around the ongoing support we have been providing to our consortium, to NOCE Institution.
But we wanted to focus on like where research fits within the CAEP three-year plan, right? Because that has been on everyone's radar. Everyone has been working on it. So like, where do we fit in within our consortium? And what have we done to provide that support?
Because I'm sure as you all know, in CAEP three-year plan, we have to set targets around several metrics. They're mandatory. They're option metrics. And how we have provided data, and different types of data to our consortium members for them to really work you know and answer those questions around the three-year plan. So we created this planning guide, which I'll quickly share.
Everything that in terms of planning guide addendum, that's on our website as well too if you ever have a time and want to look at it. And then also what we have done internally to provide information by pulling internal data for NOCE. For '21-'22, as you know, were all in the academic year. That data is not on LaunchBoard.
We recently got access to the 2021 data. So 2022 data is not even available in Launch Board until next year. So like, how can we provide baseline numbers for our consortium leaders, A, to set those targets? So that's kind of where our goal was.
So our planning guide was really to assist with this three-year planning at our consortium. And it was really driven by the guidance document that was provided by the state. We tried to use that information that was in the guidance documents, and especially those guiding questions, and provide data for our consortium members, so that they have like a comprehensive document in front of them as they are trying to understand the data about our communities.
So what does our adults in our community look like? What are their needs? And how can we serve them? And who are we currently serving looking at the LaunchBoard data? And then trying to provide additional information, like, because we are in Orange County, what are the labor market data looks like for Orange County? What are the employers looking for in Orange County?
So trying to get all of that information in one document so that it's synthesized. And then our stakeholders when they're having conversations around the CAEP three-year plan, they can utilize that information. So that was our intention.
So just to give you an example, when I mentioned like we were really using the state document as a tool, or like to align our data information, or whatever we provided in the planning guide. So on the state document, they were guiding questions like, who are our current customers? Where can we find this information? What characteristics define regional community? Where can we find information?
So we use those guiding questions. And then provided like, you know, and as you all know, we all had access to the fact sheets where for each of our consortium we were able to identify what does the demographic data look like for the adults within a region. So we utilized all of that information in our tool, which I'm going to go next. And I'm going to cover this pretty fast. Because I do want us to have time for questions at the end, or any discussion.
So this was kind of like our planning guide document looks like. There's a lot of pictures, a lot of charts. Again, this information was pulled from LaunchBoard. And we were just trying to put it in one comprehensive document. As I mentioned, we were looking at not only LaunchBoard data, we were looking at fact sheets.
We try to provide narrative for all of the different pieces of information that we were providing for our consortium members. We pulled information from different reports within our county. And we embedded links in everything so that if people want to delve deeper into whatever was being provided, they could do so.
So what does Orange County Community indicators report looks? What's being indicated there? I'm going to breeze through this. We looked at our Strong Workforce Program, plans for our region. Like, what are the employers looking for? We looked at other regional plans for our county.
Again, just trying to provide as much information we can. Because that's what was the demand in the three-year plan. Like, not only look at what is the need of the adults in the region, but also, look at where is the need in terms of labor market? And what are we going to do as a consortium to meet the needs of the adults, and the workforce needs of the adults within our region?
So after we provided all of this information, we also pulled data we wanted to provide for every single metric that was mandatory and optional on the CAEP three-year plan. We pulled all of the information that was available to us in the adult education pipeline LaunchBoard, which goes back to 2016-2017. So we provided data for what the state data looks like, what does the data look like for our consortium, what does the data look like the three adult providers within our consortium, so NOCE, ROP, and Garden Group. And we did that basically for every single metric.
And then once the new 2021 data was released in LaunchBoard, and as you may know, there was change to the methodology on some of the metrics. So we went back and we created an addendum. And we provided the same information. But now, it was including the 2021 data. And then we provided updated numbers wherever there was a change in the methodology.
So again, all of this information we provided, because we wanted our consortium members to have access to all of this information as they're working on the CAEP three-year plan. And in addition, we also provided internal tables for our mandatory metrics where we have to set targets. We pulled actual internal numbers using the AEP LaunchBoard methodology.
And we were able to do so as we had mentioned previously, because we have access to backend data from our student information system. So we were able to mimic and basically, replicate the adult education pipeline LaunchBoard data. We first tried with 2021 data, because we wanted to see how close our numbers could be. And we were within the half percentage point.
And that's because the LaunchBoard board data looks at system-wide. And we only have access to NOCE and other colleges within our district. So we were still able to get very close. And once we felt comfortable with our application, we were able to then provide internal numbers for '21-'22 for our consortium on the mandatory metrics. And we were even able to get it to a program level, because our program directors wanting to see what the data looks like at a program level, so they can better understand how the data looks like, and how they can set targets that will overall become an institution level targets.
And additionally, we are also working on this document. We are trying to instead of giving a lot of information, we're trying to find avenues, or platforms on how we can present and synthesize this information in like one-pagers, right? So we're trying to-- and start like the Excel sheet that's very as Dulce mentioned, nitty gritty, really getting at a course level. We're trying to identify information at a program level.
How are the courses that are in the program overall? And by these check marks where are you sure where the alignments of our internal programs are, how they align to the CAEP program areas, right? And then whether and how if the students are being served in a specific program area, where they fall within the CAEP program areas, and then where do they fall within these different CAEP metrics, right?
So like, what can the NOCE programs do to support students so that all of their achievements and successes can be aligned to what's being reflected on the LaunchBoard board. So that's kind of like our intention. And again, tying it back to the course coding. And we keep doing that, because it really matters. And how courses are coded will identify where they fall within the CAEP program areas.
And where they fall within the CAEP program areas on the LaunchBoard board matters, because certain metrics are only identified for specific programs, like under progress. ESL gains are only calculated for adult basic education, adult secondary education, or ESL. Transition to post-secondary is only calculated for specific programs, and ESL basic education, secondary. So that's why when we are presenting this information, we tie it back to how your course is coded. Then going back to our presentation.
Dulce Delgadillo: Harpreet.
Harpreet Uppal: Yes.
Dulce Delgadillo: There's a question in the chat. And I think actually it might be good to show. I'll read the question. Are you able to speak to any challenges your consortium faced with their college numbers being high in Cal past due to the AE202 notes?
With the switch to online delivery method of instruction in spring 2020, colleges consistently coded as 605 positive attendance hours for non-credit. Therefore, the threshold of 12 hours, which aligns to WIOA Title 2 reporting for adult education participants has been adjusted for non-credit students enrolled in the spring 2020, or any term in the 2020-2021 academic year, allowing inclusion of those students without meeting the threshold of contact or positive attendance hours.
So it is my understanding that some colleges had higher numbers, and some colleges had lower numbers, correct? Because it's not taking into account. And so for us, that report that Harpreet had just shown, that was the internal numbers that were calculated by these definitions.
So the challenge, I don't know if there was-- because Harpreet, I can't remember, is it higher? I believe ours came out lower, right? But that was an implication of the older adults being removed, not necessarily because of the 12-hour threshold. So there's other things kind of playing into these numbers as well, right?
But for the 12-hour threshold, we didn't really receive any challenges, except for we would now need to very much distinguish-- which we had to do before anyways-- but students that just receive services and have no hours. Because those students are still being counted. And then students that just received any type of instruction in hours. And that's really how we're looking at it for us. I'm not sure if you wanted to share anything else on this particular topic, Harpreet.
Harpreet Uppal: No, I think if I remember-- and Jason, please chime in-- I don't believe we had that issue where our numbers were higher because of that change. As Dulce mentioned, we definitely saw a decrease in our adult served accounts. That's because prior to the new methodology, the adults served, or reportable individuals included all non-credit enrollment. So it did include our emeritus program.
But with the change to enrollments only in CAEP program areas, we saw a decline in our numbers moving forward. But because we didn't collect hours during the pandemic time, I believe-- please correct me if I'm wrong-- we started collecting hours. But the definition does not account for hours. It only looks at enrollments. We didn't see a big change in terms of an increase or decrease.
Dulce Delgadillo: Yeah, I don't remember. But I could see it being difficult if you continue to pick up the hours, and you're not exactly sure what to do with them now that you have them. Jason?
Jason Makabali: Yeah, I mean, kind of the big thing was the participant counts are a little bit wonky. But in terms of how we actually collected attendance on this in terms of reporting for apportionment, you know, everyone's actually still enrolled as kind of the thing. So instead of looking at the participant counts per se, we kind of aligned it more towards, hey, who's actually enrolled? Who are we actually serving? Who are we actually collecting apportionment on?
And those are the kinds of numbers that were close-- that if you want to do more apples to apples comparison, that would kind of be the avenue we kind of went about it, right? Because you know, ultimately, they're still our students. Ultimately, they're still like within our enrollment file.
The difference is in terms of the label of participant versus the label of adult served, you know, we're just looking at it from an enrollment lens instead. And that's kind of our approach to how we actually really want to look at it. So that kind of gave us like an enrollment at this period of time versus enrollment that period in time, right?
And it is, because like in terms of why we couldn't really collect a portion, or we couldn't collect students hours was because the way a portion was being funded for it is through the alternative attendance accounting method, right? Which is based on census one, census two, which is more in line with how credits are actually-- generally, collects their attendance, right? They don't actually do positive attendance hours either.
It's just kind of a holdover, because we're adult ed, and we're held to a different standard versus looking at census dates. So I mean, when looking at it from a more enrollment type perspective, I really think it's pretty similar. And that's kind of the lens that really we thought was the best route to take.
Dulce Delgadillo: And one of the big questions that we have is because if you are-- so our enrollments are being captured through MIS. And we're not-- and Harpreet and Jason, I'm not exactly sure if that's kind of the base file. And if you have additional TOPSpro data, if it would be merged onto that data, that MI data.
And in a sense, the hierarchy would be the top would be the MIS data file, and then everything else kind of supplements it. But we know that that is probably not going to make sense for obviously, the institutions that just have TOPSpro, right? And I'm not sure if there's institutions that have just MIS.
So we're still trying to wrap our head around how are these data being matched between both of these systems? Now, we know they're being matched by first name, last name, and a gender, and date of birth. But you know, and usually, we use a term called cooked data.
So we're getting cooked data, and we don't know what the recipe is on the back end of how they're getting to that point. And we're trying to replicate that at a local level, which Harpreet and Jason got very close to within a percentage point. So it's that replication that allows us to determine what exactly are these implications of things like that, like a definition change like we're going to throw hours out the window.
Harpreet Uppal: OK. Thank you, Jason. Thank you, Dulce. All right. And I'm going to quickly try to wrap up, because I still wanted to leave time for a discussion. So just we're very close to the end.
So now that we are very much close to finish with our CAEP three-year plan, and we wanted to talk about like, OK, how is research going to do again, the ongoing support? And how is data going to be utilized at our consortium for that continuous improvement and planning, right? So what we have been doing annually as Dulce mentioned earlier too, is we have been creating these annual evaluation reports.
And initially, we did it in 2020 using our 2019-20 data. Because we wanted to understand, evaluate our consortium efforts and activities. So we follow the activities or strategies that took place in 2019-20 using the CAEP funds. We wanted to determine did the students end up achieving those outcomes?
You know, first we wanted to see, was the strategy implemented? Was it implemented as it was outlined on the paper, right, through those logic models. And then was there data collected? And then if data was collected, we tried to get access to that data, whether it was enrollment data we pulled from our student information system, if it was service data. We worked with our different either housing department or disability support services department, and we tried to get access to that data, whether it was living on Excel files, or on another software.
And then once we compiled all of that data, we cleaned and we analyzed it. And then we tried to measure the effectiveness of each of those strategies. And then we published, or we created this report in 2021 based on all of that work that was done in 2019-20. And we wanted to see where is the area of growth.
So we really wanted to see maybe if there was issues with data collection. We wanted to see where are those gaps, and whether we are reporting all of the data that needs to be reported through. If we're providing services to students, are we actually reporting all of that data through our different channels, and going through MIS?
So that was the purpose in our initial report. And then once we did that, we wrote that report, we presented, we were very excited. We presented all of the findings. Then we got feedback, and it was like, oh well, it was very narrowed. One of the main response that we got was that the scope of a report was very narrow.
We were focusing on a funding source. We were only looking at the activities that were being implemented using CAEP funds. And it wasn't looking at our programs as a whole. So the successes that we were presenting were very narrow, very small numbers of students, again, because we were trying to follow the money. And then we changed our scope.
And then in 2022 earlier this year, we did another evaluation report of our 2021 activities. However, this was more at a program level. So we looked at focusing not on the funding source, but looking at a very similar to how LaunchBoard looks at the data looking at a program level. And then looking to see what the successes of the students look like that were being served under these CAEP program areas.
And moving forward, we don't know how the scope is going to change. Because we have a program directors that want to look at a larger lens. We have our consortium director that wants to follow the money, and do that return on investment kind of analysis.
So we are trying to find a middle ground where we can provide all of the information that's necessary for both consortium planning, and for the program planning. So that's where we are. And we will continue to do these type of reports annually.
So we are done with the presentation part, and now to the discussion part. I apologize, we only have six minutes. But we kind of wanted to get a feel from you in terms through this poll, do you have access? As you remember from our initial poll where we asked you like what is your role, or where do you fit within the consortium, like, we wanted to see do you have access to anyone who would examine consortium data at a more granular level, such as a research team consortium analysts, data managers?
Because a lot of the things that we have presented are coming from a research team. And again, we are at an advantage, because we have access to that level of data to do this type of granular analysis. But we were wanting to see where do you fit in, and kind of open it for discussion. Yeah. All right? Thank you to my four that have responded.
So we have about two that say no. I know given the time, we might not have a lot of time. And then, Dulce, do we have questions in the chat before I even open for discussion? So we kind of wanted to hear from you. But I apologize we don't have a lot of time.
But we just really wanted to take this time to discuss like, if you don't have access to a researcher or a data person, how can this type of work that we have been doing fit within your role? Like, what can you do in your role, in your capacity, and try to dive deeper into your data? So this was more just kind of like, we wanted to just have an open dialogue around this.
Because we are available to our consortium members, and being part of the research team, we have direct connections, and they have direct connections to us. And as Dulce mentioned earlier, I am a Senior Research Analyst funded through our CAEP programs. So I am in that role where I support all of their research data needs at a consortium level, and at our institution level, and also provide that support to our members that are outside of NOCE, but within our consortium. So making myself available for our stakeholders within the different other providers.
And because I'm part of a research team, I also have access to my director, and Jason, and other members of our teams. So we work very much collaboratively. And that's what is being reflected in all of the data tools and products we have shared, is because we have that accessibility to data. And we know that that might not be the case for everyone.
And if anyone wants to share in the last three minutes, if not, I will pass it back to TAP team if there's no questions, or if no one-- if folks wants to come off mute, you can. Otherwise, we are kind of done. Oh, and I wanted to provide our contact information if you have any questions regarding what we have shared. And we will share our PowerPoint once we remediate it. And it's 508 compliant.
Holly Clark: Yeah, so if there's no other questions, anyone, if you would like to come off mute right now and ask a quick question, you can. I will begin closing out. Feel free to cut me off if you do come off and I don't notice.
I want to thank everyone for attending today. I really would like to take a moment and thank Harpreet, and Dulce, and Jason for the presentation. It was very informative. I think the attendees got a lot of very useful information out of it. And I know it took a lot of time for you guys to come up with your processes, and everything, and to share that was very beneficial to the group.
We did record today's session, and it is going to be sent to the third party vendor for remediation. As soon as it's ready, it usually takes two to three weeks. We will email it out to everyone who is registered with the link of where you can find it on our website. And Harpreet, if the PowerPoint is remediated prior to that, you can email it to TAP, and we will include that in the email. Otherwise, as soon as we do get it, we will update the website where you find the recording, will also be the link for the PowerPoint once it's ready.
So with that, I think that's all we have. I will go ahead and wish everyone a good rest of your day. And to our presenters, thank you again. We could not do this without your guys' help, and knowledge, and sharing of your experiences.
Harpreet Uppal: Thank you, TAP team, for your support. And thank you everyone who attended. I know it was really lengthy. So we appreciate you sticking around. Thank you.
Holly Clark: Yes, and take a minute. I dropped the link for the evaluation in the chat. Please take a moment to fill that out. We'd love to get your feedback. It helps us form ongoing webinars and presentations. All right, everyone, have a great rest of your day. Bye, bye.