OK, good afternoon, everyone. Welcome to this webinar-- developing tools to help your consortium fund data-driven strategies. We're going to get started in a few seconds here as folks enter the room.
Welcome to today's session. Looking forward to learning about these tools this afternoon. And we'll give it about 10, 15 seconds or so for folks to come to the room.
And again, hi, everyone. Welcome to our session-- developing tools to help your consortium fund data-driven strategies. OK, looks like things are leveling off a bit. Dulce, would you mind stop sharing.
OK, so before we get started with today's presentation, I just wanted to run through a few webinar tips for our session today. So again, welcome to today's session. We're happy that you're here.
OK, so the session is being recorded. All the recorded webinars will appear on the VFairs platform as soon as they are available. They may take a little time, but the session is being recorded, and will be up there as soon as possible.
Actually the presenters have already provided the resources for today's PowerPoint. So that resource is available for you if you want to go ahead and download that. So this is a Zoom webinar. So everyone is going to be muted in today's session.
If you're having trouble hearing you might just want to check your Audio Settings for the correct input and output. And the way you can do that is by looking at your Zoom toolbar at the bottom of the screen on the far left side. So you can check your Audio Settings and then you can also adjust the volume, either turn your volume up or down.
So we have both the webinar chat and the webinar Q&A available today. So if you would like to put something into the chat, make sure you go ahead and open up the Chat window. And then you just want to make sure or decide whoever it is that you would like to chat to, so you have two choices-- you can either just chat to the panelists only or you can chat to all panelists and everyone in the room. So make sure that you select the correct audience first and then go ahead and enter your chat.
We're also going to use the Q&A window today for questions for the presenters. So again, the Q&A is also available from your Zoom toolbar at the bottom of your screen. Go ahead and open up the Q&A window, and then you can enter questions into the Q&A box right there. And the presenters, depending on the question, they're either going to just type in their answer back to you, or if they decide they want to answer some of the questions live, they'll go ahead and do so. But if you can get your questions for the presenters to the Q&A, that would be great.
In terms of viewing today's webinar, if you want to adjust the size of the Zoom window, at the very top I believe, you have because we-- I'm not sure exactly which settings are available, but look at the top of your Zoom toolbar and then you open up the View options dropdown, go ahead and you open that up, and you should be able to adjust from full screen to a smaller size. So you can go ahead and adjust that setting there.
And then at the very end of today's session once we close the room, there is going to be a link to an evaluation form for the session, so please make sure that you complete that evaluation form. We'll send that feedback to the presenters for today's session.
Otherwise, have a great summit. And I hope you're having a great summit, if this is your first session. Welcome to the summit, if it's your fifth or sixth session. Welcome back, and I'm going to stop sharing and turn it over to today's presenters to get started.
Thank you. Thank you so much. All right, let me adjust my screen. I hope you guys can see me or hear me OK.
And really welcome everyone to the developing tools to help your consortia fund data-driven strategies or CAEP activities. In the duration of our PowerPoint presentation, we'll be going back and forth with the word CAEP activities and CAEP strategies as we're adjusting to our state's language request.
So a little bit about the purpose of our presentation today is to really cover what has been working, what is currently working, and how to adjust our consortium needs for adult education given those circumstances-- our current circumstances with COVID, budget cuts. But also are not so new state funding formula with state CAEP outcomes or metrics as we know it. So with that being said, we're going to cover the historical aspect, an overview, and then going to the nitty gritty of details with my colleagues as well.
So with that being said, we'll get started. I'd like to introduce myself. My name is Janeth Manjarrez . I'm the California Adult Education Program Director for the North Orange County Regional Consortium.
And we have next, Dulce Delgadillo, who is our director for own NOCE institutional research and effectiveness Did I say all right, Dulce?
And then our team is Harpreet Upal who's our senior research analyst for NOCRC/NOCE. She helps connect the data with all of our CAEP activities for all of our consortium members. So Harpreet is here with us. She's also the expert.
And Jason Makabali, who is our go-to guru for MIS. He is also in the NOCE team of institutional and research effectiveness-- a research team for NOCE, and who's always there to give us a hand and support us to make sure that our CAEP strategies or activities are in alignment with MIS outcomes.
We will cover a bit of TOPSpro and CASAS as we get further since we have two funded members that are gathering their data with both TOPpro and MIS. Next slide, please.
So here's a compilation of the agenda we're recovering today. Again, it's a little overview-- historical context of the purpose of the creation of our data-driven tools to best support our consortium activities or strategies for CAEP metrics. And then we'll go into the processes-- again, the template, the proposals, the evaluation plan, and any thoughts, and closing remarks, as well as we'll have an opportunity to answer your questions.
Next slide. And like I said before, what was the intent? So it's just a little bit about the North Orange County Regional Consortium-- it is apprise of the members listed below or listed here that you see. The way that we function is to come together as we know not only adult providers, but members in our North Orange County community to help support and bring resources academically and student support services to our adults in our community.
Here's the list of again, our regional partners. We only have two adult providers in our region-- one is the North Orange County Continuation School or Continuing Education school, just NOCE, and North Orange County Regional Occupational Center or North or Orange County ROP-- those are both adult and providers.
And the rest are apprise of our K-12 district such as Anaheim Unified site. We also have other partners in the community College District, Cypress College, Fullerton Joint, Garden, Los Al, Orange County Department of Ed, and Placentia-Yorba Linda. These are our adult providers as well as our members that are apprised of our executive committee or our voting members, in this case, for the consortium.
And we have actually six work groups or program areas that apprise of DSS, DSL, our parents parenting K-12, basic skills, CTE, and-- am I missing one? I think I've said all six.
And we've also created a sub-workgroup group that is apprised of student services. So because we wanted to maximize our current student funding formula as well as our current situation with COVID and budget cuts, we apprised or comprised the student services under a sub-workgroup that we call a NOCRC transition. And that is aligned and combined with all other work groups in the CAEP program areas that I've already mentioned.
And so we want to make sure we maximize that transitional peace or student support services in leveraging each other's resources, meaning our activities or strategies as well as from each other's institutions, to be able to support and maximize any social support and equitable support for our adult learners and of course, the academic peace and teaching our students the necessary technical skills to attain living wage attainment, of course. So I want to make sure that I've covered all the pieces here. Next slide.
So as an operating consortium of adult Ed, and coming from what has worked in the past that was very needed and as we're evolving and transforming into a new study, a new student funding formula aligning the CAEP metrics, making sure that activities are connected to the data through MIS or TOPSpro so that we can best maximize the funding residuals and the funding cycle of CAEP in this case.
So we took a close look collectively, of what our current processes are, and ensuring that is now connected with the outcomes, but just vetting the process that it's maximizing that we are serving our students' needs, we came together as a collaboration with our NOCE research department to collectively come up with an in-house internal and external processes or toolkit that can maximize not only for NOCE as an institution, but for our other partners and our other adult provider, in this case, ROP.
So we definitely partnered up, teamed up, and went through a selective meetings and toolkits and brainstorm ideas of what is currently working and what is that we need to evolve to make sure that we are meeting our state needs and also our-- I'm sorry, I was distracted by one of the questions-- by our state needs and our local needs in this case.
So we're going to go over our current tools connecting the CAEP outcomes and existing decision making process with CAEP strategies and activities-- the way that we collect data, the way that we implement data and our work groups or CAEP program areas, and the way that we present it to our executive committee for a vote so that implementation becomes an execution.
And then again, aligning the strategies and proposals or activities to a scoring rubric that does not necessarily have to be created by a research department. These are ideas that if you do not have a research team, I'm just very fortunate to have an awesome research department and a research team that we collaborate and definitely look at all angles to maximize our metrics, and again the funding cycle to continue to receive the necessary funding to meet student needs.
So that with that being said, I'm going to hand over our presentation to my colleague, Dulce Delgadillo, who's going to go into depth and detail about the toolkit and the scoring rubric next. Dulce, you're muted.
Yeah, great thank you, Janeth. All right, so the first thing-- Oh, I hear you, I go on this. OK, so the first thing I'm going to do is going to go ahead and do a poll. So I actually want to ask all of you on this presentation is does your consortium have a resource allocation model or process? You can go ahead and join at slido.com with that number, or if you have the QR code reader, you can use that. We'll give it maybe about 30 seconds or a minute.
Dulce, we're still hearing echoes so I don't know.
Janeth, Dulce, try it again.
Hello, there we go. That's way better. OK, does anybody go-- OK, so there we go. OK, so a couple more-- see if we can get a couple more in here. Or maybe we could get it to double digits, to 10. All right, halfway there. OK, a couple more, maybe about 10 more seconds.
OK, so we're looking at about 40% yes, 40% no, 20% I don't know, that's a great question, out of five. Oh, there we go. We got one more. All right, can we get one more? Feel like I'm in the-- there we go. OK, all right, so you don't know-- so let's see.
We just wanted to get a sense of what resource allocation really looks like across consortiums. And so really what we wanted to do-- what the whole goal of developing these processes, establishing these templates, was really to look at data collection and look at data at the forefront in the planning phase. From a research department perspective, we were looking at data on the back end, once things were already implemented-- once services were already rendered to students.
And so in an effort to really improve those processes-- improve our data and ultimately improve the way that we look at data and how our services connect to the outcomes on the LaunchBoard, we really wanted to have data at the forefront of the conversation right at the planning phase.
So one of the first thing that we did, is that we developed a data collection manual. So this was something that we developed within the research department to really look at, what is all the data collection that's happening across the consortium? And is it consistent?
So it's really important when you're collecting data to have it be collected in the same manner. And particularly, this was really led because we do CASAS TOPSpro Enterprise, and we wanted to make sure when we were testing students and students were doing the pre- and post-test, and the entry and the update forms, we wanted to make sure that it was consistent.
In addition, we wanted to make sure that individuals who were out and collecting the data were giving the same message to the students of why we're collecting this data, why is it important. And it also served as a messaging tool in order to also inform other consortium members. So within our data collection manual, we were able to include scripts. So if there was questions or Q&As, the type of information that you should provide students on, why this information in this data is important in collecting.
And we looked at each of our strategies and saw, OK, what does data collection look at each of our activity strategy levels? And how can we make sure that it's somewhat consistent across all of these? So that was the starting step-- laying the foundation of these data collection efforts across our consortium.
We also wanted to make sure that we have CAEP outcomes in mind as we're planning these activities and these strategies. So this is a tool on the right-hand side and infographics that the research department put together for consortium members to really help facilitate conversation around data and how each of those activities really contributes to each of those outcomes.
So one of the first things we have to do as a research department was build that data literacy capacity within the consortium. So when we say, adults served, what does that mean? What does one to 11 adults served versus 12 plus participants really mean? And how do our activities and strategies fall within each of those two dockets. When we talk about progress, what are those metrics that we're particularly looking at, and how do our activities contribute to each of those progress metrics?
Completion, placement into jobs-- all of these main goal bucket areas-- we really wanted to build capacity and a greater understanding and an opportunity and platform for individuals within the consortium to ask questions, and really just again, build that capacity within a consortium to be able to understand and to help ultimately planning and ultimately improve our outcomes as a consortium as well.
So I'm just going to address Greg Hill's question really quick. I'm assuming-- when do you determine that an internal evaluation is needed? Do you also include the cost of the evaluation into the allocation? That's a great question.
So we are going to talk a little bit about what our internal evaluation look like. And just from a start, we were just looking at data collection at this point. Harpreet is actually going to talk about what are evaluation plan was.
And we decided as a research department that we wanted to take on the evaluation and really develop an evaluation report for the consortium. But we understand that sometimes those resources are limited and you may have to outsource that evaluation. So definitely if evaluation is something that you're interested in and you would want to build a budget for it upfront and you have a budget to build for it then that could also be an option for you.
OK, all right, so I'm going to jump right into what our strategy proposal template. So this template was really given to consortium members and said, "Hey, if you're interested in funding some type of student service or a proposal using CAEP funds, we're going to give you this template. And we want you to think about these components as you're doing your planning." So I'm going to go over a couple of these criterions and really what was our thought process as we were building these proposal templates for consortium members.
So the first part that we really wanted to dive into was background and significance. So we wanted consortium members to really think about why are you doing, or why are you proposing what you're proposing? Is there a need? Are you trying to close an achievement gap? Are you seeing a gap between completion and job placement?
So starting to think about those questions. And we gave them a little bit a prompt. We even included questions to consider. So you see here we asked them, what is your target population?
So these are all questions that really help us as a research department to determine on the back end. When we pull the data of the students that you were serving, did you reach the target population that you intended at the planning phase?
Again, was there something that you saw maybe was being implemented in another consortium or you saw that literature has led to this type of strategy that seems to be promising? We wanted those pieces to be embedded into the proposal process.
The other section we wanted participants to really take into consideration is, is your proposal regionally inclusive? So we know that we're not working within one silo, we're working within a region and we're trying to leverage partnerships and we're trying to leverage resources to maximize.
And so we wanted participants to really take into consideration-- OK, what are the partnerships that currently exist and how can you leverage them? What are the partnerships that need to be established and how can we help facilitate that? Or what type of assistance do you need in order to get to build those bridges or to strengthen those partnerships as you're trying to think about these proposals that are going to be implemented to serve students? So this was another piece that we wanted individuals to think about not just at a local level, but at a regional level.
The other piece was goals, objectives, and milestones. And this really came about because as a research department, when we were going out and talking to strategy activity leaders and trying to understand, OK, what is the data that we need to collect, we need to know first, OK, what are you trying to achieve? What is your goal? What is the objective of what you are trying to implement?
Because if we don't know that, then we don't know what to measure. We don't know what to look for, and ultimately, we don't know what type of data needs to be collected throughout the implementation.
So we really wanted to make sure that consortium members knew exactly what their proposals goal was, what the overall impact was, what the objectives were, and what are the things that they could potentially measure-- students served, number of counseling appointments, how many students are transitioning from one program to another or from noncredit to forcredit-- all of those things we wanted to have in the conversation and in the planning phase up front.
And so we also asked them, how do you intend to achieve these goals? So what are components of your proposal that would help to ultimately achieve these goals? So again, we asked these because we wanted them to know what are those goals in order to help us ultimately measure how that progress was going from year to year in terms of the implementation of the proposals.
Now we know OK, why are they doing it, what are the goals, what are the objectives, how are they leveraging resources, and how is this becoming regionally inclusive. Now we want to know, how are you going to do it? And what do you need to do it? And so we wanted to ask them specifically OK, tell us your plan on implementation, what does that look like?
We wanted to ask about timeline because we know that there are a lot of things that impact timeline, whether it be budget, whether it be resources, staffing, partnerships that may need to be built a little bit longer. So all those things we wanted to go through the thought process as you're thinking about these proposals. So we asked them about a timeline for implementation, we asked about possible barriers and challenges that could potentially impact that timeline and the implementation.
And we wanted them to think about all of those resources that need to take place or that you need for full implementation to take place. This was also very helpful on an evaluation standpoint-- to see OK, what was said and written down and what was your original intent. And then we know that sometimes that doesn't happen in the real world, other things get thrown into us. We find out that once we go into implementation phase, other curveballs are being thrown at us, like a COVID situation.
And so we really wanted them to really think about those things. And again, going back to the evaluation standpoint, to really look at OK, this was your intent, now let's see how it was implemented. And looking at that implementation fidelity piece as part of the evaluation plan as much as we could with an understanding of what was planned and then what ended up happening and why did it sometimes it doesn't align so why did it not align .
OK, now I'm going to hand it off to Jason. And he's going to cover the rest of the criterion for the proposals.
Thank you, Dulce. So the next thing that we ask about is what is their data collection plan? What we noticed was that we had trouble trying to find out who was actually going to be collecting data so we built into our proposal asking who would collect the data, where and how it would be stored and how the reporting utilized for strategies and programs.
We also wanted them to-- the people who offered proposals to describe any methodology that they might want us to consider when evaluating their proposals too so that would give us some insight into any additional data we may need to collect on top of what they are stating that they should be collected.
What we wanted to get out of this was that data shouldn't be an afterthought, data should be built into the proposal itself. In order for proper evaluation, in order for proper outcome measures, in order to make sure that you're actually doing what you say you're doing, you need to have the data up front.
You can't just do all your stuff and then be like, wait a second. What did what is it that we actually did? And then try to scrape everything up later and figure out what's going on. That just doesn't work and it makes for like a bad evaluation practice. And you end up finding gaps and all that stuff in how you're reporting and how you can actually even measure anything.
So we wanted to make sure that anyone who was requesting funds would think about the data piece right from the top-- right at the very beginning so that we have something to build off of and move forward with. Next slide, please.
Next, we also wanted to ask those who are requesting funds, how exactly did their strategy or activity align with key metrics and what outcomes would they achieve? What we saw was that people would just say, we align with everything, we're going to get every transition outcome, we're going to get every single outcome that is imaginable, yet when you look at how things are actually defined, how things are actually measured, that's not really how it works.
So we wanted them to really consider based on the data that they would be collecting and how exactly would they be aligning with the CAEP metrics. I mean, it should be pretty obvious why we're asking this because this is your long term outcome. This is how our consortium is moving the needle forward. This is what we're doing is going to be reported to the legislature and showing the Adult Ed is actually working.
So we want to make sure that we're actually hitting those outcomes, that we're actually collecting all that data, and that everything ties back to us showing that we're doing what we say it is that we want to be doing.
So the next metric that we want to look on is how scalable is this activity or strategy? If we find that something is working, that's greats.
But what if it's only working with a handful of students? How do we expand on it? How do we make it bigger? How do we institutionalize this program? How do we make it long-term sustainable?
And that's what we want to get out over here. We want to make sure that the great work that we are doing, it can be expanded-- it can go beyond the scope of your initial pilot phase into a larger audience-- into something that everyone can get a handle on all hands on and everything has got to go.
So those are all of our criteria that we use. But beyond asking for-- Next slide, please. Beyond just the proposal, we also ask our constituency to create a logic model upon proposals. So first things first, let's do a little poll-- do any of you all actually use logic models when offering up your proposals, when doing suggestions and strategies, all that kind of stuff? Just please go on to Slido window, fill in an answer, and let's see how we go
I think that's-- does anyone else want to throw something in there? All right, well, this is kwotasie, I guess. I mean, not really. Well, part of it is that we do a way to evaluate a-- it's good to see that some people are asking, what is a logic model? Yeah, a logic model is something that when we see that the majority probably aren't using one right now or I guess, about half-- but yeah like--
For all of our strategy so far, we actually do go through this allocation process. Everyone has to build a logic model. All this has been built in at the start. In the process of updating our rubric, it's a cyclic process. As we find gaps in what we are seeing or examine then we may have to tweak things.
Yes, we are doing this annually currently. Every proposal is an annual proposal per each budget year. But yeah, so what is a logic model? Well, it basically is your flowchart. It's designed to show how you're actually going to implement and how your outcomes are actually going to be achieved.
We can tie this into our proposal. Basically, we had logic models beforehand. But the way in which they were done, the guidance on them wasn't very clear, I guess. So as a result, like as we saw everyone was listing every single outcome. Everyone was listing a lot of activities that didn't really seem to flow.
So what we wanted to do was we wanted to tie our proposal guidance, our proposal criteria to more closely resemble what would be the components of the logic model. It used to be more loosely-- our former proposal criterion was more loose, just more free form more open-ended. But with these guiding questions, we can now tie where exactly the proposal criteria fit into the logic model.
In our logic model, our inputs are those questions like what is the background significance? Like how will it be regionally inclusive? Your activities are more driven by what is your implementation plan. And your outputs are more of like what do you hope to see out of it really quick? Like immediately-- whereas your long term overarching goals are your entire goals and objectives.
So in building our proposal criteria, we hope to mirror the logic model process in a sense so that there would be more clear direction on how to get from point A to your end goal-- how to get to from the start to the finish. Another way of looking at it is that you can start from your end goals and build your way backwards to see how you can structure yourself to align with your goals at the end.
We also leave space for the assumptions that are made in proposing these, like when we propose a logic sometimes we're going to assume that there's not going to be COVID, for example. Or what kind of external factors like COVID, for example. COVID is a great example, in general, just because it does throw a lot of wrenches into like how your evaluation and how your proposals actually end up shaping up.
Oh, have we seen any challenges around asking this a very small schools and sites? Honestly, that's a great question. Our consortium is actually rather large like our adult NOCE serves roughly 30,000 students a year. Our ROP serves about 400 adult students. So we haven't really had that kind of an issue. But yeah, I could see the challenge with developing something at this scale with like a staff of one in which case--
I think one of the things to the limitations with staff of one is that you're not sure what's happening everywhere else-- you just know what's happening within your one department or within your one. I think that's one of the reasons that we also included the regionally inclusive for you to also think about those pieces of how does your piece fit in the bigger puzzle. We try to do that. But that is one of the limitations of if we have a smaller maybe a smaller partner trying to complete this.
And we do-- to answer some of our attendees questions-- we've been very lucky and having a ongoing permanent support such as Harpreet, an analyst, to support all of our members, in this case, to make sure that as a consortium, we are walking together.
But we do have very small members such as Garden Grove Unified who is a very unique member that is currently in between three consortia in Orange County. And we also have ROP which is also up and growing, and up and coming sector in adult Ed because they also come from-- they have unlimited capacity.
So that's why we implemented this as a large-scale tool that can be implemented and they dissect and whatever is needed for them so they don't have to duplicate. So if this is something that in the midst we're sharing with you, is to share our knowledge, our toolkit, and take whatever you think it's going to work for your organization and making sure that is helping you and tracking the necessary knowledge and information for your outcomes.
All right, Jason is this slide yours as well?
Yeah, so lastly, now that we've asked everyone to do all this stuff for us, we have to score it unfortunately to make sure that everything is on the up and up and the proposals actually fit the criteria that we are asking of them. Our prior proposal scoring rubric was not as robust-- it was a lot more robust, I guess, but what we found was we were scoring proposals on things that we never actually really asked them.
So in our attempts to make sure all of our stuff was aligning a lot more clearly, a lot better, we created this scoring rubric to more closely reflect the proposal criteria themselves. Everything is matched and we make sure everything has an equal point value.
We decided everything was of equal weight. We didn't want to weigh anything more-- we didn't want to weigh anything more heavily than anything else because we value everything. We think every piece of the puzzle is important and as such our executive committee scores it in this manner.
We also make this rubric available to everyone so that everyone can see exactly how they are being scored. And we just assigned a simple point system. We just made it a fillable PDF so that it would automatically calculate just to make it easy as easy as possible for our executive committee so they could see the points that they awarded plus-- Next slide, please, any comments that they may have regarding the proposals and just to keep it nice and tidy.
And now Harpreet will go into more of our evaluation plan-- the actual evaluation report that we've created after doing all of our evaluation set up.
All right, thank you, Jason. So we have another poll. So if you can take a few seconds to complete this please before I go into next slides.
All right, thank you for those who participated. So as you can just see by our findings here, we have evenly or actually majority saying that they don't have an evaluation plan to assess their consortium activities, or they're not sure. And about 11% saying yes.
So that this itself is telling and how Janeth was saying earlier, the fact that our consortium does have this collaboration with the research team and allocate funds utilized to have an analyst on board. I think that helps a lot and with all the work that we have been able to produce given those opportunities of collaboration between our consortium and the research team at NOCE.
So I just wanted to cover basically tie everything that Janeth, Dulce, and Jason talked about in terms of why did we ask our consortium members to provide us in detail all the information and how that all leads to our evaluation plan. And so why do we evaluate? So we evaluate to understand how well our activities are achieving the goals and objectives outlined at the beginning in the developmental phase, and also we want to evaluate to help determine what works well and what could be improved.
So initially when we started working on this, and this screen sorry, this just a snapshot and then the following slides, I will go over all the different pieces on this evaluation plan. So this like is it built off so before like over internal research team came on board and took all this work. I know we had external evaluators working with NOCRC and creating an evaluation framework for us. And that was the foundation for us and then we took that work and elaborated as we got new activities and new strategies that were being funded under NOCRC.
So this is a snapshot of all the strategies that were being funded under NOCRC using the CAEP funds. And we mapped out the evaluation framework, what are the strategies, what are the outputs, outcomes, the measure of change that we want to notice, depending on the type of services that are being delivered to our adult Ed students, aligning our strategies to the CAEP metrics and outcomes, and then having figuring out where is the data going to be collected.
And this was just for us internally to map out so that we know what data we need to collect at the end or throughout the annual year to the academic year and so that we can report out on the actual outcomes at the end at the end of the year report.
So why did we create this document? So we created it for three reasons-- one was really to get a better understanding of all the strategies that are being implemented at NOCRC because this was for us to put everything in one document, given that every strategy provider was submitting those proposals for each of their specific strategy under each of the program areas. This was for us like where we can put like one comprehensive document together.
And two, this was used as a planning tool for us to determine what data needs to be collected for evaluation purposes. And then the third was really to help provide clarity to our consortium members about what are the CAEP outcomes that they are trying to achieve and really aligning.
So this is the point Jason was making to really showcase to our consortium members if they're providing, for example, a transitional service. These are the CAEP metrics that they might be achieving and not necessarily a completion. And I will talk further into the next slides. So we created this Excel workbook that was based on the information that was provided on the strategy proposals and the logic models that Jason mentioned earlier.
So after we initially mapped this, we actually set meetings with our workgroup members and our program leads to get a better understanding of what are their goals, what are the objectives for these strategies, and really explain to them how close our strategies are coming to them achieving those outcomes, and the specific CAEP metrics. Next slide, please.
So here, this particular slide captures the first portion of that big mapping out that we just had in the previous slide. So here, we first mapped out what are the outputs, outcomes, and outcome indicators. So as you know, outputs are really just mapping out what are kind of the services that were being delivered to the students.
Again, all of this information is being captured by a strategy providers in the logic models that Jason had mentioned, like when they're mapping out, what are your activities, your resources, what are your outputs, what are your intermediate outcomes, immediate outcomes, intermediate outcomes, long-term, and how they all align to CAEP metrics. So we took that information. We incorporated it into this evaluation plan.
And then the second part is the outcomes. So here we wanted to indicate what are the short-term outcomes and what are the intermediate long-term outcomes. So what are the changes that our strategy providers-- whether is an instructional strategy, whether it's a service strategy, are going to see as a result of the work that they have done, whether it's the activity and interventions so what is the result.
And if the service is being delivered to the students here, this is just an example of a transition-- to help students transition into our high school diploma. So does the outcome actually shows like the number of students that did meet that outcome.
And the third is an outcome indicator-- those are your specific measurable pieces of information that you collect to keep track of-- are you basically meeting the goals and objectives that you outlined at the beginning? So if the outcome for us, for example, this is just a sample strategy, is to increase student transition to a high school diploma. So do we have an indicator-- do we have data that shows the number of students that actually were provided this service that end up actually transitioning. And so that's how we measure change.
The next slide please, Dulce. Thank you. And this is the second section. So this is what we did in our 2019/20 academic year, where we really mapped out how other strategies are aligning to the CAEP outcomes. So this is based on the calculations that are presented in the Adult Education LaunchBoard data dictionary.
So we actually printed the whole dictionary. We went through every single calculation as it related to these CAEP outcomes, and we used these checkboxes to really help our strategy providers or program leads to identify for them what CAEP metrics are you meeting based on the strategies that you're providing, given the calculations that are in the LaunchBoard, like some strategies are very specific for example transition to post-secondary. It's calculation is based on the student's who are your basic skills ESL programs.
So we wanted to make sure our CTE services providers know that if they, not that they shouldn't, but if the student from CTE program is transitioning to a credit college, that outcome will not be captured in LaunchBoard just based on how the calculations work in the LaunchBoard. So that's what is mapping out was for.
And then the third piece was really for internal purposes. This was where we were mapping out the data that needs to be collected for evaluation purposes. So who will be collecting the data, who are we collecting the data from, is it an enrollment data, does it live in our student information system which is Banner, or is it being collected through TOPSpro like our pre- and post-testing for our ESL students and basic skills students through CASAS TOPSpro.
And are we doing some satisfaction data from surveys, focus groups, and is that data coming from internal sources. So we do have some activities within our consortium where data is being collected on internal documents such as Excel sheets. So where does that data live, so just mapping out all that. And then the frequency of data collection-- does it make sense to collect it every term, or is it annual data collection.
So we did that for all of our program areas-- the CAEP program areas that Janeth had mentioned at the beginning and all the strategies that are funded under each of the program area. Next slide, please.
And here, I don't have the full report, but this is like I wanted to talk about our evaluation report. So all this work led to the development of our 2019/20 evaluation report.
So once we had a better understanding of all the strategies, activities, their objectives, and the data collection process, we sat down at the end of the academic year with our program leads and strategy providers to understand any data that we may be missing based on all the work that we had done initially in the evaluation plan and mapping out. And then once we were able to figure all the data pieces, we gathered all the data at the end of the academic year, and to develop this evaluation report.
And this is because this is our first time doing it. So it really provides a baseline data of what are our outcomes for our students that are being served under CAEP and that results directly from the services that they're receiving through NOCRC. So this was also for us to measure the extent to which NOCRC strategies activities are achieving those outcomes, those goals that they had mapped initially in those strategy proposals.
And another internal purpose for us really was to understand, are there any gaps in the data? So based on the work that we have done so far, we have a better understanding of what data have we collected in the year, and is it really answering the question that we wanted to answer? Is it really measuring the outcomes as it's supposed to measure? So where are those gaps? And then going back and having those conversations like should we have collected data in a certain way or collected more data or collected student voices.
This was now like a summary report even though we did an end of the year outcome report. But this is really for us like how can we improve our processes better? All right, next slide, please.
And here, this is just a brief overview of our timeline, so how long this process took.
We started this conversation, actually in January 2019, when we had developed that data collection manual. That was really because research office was going in and supporting our strategy providers in collection of data, whether it was just getting our students enrolled into NOCE, and going into offsite locations, and collecting that data.
So we understood the need in the field to have some manual that shows scripts that our data collectors can read to students to show them why are we collecting certain data, what is the importance of this data. And so we developed that, and that's like a living document again that's being constantly updated based on the changes in our implementation phases of the strategies.
And then in June 2019 was when we started mapping out because at that time, we knew what were the strategies that were proposed for 2019/20, so we started mapping out what the data collection process will look like for those strategies and activities and how this all relates to CAEP outcomes metrics.
And the bulk of our work was actually from October of last year to October of this year because we had ongoing data meetings with our consortium, work groups, with our program leads at the beginning of the implementation phase so we can better understand the strategies, and then at the end of the academic year to indicate what data we have, what data we need from them, if we didn't capture it, and whether we're able to assess or measure certain things depending on the access of data that we have.
And then right now, October 2020, at the end we have just finished we completed our efforts to CAEP NOCRC evaluation report. We think it's easy to read. It's very comprehensive. We did try to lay out every single strategy, the type of data we collected for each strategy activity, or the lack of data, or the limitation to our data depending on with COVID hit and how it impacted our lack of positive attendance data collection. And then how it all impacted how we are evaluating our strategies. And then mapping out recommendations based on all the lessons that we currently have learned.
And again, this is just a baseline so we are hoping in future years, once we collect more data and improve on our processes, we'll have a comprehensive report out that shows a measure of change from previous years moving forward. Next slide, please.
So this is really just additional things that we have done. So here I do want to speak again for this partnership or this collaboration that we had research in consortium because NOCE did some presentations in collaboration with westhead at the regional level in the state, and we realized once when we were presenting on our evaluation plan and things that not every consortium is fortunate to have the number of people working on data, or having a designated researcher or a team that looks at the data.
So I do understand the limitations. But we feel that these tools can still be incorporated. They may not be incorporated at a larger scale, but there are things that can be utilized from what we have used and what worked for us even at a smaller scale consortium.
So that data collection manual that Dulce was talking about and then I shared-- that's actually on the caladult website. Once we had created it, we did share it with the state. So it is on the website if you want to take a look to see what it has to offer. If you can use any information from it for your purposes. And again, that was really just a guiding tool for our consortium members on all the data collection efforts.
And then we created a service data collection template. And as I had mentioned earlier, because some of our data collection happens on Excel sheets. It's not enrollment data. So it's not going to Banner. It's not a service data in the sense it's not provided by our counselors so it's not going through source and it's not going in Banner. So we have some services that provide hands-on support to students, but that data gets it's only living on Excel sheets.
So we created this template that we thought would be useful to highlight or illustrate for our members what are some data pieces that they need to capture to make sure that we can track our students in research and actually report outcomes for those students.
And then the presentations that we did for our consortium-- we hosted several presentations and this was really to break down things for our consortium members on the processes like how does data go from where they provide services, they collect the two Banner, two MIS and how it gets reported and how it's presented on the LaunchBoard.
And then we also wanted to build some data literacy for our consortium members on how all of this is related like their single service that they're providing, how this has a bigger to our outcomes, how it relates to bigger things at the state level. And then lastly Dulce, if you can go to the next slide, please.
So lessons learned. So what did we learn through this whole process? So why did we do this? This was really because we wanted to use evaluation as a strategic planning tool.
So as Dulce and Jason mentioned, we wanted our consortium members to start thinking of ways to measure the goals and objectives as they're setting them up at the beginning in the developmental phase. So knowing what you want to do is essential to knowing what you want to evaluate.
So we wanted them to be very clear , what are their objectives, how can you measure it, what data we will be needing to measure the effectiveness. And then as Jason mentioned, data shouldn't be an afterthought. So talking about the data at the forefront at the developmental phases.
And then we understood the piece about technical assistance and building that data literacy, and that's why we hosted several presentations for our members so they get familiar with data and they understand data better. And they're able to use it and apply it in the work that they do in the field.
And that was all for me. And I'll see if there are any questions. But Janeth, if you would like to wrap this all up please.
Harpreet, I just want to say well, that evaluation report is amazing. Not because it's coming from my colleagues truly. As in my role, I have to make sure that all of our members understand where we're going, where we're coming from, their needs, their request. And really truly the report encompasses like Harpreet mentioned, not just the needs that we have, but the work that we're doing.
I know they've even included success stories in the report of each program area. I think part of this came from just a compilation of our needs as a consortium. So we knew that whatever has-- what was working in the past was working. We know we have content experts and in our different program areas and passionate members that are not only advocating, but implementing this work and these resources and services for students.
But as the CAEP outcomes came from the state down to our region, we realized that we didn't have a very descriptive or step by step process. And making sure that we're maximizing those funds again it's a funding cycle. So we had to strategize on learning on how to best work our strategies to make sure to continue to receive the funding that is necessary. Again, to continue to provide those services for our students.
So that that's where the idea emerged from and maximizing aligning and reshaping our toolkit to continue to provide services to our adult learners in our region. And as a team mentioned, they went ahead and uses it as a sample to our colleagues and on a statewide level. So again, I a research team can email you these toolkits as well they are available in our caladulted.org website.
And we are all in this together. We know the challenges that have emerged, not only due to the student funding formula, but also our current situation with COVID and our budget cut. So we're trying our best to help ourselves in our region, but also share this knowledge and use it as you fit, as it fits, or as you need it. Or it may be just an idea of a creation for something that you really need to create for your own consortium.
So with that being said, we're going to open up to questions and hope that we can answer them. And if we can't, we'll probably take down your information and address it at a later time.
Again, we want to thank everyone for joining us. And I want to thank my partner, in this case, NOCE research team, good team, for doing such a great work. And really hearing our consortium members and listening to their needs, their challenges, and putting it together in a very comprehensive manner. So yeah, we're going to open up the chat room or the next step to be able to address your questions.
So we have a question and they're saying, who spearheaded this effort when it was first designed? So oh--
Go ahead, Dulce.
--so one of the things that we do, we want to give a shout out to Greg from onset because a couple of years ago there was an intent to do an evaluation, and we were just launching-- not launching, but we were still wrapping our heads around exactly what was being implemented, what these processes were, what data was being collected. Data collection was not centralized. The individuals who collected data did not necessarily-- were not part of the research team.
So a lot of things had to happen in-- a lot of other puzzle pieces kind of had to fall into place for this to actually start to launch. So I definitely-- we want to recognize the work that Greg had done. And we really used it as a foundation to be able to see OK, where do we need to look at, how do we start having these conversations.
And then-- so I'll give a little bit of history-- there was a little bit of transition in terms of personnel with the consortium. At that point, data collectors moved into the research department and then we started having these conversations of OK, what does data collection look like? How can we make it consistent?
And I think it was an ongoing conversation between consortium members and the research department, and then in collaboration were like OK, we got to develop these tools. OK, now we got to ask about goals and objectives. Now, we got to outlay all of these things that are happening because there's all these puzzle pieces. How can we display it in a visual manner so everybody can see how their role or what is their role in all of this?
So I don't know if I could give it like one particular individual that spearheaded this. But it was really a collaborative effort and ongoing conversations. As you could see just from our timeline, this was more than a year in the making and it's a continuous evolving effort that we try to embrace as much as we can.
Yeah, now I agree. And just to let you know, we have a very collaborative consortium. We have great members that understand that we need to continue evolving. And really we're open-minded in absorbing the information about the CAEP outcomes and how we were merging from-- we're going from and to a student-based funding formula. And understood that data was going to be much more integrated and needed for us to continue to receive the funding that is necessary for student success.
And like I said, we definitely want to thank Greg and Valentino Patel for embedding this mission. But also all of our members who have been very open-minded and understanding that we need to continue to evolve to meet our state needs or our new state legislative mandates that we get every day. And our regions needs because things do change based on economy and based on our own local adult community needs.
So we have another question from Kristin who says, curious to know how Buy-in was achieved. Does anybody-- Harpreet, Jason, or Janeth-- do you want to take that?
So I think it goes back to-- and I wanted to make sure I wanted to answer that question. The Buy-in was to ensure that our continuous activities or strategies in this case, were a line we were now going to be in alignment with our CAEP metrics. So that was the pivotal topic of conversation for our change, and our need for development and growth.
And it was presented our executive committee and not something that our consortium partner up with the research team and support that we shared-- all of our concerns, our next steps, and they were very involved. And we were able to provide updates on a monthly basis or quarterly basis, or as needed to ensure that everybody was on the same page and was understanding the need for this outcome, for this toolkit, I should say.
So that's how we got the buy-in from the consortium and to approve a 100% CAEP research analyst designee to help us to continue to support not only obviously NOCE as one of the providers, but ROP, and all of our members that are invested in the success of our adult learners in our community. I hope that I answer your question, Kristin.
Yeah, and I think it also facilitates conversations and gets people thinking. Another thing that we talked a lot about was the duplication of efforts and how can we leverage, and how can we make sure that students aren't receiving the same service multiple times and maybe confusing them or maybe making sure maybe they don't know where to go.
So all of this really led to meaningful conversation-- continues to lead to meaningful conversation, and continues to help facilitate these types of trainings that research can help with consortium members around understanding metrics, understanding data, understanding just all of these data pieces that kind of play a role in the planning and implementation of these activities.
And I do want to add, Kristin, we did do some driving. So we had ideas such as data and donuts just because we wanted to make sure that we have a various members who understand data very well. And others that don't. It's not really in their professional world. And so we wanted to make sure that we collectively provided the necessary information to continue to get the buy-in. So definitely food helped. In this case, donuts.
Any other questions?
So if I may, presenters-- yes if anybody does have any final questions, go ahead and put them in the chat. I'll just take a moment to remind everyone that at 2:30, we do have a series of networking sessions for all of the CAEP conference participants. These are actually going to be Zoom meetings. So you'll be able to actually speak with your colleagues and turn on your video.
So we have those coming up 2:30. Right now there are six networking sessions that are set up. So some of you actually may want to continue this conversation in one of those networking sessions. We have managing and leading your consortium, determining agency effectiveness through accountability, balancing academics and personal transition, social emotional learning for all, student engagement, and equity.
So again those all start at 2:30. They're going to be scheduled for about 90 minutes or so Zoom meetings. So you will actually be able to see each other and talk with each other face-to-face in a virtual setting, but those are set up for 2:30. So again, if you have any final questions for the group, please go ahead and put them in the chat. And I'll turn off my bike for a second here.
Thank you. And here's our contact information. And again, if you think of questions later, , I mean after this presentation, please feel free to reach out to any of us with your questions, concerns, or ideas. We're here to learn from one another and we're here to help each other as well.
So we want to thank you for your time and in listening to us. And hope that any of this information helped you.
OK, presenters I don't see any further questions. So we'll finish a little early. Thank you so much for your presentation. This is really great stuff.
And also just to remind our folks that the slides today are also available back on the VFairs site. They're available right now for download so just navigate back to the VFairs site, and you can get a copy of the slides from there.
We'll close up the session a little bit early. Have a great afternoon, everyone. And when the session closes, there is an evaluation form. So if you could, please fill that out so we can provide feedback for our friends. We just had a fantastic presentation. Thank you so much, everyone.
Thank you.
Thank you.
Thank you. Have a good one, everyone.