Dulce Delgadillo: You are-- oh, there we go. And we are recording. This is the Professional Learning Forum for CAEP Program Evaluation 101 provided to you by the CAEP TAP team at North Orange Continuing Education. We are very excited to get our morning kicked off with all of you today. And we will get started in the next one or two minutes while people are trickling in.
Lisa Mednick Takami: Welcome everyone. Good morning. So please that you could join us on this Friday. We are expecting quite a few people. We're so pleased at the response to this important topic for CAEP, and we are going to get ourselves started just as soon as we let in a couple more people.
We'll be going through our introductions, and then getting into our formal content, a wonderful panel, and then an opportunity for you all to ask questions of the panel, of the CC TAP team, and so on.
Dulce Delgadillo: Great. Thank you, Lisa. So let's go ahead and get started everyone. Like I mentioned earlier, this is the Professional Learning Forum for CAEP Program Evaluation 101 provided to you by your CAEP TAP team here at North Orange Continuing Education.
Thank you, again, for spending your Friday morning with us. We have a jam packed evaluation framed agenda for you hearing voices from the field and covering some evaluation 101. So let's go ahead and get started.
My name is Dulce Delgadillo. I am the Director of Institutional Research here at North Orange Continuing Education and part of the awesome team at CC TAP housed in North Orange Continuing Education.
For all of you in the chat, we just kind want to get to know who our audience is. So if you could just introduce yourself in the chat, indicate how you define program evaluation.
We're really interested in just how program evaluation is defined by individuals. I know hearing it and getting familiar with it-- specifically the last couple of weeks, people define it very differently.
So please we encourage you to put in the chat your name, how you define program evaluation. And then we really want to find out how would you rate your level of program evaluation expertise, however you define program evaluation, right?
Do you see yourself as a novice, or do you see yourself as an expert on a 5? So if you see yourself as a novice, go ahead and put your name, how you define program evaluation, and then a 1.
If you find yourself really in that expert realm, you know you've been doing this work for quite some time and you feel very comfortable in the land of evaluation, go ahead and put a 5 on there. But we're really trying to get a good gauge as to who is in our audience and how we should frame our evaluation 101.
All right, great. I love seeing-- so we got a great, diverse group in our audience here. I see some north, I see some south, I see some central. This is great. And I see we have some individuals that are in the novice realm, we have some individuals and a 4. That's great. So bringing in some of their expertise.
I see a definition here of student success based on CAEP and WIOA data points, employment earnings. Great. We're definitely going to touch base as to how all of this is framed not just program evaluation 101, but within the scope of CAEP.
For anybody who's just joining us right now, we're just getting started. If you can just put in the chat your name, how you define program evaluation. And then on a scale for 1 to 5-- of course, we had to use the Likert scale because we are researchers. We had to make sure.
So on a scale from 1 to 5, how do you rate your level of program evaluation expertise, 1 being a novice, 5 being an expert? Great. Love seeing all the responses. All right, and I'm going to go ahead and hand it over back to Lisa.
Lisa Mednick Takami: Thank you, Dulce. Good morning, everyone. Welcome to CAEP 101 Evaluation. I'm Dr. Lisa Mednick Takami. I have the honor to serve as CC TAP's special project director.
I'd like to acknowledge the CC TAP team online this morning Jaspinder Uppal, Chandni Ajanel, Pragyee Mool, Andy Pham, Jason Makabali. And I would also like to welcome our colleague, Harpreet Uppal, who will also be serving as a panelist this morning.
We would further like to welcome our leadership from the Chancellor's Office, Mayra Diaz, and we would like to welcome our state leadership from California Department of Education. I saw Dr. Carolyn Zachry. I'm not sure if Neil is on with us, Neil Kelly, and Diana Batista. But if she is here, welcome and welcome to March everyone.
Here is our agenda for this morning. We are going to briefly review the objectives in which we're framing this program and evaluation. Then we'll be presenting a program evaluation overview grounded in the CAEP requirements around this.
We will have a panel discussion, a Q&A session, and then some closing activities. If you have questions along the way, please do put them in the Q&A. Our team will be monitoring them as we move through this session. Next slide, please.
And our objectives for this morning are first to review the CAEP program evaluation context and deliverables, distinguish data informed from data-driven decision-making, and demonstrate CAEP evaluation strategies and processes based on CAEP statewide priority for evaluation.
And then, of course, we will be ending by responding to CAEP practitioner program evaluation questions. I'm now going to pass it to my colleague, Chandni Ajanel, who'll go over some brief housekeeping items.
CHANDNI Ajanel: Hello. We wanted to let you know that this meeting is being recorded. The recording and this PowerPoint going to be released on the Cal Adult Ed Website. Our previous recordings from our past webinars are up already. If you have a question, please use the Q&A function so it doesn't get lost in the chat.
During our Q&A and discussion section, you will have a chance to unmute yourself to ask a question if you'd like. Please complete the CC TAP survey at the end. We really value your feedback, and we use it to improve our webinars.
Lisa Mednick Takami: Thank you, Chandni. So what is our state priority around program evaluation? We're really grounding our presentation this morning in this requirement. So we are required to conduct ongoing assessment of programs, which is at the core of building stronger and relevant consortia and agencies.
Programs aligned to this area could focus on using data to inform consortia annual and three-year planning, programming and instruction, evaluation design, or engaging stakeholders in the evaluation process, those two big points, the annual plan and the three-year plan.
But as well, as we're going to hear from our panel this morning, how these things inform ongoing planning at the consortia of level. And now I'll turn it back Dulce for the heart of our content, and then we'll get into our panel. Dulce?
Dulce Delgadillo: Great. Thank you, Lisa. All right, so let's go ahead and get started on, what is evaluation? We're going to cover just basics of evaluation 101. So really, when we look at evaluation, it's really a tool that enables organizations to define and measure results to see if they're creating impact, right?
So it's really a systemic process of collecting data, analyzing data to determine, have you reached the goals and the objectives of the program that you are implementing, right?
And it's intended from an implementation or a programmatic sense is, how can we help use evaluation findings to make decisions about program refinement and adjustment to ensure that we are supporting our students in a successful manner?
So really, somebody who is a heavy hitter in the land of evaluation and education is Thomas Guskey, who's at the University of Kentucky. And he really couples evaluation with its impact on policy. And his views on evaluation are really on determining the merit, the worth, and value of things and their significance within the educational setting, right?
So this is involving collecting and analyzing all of that information to answer questions, but also to have it impact policy and to have it eventually be tangible and actionable items that can be used in institutions, in our ecosystems and policies, and most importantly, in the classroom to help our students, right?
So he's really known-- Guskey, he really grounds his program evaluation on this holistic approach of understanding all of the context that is around the data that you're collecting and in which these programs are functioning in. And we're going to get a little bit into that in data-driven versus data-informed.
But ultimately, what his biggest contribution is making sure that we are measuring what we're intending to measure. Very much known on assessments and developing tools in order to tie what you're measuring and the data is actually tied to the objectives, and what you're intending to do with the programs that you're executing on the ground.
Let me get to my next slide. So when we look at-- as consortia, how are we supposed to do this? What are our guardrails as we're trying to evaluate our programs? We really anchor on Ed Code and on how the legislative priorities have informed us on how we should be using this.
So first of all, it's really important to recognize that we need to do this in a systemic manner, and that we're evaluating all pieces of the consortia based off of what is the need and what is effective, and what we're executing.
And we can do that in these platforms through our three-year plan annual planning by looking at our data and other resources. But specifically, what CAEP is specifically calling out to evaluate is those four areas.
So looking at program needs, and are you meeting the needs of the community? So identifying some data points around that. Evaluating levels and types of services-- so looking at enrollment and student outcomes funds provided to-- effectiveness of funds provided to members. So cost effectiveness metrics.
And then evaluation of member effectiveness. And so this is really just kind of overall effectiveness of the relationships that are happening. So you can see right off the bat, there are many layers of data that you should be looking at as you're developing your evaluation framework around your consortium.
All right, so a couple of things on just overall messaging on how you can take these tools back of evaluation and messaging back to your consortium is, how can evaluation help you? So why am I going to participate in this? What am I going to get out of it?
And so really, it's around these points that you can leverage buy-in and try to gain buy-in across your consortia. So it's going to help you understand and improve your programs right off the bat. You don't know what the impact is if you don't see what that data looks like. So right off the bat, it's going to help you understand that.
It's also going to help you examine the theories behind your program. What is the objectives? What is the intent of your program? Is it to build soft skills? Is it for them to get a job? Is it for them to have a pathway? What is it? And so really narrowing it down so to make sure that your evaluation metrics align to that.
It also helps motivate people to tell their story, right? We want to show the great work that's happening on the field. Help us tell your story. And it's a really good motivator to say, we're not here to judge you, we're not-- that's my next slide, but we are here to help you and to help you use this data to help strengthen your programs and help tell your story.
It also ensures that the program is doing what it's proposed to do. So again, going back to, what is the intent and the objection-- sorry, objective? And the other piece is it supports financial efforts. So we know that a lot of our reporting is tied to these metrics at a variety of levels. We saw that in the previous slide.
We should be collecting data on program effectiveness, member effectiveness, program execution. And so all of those things is really helping us identify, are we good fiscal agents with our money, with the money that's being invested into these programs?
What is evaluation not? So again, tools that you can take back. So it's not a test or a punishment. We're not coming in, we're not-- it feels very auditorish, but it is not. It's really a conversation and it's going into it with an inquiry mindset.
It's not something you do just for show. We can develop all these reports and we can tell these great stories, but really go in with it with an intent of, what is the purpose? Why are we doing this? Who is our audience? What is the end goal?
It's not a scientific research project as much as it looks like it. It could look like a scientific research project, but it's not. It's not a one time kind of that's the only thing that we're going to do. Oops, sorry. Let me do this.
But it's about a continuous cycle that we are trying to implement to improve our programs. And you'll see that once I go into the framework of evaluation.
And tying it back again to that, it is not a one-off occasional activity that we're doing just once. This is continuous improvement that we are doing over and over again because our environments and our students and everything around us is continuously evolving.
And so it is not a one-off that we do and we just learn. We need to figure out how do we do it and then use that information to help improve our programs.
So in regard to the framework, we have guidelines from Ed Code and from state priorities, but how do we actually do this? We kind of have guidelines on how we're going, what we should be measuring, but how are we going to start this framework?
So we go to a basic framework of evaluation and notice it's not linear. It's cyclical. And so it really fosters this continuous improvement and accountability within programs in your projects.
The intent of this is to ensure that your initiatives and your programs are effectively meeting their objectives and responding to the needs of your students, of your stakeholders, of your community, of the workforce, et cetera. So we're going to go through each of these pieces and we're going to start with using.
So really, using is where you're going to start that evaluation framework of, how are you going to use these findings, who will use these findings, how will they be used, and how are you going to tie these findings to informed decision-making? So it's really starting that conversation as to what is the purpose, why are we doing this to begin with?
Next, you're going into your planning phase. So this is setting your objectives and it's setting and discussing your research questions. Going back to grad school, we are talking about those critical research questions that we are trying to answer.
Not that we're trying to answer all of them, but it's going to really just focus in on what should we be focusing in and what is the data that we are trying to-- what data should we be collecting to answer the questions that we have.
Whether it's student services, outcomes, how are our staff feeling? That's another component. What is their environment that they're working in? So again, what are those research questions?
Asking is about finding out what the answer is of those questions are. So once you get those questions and kind of a framework as to what are we trying to answer, then you're at the asking phase.
And it's really determining, who are you asking these questions of? Are you trying to figure out questions, answers from students, from faculty, from CAEP staff, from CAEP partners, from community partners and really kind of gathering all of that information?
So you can think of-- for my researchers out in the group, asking is really that data collection phase, whereas tracking is really your data analysis phase. So tracking is making sure that you are analyzing and identifying those trends across all of those stakeholders and all of the data that you've gathered, and it's trying to draw meaningful insights from this.
At this phase, you're really stressing the importance of how you're interpreting the data within the context of the program's objectives, within the context of the needs of the stakeholders, and within the external environment in which these programs are operating in.
And so then we get to the learning phase. The learning phase is really where actionable items come in. It's taking all of this and translating your findings into actionable insights that can be applicable in the classroom, in policy, in our ecosystems, in our communities.
This involves identifying your lessons learned, your best practices, and your areas for improvement. And so really, it's about this phase where you're taking that feedback loop and you're taking those insights and putting them back into the using phase of that cyclical cycle.
And that's where you're really bringing in kind of closing the loop as to we found out all of this from answering these questions and now we have more questions, and now we're going to answer those questions again.
And again, you're continuously doing this as you're evolving your programs and the services that you're providing. And this is going to help you refine, improve, and scale your programs.
So really, at the planning and asking phase, you're going to want to clarify what you want to know. So you're going to be asking questions such as these. What do you want to learn about your program? What are you trying to achieve with this particular activity?
How do you know if your program is successful? Are you looking for an increase in the number of students you're serving? Are you looking for an increase in transitions? And then where are you going to go get your data?
One of the pieces that you're going to really be talking about in your planning phase and most likely in your asking phase is methodology. Are you going to be doing surveys? Are you going to be looking at just documentation of how the programs have been implemented? Are you going to talk to staff? Are you going to talk to employers?
So all of these pieces are going to be puzzle pieces to put together your evaluation as you look at your consortium as a whole or even just one program.
So all of this we're working with data, we're getting it, we're trying to get all of this data to help us make decisions and tell the story and all these pieces.
And I want to also end with making sure that we don't let data be your dictator. Don't let data be your dictator. Data is a really important component of it, but it shouldn't dictate everything because of the contextual pieces that I have outlined in the previous slides.
So a data-driven mindset can sometimes really lead to viewing the data as kind of that sole determinant in the decision-making process, and it's going to overshadow all of these other critical factors that may and should be included in this decision-making process.
Also, if your data-driven, you're really put yourself at risk of this overreliance on data and you're going to potentially ignore contacts, you're going to potentially ignore human experience, and you're going to potentially ignore just basic realities of the environment, a.k.a. A pandemic, a.k.a. AI, a.k.a. budget, state budget. So all of those pieces.
When you switch that and really go into data-informed mindset, this really encourages more of a holistic approach to decision-making. It integrates data into the decision-making process, but it still has contextual understanding and there's ethical considerations taken into account as well.
This approach is really looking at the importance of combining the data with the broader insights of what you have learned and also with the education-- kind of the contextual environmental pieces that you need to take into consideration with the decision-making that comes in hand with that.
It's also this idea that we are going to continuously gather data and we're going to continuously learn, and it feeds back into that kind of commitment of continual learning.
So before we take it to our amazing panelists, this is a call to action. A call to action for all of you as you engage in your evaluation work is to really look at integrating data with a broader insight and consideration of that context.
So it's really about letting us be guided by data, but not governed by it and really not letting data be the dictator, but being a very important voice and having us understand the greater context of it.
So without further ado, I'm going to hand it over to our amazing panelists that are going to show us how they're doing this in the field and the amazing work that they are doing in their areas.
Lisa Mednick Takami: Thank you, Dulce. So I'm going to introduce our panelists within the frame of-- Dulce has painted a very broad and important picture for us in terms of evaluation process as a whole. And now we're going to bring it into the specific context of CAEP consortia.
So with that, I'm so pleased to introduce to you Thatcher Weldon, who's the Director of Adult Education at Kern Community College District and Kern Adult Education Consortium. His contact information is there for you.
I'd also like to welcome Adele McClain. She's the Coordinator of Adult Education at Apple Valley Adult School and she's zooming in from a conference in Fremont, and we appreciate that.
And then finally, our esteemed colleague Dr. Harpreet Uppal, who serves as Senior Research Analyst here at NOCE and for the North Orange County Regional Consortium.
Welcome to our panelists. We have three questions that are here really designed to bring what Dulce was saying in the broader sense of program evaluation in general, the dip sticking, the ongoing lessons, the discussion, the informing of planning into the arena of CAEP specifically.
So Thatcher, I'm going to ask you to get us started with our first question, which is, can you please briefly describe your program evaluation process, and how often does your consortium engage in this process?
Thatcher Weldon: All right, yeah. Thank you. So yeah, my name is Thatcher Weldon. And if you don't know, the Kern Adult Education Consortium is the largest geographic adult Ed consortium in the state. We're just under 25,000 square miles.
So I'm going to give a very high-level kind of evaluation process that we engage in every month at our monthly meetings. So part of our consortium consists of Mount Whitney in Inyo County. So we have that highest peak in the lower 48 states.
And so I'm kind of sitting up there looking at all of our data within our consortium and trying to see where we are within the year. And I created these data tables about four years ago based off of the tops data tables that you get in the PDF form, and I pulled those Excel sheets.
And with this data, we can look at our monthly meetings to see our progress towards our students served. We can also look at our increases from the previous month every time we have a board meeting. We can pull that data.
We can look from the previous year-- so '22 and '23 compared to-- this is the January data tables we had in our meeting comparing January of '23 to January of '24.
We can look at all these things and analyze where we are as a consortium and as sites. I pulled the data for all of our sites as well, so they can look at these tables for their individual sites.
And this year, I added our goals and our targets for our annual plan. So we have our actuals of our number of students served up here, and I plug in our targets. And we can check every month how we're coming along to that target. How many ELL students were serving? We've already reached that target. So we're good there, but we're going to keep improving.
Our low literacy and our low income, these are all our consortium targets for the year. We can look at enrolled students who receive one-hour of instruction. So all these students who come through our doors, how many of them are signing up for classes? 85%.
We can look at students with one or more hours of instructions who come and plug in down here and get 12 hours of instruction. And of these students, we're getting 76%.
Then we can look at all the outcomes, and we've achieved as a consortium. So we have more outcomes than we have students in classes. So we're doing very well in some of these instances.
And then we also look at our cost as numbers, our gains. So we look at our students who have been pre-tested. How many have a post-test? How many have a gain who have been pre-tested, and then how many have a gain who have been pre and post-tested?
And then all the individual sites can look at this and go back to their teachers and staff and say, hey, this is where we are. We're missing something here. How can we improve on this?
And we can plug in our quarterly reports, fiscal reports and see our cost per student served. I look at both all student served, and then students who get those 12 hours of instruction as well. So this gives us kind of a high-level idea of where we are and where we'd like to go, and then individually where we can go. Sorry.
Lisa Mednick Takami: Thatcher, excellent. Thank you for sharing. That's very concrete example of the outcomes and the work that happens in your board meetings each month.
Dulce is going to switch back to her screen and now we'll pose the same question to Adele. Please briefly describe your program evaluation process and--
Adele McClain: Hello.
Lisa Mednick Takami: I think if we could have someone on mute. And how often your consortium engages in that process.
Adele McClain: So our leader-- can you hear me?
Lisa Mednick Takami: We can.
Adele McClain: OK. Our administration meets together monthly. It is usually on a Zoom meeting. But we meet in work groups usually quarterly. And in work groups, we have the people from the different offices who actually put the data in. So it's a different way of doing it.
I should say we only have four adult schools and one college in the whole area. Our consortium is rather small. We have a little less than 3 million to operate all of those programs on.
I absolutely love the solid data Thatcher provided. We, of course, use the TOPSpro tables. I was really kind of happy because our cost per student is about the same.
Our need in the area is we're always trying to build capacity. So I would say we're a little more data-informed than data-driven because we have a limited amount of funding. We have roughly 20,000 students that have need. And honestly with the funding that we're currently given in a year with all of us working full steam, we serve maybe 4,000.
But of that, we also have almost 400 graduates. And that's on top of our ESL if you include our HSE. So monthly is the admin team meeting. Quarterly is where we meet as a whole group to try to make sure we're meeting our objectives that we set forth in the annual and the three-year plan.
We always remind people about the annual and three-year plan in every process. I'm trying to keep it short. I did not provide a visual. But I would love to have questions.
Lisa Mednick Takami: Sure. Yes, so the point of this is really to have a variety of consortium experiences. We appreciated the visual that Thatcher had.
Adele, we appreciate you're talking about the funding and the number of students that you serve and the number of graduates that you have as a result of the particular meeting structure you have. So I'll now pass the same question to Harpreet Uppal, who can talk a little bit about NOCRC. Harpreet?
Harpreet Uppal: Thank you, Lisa. So I'm just going to cover what we do at North Orange County Regional Consortium. So we do conduct CAEP program evaluation annually.
And what we do is we evaluate the impact of our CAEP funded activities by assessing student outcomes that were achieved through these funded activities that were implemented within a given program year.
So just to put in context, NOCE for our consortium is the largest provider of adult education. We do have two other funded members, Garden Grove Adult Education and North Orange County Regional Occupational Program.
So because I am kind of situated within NOCE as a researcher, I have access to all of our student level data. So I can look at a granular level. However, for our ROP and for Garden Grove, I usually get aggregate data that I look at and incorporate in our annual report that goes to our executive committee for our consortium.
Our evaluation process is both summative and formative in its nature. It's summative in that at the end, we produce a report, a deliverable that highlights all of the outcomes that the students have achieved.
And we look at an activity level that is using CAEP funds or leveraged funds, and then we take it higher up at a whole program level, like our ESL program. They might have a specific ESL activity. We look at that. Then we look at our ESL program enlarge, and then we take it at an institutional level just to showcase.
A lot of time, CAEP funding doesn't just get used by itself. You leverage all other funding sources to provide services and instruction to students. So we try to look at multiple levels.
And the process that usually our research office takes is we gather all this information from our strategy proposals to understand what the intended goals are for these activities.
And then because I've access, I can query data backend to see our students, their enrollments, their services, their outcomes. So we analyze that data, then we produce a report, and then we disseminate data to the consortium.
Lisa Mednick Takami: Fantastic. Thank you very much, Harpreet. So I think everyone you can see that the end goals are the same, to be able to measure our student outcomes, to measure the effectiveness of our CAEP funds.
And the methodologies have some commonalities and some distinct features. And that's exactly what we wanted to be able to demonstrate this morning. So thank you very much to all three for the first question.
And now we'll go into our second question, which is, and this will go to Thatcher first, can you highlight any impactful approaches or promising practices that have served your consortium well to inform your consortium planning to support those student outcomes that you each highlighted in the first question? Thatcher?
Thatcher Weldon: Yeah, so thank you. I have four things I want to mention. So at the board meeting, we look at our data and then we have discussions about things.
So we try to build a strong teamwork and support system. And I think we have that within our consortium, and where one site may be doing something really well and they can help support another site that's struggling with something.
We emphasize the student-centered focus. So we're all focusing on our students. It's not a competition. We're trying to do our best as a consortium.
Focusing on data-- so what we look at, take it back to your sites, take it back to your teachers and staff and testers and students and look at data more closely because I'm looking at the high-level they can drill down into that data and find the students who might need support in different areas in order to get their gains and outcomes.
Orientations-- letting the students know how we're funded I think is powerful because then they can understand that these tests are important. Reporting our outcomes is important. And they value our programs and they'll want to contribute.
And then another thing we're looking at is student ambassadors. A lot of our adult Ed sites, they get comfortable in their levels or at the school and they're afraid of going over to the college.
Well, having those student ambassadors, those student leads come back and talk about that process and help guide other students through those process. Those are some of the impactful approaches I wanted to share. That's it.
Lisa Mednick Takami: Thank you, Thatcher. I really, really enjoyed hearing about the student ambassadors, not to mention the other particulars that you mentioned. So thank you so much. And let's pass it over to Adele for the same question, highlighting impactful approaches to support student outcomes.
Adele McClain: I think that probably our most impactful one that's known throughout the state is the fact that we have a regional graduation and celebration in June of each year put on by the community college to welcome our students into credit-bearing classes. Those transitions are a bit hard to track.
But when you have 2,800 students in an auditorium and you have local politicians, and we've had a senator several assemblymen that our guest speakers, it makes everyone aware of what we're doing and why.
We not just celebrate high school diploma and high school equivalency, but we also celebrate all of our students that have gained citizenship while in our programs.
And we also ask them to stand and be recognized if they have gotten a CTE, and if they have gotten a job while in our programs. And that's a pretty powerful thing. That was also recognized by the COA journal nationally as well.
And then secondly, we hired a transitional counselor that's full-time and dedicated to transitioning students into the community college because we all believe that that is going to be their way to a sustainable job and out of poverty.
Three, during the pandemic, we came up with tutoring on-demand, which my school started first, but then we all realized we could flex those paraprofessional hours.
Because one of the reasons that we have the greatest attrition of students is they don't feel like they have adequate support at a time when it can be supportive. And virtually, many more options are available.
We do also have student ambassadors at the college that come back to our program that are graduates, and we have a formula for employment and earning surveys that have netted us really high results. And we even share those results with the AJCC and employers in our region.
Lisa Mednick Takami: Excellent. Really appreciate your highlighting tutoring on-demand, the importance of transition counseling, and then the sense of belonging in something like a graduation in that really important process and transition from noncredit to credit. And I'll now pass it over to Harpreet for that second question.
Harpreet Uppal: Thank you, Lisa. So at North Orange County Regional Consortium, in order for members to receive CAEP funding annually, we ask them to submit what is like a strategy proposal form, if you want to call it.
This is a document that we as a research team kind created. And Jason is actually going to drop that in the chat so you have a visual in front of you of why we think this is a promising practice at our consortium.
So the form itself asks our member to think and describe certain elements, such as, what is the problem or the need that they're trying to meet through this strategy, this activity that they want funding for? We want them to think about how is this regionally inclusive because again, we are serving within a consortium.
It asks them to describe their goals and the overall impact of their program and think through, how are they going to implement this given strategy? What is the timeline? What are potential barriers that they might encounter in their implementation process? So kind of thinking this process through before they actually get it running on the ground.
And then we also ask them to consider how they're going to measure its intended impact, what data will be collected, where will this data live, how will they use this data to inform planning.
And lastly, how does this all align with CAEP outcomes, and specify how do they plan to scale this? If this is like a smaller activity, are there intentions to institutionalize it later down once it's successful?
So the proposal template was really designed with evaluation in mind, and it serves more as a logic model. So the questions were intended for our members to articulate those goals for their program and activities, identify their desired goals, and align this whole implementation to intended outcomes.
And then the impetus behind this was also because we wanted to use this for our three-year planning, for our annual planning, and then evaluation of like, again, what are the needs in the region? What are the gaps that you're trying to close off and also kind of measure our member and consortium effectiveness?
Lisa Mednick Takami: Excellent, Harpreet. Thank you for providing a concrete example of the data-informed approach that Dulce was speaking of earlier and really taking all of us through the process at NOCE for aligning with the three-year plan and annual plan and how the data role or, excuse me, the strategy proposal template can really serve that process. So thank you so much for that.
We'll now move into our third question and returning to Thatcher. Thatcher, based on what you've learned from your own program of evaluation, what is one piece of advice you could pass on to other consortia that could help improve their program evaluation process?
Thatcher Weldon: OK, yeah. I have a couple. But I always tell our members and anybody who will listen, we all the great things that we do in adult Ed, but the public and the legislators may not. And so we have to be able to show that.
And one of the things I would recommend is read the legislative analyst's office on the recommendations for a funding model because that might be what's coming in the next couple of years. So prepare for that and start planning how to show those things.
Pay attention to updates throughout the state. Think outside the box. What do you do that's really great, whether you're rural or urban? We're all different in the way we do things. Think about what it is that you do great.
And then also we have to think about return on investment. It's something that always comes up. I think most of us believe there's more value in education than just money and a job, but that is something we're very focused on in adult Ed. So we need to look at those career outcomes. Make sure you're tracking those.
One of the things I'm starting is a dissertation this summer and I'm looking at adult Ed's effects on K-12 student success. Because if we can show how adult Ed improves K-12 GPAs or test scores or attendance, all of a sudden, we're going to have more support from boards and members of the public.
And then community building, think about the great communities we have in our ESL programs. Talk about something we need to really improve in our society, that development of community. So think outside the box, think about how you want to showcase what it is that you do great.
Lisa Mednick Takami: Wonderful. Thank you, Thatcher, for highlighting the importance of appreciative inquiry. What are we doing well, and how can we demonstrate return on investment to the California legislature and as stewards of public funds in general? So thank you for that.
And now we'll pass it over to Adele for the response to the same question. What's something you've learned from program evaluation, and a piece of advice you can share?
Adele McClain: I would say meet as often as you can and with as many stakeholders as you can. If you can't get your stakeholders in the room quarterly, do a survey.
As a principal, I do a principal's letter every single month and all my students know they're going to get three or four questions. And they're all designed to be promoting program development and improvement. And when the students see that you use their information to make program development as well as your staff, they have a lot more buy-in.
I also wanted to say-- let's see. Because I was inspired by Thatcher, I wanted to say to him, take a look at your dashboard. I bet the work you're doing will show up there, not just the launch boards.
We learned that our EL learners in our district since we started having an adult school, they're performing at 10% higher than the regular rates, especially in high school. And that is really because of parental engagement, the students that are coming to our EL classes. And that's something that's very measurable and your boards will notice. And so will your K-12.
Lisa Mednick Takami: Wonderful Thank you, again, for talking about the importance of building buy-in, the importance of frequency and looking at community engagement and improvement. And Harpreet, if you could now answer that third question please.
Harpreet Uppal: Yes. Thank you, Lisa. So my one piece of advice would be to have clarity on how you plan to use the findings of your evaluation. So knowing the timeline of your evaluation is really important.
For example, when we think of evaluation, if the goal of your evaluation is to inform your planning for the upcoming activities, for example, for new year, then it's really important that you conduct your evaluation of your previous year's activities beforehand so you have some data to look at, you have contextual information to look at before you go start your planning conversations.
And that would allow your consortium members to have time to reflect on the data, and then make some data-informed decisions, as Dulce was talking about, for planning purposes.
And then some questions as we were thinking about evaluation to keep in mind is like-- again, I know I'm using technical terms of calling it formative evaluation, summative evaluation.
But when I was talking about formative evaluation, I meant basically are you conducting it during the development or the implementation of your activities of your program to understand is this a better fit for your needs in the region, or are you conducting a summative evaluation that measures the impact of the program that you have just implemented?
And then looking at that data to really understand, as Dulce had mentioned earlier, like, are you getting the intended outcomes? And if you're not getting the intended outcomes from your program, then you want to look at the actual implementation process, then you want to do what is called a process evaluation.
Was it because the program wasn't implemented as the way it was intended? Maybe you hit some roadblocks. So it's an ongoing process. So it's not, as Dulce said, one time done thing.
So that's something that when we are evaluating at our end, at our consortium, we have to look at, is the data we just produced the report, is it useful? Is it timely? How is it going to be used? So you have to keep all of those questions in mind when you're going through this process.
Lisa Mednick Takami: Yes, you can tell everyone in the field that NOCE, we are a research department. And so we're very tied to the fundamental pieces of the evaluation process. And I appreciate you sharing some specific examples, Harpreet, of how we approach that here at NOCE and NOCRC.
So with that, I want to thank our panelists for coming up with those responses to questions with just a few days time. And we're going to transition now into our Q&A discussion, and I'll pass it back over to Dulce.
Dulce Delgadillo: Thank you so much, Lisa. And thank you so much to our panelists. I always love hearing just the stories and the great work that is happening on the ground and taking these theories, and then what do they look like in practice? And always just identifying what works, what doesn't work, and being able to share that on these types of platforms.
So now we're going to go ahead and open it up to our audience because I know you have questions. I see a couple of questions on here, but we want to make sure that we open up the platform to our audience to ask our panelists some questions or to have a discussion.
So please either unmute yourself or we can go ahead and put it in the Q&A if you have any questions. So do we have any questions from the audience? All right, thank you, Will. So you are our first one. When pulling data to analyze, are you taking from TOPS and MIS? So I'm not sure who wants to take that question.
Thatcher Weldon: What was the question? Sorry.
Harpreet Uppal: So the question was, when you're pulling data to analyze, are you looking at TOPSpro data or MIS or both?
Thatcher Weldon: So I'm working with our IR department at KCCD to make sure we're pulling MIS data for our noncredit that aligns with the TOPS tables that we look out. So we're working on that to incorporate into our monthly meetings as well. But right now I'm pulling TOPS data in those tables.
Harpreet Uppal: And then for NOCRC, same. We're looking at MIS data because NOCE as a part of community college district, we submit through MIS.
And as a researcher, I have access to pulling our backend data from our student information system. But we're also WIOA two funded members. So we also submit data through TOPSpro. So for NOCE, we also look at our TOPSpro data.
And our two other funded members, Garden Grove and ROP, they submit data through TOPSpro. So again, I don't have access to student level data, but I do get aggregate data from their CAEP summary report from TOPSpro that we include in our evaluation reporting.
Dulce Delgadillo: Great. Adele, do you want to ask, do you have anything to add?
Adele McClain: We also use the TOPSpro data. We primarily look at tables 4A and B, as well as table 5. And then our counselors and our resource navigators also do a pretty good job of just tracking in an Excel sheet who's going to college and who is going to the workforce.
So we take a look at that data as well because even though our workforce partners capture some data, in absence obviously of a Social Security number, we sometimes have to track that in a bit more manual basis to see where our successes really lie.
Dulce Delgadillo: Great. Thank you. OK, so we have one other comment-- I see a couple of comments going back on that parental engagement data. So I love that sharing of data.
And then we have another question in the chat. I must have missed this at the beginning, but is there a CAEP evaluation template and a place to submit program evaluations?
So it is my understanding there is no formal place to submit evaluations. It's a local process that is determined at a consortium level. And typically, what that process or what that implementation of program evaluation is determined at a local level.
We've seen it here that it looks different in a variety of ways. Some is much more quantitative, some has different types of data. But it's not my understanding that there is a formal submission program evaluation process for CAEP. Does anybody else want to say anything on that process, or if you have any other feedback on that?
Lisa Mednick Takami: Dulce, I'm also seeing that we had a question to Adele. And I know Kelly has a question as well. The question to Adele was if she could restate how she correlates the parental engagement resulting in 10% ELL learner performance.
Adele McClain: So the 10% ELL learner performance, if you look at Apple Valley Adult School, you can see that our ELL students are actually outperforming general population.
And we said this is largely due to the fact that we have so many of their parents engaged in our programs. If not directly taking programs, then coming to us for a computer course or coming to us for a variety of services we offer on Fridays towards getting them jobs or family resources.
So the engagement of their parents is showing a direct correlation to student improvement in graduation rates and going to college. So this is huge for my district.
Lisa Mednick Takami: Right. Great. Thank you. And then, yeah, Kelly, I know you had a question, too.
Audience: OK, and I'll put it in the chat because it's kind of multipronged. This is really great information. So I'm from San Diego Regional Education Consortium, which were two-member consortium.
So we've been doing an effectiveness survey that's really focused on consortium level operational effectiveness with other pieces, but we've been doing that for the last-- I think this will be the fifth year.
But one of my questions is when we start talking about CAEP programs-- so for us basically, all of our instructional programs are part of CAEP, with the exception of our older adult program.
So when we talk about evaluating the effectiveness of CAEP programs, how would you distinguish that between CAEP funded projects from CAEP funded personnel, from other institutional program review initiatives because everything we do is pretty much CAEP within the instructional realm?
So it's kind of like we're-- and we're very big and we're unique in that our own standalone noncredit college and we're a small consortium with just two members.
Dulce Delgadillo: I'm going to go ahead-- so I think me and Harpreet actually made eye contact digitally because this is exactly what we struggle with because we are also a stand alone. So I'm going to go ahead just tell you that we also struggle with this, but I'm going to let Harpreet tell you a little bit about our story with this.
Audience: Thank you.
Harpreet Uppal: Yeah, so initially when we started this project program evaluation in 2019 as a research team for our CAEP office, we really just wanted to know return on investment. So we were looking specifically at those activities that were just using CAEP funds.
And we actually had received a lot of pushback once we released that report because as you said, everything in the CAEP realm, in the instructional world of your noncredit, you might be leveraging funds, especially for personnel positions. Or even to provide instruction or services, you might be leveraging your general funds and using CAEP funds for X, Y, and Z.
So then that's why we had to scale our evaluation and actually look at overall, you're just programming, just you're CAEP programming in general regardless of the funding. So that's why we do two types of program evaluation within this whole evaluation process.
We try to find-- because again we have access to our backend student information system, we can look to see which particular instructional courses are being taught by an instructor that might be funded through CAEP.
And then we only look at the outcomes of those students that are specifically taught by the instructor that is CAEP funded. So that's what we mean by CAEP funding program. So it's a very granular, very like just small activity level analysis.
And because it doesn't give the whole picture of CAEP-- as you know, the adult education pipeline dashboard, it doesn't look at the funding structure. It just looks at CAEP program as a holistically.
So then that's what we started to do in 2020. And moving forward, we now look at all of our CAEP program areas. We look at a program level and then at an institutional level and a consortium level.
Not sure if that answers, but we struggle with it and we continue to do. We have to do a lot of doing back and forth with our program leads to get to that. Like, who's being funded by CAEP per se using consortium level funding just to get to that activity level analysis?
Dulce Delgadillo: Great. Thank you so much, Harpreet. So just to be respectful because we still got two more slides, we're going to go ahead and wrap it up.
And I just want to let you all know that if you did-- I don't see any outstanding questions, but if by any chance you did have a question or we don't get to it, we typically take those questions and we try to create an FAQ out of it and release it out.
So thank you so much for your participation and extra thank you to our panelists. And I'm going to go ahead and hand it over to Jaspinder for some final comments.
Jaspinder Uppal: Thank you, Dulce. So I just wanted to remind everyone here that we have the CC TAP Listserv that you can subscribe to. And that way you don't miss any notifications about all of our PLFs and webinars that we have and other announcements. That is our main form of communicating to you in the field.
And there's a link here to subscribe to the listserv. And also right below that is the TAP email address and that is where you can go to for any technical assistance that you need. And the next slide, please.
This is our Voices from the Field Interest Form. You are more than welcome to scan it. And if you would like to be a part of our PLFs to provide any insight into what you are doing at your consortium, if you have any expertise in any of these areas and you would like to share that knowledge, please feel free to do so. We would love to have you on. Thank you.
Lisa Mednick Takami: Wonderful. Thank you, Jaspinder. And I know that during our housekeeping, Chandni was going to let everyone know of all of the hard work she's done.
So that the materials and webinars for our previous programs, several of those have been uploaded to the CAEP website. So for those of you who have been asking, we were working through accessibility issues. You can find a number of those materials on our CAEP website.
And of course, if you have questions about anything presented today, anything presented on any of our other programs, if you have those technical assistance requests, we are increasing in the number complexity and interest level of hearing from the field for how CC TAP is part of the integrated CAEP TAP office can serve you. By all means, please reach out to you.
Thank you, again, for joining us this morning. We hope that you found this productive and impactful, and we look forward to hearing from you in terms of taking the survey.
Dulce Delgadillo: Great. Thank you, Lisa. Yup, so Chandni has dropped in the survey link. You can take your cell phones or scan it with the QR code. And again, just a final thank you for all of you for joining us this Friday morning and for continuing to do this work.
Thank you so much for continuing to build community in this very important work, and I hope that you enjoy the rest of your day and that you have a great weekend. We'll go ahead and keep this slide up for a couple more minutes so that if you need to jot down anything.
Our website for Office of Institutional Research and Planning is also right there if you just want to check out some dashboards or noncredit research as well. So thank you everyone, and I hope you have a great day.
Lisa Mednick Takami: Thanks, everyone.