Lisa Mednick Takami: We do have a large number of registrants. So we will ask you to be patient. We have allowed plenty of time for questions in our time together this morning. However, I'll just put out right at the beginning that if we run short on questions, please do make sure to put your question into the chat. And if we need to do an FAQ following this webinar, we will certainly do that.
So I want to welcome all of our CAEP practitioners. I'm Dr. Lisa Mednick Takami. I have the honor to serve as director of the community college technical assistance provider, one arm of the integrated CAEP TAP office. I would like to acknowledge our immediate colleagues at the Sacramento County Office of Education. Mandilee Gonzalez, Renee-- excuse me, excuse me, Mandilee Gonzalez, Renee Collins, oh my shoe was falling off, And Holly Clark.
I would also like to recognize leadership at the State Chancellor's Office, Mayra Diaz and any of the other members of the Chancellor's Office team that may be here, along with state leadership from California Department of Education, Diana Bautista, Dr. Carolyn Zachry, if she is in the room. And just a reminder that we are here on behalf of more than 600,000 students who are enrolled in the California Adult Education Program, and we have the pride in working on behalf of the largest adult education program in the nation. And with that, I will ask us if we could advance the slide, please.
So go ahead and introduce yourself in the chat, yourself and your affiliation. It's not uncommon that we have at least several colleagues who are completely new to CAEP. And we welcome you, as well as those in our audience today who may be veterans. This particular webinar on data collection for ELL healthcare pathways is intended for practitioners and leaders who may be at any point in the learning curve around CAEP. So please know that we have taken that into account. Next slide, please. I will hand it over to Chandni for some housekeeping items. Chandni.
Chandni Ajanel: Hello, everyone. Just a few items. This meeting will be recorded. The recording and the PowerPoint will be released on the Cal Adult Ed website following its remediation. And we ask that you please fill out the survey at the end. We really value your feedback, and we use it to improve our webinars. Next slide. And then just a gentle reminder that we want to cultivate a gracious space for lively, respectful and professional discussions. Thank you.
Lisa Mednick Takami: Thank you Chandni. Here is our agenda for today. We've covered our welcome and housekeeping. We're going to start with an overview of the ELL healthcare pathway metrics. Part of the reason for that is that we anticipate within our attendees today that there are those who are lucky grantees for round one and round two, and that we may also have a number of colleagues who are prospective grantees for round three. So we're going to do an overview of those metrics and the reporting deadlines associated with that.
And then Dulce will be focused in on potential data sources for those metrics and methodologies for collection. I would also like to acknowledge at this point, our esteemed colleagues, Karima Feldhus from South Orange and Chrissy Gascón from San Diego Santiago Canyon College, who we spoke with regarding their data collection efforts. We welcome hearing from our k-12 adult schools colleagues as well in terms of both best practices and challenges. As I mentioned, we have left plenty of time for questions and discussion, and then we have a few closing activities. Next slide, please.
Here are our objectives. So as I mentioned, we want to review the ELL healthcare pathway metrics and the reporting deadlines. We will be providing an overview of data reporting issues observed by the Chancellor's Office, and provide grantees data collection strategies to improve reporting completeness. Next slide, please.
So we have the next two slides are the metrics themselves. And I have bolded a key word in each of them. The first one, the number of ELL student participants enrolled in the healthcare pathway program. Number two, the number of ELL participants who complete the healthcare vocational pathway program. Number three, number of ELL participants who complete a healthcare vocational pathway credential.
So what's the distinction there for those who may be prospective grantees? You may have a student who is in a program, even completes that program, but you may have a student who completes and then also completes credential. That could be, for example, a license. Number four, the number of ELL participants who transitioned into a non-developmental credit college course, a credential program or a higher education degree program. Next slide, please.
The number of ELL student participants who entered employment in their associated field of study after completing training in that field of study. Number six, the number of ELL participants who increased earnings after completing training in their field of study. Number seven, the number of ELL participants who complete a health vocational pathway credential. I think that's a repeat. Sorry about that. Number eight, average ELL participant salary upon entering the program and average all participant salary upon exiting the program. Next slide, please.
Well, we are not talking specifically about NOVA submission issues. I just want to put out for everyone that all of these metrics are reported in NOVA. And the Chancellor's Office have has noted some issues that they are seeing. Some of them are incomplete reports. Some are reporting on negative overspending or negative numbers. I'm going to focus here on our deadlines. So for round one, on the left-hand side, you have the reporting schedule for round one. This is listed on the Chancellor's Office.
On the right of your slide, you have some new dates that are in red, because there was extensions granted to a number of grantees for fulfilling those three completion or those reporting deadlines. So the March 31, 2025, September 30, 2025. For final reports and fourth biannual expenditure progress report, you will see new dates over on the right. March 31, 2026, which is coming up, September 30, 26, for those last three dates. And then you'll see the link at the bottom of that slide to this information where it's housed on the Chancellor's Office website. Next slide, please.
And here we have our reporting deadlines for round two. Again, this can be found on the Chancellor's Office site. And we have here our next reporting cycle for March 31, 2026. So you will probably have guessed that the timing of this particular webinar is timed to that March 31 reporting. I do want to give the opportunity for Mayra Diaz, who's here from the Chancellor's Office, as she might want to add anything in particular.
Mayra, I don't mean to put you on the spot, but I just want to make sure that I didn't miss anything or if there's anything related to the reporting deadlines or anything, patterns that the Chancellor's Office has been noticing in terms of issues with reporting that you may want to add. Since I do not hear from Mayra, we will move on. Next slide, please.
OK, so I'm now going to pass this over to Dulce. And we're going to talk about the temperature check and survey results in terms of which metric outcomes are posing the greatest challenge. What strategy has your consortium identified to address these challenges? And again, we spoke to members from two consortia yesterday. But we're very interested. So if you've got best practices and/or specific challenges please put those in the chat. We did a very brief two question survey, in addition to the questions that we posed during the registration process for today's webinar. So with that, I will pass it over to Dulce. Thank you.
Dulce Delgadillo: Thank you, Lisa. Good morning, everybody. My name is Dulce Delgadillo and I'm the director of Institutional Research and Planning at North Orange Continuing Education, where we house the CC TAP team, so I'm very proud to be a part of that CC TAP team. So really what today we're talking about is data, our expertise. We're housed in a research department, so we know a lot about what it takes to capture data, what it takes to track data, and what it takes to capture valuable data amongst our student populations for reporting purposes.
So really today, what we wanted to just begin a conversation around what are the metrics, what are some common methodologies in which individual schools and colleges are gathering this information. Some potential data sources and resources for all of you potential grantees and existing grantees. And also just to get feedback from all of you as to what additional support you need. CC TAP, CAEP TAP, SCOE TAP is here to support your CAEP needs. And so as ELL healthcare providers and grantees, we're really in the business of helping be thought partners with you in what makes sense for your scenario to be able to capture this information.
And when I mean your scenario, I mean what your structure is made up of, what your capacity in terms of personnel is, what your structure is of onboarding and offboarding students. All of those pieces are really going to be these puzzle pieces that you're going to need in order to determine what makes sense for all of us. So just to have a reality check, we always like to do baseline in research. And what that means is that we're really starting with where's the field at right now at this point. Two years in for some, one year in for some, potentially jumping into this for others.
And so I just wanted to share verbally just some of the findings. So overall, we had about 30 participants respond to these questions. But in terms of level of comfort in capturing overall metrics, we really see everybody on both extremes. So we have about four was our highest, which was at a comfortable phase, and everybody else was really scattered around, not comfortable, slightly comfortable, and we only had one survey respondent that said that they were actually very comfortable. So again, do not feel like you are alone in this. We are definitely all learning as this process. What are best practices to capture students potentially at the end or at the beginning or what does that look like?
And we also asked some of the best practices that you may have related to ELL healthcare pathways. So I did want to share some of those findings as well. So a lot of individuals-- we had a couple of individuals that said, I'm brand new to this role. I'm not exactly sure what I'm supposed to be looking at, who I'm supposed to be talking to in order to capture this. So we definitely hear you, and we hope to give you some resources to be able to support your advocacy efforts within your local ecosystems that you're advocating for.
We also saw a couple of data system pieces. So we had some questions around-- we're coordinating with the School of Health and Wellness to actually collect outcomes data. So the partnership of making sure where are our students going, and then how do we build those bridges with existing partnerships to potentially bridge that data as well, not just our students. We saw a couple of-- we transferred data from ASAP to TOPSpro. So mixing those data systems and I actually have a table that's going to showcase a little bit about what can be captured in both your SIS, your student information system and potential third party systems like TOPSpro Enterprise.
And we also saw some scenarios where they're saying that counselors are very critical in capturing the student information. Why? Because that's where that high-touch is going to be. That is that personal kind of connection that the student may have. We see that also-- We haven't seen it. But in some scenarios in the CAEP land. We've seen it among faculty as well. Why? Because they are in contact with students. And so how do we leverage that type of repertoire that we are building with our students and with individuals that may enrolls in our institutions that may be best fit to be able to capture student level data after our students may be leaving our institutions.
And then a lot of what we were asking in terms of data collection, we saw a lot of I'm emailing students, I'm calling students, I'm texting students. I'm trying to catch them right after they graduate. I'm trying to catch them between the three month and the six month mark. Some time frames are in there. So really just a mixture of methodologies in which that we are seeing across. So what are we seeing? We are seeing that we have a wide range of just level of comfort in capturing this data. We are seeing counselors being utilized, data systems talking to each other, and leveraging the existing partnerships in which our students may be leaving our institutions to go and get a job, or potentially get their licensure at, building those relationships with those entities to be able to capture data.
Lisa Mednick Takami: Also, may I add one thing to what you just summarized so nicely from the survey results, which I learned in speaking with our colleagues. So I found it interesting that some of our grantees are using both paper and online surveys. Certainly a strategy we use for registration here at NOCE. I also found it interesting that braided funding was being used around the data collection piece.
So one of our campuses has hired a job developer under a different funding source to really cement relationships with students when they're in the program, so that by the time they exit the program and getting that wage earnings differentials and the point employment questions, which can be the most difficult metrics to get, there's that relationship built. Along with the notion that in collecting the data for ELL, that data may also be used for other grants or other categorical programs. Thank you.
Dulce Delgadillo: Thank you, Lisa. That's exactly correct. I think that's when we're talking about leveraging data. That's exactly what we're talking about. If we're going to be asking students to complete a survey, that might be an opportunity to also capture data for CAEP, to also capture data for your Perkins, to also capture data, potentially for other initiatives like Strong Workforce that also lives in the K through 12 and in the community colleges. So definitely having a conversation, I like to call them your data geeks on campus, as to how can you leverage that, what is happening. And I'll show you a little bit about what data mapping could potentially look like.
But that's really-- as I'm wearing my director of research hat when I am being brought in to a grant at the beginning, which is very critical to bring in your data people at the beginning, at the inception of the grant, because you're really working with those individuals to say, how am I capturing this data? Where is it being entered? How is it being updated and maintained? How am I extracting it so that I can then report it? And you're really trying to understand the data life cycle of those data points. And that involves a variety of people. For some it could involve your A&R, for some, it can involve your-- and A&R as admissions and records. For some it would involve your counselors. For some it could involve your faculty. So it's really understanding how you're capturing those data points along the student journey would be very helpful.
All right. So let's dive into a couple of strategies. These are very broad strategies that we have seen in existence out in the field. They are recommendations. And before we dive into this, I really encourage you to think about who are going to be the key players in the data collection process. Is it going to be some individuals at my adult school? Are they going to be professional experts or contract employees? Are they counselors or faculty categorized? What does this look like?
So the first one is really a data collection effort that we've seen across a mixture, which is structured onboarding and exit counseling model. So this is high-touch. We are talking to our students as soon as they're coming into our program. We may be keeping some check-ins with them throughout the program. And at the exit they are going through an exit counseling, exit model, exit survey, exit something. And so really you're looking at this is high-touch across the student journey. This could potentially look like I'm interested in doing pharmacy tech at NOCE.
And in fact, we actually do this at North Orange Continuing Education for our high school diploma students because we really want to track when they complete that GED or that high set. So this is a model we use in our high school diploma. But we have seen it used in short-term vocational CTE programs that are tracking healthcare or employment outcomes. So you're taking the intake onboarding appointment. If you're in CASAS TOPSpro, that's your intake form where you're capturing that data. If you're an SIS or a Banner school or a PeopleSoft school or a community college, this is potentially where you're capturing that Banner ID, your student information. You're onboarding them. If you're an adult school, it could be your Orbund where you're capturing that enrollment information.
So this is where you're capturing, they're starting the program, this is where they're at. If you're moving towards a cohort model, this would also fall in somewhat of those cohort models so that you're kind of moving those students together through the program. Students complete the program and they complete a formal exit interview. This sometimes is in-person, sometimes it's over the phone. What we have seen is it typically is the counselors, because that is the relationship the students have this relationship with, of updating them.
If it's over the phone it tends to be somebody that the students are familiar with, not the researcher who's capturing the data that they've never heard the name of. But it's someone that they are very familiar with and are comfortable in sharing potentially income and employment data. We have to understand that this could be very sensitive information for individuals, and for some of our students, this could be the very first time they've ever been asked about employment or wages. And so we really have to be cautious as to how our students are interpreting these questions so that we get accurate, valuable data to report back.
So a couple of pros and cons that I want to capture. And within the frame-- if you choose to really go through this high-touch model. It's really going to be-- your pros are going to be high data accuracy because you're continuously talking to that student. It's going to be a strong student relationship that you're going to have throughout this process. Again, because it's going to be high-touch.
And there's going to be really clear pre and post kind of comparisons because you're going to have that intake and you're going to have that exit. And most likely you're going to have parameters within your program to say this is when we're going to do is and this is when we're going to do the exit form. So there's consistency. There's a streamlining of this methodology and this process across your cohort of students. Some considerations you want to take into this. It requires a lot of personnel capacity. Whether it's your counselors, your faculty, contractors, however you are deciding to put this into practice in the model, it's definitely going to be personnel heavy. Staff heavy.
It's also going to really require consistent documentation in these practices. Consistent in the questions that you are asking students in that intake, consistent tools. And if you're going to be using an exit survey, then you're asking students the same questions. So when we look at methodology, we want it to be replicable. We want to be able to replicate it. And so what that means is that you are using a consistent tool across all of your students to capture this data. So what could this model potentially, how could it align with the data that you're needing to report for the ELL metrics? So remember enrollment and completion?
If they complete the data, you're going to know about this. You're going to know their enrollment patterns because you have been tracking them through the process. Most likely you're going to be looking at their academic history, potentially even their Ed plans if it's a counseling model. You're going to also have an opportunity to say, hey, were you able to get that credential attainment or are you in the process of signing up for credential attainment? Potentially maybe that pharm tech or phlebotomy certificate that I have signed up for?
You're going to get a good sense of pre and post wages potentially. Definitely your pre wages. So you're getting that data point right at that entry. And if they have secured a job by the formal exit interview or potentially maybe three months after completion, then you have that window of gathering that post wage to understand what that gain was. You're going to look at employment transitions and then potentially also post-secondary transitions. And I think employment transitions and post-secondary transitions really come in.
You're going in, and most likely as part of this intake you're asking students, what is your goal? What is it that you want to achieve? Do you want to transfer? Do you want to be placed in employment? And so again, if you have that roadmap for both the student and the counselor, data collection is going to be facilitated, and you're going to really look at that roadmap of that, OK, the student enters here. So we're capturing this data point here. Then we're doing an exit point, we're completing, and we're getting completion data and credential attainment data.
I want to pause and see if there's any questions or major pieces around this methodology. All right. We're going to go ahead and jump to the next one. So what does a medium to high lift? So maybe you're not in a place where you have five counselors that are fully dedicated just for ELL healthcare pathways? So what does potentially a follow-up look like?
So the second strategy that we have seen is a follow-up exit survey. So they start the program. They go through the program. Again, it may take them one year. We have students that for some CTE programs it takes them one year. Sometimes it takes them three years. But when they are done, when they are almost done, so in fact, for our ESL program, we have our researcher-- they pull all of the students that are one class away. And we say, you're one class away? We're going to send you an exit survey as soon as you're done. Or we give that expectation and say, hey, at three to six months, you are going to receive an exit survey from us.
What we have seen as promising practices in this methodology is, again, having the individuals who have a repertoire relationship with the students, contacting those students is going to go a long way. We have attempted, in fact, I have personal experience in trying to attempt to have our researchers contact students. They don't know us. They don't know who we are. They know their counselors. They know their teachers. They know the people who have helped them face to face, either virtually or online. And that's who they have a relationship with. So being understanding of that context will go a long way in capturing data, especially post exiting the program.
So what does this potentially look like? We've seen a model of standardized exit and follow-up surveys. We have seen this a variety of ways. So I'm going to give you one straight up, which is the exit and earnings survey. It is a standardized tool from TOPSpro that is a follow-up survey that is given to all students that are within the system to say, where are you? Are you employed, whether it's first, second, third, fourth quarter? So again, standardizing your metrics and your methodology. Has your wage increased? So it's not necessarily that you're asking students here, my age-- I'm sorry. My wage was this. And then it went to this. It could potentially just be did you see a wage increase? And that is the data point.
Typically what we have seen is this outreach is being done right as the program is being completed. So some programs have job placement at the completion. So this is where they're doing that exit interview or that exit survey right at program completion, knowing that the student will graduate or they do it three to six months if they know that job placement may take a little while, or maybe they're still trying to determine what that placement looks like, or if they decided to take a different route. But typically three to six months, that aligns with your second quarter and fourth quarter, I'm sorry, with your second quarter going out of the program. So making sure that you're aligning that.
Typically, what we see here is not necessarily that we are tracking this very sophistically through a full system, like our Banner system or PeopleSoft. But what we see is typically spreadsheets or flags within our banner system or TOPSpro or Orbund that is saying this student is about to finish. Contact them. Or the student just finished. Contact them. And so it's much more tracking it locally to determine, hey, how are we doing this?
So a couple of examples is text-based surveys. This is where we've seen the phone calls, the phonebanks, email surveys or just a hybrid approach on each of those. And so what we see that you're typically capturing in these processes is employment outcomes. That's typically the big one. And earnings. Did your wages increase from when you started in the program to now, within a time frame. We've also seen that they use the survey for, are you working in the field that you got your credential in? So really understanding what has happened to those students right after they received the credential.
Did they get the job within the field that they just completed their pathways of study in? And with that, did their wages increase? We can also see short-term transitions. Especially if you're looking at three months and six months out, you may have an opportunity to say, hey, the student just at three months was placed at this level and they were able to get to the next level or to the next step in their career or to the next wage six months later.
And now you have a narrative and a story of our students were able to not just get a wage increase, but were able to actually take the next step in their careers within their jobs. So again, that is much more feasible with this exit survey than when you're focusing on your students in that first methodology that you may say, our students are getting really high success rates, really high retention rates, really high credential attainment rates. So that lends itself a little bit more to that type of outcomes that you may be searching.
So a couple of things in terms of pros and cons of this. Pros, very flexible. You can develop it in-house. You can use existing tools such as the E&E survey or for community colleges we also have the CTEOS which is the career technical employment survey. So that's another tool that you could potentially leverage. It doesn't really require a full counseling model. We have seen phonebanks of contract employees go and do this to be able to capture data. So you don't have to go through a full whole counseling model.
You do have to think about these considerations. It is labor-intensive in short amounts of time. So those as soon as they graduate you are contacting. May, June comes July, we are on the phones. Three months after, we're on the phones. You are just really tracking students down where I have seen again, the best methodology for that is have the individuals who know the students, that the students know, have them contact them. The response rate can vary. I can tell you that right now. It depends on your student population. If they do well with text, you may get a better response. If they do well with phone, you may get a better response. It's really going to be dependent on your student population and your demographics and their level of comfort in giving you that information along those methods, whether it's email, phone, electronic survey.
And it may require multilingual outreach. You're talking to individuals. You're having individuals speak. And you're asking questions that may be perceived as personal. And so it may be to your advantage to be able to translate that survey into Spanish or into Korean or into Farsi, into whatever language that student population may be comfortable in providing that information to you. All right. We got one more. Oh, I went a little too fast.
Let's talk about a lower touch. So let's say that maybe it's just a team of one. You. And you need to figure out how to capture this. And I have no idea. I cannot call my cohort of 70 students. It's just not physically feasible. I hear you, I definitely do. So one of the strategies that we have seen minimally, but we have seen in other venues and with other data collection efforts, is third party or system-based data sources. What does that mean? That means that there's already a process in place that is capturing this information.
Think of your employment and earnings survey from TOPSpro Enterprise. Think of your update forms in TOPSpro Enterprise that are maybe reporting employment outcomes, wage increases, certificate completions rate. You may be looking at existing data collection efforts at a system level such as the CTEOS survey. So we receive that information at a student level. So it would require you taking the students from your cohorts in your ELL healthcare pathways, administering and submitting that information to CTEOS which is housed in Santa Rosa Junior College, and having your students participate in CTEOS to be able to capture that data to get that sense.
You can also do local employer follow-ups. So if you have a partnership, our DSS program, has a very strong relationship with Ralphs and with Kaiser, and we have liaisons from our institution to partner with Kaiser. And so potentially building a data collection method, data sharing methodology to say, hey, these are our students that we gave to you. We're going to follow up with you in three months and see how they're doing. Are they still employed?
And potentially doing a mixture of getting employment information, earnings information, and then this is also an opportunity for you to be able to see is there industry alignment where we were supposed to place our students. And it could be as broad as, is my student who just completed this program, are they just in a health care field? Are they employed in a healthcare field? There's also data matching tools. So I want to share a couple of resources where I've seen this done, which is through two vendors, Lightcast. Lightcast, which used to be known as Esmi. I'm not sure if anybody has known, but this is a service. Emsi. Sorry, Emsi. A data service that is actually used to match. So I'll tell you how it's used.
For our CTE program, we have done this for our entire institution. We have done this for our entire CTE program or across all of our district. So that is for CTE programs at Fullerton College, CTE programs at Cypress College, and CTE programs at North Orange Continuing Education. What we have done is that we say, here are our students information. And we give them their name, first name, last name, their date of birth, their gender. And I think one other identifier, but I'm not exactly sure what it is. They'll give you the parameters.
And you pay. It's a fee-based service. But you pay for them to scrape their LinkedIn, their internet, anything to match them. And what we get in return from a research perspective is I see at a student level, how many of them got a job, how many of them got a job in the industry that they have said they received their credential, and any other potential employment earnings information that could be useful. So this is again, low-touch. I will add high cost. If you have the money and the ability to be able to outsource it, this is a viable option to be able to get student level data to be able to complete that.
So another way that I have seen this is actually through credit unions. So the Equifax. Equifax has this service. Again, they are relying on Social Security numbers specifically. So Lightcast is a fuzzy match where there's pulling and matching up against gender name location, those types of things. I'm sorry. Equifax is going to match it on Social Security number and whatever most likely employment data. So again, depending on your pool, if you have a great capture of Social Security numbers among your ELL Health students, that might be a viable source for you.
But I did want to share that there are those services out there to be able to give third parties information protected, and for them to go and match, at least among industry alignment, earnings and employment. So what does this potentially look like, data collected through existing systems or external partners? You're really inferring the outcomes here. You're inferring the outcomes based off of what the third party or potentially what is being self-reported in your E&E and your CTEOS. It's also minimal direct follow up by program staff. So again, if you are a staff of 1, this could potentially be an option for you to be able to verify, validate, get an idea of what employment outcomes look like for your students.
A little bit of pros and cons here. Pros, very scalable. Scalable. If you go from a 77 ELL healthcare cohort to a 700 ELL healthcare cohort, you add those student IDs to your file and it's very scalable. It's useful for employment and wage validation. Third party, you're not really scraping, and I'm sure they have many more resources than what is available to us. But a couple of things to think about with this tends to time lag, right. So you may not get this data until six months later depending on what their methodology is. So if you do go down this route, I encourage you to very much look at their methodology as what they're offering you to match.
It may not be as program-specific. So you may just say is this student-- And it may not necessarily tell you the story. Tulsa received a phlebotomy non-credit certificate. They worked in a hospital as a phlebotomist. What you may get is Tulsa received a healthcare certificate, and now she works in health or she does not work in health. So that is the level of data that you may be getting in terms of industry alignment. And so the last one is you may need some level of data literacy to be able to interpret these files. You're going to be getting large, large Excel files that you're going to have to decipher and determine what does this mean for your program. And so that may require just processing and digesting data slightly.
Mapping out your data. So you may have a mixture of it. You may say, hey-- and we do too. As an institution we have Banner, which is our SIS. We are a WIOA school. We capture TOPSpro Enterprise data. But not for all our students, just for our WIOA students, which is basic skills and ESL. And we also have several homegrown tools that we've used, including our own homegrown Excel files that we just maintain. And so really, what it comes down to and I encourage you to do this at a local level.
And again, CC TAP, CAEP TAP is here to support you in trying to understand where does this data potentially live within your local systems is saying, OK, for each of these metrics, where am I potentially capturing this data. So we're looking at enrollment and completion. I didn't pull all of them, but I pulled the main ones. Enrollment and completion. Did they attain a credential? The two important ones. Did they get a job? Did they stay in that job? Did they increase their earnings and transition?
And so you can see here where it could be a potential of mixture. You may be getting the post-secondary transition from both your Banner and your exit follow up surveys. You may be getting your credential information from both your SIS and your program records internally, locally. You may be getting your earnings from your TOPSpro Enterprise and potentially other data matching. So going through these methods and aligning those metrics, those ELL metrics on your left-hand side and saying, where do I potentially have the data, could be a good starting point to begin saying, OK, where does this data live? Who do I need to get involved?
I want to highlight two of our colleagues that really stepped it up. So thank you for showcasing this. But this is Saddleback College and really how they're communicating what their data collection process. What I like this is that it is sequenced. It is streamlined. They are going from the program application all the way, ensuring licensure and demonstrating that program value through the end as they're going through.
So you can see, almost envision the student journey through this, as well as what they're doing. But once they get that certificate, there's really that process of OK, how are we capturing it, data compilation and analyzing it. And then how are we using it, not just for grant reporting to check off that box, but we're looking at it to improve our program, to improve our outcomes. And so really that's about how are we demonstrating that value. How are we saying that this is a good route, a valuable route that increases our students wages, increases their likelihood to be placed in a job.
All right. The second one that we really liked was Santiago Canyon, so thank you. So Santiago Canyon. So they've really followed kind of a counselor model, where they have a job developer position. And they have followed that model of really having that position build and form that relationship with students. So it helps their data collection efforts. So they're doing, again, that mixture of in-person, online surveys, with some of their telephone efforts. So making sure a dedicated individual building that relationship with students.
All right. Before we wrap it up here and open it up for Q&A, I did want to showcase a couple of resources here. For survey and question design, I've placed the links to the CTEOS survey, which is housed. So if you are a community college, your data is already in there. You may want to check it out. You can find your reports right in there. I also put in the E&E survey. A great report, a great presentation from summer 2021 that I thought would be really helpful. And I also want to share that at CC TAP.
We're really hoping to really centralize all of the survey tools, data collection tools that all the grantees have been showcasing, and place them in a repository that is centralized and accessible to all the grantees to be able to use as tools. I placed the two tools around systems, which is CASAS TOPSpro for our adult schools, and then Orbund as well for some of our adult schools. I also placed PeopleSoft and Ellucian. Those are the two main ones for our community colleges. So that's going to be your Banner. And then your PeopleSoft is going to be your Oracle.
And then the third one is I wanted to showcase, just share with you the links to the two service providers that potentially you may use for employment matching and wage matching. So Lightcast which is at the bottom, very much frequently used by community colleges and other sectors. And then Equifax, which I have heard is used by several sectors, including the private sector as well. All right. I know I went through that really quick. I wanted to make sure to get it all in. All right. Let's open it up for some questions and answers here. Hi, Rick. Go for it.
Rick Abare: Hi, good morning. This is great. If you found yourself in a situation where you wanted to go beyond what--
Lisa Mednick Takami: Nick, we can't hear you. Oh, I can. OK. Sorry.
Rick Abare: OK. Where was I? If you wanted to go beyond, our schools have been individually tracking their own metrics and doing their best on follow-up since typically the employer is engaged enough for us to try to follow-up on at least did the student get a job. But we haven't been engaged in surveying across the board. If you had to do it forensically, you're like, look, we're halfway through round two. Round one is over, ostensibly. So let's do some forensics kind of on this. Do you think you'd take, like, a mix of approaches, maybe get some kind of survey instrumentation that could come out from a trusted contact, but then try to approach maybe a method three in a way of like, look, we know our survey results are probably not going to be super great, so let's try to attack it from the side.
Dulce Delgadillo: I would say that what we have seen out in the field is a mixture of all of it. Because it's dependent on what your capacity is and your student population. So if you know-- All right. And I think that's a great approach, Rick. I think it would give you some sense of which method may yield more and better results. And when I mean better, I mean like accurate. Accurate results. And so it may be that-- and sometimes we find this for our ESL students. We tend to be able to capture them really quickly. Our high school diploma students, we had to use a completely different methodology for our high school diploma students. We had to do the counseling model.
So that's why I encourage you to really look at your student population. Do they tend to respond to emails? Do they tend to do better on phone calls? And then potentially look at a mixture. If I were to have all of these methodologies at my fingertips with unlimited money, I would probably look at local tools. And then I would use one year's worth of money to do a Lightcast match, and then I would try to compare the both and see. Because what I want to know is how much am I potentially missing on either ends of the methodology spectrum. And so when you do those two methodologies and you compare and you say, hey, my data over here is way different than this data, then it brings up the question of, are we using the right methodology? And so I think that is a good approach. Thank you. Francisco.
Francisco: Hi Dulce, this is Francisco LARAEC I have a question and it's following up Michelle's question in the chat regarding outcomes for students. I mean, in the reporting period, there are certain reporting periods that we need to report these outcomes. But then there's a final report as well. So my question is in the reporting period, we capture students some of their outcomes in that reporting period, but not necessarily all the outcomes. For example, our students might finish a class but not get a job yet. And then they'll get a job later. And so, we'll try to capture that forensically. Like Rick said, we're going to try to research those students and follow them up and all that. But will the state ultimately look at the final report versus-- because the final report will have, I think, the most accurate all the students within that round, round one around two.
Dulce Delgadillo: Yeah. So I'm assuming that the way that the reporting is that what you would see is maybe in the first round of reporting, you see just a subset of your cohort. And then by the time you get to the final report, you would be reporting on all of your entire student population. Is that accurate, Francisco?
Francisco: Yes. But then let's say in round one, the first reporting period, we identify enrollment and completers. That's probably more accurate because they don't get a job, but they might not get a job until maybe six months later or maybe even eight months later. And so we'll try to capture that. But then that problem is not solely in the first reporting period. That problem is second reporting period or third reporting period. There's six reporting periods for each of the rounds. And so we'll try to capture that in the last report as best as we can, because that's going to be a summative of all of the students along the years of the round. So my question is to what extent is the leading up reports more or less important than that final report.
Dulce Delgadillo: Well, I would venture to guess that the smaller kind of like the shorter reports are pieces of the larger report. And the final report is the big picture, the final picture. That's how I'm interpreting it. I'm going to let Mayra jump in. I see her hand is up. Go for it, Mayra.
Mayra Diaz: Thank you Dulce. Good morning, everyone. Yes, Dulce, you're spot on. And so the reports are intentionally set up where Q2 and there's a Q4 and a final report. So under Q2 reports, there are two of those, depending if you don't have an extension. But under Q2 reports we're essentially capturing fiscal progress. Narrative details tell us what pathways you're enrolled in, what programs you're funding, and then tell us your students being served. How many students are you? It's all anticipating. You're setting your base for that period. So always take a look at the corresponding reporting period.
And Q4 then, usually at the end of each year, you're capturing a little bit more information that actually gets at the outcome data for that particular period. And then of course, we have the final comprehensive report that looks at the entire project. And you are able to report comprehensively what was served or how many students you served, and also report on the overall investments, what was learned. Those are all various touch points that we reflect on, that we're able to extract some data, make some analysis. I know that we've actually done that under round one.
So this breakup of data really helped us share the picture of what this investment was doing. As we work with our inter agency partners who are interested in what is going on with these funds. And so, as I've worked with some of the grantees, I've always said, especially under round one, funding came out a bit late. Round two, similar scenario. Report what you have. There's narrative sections. Tell us your story. Tell us your challenges. Tell us your hurdles, what issues you're coming across. Tell us information that we can be able to report on and share that story of why we're seeing less students enrolled or whatever the issue may be.
Because we know especially for those that are interested in going into round three, round one, round two grantees are going to be seasoned in understanding the reporting. But it takes time to build some of these programs, if you don't already have that. And so that's where the way that the reports are set up allow you to tell the story as you have your program set up and you're starting to get your students into the door. We've set up so that we get to understand how many students you're planning to serve, how many students you actually served, and then you'll obviously start to report the outcome data as you start to see that within that corresponding period. And then the comprehensive report will allow us to really assess what you've done with the entire investment. So I hope that helps address, and I think Dulce, the response that she had provided was also very spot on.
Lisa Mednick Takami: Thank you Mayra.
Dulce Delgadillo: Thank you Mayra. I think one of the things too, that got me thinking is with the employment data. Because I think Francisco, was it that you had mentioned if the report is due September 30, but people from that reporting period may not get the job until after that period, where would they report that outcome Mayra? Or would there be an opportunity to report that outcome?
Mayra Diaz: I'm sorry, Dulce, I was reading that question in the chat. Can you repeat it?
Dulce Delgadillo: So let's say September round one. September 30, the final report is done. But there are students from that round, from that final report, that may not get a job till six months after that final report is due. Is there ever an opportunity for those outcomes to be reported, or is there a narrative section where they can say, on average, our students typically get jobs this time after or a narrative section. Because I think that's what the question is for the students that do get jobs after or may get a wage increase after that final report is done, where could that be reported? Or is there a narrative that grantees could include that information or context within that final report?
Mayra Diaz: So it just depends. Because there's a couple of different scenarios where round one, round two grantees, if they have students that are being served within round one, and then they're continuing to build their pathways with round two funding, we've said, obviously you're going to report your outcomes based upon that corresponding investment associated to that particular round of funding.
So each of the rounds has its own designated period. You'll report accordingly. Round two funds has its own distinct reporting periods. Round three grantees are in a whole new bucket of you've got new money, you were never a part of round one, round two. Same thing. You're going to report according to your reporting periods that we have captured. So I'm assuming similar to other grants. It's a short-term grant. You capture what is reflected. We're going to take a snapshot of what was captured within that project period. Report what you can with the information that you have.
I will also say that these students, for the most part that you are serving, you're also reporting into your MIS or into your TE. So they will still be captured in your other data collection platforms and be reflected in other dashboards. But for this purpose, this project, we're taking a snapshot of what was done within that period that we've called out. And so at this time, your outcome data will be captured within that reporting period.
We know that there may be additional data that you'll have to report. It's not lost. It will still be captured and may show up in your TE report, if you're reporting those students in they're already. In your MIS it'll show up. We just don't have a way to disaggregate those cohort of students that are being served under ELL healthcare pathway, but they're still being counted. Those outcomes are still being reflected. And then your narratives as well capture in your narratives when you finish up your final report, the entire investment.
Lisa Mednick Takami: Thank you Mayra. I wanted to get to one question that's in the chat regarding bridge programs. And the question is if they're bridge programs that are going into multiple pathways, can those students be counted? This question came from Nancy Miller as well as another participant, Beth Cutter.
Mayra Diaz: I'm sorry Lisa, I need to stop reading the chat. Can you repeat the question?
Lisa Mednick Takami: Sure. So the question is, there are several, Beth Cutter and Nancy Miller are asking regarding students that are in bridge classes that are really critical for students being pathway-ready, can those students be counted in those bridge classes that may not be the pathway classes per se, but in the absence of taking those classes, those students wouldn't be adequately prepared to be able to go into the pathway classes themselves.
Mayra Diaz: Yeah. So we actually saw this in round one and round two. So it all depends on your proposal. We approved your work plan proposal and where you told us how you are setting your pathway programs up. If you called out that one of your accelerated learning models, you are bringing your ELL students into these bridge programs is your accelerated learning model that aligns to a CNA course. Obviously, we know that part of that component, that model, you are going to get your students into this entry level component to try to eventually get them to the vocational training aspect.
And so if it was captured in your work plan that this is your model that you've brought forward and it was approved, then yes, you're going to want to report as student serves. That would align to how many students you have enrolled in your pathway. Count those students if it was in your work plan, if your work plan was approved, and if you are funding ending the course out of these ELL healthcare pathway funds. And then when you get to the student completed this training, then you're going to capture the student in your outcome data.
Lisa Mednick Takami: All right. So that might be a point. Thank you Mayra. We need to return to-- we are already over time, so I'm going to ask if we could advance the slide so we can wrap up in respect of everyone's time. Here's our survey. If you would like to ask additional questions, if there are data concerns that we were not able to address, of course, we are willing to do a part two of this type of webinar, especially after the round three awards have been-- sorry, the RFA submissions have gone in and the awards come out, if not before. So please do fill out this end of webinar survey so that we can hear from you if this was adequate for current needs and if there are other areas regarding data collection systems that we can address.
And then if we could advance the slide and we have another QR code. Many of you are already part of our CC TAP listserv, which along with the CAEP newsletter, are the two different communication vehicles that the integrated CAEP TAP office uses in order to get information out to you. So please do subscribe, especially if you're new to CAEP, we really encourage you to join the listserv. And then finally, if we can go to our last slide, we have another QR code. We often call upon you, our practitioners, either to participate formally, informally, or both in these webinars. If you have an area of CAEP which is an area of expertise, and you would like to be part of this webinar or another integrated CAEP TAP activity, please fill in this form and join our voices from the field. Next slide, please.
Thank you for joining us. This was a robust discussion with some very good questions. We know the data collection process is on the minds of round one and round two grantees. And it really ought to be on the minds of round three perspective grantees, because I think everyone can tell from the discussion this morning, the resources that Dulce discussed, the level of what is expected from the Chancellor's Office to be reported, the need for extensions that have resulted in getting this data, and some of the time frames that Dulce mentioned. If looking at external sources for round three, folks, please do be planning and spell this out in your work plan, how your consortium, or if you're partnering with other consortia, plan to accomplish this data collection function and the associated budget for it.
With that, I will wish you all a wonderful rest of your day. Thank you very much for joining us. And do look for the remediated PowerPoint and the webinar recording on the Cal Adult website. It takes us a few weeks to get that up and running. And I know that our colleagues at West Ed and IRC are also working towards remediation of webinars that they have done on ELL health pathways and other areas in the month of January. So please be on the lookout for those as well. Thank you, and have a great day. Bye bye.