Blaire Willson Toso: Hi. Welcome everyone, we're delighted for those of you who are joining us for this second session. And if you weren't here the last time, welcome, we're delighted that you are going to be with us today. This is a continuation of Using Adult Education Pipeline Data CAEP Facts Sheets and Other Data Resources for Three-Year Planning.

And last time we focused primarily on qualitative data, and this time we're going to focus more on quantitative data. Next slide, please. So I'm delighted to be joined with my colleagues Jessica and Ayanna. And you'll hear our voices throughout. And delighted that they are with us-- with me today, and with all of us today. We each bring a very special experience to this work, and are thrilled to be sharing it with us. Next slide.

And we are also thrilled that we have two guest presenters, and they will have a much more time to introduce themselves. So I'm just going to mention that we have both Harpreet and Dulce from North Orange County Regional Consortium joining us. And they'll be really showing us, in practice, what this work looks like in their consortium. And hopefully you'll be able to garner some ideas and learn from their experience. Next slide, please.

And the last of our lineup, we've got-- we're delighted that the Chancellor's Office who oversee and work on the CAEP program are able to join us. So I just want to pass it over to Lindsay really quickly to say hello, and any messages that she might have for you all.

Lindsay Williams: Thank you, Blaire. Hi, everyone. I'm Lindsay Williams. I'm the program manager for California Adult Education. And I'm delighted to be here today. I do not have any announcements, so I will kick it back over to West Ed to get us into this webinar.

Ayanna Smith: OK. And hello, everyone. My name is Ayanna. I'm going to go over the agenda for you all. So we're going to get started with a welcoming check in. And then we're going to jump into using stakeholder data, and specifically data stories which is something that we didn't get into during our last session, so we're going to continue that.

Then we're going to jump into our guest presentation by North Orange County, which is, the topic is explaining data and setting targets for three year planning. So they're going to share with you how they have utilized data for their three year planning.

And then we're going to come back and go over some key changes to AEP 5.0, as well as data exploration. And then we're going to have a discussion and close out. Blaire, do you want to discuss the check in?

Blaire Willson Toso: Yes, thank you. Last part of our session we were not able to really dig into a lot of pieces. But we did leave you with the possibility of doing an exploration activity, and I think we spent about 10 seconds talking about this.

But I did want to check in to find out, did anybody do the activity? Did you have anything that you found? Did you interview a stakeholder or identify a stakeholder that you would like to interview in order to find out more about what's going on in your consortium.

All right, my wait time is expended so we'll just move on. And thank you. And we encourage you to use that as a thinking activity at some point in your planning. It isn't simply about three year planning, it's about any kind of continuous improvement.

Or if you're thinking about centering learners in your work or employers, engaging employers and following your work, it's really an exercise that is very, very useful for planning at whatever point in time. Or if you're thinking about launching a new program, that these are all pieces-- that is a particular activity that adds to nuance to your work. Next slide, please. So our objectives-- yeah, I'm sorry Ayanna.

Ayanna Smith: Yeah, no problem. So our objectives today are for you all to identify data sources and uses for three year planning. So that's including the adult education pipeline dashboard, as well as the CAEP fact sheets. We also want you to identify key data points relevant to three year planning, and then also how to use student data stories to identify opportunities and strategies.

Blaire Willson Toso: Thanks. Now I'll stop interrupting people. Thank you, Ayanna. So just as a reset for people who were in our last session, and then for those of you new in the session, what we discussed as a reminder, we were focusing on the adult education pipeline dashboard.

And that the dashboard as, you all should well know, it highlights key features, such as the scorecard metrics, we have the drill downs for different views, comparison views, metrics across the learner journey, and as well as the longitudinal view. And that this is-- we always remind people that the AEP dashboard is not real time data, that it is a static data upload that represents a particular moment in time.

But you can look across back across those moments in time for planning and understanding your trends. Next slide. And then we also reviewed the CAEP fact sheets, which use a combination of the student outcomes data, community data, and labor market data to help provide consortium views that will guide program planning and three year planning.

And, again, for all of these are not just for three year planning. I know that everyone's in the throes of making those refine details and changes to their plans, but that these are also tools to use across the years in different instances.

Again, like I said at the last slide, is for program planning, continuous improvement, revisiting your goals, and using these as a foil to understand what might be going on and whether refinements need to be made. Next slide.

I did want to say that the data-- sorry, apologies, on the CAEP fact sheets, that data will-- anything that pulls data in from the adult education pipeline, that in the past has not been updated on a continual basis, this year we will be updating it. So you can revisit it and see your 2021 data in there.

The last session we also briefly touched, as Ayanna mentioned, we briefly touched on the newest feature on the CAEP fact sheet, which is-- are the data stories, which are-- they use a combination of student outcomes data, community data, and student voices to understand how we can center and serve adult learners better. And we're really excited that this will be-- that this is on the dashboard, it's now the last tab on the right hand side.

And it's based off of student stories that we interviewed a number of students across the state of California. And then created data stories that represented these learners and overlaid them with particular data points that helped to elaborate and elucidate how you can use student voices and data points together to really nuance your data. Next slide.

So what we are going to do is take a moment, I'm going to pop this link into the chat, should you want to open it. Because we would like you to take a look at it just so you can see what we're talking about. And as I said, once you open that up on your computer, if you have that opportunity, you can then go ahead and read the story along with us.

And as I said, this is based off of a student interview. And then we heard the person is not Cynthia, we use pseudonyms for all of our students. And they're also-- we've collapsed a few points from other learners that we thought were interesting or told the similar story. So this is a representative story mainly about the person we're calling Cynthia.

And then putting it out there. And you can see that it's layered with the different data points that are both community data points, as well as some of our adult education pipeline data, or other relevant statistics that speak to this piece. And really, Cynthia, really talks about seeking, wanting to get further education to obtain employment. She had a very set goal in what she was using her education for.

Also what isn't necessarily refined in here, but when we are talking to her we moved it for brevity and accessibility on the dashboard, but one of her barriers to accessing education was that she couldn't pass the wave the entrance exam when she wanted to move on to phlebotomy. And when COVID occurred, people waived those entrance exams, which opened up enrollment for her.

And so she was able then to transition into that class. She was very successful past the class with flying colors and was able to, both, think about going on for further education, as well as find employment. She noted in her story about-- in her interview with us how the role, that supportive role that different people played.

And particularly she called out the advisor, which some of us call transition specialists, others call mid-career navigators. But they really helped expand her view of the opportunities beyond her goal of learning English, which is where she really understood that she could move into the employment track at the same time as learning English.

And so they also supported her through her journey of taking classes, helped her be able to navigate that space, alerted her to when the waiver of the enrollment exam was instituted. And then when we look at this compared to some of the data points, we can see how Cynthia's story evidences a strong possibility of transition and success.

And if you look at it, it is that 80%, which is where 80% of women who completed and received a non-credit CTE courses, they were able to go on and earn a GPA of 2.0 or higher in community college.

Demonstrating that by investing resources into adult learners like Cynthia, really expands your opportunity to be able to meet some of those transition outcomes that are set for us. As well as knowing that these learners, when they get this kind of support, can also be successful in community college.

Any questions about that. Perfect. Now, let's go ahead and go on to the next slide. So I'd like you to go ahead and open up that tab and then think about what are the opportunities that exist to support learners like Cynthia. So we have the one question that we have posted on the student story online, on the CAEP fact sheet.

But we also would like you to think about what can you learn from Cynthia's story? And then think about, because we're thinking about this as far as analyzing data and presenting data to your consortium and to your members, or to use it to inform people in your program, how can those data points and the narrative help to create a more informed picture of the learner? What does it do to center the learner? And what kind of nuance does it bring?

And then what opportunities exist to set targets or to identify strategies based on that? And sort of writ large how can this inform planning? I'm going to exercise a full minute of silence so that people can go out, look at the story, think about it, and then come back. And maybe we can have a little bit of a discussion or just pop them in the chat, please.

OK, we're a little under a minute of silence, but I can take it no longer. I'm just wondering, did anybody have the opportunity to go out look at that story? What did you take from it? What do you think-- how do you think you could use a data story that centers a learner in your planning? What do you think it would bring to the process?

OK, I know my colleague Jessica is posting in the chat. Thank you, Jenna. Dual enrollment for adults virtual and CPT class and they finish in six months. We have tons-- yes, thank you. Right, like, using those-- pulling those points out and being able to show a different way and represent your data in a different way that makes sense to other people, that really can drive me to go out and get knee-deep. That's a fabulous example, thanks, Jenna.

Thank you, Doreen. Data doesn't lie and some people-- [laughing] I think that's a wonderful point, right? It's like being able to push those stories out. It's very hard to get people who are data adverse to look at data sometimes.

And then putting it in this is putting it in a story and then really overlaying it really allows that, that access, and that it provides a different layer in a different way to look at it. And additional information to draw people in.

And Jenna, we hit these options hard in a robust orientation. Our transition specialists and academic coordinator do a great job at this, and students leave pumped. That's really interesting, Jenna. Does that mean that-- are you using this kind of information also for your learners? Jenna, and I see you on camera. Oh yes, all students. Oh, that is fabulous, I would love to hear--

Jenna Cestone: Can you hear me?

Blaire Willson Toso: Yes, I would love to hear more about that.

Jenna Cestone: Yeah, I don't want to take up too much time because I loved how you guys are presenting this. But this really moves our board, the student story. It really helps them to contextualize the actual student and the programs that come alive for what we're trying to get them to invest in and change, and kind of revamp adult education and the options.

So any DeepHE for anyone who doesn't know these acronyms, like, alphabet soup, right, they're the National external diploma program through CASAS. And if you go to the CASAS website, you can easily see the button on the side. And I went through a training with a couple of my colleagues here.

And I'm telling you, any high school adult who needs 60, 70, 80, units to complete that 150 requirement to get a high school diploma, they just kind of languish for like three or four years in adult Ed. And they just languish and they get heartbroken because it's just a long, hard slog. So any DeepHE gets them through in six months, six to nine months. And it's all virtual.

So we don't have an independence program at my site. And I was struggling to find something that since my students got their computer chops up to snuff during the COVID times, they were like we just want something through Zoom. Can you just help us out because we're doing the grind work of these CTE classes. We need something because our hours are just stretched, you know, kids, family, transportation, no childcare. All these things.

So the NED program, the NEDP program and the IET combination of really listening to them and saying I still want to be a medical assistant, I still want to go into be a nurse, but I just don't have time enough to dedicate 15 hours in person in class. So I kept hearing that, I kept hearing that, I kept hearing that. And so these stories drove me to look for something that would fill that void, and NEDP was rock star.

Yeah, the IET program, the Integrated Educational Training, our site has those CTE classes on site. And then, of course, students come and do in-office checks. I could go on for days guys, but I'm just saying, I could go on for days. But, yeah.

Blaire Willson Toso: Thanks, Jenna. I really appreciate that. And I really appreciate how having those stories helped you identify a gap and think about solutions, which is really the best way you can use this. And you can also use it pushing out your message to stakeholders, board members, and things along those lines. Thank you. Really appreciate that. Next slide.

So I just want to briefly say, I know we've whirl through this again, but I hope you'll take a look at it, think about how you can put this into action. And we do have the resource and a guide that will help you with planning how you can get through the process of creating and using student data stories. And next slide.

Now I am really excited that we are going to transition over to Harpreet and Dulce as they are going to walk us through some of their processes for the North Orange County Regional Consortium.

Dulce Delgadillo: Great, thank you Blaire. Hi, everyone. It's on. Perfect. Great, thank you, thank you, Harpreet. Hi, everyone. My name is Dulce Delgadillo. I'm the Director of Institutional Research and Planning at North Orange Continuing Education. And today I am co-presenting with an amazing colleague, Dr. Harpreet Uppal who is our Senior Research and Planning Analyst.

And, really, today's session, we're going to talk about the planning and the analysis of the demographics, and the educational and kind of community data sources that we've utilized in order to help facilitate a lot of the conversations and three year planning within our consortium. So next slide.

I'll talk a little bit about-- before we jump in, I just want to provide a little bit of context of our institution. So we're North Orange Continuing Education. We're part of the North Orange County Regional Consortium. And NOCE is the largest adult education provider, is part of our consortium.

We are a standalone non-credit institution, part of our sister colleges, our Fullerton College and Cypress College. So we are a small but mighty team of researchers that are fully dedicated to non-credit, and we serve an institution the same way that we serve-- a research department would serve our Fullerton College or Cypress College, like, our sister colleges.

And we are looking at beyond, also obviously, the adult education program. But it's been very helpful to look at where CAEP is going and then how it integrates, and how it impacts kind of those other adult Ed initiatives as well through kind of a larger scope.

But for the purposes of CAEP through your planning for our agenda, we're really going to talk about what has been the role of the research department. So how we kind of set the foundation up in order to begin having those data conversations for the three year planning, and moving forward, some of the tools that have been created by our research department in order to help facilitate those conversations.

And then really getting at the crux of all of this is really building culture of data utilization. So how do we get people and stakeholders within our consortium to really take into account the data points, the data stories, as Blaire had just shared. And really facilitate a conversation that data isn't just numbers, it's those stories, and how can we use all of the data as part of the planning process. Next slide.

So I'm just going to jump really quick into what research's role in the CAEP three year planning for a consortium really look like. So next slide. So in the last couple of years we have, kind of, really understood what non-credit data looks like within our system and within the California Community College system, and through the lens of these initiatives that are specific for adult Ed.

And as an institution and as a department, we've really gone out with the intention to, again, build that data literacy and just knowledge around why data matters, why does it matter. And what is your role in all of these presentations that we have done to a variety of stakeholders, diverse set of stakeholders. What is your role in this kind of data process that ultimately feeds this data into these dashboards and these metrics that the state is looking at.

I also want to provide context that for a lot of the data trainings that we've had within our institution, we've really-- this is nuance for non-credit. So historically we really haven't had a set of metrics, and now usually we frame it in the sense of just the way credit has the student success metrics. These are the metrics that the state is looking at.

And so what that looks like is doing data literacy around curriculum and course coding, connecting the course coding to the MIS process, connecting the internal MIS process, so ultimately the AEP launch board. And then we have kind of this other sphere because WIOA a-plus school title two school, we are also using TOPSpro.

And so how does kind of TOPSpro add to the conversation? And what are those additional steps and processes we have to put in place in order to ensure that our data is accurate in both of those systems. It's also, again, building that knowledge base. So why is course coding matter?

All those CB 21 conversations, educational functioning levels. All that terminology, really familiarizing our stakeholders with what that means, and even diving into the nitty gritty of it and how preview is going to show some awesome tools that have been developed to really get into the nitty gritty of it in terms of these metrics and these calculations.

And then lastly, an evaluation of consortium efforts. So the last two years, the research department has annually released out in NOCRC, a CAEP evaluation report, which really looks at some metrics and numbers in terms of how many students have been served by activities. Really looking at a lens sometimes of just CAEP funded versus activities that have been leveraged.

So having those conversations of what is the most useful type of data and lens to look at these activities through. And then also just, again, having people familiarize themselves with these metrics and how our students are kind of benefiting or what are the needs that we need to address that our students have as part of these activities that we're implementing as part of CAEP.

And you can see on the right hand side, this is kind of what our timeline has looked like. So we've done our evaluation plan, we have done Data & Donut sessions that are kind of that technical assistance component ongoing. Kind of just we're embedding ourselves into these conversations so that we can make sure that data is part of those conversations. And then ultimately just be there for support for stakeholders.

And what we've noticed is that it is a wide range. So we have some people that feel very comfortable with data, just want to dive in. And then we have others that are not. And so how do we tailor that for a variety of our audiences. Next slide. So jumping into practice, what are some of the tools that we've utilized?

So I spoke a little bit about our annual evaluation report, which Harpreet is going to jump into, kind of, what are the main components of that and how we've adjusted that to serve the needs. And when I say that, I mean we have conversations with our presidents, our president and our vise presidents, and they want to look at the data and the activities through one lens. Whereas our activity or our program leads want to look at it through another lens, so kind of balancing that.

And then Harpreet will also dive into our three year planning guide, which is kind of a one stop shop, a data one-stop shop that we really envision for our users. And, again, a variety of users. So we wanted everybody from our executive team members-- our executive team members, consortium voting members, to faculty, to be able to use this to facilitate data conversations for planning.

We'll dive into course coding Excel file. So that's where we went into the nitty gritty. We showed people what were the calculations, what were the codes that mattered, and why did it matter. And that has really led to other conversations around curriculum committee, around technical assistance, around entering courses. How do we build our courses, how do we determine what is going to be offered in the future, and what are some of those implications.

And lastly, also our institutional effectiveness indicators. So this is how we look at our data at a local level. And so we really wanted to connect the dots between, this is what we're looking at. And we have finalized our institutional effectiveness indicators, probably around three or four years ago. And we have intentionally mirrored them to what some of the CAEP metrics look like or other adult education metrics look like.

And really, again, connecting those dots to a variety of stakeholders from, this is what our internal data looks like, this is how it connects to what is presented in the AED dashboards, this is the community data and the census data, and anything else, LMI data. And then how do we put all of that together to plan for the three year. So Harpreet is going to go ahead and jump into that one.

Harpreet Uppal: Great, thank you Dulce. Right, so as Dulce mentioned, we have been-- we have created several data tools for our consortium members. And as you mentioned, you know, NOCE being our largest adult providers. A lot of the focus around data has been with NOCE data because we are situated within North Orange adult so we have access to the student information system.

Being in the research team we have access to the back end banner data, we use banner. And so we are able to create a lot of the data in-house that we have used in these reports. However, we do provide support to our other adult Ed providers in the consortium who use TOPSpro data. So we have individualized meetings with them, and we we'll go over in terms of the type of support we have provided.

So we look at data both through the MIS lens and through TOPSpro lens. So in terms of the valuation report-- so I'm going to cover all the slides in terms of tools, and then I'm going to actually show you the tools. So there's not that much back and forth between PowerPoint and the documents.

So the evaluation reports really came because our North Orange Continuing Education president was interested in knowing why the data looks different on the AP launch board than our internal data. So we wanted to look at that aspect of, when we were examining our data, kind of like, what does AP do and then what do we do internally.

And, sorry, I should backtrack. That was not the focus for the evaluation report. The evaluation report actually came because we were interested in doing kind of like a cost benefit analysis. So initially it came because when we were looking at our 2019, '20 data for the NOCRC, our consortium, and we wanted to see how are the funds being utilized.

And what kind of activities and strategies were taking place that were being used to support students, whether it was in instructional or student support services. And really wanting to see what the outcomes look like. When our work group leaders were requesting funds, we wanted to see how are they using these funds.

And when they were outlining their outcomes and objectives and goals through logic models, we really wanted to see are what has been intended or proposed, is that actually happening in the field. And if it is happening, what does that look like. So that's kind of like where the evaluation piece comes from. So when we did the report for the 2019, '20 data, it was very narrow.

It was focusing only on CAEP funds, and not looking at all of the leverage funds or everything that North Orange Adult does, or the other adult Ed providers do in terms of services to adult Ed students.

However, we shifted gears in 2021 based on the feedback we received from our 2019, '20 report. Because what we heard from our members, our stakeholders at the consortium were like, that was too narrow, that's not capturing all of my program, we do beyond what's being presented in this report.

So then we took a broader scope, and then we looked at the data holistically and examined outcomes for all students, regardless of the funding source or regardless of where the instructors were being paid from, you know.

So that's where we just wanted to see, very similar to the length that adult education pipeline launch board uses, in terms of what are the outcomes of the students, adult students that are being served by our adult Ed providers.

And then we also-- one thing we did in these reports where we were really interested in cross enrollments of our students in different programs, because in North orange adult we do offer five main programs. ESL, disability support services, CTE, high school diploma basic skills, and our emeritus program, parenting program.

So we really wanted to see whether students are in core enrolling in other programs as well, because that was some of the data that our stakeholders were interested in. And then I will go over the reports just so you can kind of see what was all included in there. But just in terms of slides, I just wanted to cover what was in each of these reports.

So as Dulce mentioned, this year we also created a planning guide, and this was really to assist with the three year planning for our consortium. So this was driven by the CAEP three-year plan guidance document. So we kind of use-- and I'll show you in the next slides how we use the guidance document to provide the necessary data.

So it's a comprehensive document for our consortium that provides all the pertinent information that is necessary to start the conversation around data. So around identifying what are the needs of the adults in our consortium. And this is where we use like the data fact sheets. We wanted to see who are we currently serving. So we looked at the adult education pipeline data. What are the outcomes of our students?

So we kind of provided all of that data that was pulled from the adult education pipeline launch board. We also examined how do we compare to the state averages. This was also kind of modeled after a Word document, with tables that was presented by CAEP office West Ed team. So we kind of modeled around what was already being provided.

And then instead of having a consortium of stakeholders each going into the different tools and trying to figure things out, we kind of put everything in one big document for them to go through, read, and process, and for different meanings from it.

And then we also used various data tools. So we didn't limit the planning guide just to what was, in terms of adult education pipeline launch board of fact sheets, we actually went into the original plans. Because we are located in Orange County, so we wanted to look at other data that might be available for us.

So we looked at, for example, in order to understand the labor market data more, we looked at the Orange County regional plan. What were some of the priorities for Orange County. We looked at the Orange County strong workforce regional plans, the Orange County community indicators report.

Again, just trying to understand a broader perspective of what are the needs of-- what are the workforce needs in our region, and then what are the adult needs of our-- what are the needs of the adults within our region. And then we also examined-- so I'm not sure how many of you are familiar with EMSI, or called MCE. But our district had partnered with them, and they provided us with labor market data, employment and wage data of our graduates.

And we analyzed that data too understand where are our students going. Those that they were able to scrape the web. So the way they analyze the data, they look at the personal professional files or social profiles of individuals.

And if they can match that information to whatever we provided, then they're able to pull kind of like employment data or wage data for those individuals. And we analyze that data as well and provide it in the planning guide.

And as I mentioned, we were really trying to align the planning guide to the CAEP guidance document. This is like a screenshot of how we did that. So in the CAEP guidance document, under section two assessment, there's this subsection that kind of lays out, like, what is expected for the consortiums to do for the three year planning. And there's a lot of guiding questions and resources.

So we utilized this document and this information, and we kind of dug into these questions. So and in the next slide, for example, you'll see one of the main questions that was being asked was what characteristics define the regional community. And that information was available on fact sheet.

So this is how we went into the fact sheets, we went through for-- we looked for our consortium, we pulled the data for our consortium. And then we placed this information in the planning guide so that it's available on one document. And then we also-- instead of just having that one screenshot, we went through every single tab and pulled information for our overall population.

We pulled information from each of these key demographic populations, like adults with disabilities, foreign born. And then laid out a massive table in the planning guide, which will show that you can look at the data and see like, OK, what stands out to you when you're looking at the data and adults within the region. And then we used all of that information in our presentations and in our one on one individual meetings and/or workshops.

And next I want to cover the Excel file we were talking about. So this has been a source of driving conversation. So this was done in-house where we pulled all of the active courses that were in our system and our banner system for the academic year of '20. I believe this is an example of '21, '22 academic years. So everything that was built in our system to be offered to students.

And we pulled all of the courses, and we pulled every single CB element because of a banner system then is submitted, we submit data through MIS. So we have all of the course data element information available that we can pull from banner. And then we created this massive Excel file.

And then in this massive Excel file, we kind of utilized the AP methodology, so the calculations from the metric definition dictionary, to point out to our program directors, managers, and staff in terms of how their courses are coded and whether those courses will fall within one or more of the key program areas, based on the AP methodology.

So, for example, right here I'm showing you we listed the program, the course. We just took the numbers for the presentation purposes, whether that course based on its coding would be identified as a CAEP CTE course. Would it be identified as a CTE work for course. And then you can see here, this is how we built the calculations.

So based on this information and how the AP methodology identifies participants and career technical education, based on the CB 22 code, based on the CBO3 code or the CBO9 code, we utilized that information. And this is based on where that information is living on our columns. We identified is it a J, is it a 1, in terms of a vocational course for the CBO 3. Is it ABC in terms of the CBO9 element.

And that's how we were able to then identify whether that course would be identified by a launch board as CTE. So this was a way to identify data discrepancies. So we know sometimes course coding is based on how the curriculum is built, so we just wanted to map out from the beginning to end how courses are coded, and then how the data is reflected on the Launch board, what that maps out to. And bring it to our constituency groups.

And then lastly, as Dulce also mentioned, we have been building internal Tableau dashboards, and all of these are public and they live on our website. So I'm going to quickly go over all of the data tools. But I just wanted to show you where everything was. So let me-- please, let me know if you can see my screen. It should be on a PDF right now. OK, all right, awesome.

So as I mentioned, so initially we started doing evaluation reports. And this is kind of like an ongoing, so we have been doing this annually. So this was actually-- I believe we presented one of these in the CAEP summit, so some of you may have seen it. So in 2020, we used over 2019, '20 data to really, again, see what does student outcomes look like for our consortium based on the strategies and activities that were being implemented.

So what this report really did is kind of provided, we mapped out like why are we doing this, what does CAEP program look like, just to give it background information to our stakeholders or consortium stakeholders. And then we, again, provided what data looks like in AP Launch board. And we created this one page infographic because at the beginning-- these are several year old documents.

This is like we were also starting to map out what this kid looked like, what do CAEP outcomes look like, because everyone was saying, outcomes, outcomes, outcomes. How do we reach outcomes? They we're like, OK, we first need to know what are these CAEP outcomes and what's being identified under these big buckets.

So we wanted to present the information in a digestible manner to our stakeholders. So all of that was that information was embedded kind of like in the introduction of a report. And then we indicated the purpose of a report, and why were we doing this evaluation where we wanted to see what we're-- sorry, I should make this bigger.

And then what were some of the instructional and supportive services that were being provided by a consortium, what were some of the data elements that-- what did elements is our consortium capturing? That kind of aligns to the CAEP metrics. And how the strategies are contributing to the Launch board outcomes.

So I'm now going to go over, we provide a methodology and we provide a metric definition. We were really trying to align with the adult education pipeline Launch board here. And trying to be transparent in terms of, again, what is being calculated and how are we calculating things. So the big takeaway from this was I kind of wanted to show this is because I wanted to connect back to what Blaire was presenting earlier in terms of data stories.

So that was also our intention with this report too. We were presenting all of this data and all of these numbers. And we tried to embed student success stories within our report, just to highlight like all the work that's happening behind the students. And all these students we're saying, 153, this many achieved success of.

Now, we wanted to include those narratives from the students, student voices, that's the highlight adult education services resources that have been provided to them, that have been helpful on their academic journey, on their workforce journey. So we try to include that in our reports as well.

So I know given just the time, I just wanted to-- these reports are all on our website as well, so that was kind of like one way. We went very nitty gritty from every single activity and looked at the data, at the activity level. And shifting gears in 2021 when we were told like that was a very narrow approach, it didn't highlight everything that was being done within our institutions, within our programs.

We then utilized a different approach, and this was much more broader. Again, providing same context so that if someone is coming in and looking at this data, that they know what's being calculated, how are we defining enrollment, what were our data sources, just for transparency sake.

And going to data-- and initially I had mentioned we were really interested in looking at our cross enrollments. so this is how we presented data. We presented the numbers of students that were being served in our programs, and then we looked to see their cross enrollments.

For example, like here, if you were to look at this column, like there were 1,500 students that were being served within this academic year. And our basic skills program and 12% of those students were also taking CTE courses. So this type of data was used to inform where can cross collaboration happen. So in what ways can basic skills provide support to those students who are also taking CTE courses. So things like that.

And then data in terms of what were some of the outcomes of the students. And these were aligned very much to how data is looked at the AP launch board. And this was done because at that time, the AP launch board for 2021 was not available so our program directors and our higher UPS were really interested in wanting to see that data, at least internally.

So we pulled that data, kind of like giving it an overview of what 2021 data looks like for our consortium and our adult Ed. And then lastly, I hope I'm not running out of time but I will talk really, really fast. Please, let me if I took too long, because I think the part that you probably all want to focus on is the data, how we built the data literacy.

So I'm going to go over the planning guide. And this is available, again, on our website. And I have included that in the resources as well. And so the intention behind this was, again, to align with the CAEP guidance document, in providing that data.

So as I had shown on the slides, for example, we-- oh sorry, this is the guidance document itself. This is not the planning guide. Let me open the actual planning guide. I apologize, I opened a wrong document.

Actually I will open it from our website so you can actually see where it is. All right, hopefully you can see the planning guide now. All right, so right here is the planning guide that we did for our consortium. And, again, we looked at the data and we utilized those questions that were in the guidance documents in terms like, who are our current customers.

And this is where we use the adult education pipeline Launch board data to provide the numbers and context around the data. So for each of the snapshots that were provided in the planning guide, we provided some narrative, we provided links to every data source. So if someone wants to go in and dig into this data, they can.

And then we also kind of created this map for our consortium, just so they can see like we are in North Orange County. So where our students are coming from in terms of how many are we serving within our region and how many students we are serving outside of our region. Because we are kind of located in North Orange, so we are kind of what's considered like a commuter's school. So we do have a lot of students that are commuting to our region.

And you might all be very familiar with these screenshots, these tables and charts because they were pulled directly from the AP launch board. Again, just so everything lives in one centralized place instead of having our stakeholders going to different places to pull this data. And then, again, we really use the guiding questions to provide the data so that it helps with the writing of the three year plan.

And we also mapped out all the places because we offer a lot of courses offsite. So North Orange adult is located at three main centers at Anaheim Fullerton and Cypress. But we also offer classes all over the community. So through this map we kind of mapped out where all the locations are in terms of where we are serving students.

And then this is all the fact sheet data that was used to, again, map out what other adult needs and what the data looks like for adults. And providing context in terms of what our audience were saying. And this is the massive table I was talking about. So it's difficult to look at the data holistically when it was a different tab.

So this was one way we've presented the information, through a table format. And then kind of pointed out those big things that stood out to us, in terms of where there might be a greater proportion of students in one area compared to another. And so that was all broken out and then explained in each of these subsections.

And I'm going to now talk really, really, really fast because we still have other half of the presentation to go. So, again, using planning guide, and I wanted to point out to the other data sources we were using, as we had mentioned earlier. And then we also-- when we were using, when we were providing the fact sheet data, we wanted to guide our audience in other ways they can desegregate the data.

So instead of putting multiple pictures, we did still wanted them to go interact with the data. And this I will share during our last slide when you're talking about the open sessions where we had audience understand this data better. How to use these different filters in terms of what they're offering or the students they're serving.

Like, something when we were doing these things from the research perspective, we don't have all the details or we don't know our student population as the individuals who are actually working with the students, say, who are in the direct line of contact with students. So having them utilize this data from that perspective and sharing that with us, was really helpful.

So I felt like when we were presenting this information, we had to put that disclaimer. Like, some of this information is coming from a research lens array, how you are going to utilize it is really dependent on who you are contacting in terms of student populations, your daily interactions with students, and what their educational trajectory looks like.

And then this is where we were talking about where we used other data sources, like we used the Orange regional planning units, regional plan. So we were looking at, again, trying to answer these questions, like, what kind of skills are employers looking for.

So we were looking at data around skill gaps, what were the top skills that employers in Orange County were looking for, in terms of common soft skills, hard skills. And what were some of the industries that the region was prioritizing. Again, providing that context within the planning guide. So when we are writing that three-year plan, we have that broader perspective.

And then this is where we were analyzing our LMD data, so Labor and Market Data for our graduates. And then lastly, I skipped that page, but we also did provide links to every type of data, like a student survey, so the qualitative piece rate. So we did focus on that too because our research office does a lot of surveys and we try to collect data through like interviews or focus groups.

But when we were in the remote setting, it was a lot of surveys just asking students of their needs, their barriers to education. Especially when we were in the remote setting, like, what does this education looks like for them. So we have individual reports for all of our surveys, and we embedded the information within this document as well so that it's easily available to those who want to dig deeper into the data.

And lastly, we focused on the metric section. And this is all pulled from the Launch board. So we provided information very similar to how it was presented by the quested office in that one Word document. What is the data for California looks like, what does it look like for NOCRC as a consortium. And then it was broken down to our three main adult Ed providers. So NOCE, ROP, and Garden Grove.

And then for every single metric, we presented that information in this format. And this was actually-- because we only had data available up to 2019, '20, so we only presented for those four years and did the averages across the four years. And it was really interesting to present this data this way because typically when we go on the Launch board, we're only looking at our consortium.

So to have that statewide comparison, it kind of sheds some light for our institutions and for our consortium. Like, OK, where are we lagging? Like, how are state doing? How are we doing? Where can we do better?

So it did at least generate that conversation. And then rest of the document is literally for every single metric. Again, it was all pulled from the Launch board. And lastly, I'm going to go very quickly on because we did talk about our dashboard.

So these on this website are all of our Tableau dashboard. So I really just wanted to show what we did. So this is all pulled internally, for example. We have been doing this, starting our 2019, '20 data. So we pulled dashboards around our enrollments and breakdown of demographics, around our student services data, around course retention, core success data, term for term retention.

We wanted to see, again, that transition within NOCER, our students going from ESL to high school diploma, you know. Again, mirroring to what adult education pipeline launch board data looks like. And then we look at our graduates completers data, so high school graduates and our certificate completers.

And then because we are part of a community college district, so we do have access to Fullerton College data and Cypress College data, because we're part of one district. So we were also able to see some transition from non-credit to credit data. So just going to show you one dashboard in terms of what was included. So this is our enrollment and demographics dashboard.

So, again, kind of like providing information, like, what is the headcount, what are the enrollments. We broke it down at our academic program areas. So basic skills CTE, disability support services, ESL and LEAP, which includes our meredis and parenting and community Ed. And then further broke it down at a subprogram level.

If the programs were interested, if city was interested in knowing how many students they were serving in our pharmacy tech program, they're able to filter it down. And all of the information below will be filtered down. And then providing additional information in terms of demographics. So we have gender, age, we looked at what were the students educational goals when they apply to NOCE, what were their educational level.

And then we also did a two level demographic breakdown, what is the breakdown of ethnicity, and gender looks like. And then we looked at enrollment data by term. And this information can be, again, further broken down at program. Like, if our programs were interested in like, CTE wanting to know what does enrollment look like for them in a specific term or for this specific program.

And, again, just like all of the tools, we provide methodology in terms of how we define things or how we calculated things. And now I'm going to go back to the PowerPoint. And next slide. All right, Dulce.

Dulce Delgadillo: So it's been a journey. It's definitely been a journey to build this capacity. And I would say that we are starting to see some movement. Data is starting to be a part of those conversations, people are starting to ask about data. And so really facilitating or fostering more of that culture of inquiry. So how do you do that? Again, building that data culture or the culture of data utilization, we did one on one data coaching with some of these departments.

We tailored the trainings for them so that we looked at their specific program data. Or even would get into the subprogram level because we had these tools, and they could access these tools. So we were really training them on how to utilize these tools and help them facilitate the conversations and take it back.

And I would almost say, like, a train the trainer model where you train them, and then they can go back. And that's really our hope with some of these tools. Data & Donut workshops, so, again, we dove in. We just said, we're going to go into the nitty gritty of these things.

And so having these types of opportunities and platforms for an institution, entire institution has also been an effort from our office. And we have some lessons learned from that as well because, again, we are trying to build capacity across a really wide range of data literacy skills across our institution and our consortium.

And then our NOCRC open session. So really, again, honing down in our consortium, having those conversations about how can we utilize this data for three-year planning. Next slide. So this is pretty much kind of what I covered.

And, really, there were some times where we had to say that. We had to be the data coach and say, you know, what are some of the implications or how are you interpreting this data. And it was very helpful for us to understand how they're looking at the data as well. And one of the things that, again, our goal was to help facilitate conversation, but also to break down silos.

And to really shed light on we are serving an entire student population, and not just narrowing it down to your program or to your one activity that you are using CAEP funds for. We have done presentations at a variety of stakeholder groups, including our directors meeting, including academic senate.

And that's really where we're at in terms of next steps. Really, how do we incorporate the committee work and the task force work that is happening across our entire institution, and leveraging those opportunities in those platforms to move forward and improve our processes, ultimately improving our data. And, again, showing the entire picture and the entire story of the students that we're serving. Next slide.

Harpreet Uppal: All right, so here I just briefly cover because I know we are, kind of, tight on time. You know, how I mentioned Excel file but I realize right now I didn't get a chance to actually show you what the Excel file looked like. So give me one second. So this was our Excel file that was used to start those data conversations around course coding.

So as you will see right here in this Excel file, what we provided was we pulled all of the calculations. And, again, this was for NOC purposes, so the data that was submitted was through comments. So we looked at the calculations from the AP Launch board just to put it in one place where our program directors and our other NOC constituents can go in and look, OK, how we utilize this information.

And then we pulled all of our courses. So this is kind of like a massive list of all of the courses that existed in our banner system for all of our programs. And then we pulled information, again, from our student information system around what was the status of the course, whether it was active, which division department it belonged to, which program it belonged to.

We pulled information like when was this course started, whether it was offered recently. And it's formulated in the way we write our terms. And then additional information about the course. And then we provided every single CB element information.

And it's not all of them get used in the AP methodology calculations, so it was very overwhelming when we first shared this Excel file because those who were not familiar with this language about CB elements MIS, it was very overwhelming. So we try to give them an abbreviated version, we also only narrowed it to their specific programs when we were doing that one on one.

So what I really just wanted to show you, based on all of this information that was pulled, how we calculated things. I wasn't sure if you could tell from the screen, but this is what I was talking about when we said we look to see based on all of the information, all of the different CB elements, that we provided these calculations to identify whether that course was that specific CAEP program area.

And then we also look to see other initiatives because we were interested in, like, does that course fit within other initiatives such a strong work force program under Perkins. And then False means know that course is not part of that program area, True means yes it is. But the way we then presented-- because that was kind of like our internal data hub.

And then the way we presented to our stakeholder was more like, does this course fall under CAEP ESL program area. So they really know what type of data they're looking at. Obviously this is from the CT program. And the way it's coded, no, it is not part of the ESL program. Again, trying to identify those data discrepancies based on the course coding.

And so this was used for that, opening that data dialog. And then now I will go back to the PowerPoint and show a quick [audio out] also how we use [audio out] in presentation mode. So this was kind of like taking what does the MIS element for CBO3, which is top code looks like. What is the actual language.

And then-- I hope you can hear me because my thing is saying that my internet is unstable, so let me take my radio off so that I don't lose connection. Please let me know if at any point you stop hearing me.

And then we kind of then pulled the where does that information live in the Excel file. All right, then we kind of provided them with that information. And during our presentation, we went over like what does this element mean? Why is it important? What are the implications of it? And we did that for all of the important CB elements that were being used in the AP metric calculations.

And then, again, tying back to the way we define our programs might not be aligned exactly one to one with how things are calculated on the Launch board, given that there's certain ways that CB elements are looked at. So then we try to go over all of these important CB elements, and then map it back to the Excel file to see, again, does your course really being identified in that specific program area.

And the reason we wanted to emphasize, it was that because that question that we were being asked is, like, why does the Launch board data looks different from other data that's internal data? And this was kind of to map out like, OK, internally we might be coding something like this, but it's not how Launch board looks at it. So let's really identify where those data discrepancies lie and how can we fix it.

So that was kind of like the conversations we were having. But in this whole process we learned that it was a lot of information, a lot of data dump in one go. So as we are learning from this experience, where tailoring how we're presenting this information to our audience, and making that data digestible.

And as Dulce has mentioned, our focus has a lot been on MIS. So shifting gears to how we utilized, again, the similar type of information for our other stakeholders or other adult providers that use TOPSpro data. So we did for one of our consortium members, kind of, like did this mapping out where we color coded.

Again, this was like we use like how Jay Wright from CASAS kind of presented this information. And we kind of did our internal color coding to show how each metric aligns with the elements that are being captured on the CASAS entry update form. And then tying it back to, OK, how does CASAS look? How does AP launch board looks at this TOPSpro data. What are those calculations and what does that mean from here to here to here.

So, again, kind of like mapping it out how we collect information, how we code information, it all has implications in terms of how the data is then presented For. Our consortium on the Launch board. And then as Dulce-- Dulce, I don't know if you want it to cover this slide.

Dulce Delgadillo: Yeah, I can cover it really quick. Again, going back to just data literacy, doing it in a variety of different ways, shapes, and forms. So we've done it through Data & Donuts, is one of the platforms that we've utilized. This is open to everybody. It's through Zoom, it's virtual, it's recorded, and then it's posted on our website.

So if people want to do it on their own, they can do it. If they want to participate-- and we-- this is really a hands-on opportunity for anybody who wants to dive into the data. So that's really how we utilize these.

And we focus in on, you can see one was just course coding, one was just dashboards, one was just looking at community data. So, again, that lesson learned was break it up in chunks because it's very overwhelming for those who are not used to being in the trenches, in the data trenches, I guess.

Harpreet Uppal: Next slide. All right, thank you. And then I'm going to quickly go over, put it all together. So we have all these data tools, we did all of these workshops, and we did these open sessions.

And these open sessions were actually led by our consortium director, and we kind of, quote, "facilitated" when we were providing data tools, and going over the data. So we were asked to, quote, "facilitate" these conversations with our consortium stakeholders.

So this is-- so the way we used these open sessions were to gather feedback for the CAEP planning purposes. So to work on the CAEP plan we wanted to like get information once all the data was presented. So the intent was provide all these data tools, the planning guide, the evaluation reports, and all the different data sources, the fact sheets Launch board. If individuals wanting to dig into that data on their own, and come prepared to discuss questions like this.

So one of the questions that was posed in the open session was, what are the implications of the community profile findings rate for NOCRC. So once they look at the fact sheet data, like, what did they think? Like, what are the needs of the adults within our region. So we propose that question. And before we propose it, we kind of provided them with another-- like, looking at the data in another context.

So as you remember, from the planning guide, we did that massive table. But we thought one way also to present the information would be to take snippets from the fact sheets, and kind of presented it this way on the planning guide.

And this was not the order it went, but I just wanted to emphasize. What came out of these open sessions was like having them look at the data, and this is just one example, for our overall population, what does an English language ability looks like. How about proportion, that said less than well spoken English. And then how does that differ from those who indicated that they have less than high school diploma education.

So that they can really see those intersectionality. So that was the intention. When we were presenting this information, it was like when we presented this data, before giving them what stood out to us, we asked them those questions.

Like, what stands out to you? Do you see any differences? And based on the selected sub-populations, where do you see some opportunities for that collaboration across programs? And then who's not being served, based on the adult needs, like, what are some of the populations that we think we can still serve and provide support services through.

So we kind of opened this-- we started the dialogue in these open sessions based on this data. And then we provided them with what we thought where those intersectionality were. Like, based on this slide, we noted like 52% of those who speak English less than well, indicated they do not have high school diploma or equivalency. And 46% of those that do not have high school diploma indicated that they speak English less than well.

So what does that mean? Like, that might be a potential population that we want to target. We might want to provide programs. And we kind of give examples of an I-based model that has been utilized in our institution to provide that assistance to students who may need even basic skills or ESL. So where can cross collaboration happen. And then strategies that can be mapped out.

And I believe that, say, in terms of all that has been done, and now it's really about what's next. And I'm sure, like Blaire and the West Ed team is going to go over the upcoming changes. But we're also trying to understand all of the new changes to the AP launch board methodology, and how that impacts our data.

Because when we see the shift in numbers and we're like, now we have to explain to our stakeholders and communicate to them what does those changes do? The methodology, what does that mean for us.

Because we are setting our targets using the AP Launch board data, so we need to have a better understanding of what goes in there and how that impacts the numbers that we will be providing on our three-year plan. And then, Dulce, did you want to touch on the last two pieces or?

Dulce Delgadillo: Yeah, I think it's really been kind of a-- again, I do say it was a journey because it continues to be a journey of really understanding what our data processes look like. And I think because we're an office of institutional research, we're really looking at it through that top lens. So how does CAEP, how the strong work force, how does MIS, how does Perkins, all kind of function, R-FLA, function together.

And so we've tried to build our tools to look at it through that lens, but also be able to look at it through just a CAEP lens. And I think that has really helped us to facilitate conversations to understand our data at an institutional level, and how to improve those processes. How to improve the data that is being collected in order to report.

Because a lot of times we hear faculty-- we hear anybody say, well, our students are being counted, our students are being counted. And we said, well, no, technically not because the coding does not follow that. And so trying to explain those connections has really been key to, again-- and have them see what their role is in this process of being able to feed into the data for those metrics.

Harpreet Uppal: All right, thank you. We went way over time but this is all we have.

Dulce Delgadillo: There's never enough time, it's so much.

Jessica Keach: Thank you both so much for such helpful information. And the resources you provided are so incredible. I wanted to open it up to the group, and honor the time that you've taken to share all of this important work. And see if there's any questions that anyone has for Harpreet or Dulce.

No questions? I have a question. So I can ask first. For consortiums or for offices who maybe don't have the availability of a dedicated researcher or the resources. What is maybe your top thing that you would recommend to help move this culture of data engagement forward? Like, what would be the most impactful thing that folks could do? In your experience.

Dulce Delgadillo: You want to start, Harpreet. Or you want me to go?

Harpreet Uppal: You can go, Dulce, I'm still trying to process this, because this is the most challenging question we get at all. Like, I don't know how to look through the other lens.

Jessica Keach: So sorry.

Dulce Delgadillo: I'm going to be honest. I'm going to go ahead and just throw it out there, money, money connected to the funding. Once we connected it to the funding, the volume was up. Yeah, because it was very much-- and then we were able to connect it to the funding to other initiatives too. To strong workforce, to Perkins, to student success metrics. And I think-- so that's my first recommendation.

My second recommendation is really look at it through the perspective of the audience that you're presenting this data to. And how do you see their role, how do you want them to see their role in this process. So for faculty, it's very much-- and, again, I'll be honest in this group where there has been times where we're just like, this is getting into the curriculum realm, and that is faculty purview.

And so how do you facilitate those conversations? It's connecting it back to, well, the data is connected to the curriculum, the coding is connected to the curriculum. And so this is where your role, this is how you play a role in all of this. And I think that's been helpful as well.

Jessica Keach: That's really great, thank you. I think Blaire put a question in the chat. Have you had to manage pushback on being engaged in the data?

Dulce Delgadillo: You want to go, Harpreet? I think, no, well-- OK, yes, yes. And it has varied, which is interesting. Because I thought very-- and, again, I think it goes back to like that literacy piece because it just varies so much across the stakeholders. But, Harpreet, do you want to-- she touched a little bit on it, our first evaluation report, people were like, you're just looking at one lens, this is just CAEP funded, we serve way more students.

And we're still wrapping our heads around, how do we want to present the data, what is useful? Because the way our president and our VPs want to look at it is different than what our directors want to look at it, which is different how our executive committee wants to look at it. So really being able, again, to serve a variety of needs. So, Harpreet, do you want to add to that?

Harpreet Uppal: Yeah, no, I definitely agree, Dulce. Initially we were coming-- when we worked on the evaluation report the first time, we were coming through the lens of research. Like, we were really interested in like, you told us on paper this is what we're going to do, these are the students we're going to serve, this is how are we going to serve them. And we wanted to see, did that take place?

And then when we were actually collecting the information-- because we really just were trying to get at that cost benefit analysis, I was talking-- I was coming from the private sector, I was really interested, like, this is how I want--

Like I want to analyze the actual effectiveness of the strategies that we were implementing. Like coming, again, from a grad school perspective and to, like, is our treatment working? Is, like, are we improving what we set out to do?

And then we were going with that lens, and it was very narrow. We wanted to follow the money. We wanted to see like, OK, you say you're using CAEP funds for this, this is what you did, this is what the data says. And we took that approach and we were-- there was a lot of pushback there. Like, this-- because it was such narrow scope and we were only looking at small group of students.

And then the pushback was like, this doesn't reflect everything that our program does, this doesn't show all the successes of our students. And then we were like, this is a our trial year, this was the first time we were trying to assess.

And we were learning with them too. And then based on that experience we're like, OK, nobody really wants to go that detail. So we had to take a broader perspective and look at all of our students within the programs. And their outcomes, very similar to a Launch board. So we were like, well, Launchboard doesn't look at that way. Like, you know, why are you looking that way kind of thing?

So we're like, OK, we will adapt it to how data gets looked at. So that was then. However, this year we looked at it from a broader perspective, and we're waiting on that feedback to see how can we improve it so that it really does fit that need.

One of the things I've noticed across my time here and working with different initiatives it's like, we do a lot of work. We do all these reports, we want to be transparent, we provide all this methodology.

But the end of the day, with everything everyone has on their plate, they just want to look at that chart, that they just want to look at that table. And then things get lost in translation when you're just looking at that number because that number is very specific to the methodology and the calculation. So we're trying to put kind of that in perspective. Like, you need to understand how that data was calculated.

And that's very important because just saying like 6% of my students transition, or 10%. Well, you need to know how translation is defined, how translation is defined, how that calculated. So we're trying to approach it from that perspective.

Jessica Keach: That's great. Yeah, I definitely hear what you're saying about folks needing to really understand where the data is coming from. And I know we've had conversations around the Launch board, we're really trying to make sure that we're as transparent as possible.

And so we're excited to release the new dashboard. It's live right now. We're not going to be able to walk through a full list of the changes today, but we really encourage everyone to join us for our webinar tomorrow.

I think that information is in the chat or could be put in the chat. We really encourage you to register and attend that webinar. We'll be walking through all of the changes to the adult education pipeline, and then we will also be providing information and time for questions and links to different resources to understanding that data. So we are so excited to share that with you.

Let's see, oh, here we go. Some upcoming webinars. So tomorrow AEP 2022 is live. That's the webinar I was just referring to. We also have a webinar on May 10, around continuous improvement in three-year planning. And then we have a June 9 webinar. And then we have a two session professional development series, similar to how this session was two parts.

We have our first part one of our exploring equity IN CAEP programming using AEP dashboard data and other tools, next Thursday, I believe. It's May of 5. And then the second part will be May of 17. And then there will be another two part series.

So all of this is listed on the CAEP website. And you can go and register. I know that there have been several links in the chat. We encourage you to visit those as well. I don't know if there's anything else.

I just want to say thank you so much Harpreet and Dulce for joining us and for sharing with the group all of your expertise. And everything you've learned, the lessons learned with data has been so useful and really helpful to the field.

So I just want to say thank you for joining us. Is there anything else that-- any other final questions that anyone has or any other final comments?

Harpreet Uppal: I just want to give a shout out to West Ed team, like truly. A lot of our work really is based on all of the great work that you do on it. Like, you know, and then we use that information to build that knowledge base for our institution and for our consortium. And then you always answer our questions, so we love that. And we're like, we connected with West Ed. And we're like, you know, we're just, like, we're talking to the state.

Jessica Keach: Thank you so much, Harpreet. We really-- we mean it when we say we listen to your feedback and your questions, suggestions that Harpreet is providing, the questions she's asked to have informed the live release of the data that's on the dashboard right now.

We were able to make a few last minute changes because of some of the questions that she's asked. So we really mean it when we say keep it coming. And if we don't have the answer right away, we definitely will be able to look into it and get back to you, and follow up. So thank you so much.

Harpreet Uppal: Thank you. And I just want to give a shout out to Jason Macca Bailey because he liked to stay behind the scenes but he's also the brains behind all those emails. And when we are critically looking through the methodology, he is right with me, sitting right next to me in my cubicle. So, hey there, Jason.

Jessica Keach: Great, well, thank you Jason, the man behind the curtain. I think there is another link in the chat manually put, and a link to an evaluation. So we really encourage you to fill out that evaluation for this session. It helps make anything we're able to provide to the field better. So please complete that for us.

And then we really hopeful see you all tomorrow at the release webinar and share more details on the new features of the dashboard. Blaire, do you have anything else you wanted to say?

Blaire Willson Toso: Thank you to everyone for informing this work. And also a big thank you to Scobie Tapp for continually hosting us, making the evaluation available, and passing that along to us, making sure that PowerPoints and recordings get out. I think that they're the unsung hero in here for hosting us and getting the word out. So thank you very much as well for your work and your support. I appreciate it.

And Thanks to everyone for joining us. I know this is lots of information between the last session and the session. So we hope that it is value and you have contact information for everyone who presents on these, whether it's us, whether it's CO, whether it's Harpreet and Dulce's team. Please feel free to reach out.

Mandilee Gonzals: Thank you so much Blaire, thank you West Ed team, Jessica and Ayanna, and then, of course, your special guests, Harpreet and Dulce Dale. As mentioned earlier, we will be remediating this recording.

And once it is fully remediated, everybody who has registered or attended will get a copy. And then once the PowerPoint is ready to share out, again, with remediation, we'll share that out. And we'll also post it to our website.

Like Jessica mentioned, those evaluations are important. They not only inform our presenters, but they help us bring relevant professional development to the field. So please take a few moments and fill that out for us. And we will see you all back here tomorrow. So thank you, and have a great day. Thank you

Jessica Keach: Bye, everyone.