Blaire Toso: Hi. Thanks Mandilee and Holly for accommodating us and supporting this webinar. I always forget towards the end when we get busy. So I wanted to give a shout out to them. They're always the most gracious of hosts for our webinars

So thank you very much. My name is Blaire Toso and I am from WestEd and we are looking at Data for Continuous Improvement and 3-Year Planning today. Next slide, please. So I am joined here by my colleagues Jessica Keach and Ayanna Smith, and you'll hear from each of them later today. We are the mighty CAEP team over at WestEd and then we're also supported by many other people at WestEd on building the dashboard and working on data and professional development.

So we are here today to support you and hopefully give you some information and really position this work so that you can hear from some of your colleagues in that field. Next slide. So, as Mandilee mentioned, we are joined by our guest presenters, Ute Maschke, Ilse Pollet, Rick Abare, and Jenna Cestone. And we're really delighted that they're here. I'll start us out with some background information on continuous improvement, but they'll really bring us home by giving us concrete examples of how they're using data and investigating data to prompt questions and think about their programming and engage in this continuous improvement cycle.

Next slide, please. We'd also like to welcome Mayra Diaz and Lindsay Williams from the Chancellor's Office. As usual, they're here supporting our work, which we really appreciate.

They support the dashboard and make sense of the work. And we, in turn, help support the CAEP work that goes on across the state. But I'd like to turn it over to Mayra now to say a few words.

Mayra Diaz: Thank you, Blaire. Just want to say welcome everyone. Thank you for participating in today's webinar presentation. We are excited that you can join us and excited to hear from today's group of presenters. So thank you. Thank you for having us here.

Blaire Toso: So, as I said, we're going to go-- excuse me. We're going to go quickly over the continuous improvement process. We'll have a couple of activities. And we'll move pretty quickly from one to the other because we want to make sure that we stay on schedule and have a chance to learn from the field, as I mentioned. And then we'll have a little bit of time for discussion and questions and answers.

Next slide, please. Our goals are really to be able to learn about for continuous improvement phases, use the five whys to identify and refine the issue, and think about using data to explore and identify solutions. Next slide. So continuous improvement is really about-- it's an ongoing commitment to quality improvement efforts that are evidence based, integrated into the daily work of your program, and contextualized as well as iterative, which is why you see those arrows that go around in a circle. Just because you complete one cycle does not mean that you're finished.

It's continually looking at your learners, your programming, your stakeholders, and seeing what works best for them. Continuous improvement has been shown to increase reaching target outcomes. It helps to decrease inefficiencies. It also helps to really identify those solutions that work because it allows for realistic planning and testing. And, again, going back and forth and not thinking that you found the solution the very first time.

It also allows for incremental and, over time, fixes. As I said, you can take off a small bite and begin to work with it there. And then increasingly explore an issue over time as you would like. So you can do a small quick fix, but it isn't intended as a quick fix to your full system. Continuous improvement implies that you take little pieces of it, examine it, and come back, and we'll hear about those different efforts from your colleagues later on.

It does help you focus on a single issue and allows you to go deep so that you can identify what's going on and then come to possibly explore and come to a solution. Also, I did want to-- there are many different ways of naming the phases of the continuous improvement process. This is the one, the identify, plan, execute, and review is the one that I have chosen to use. But many of you are probably also familiar with plan, do, study, and act. And I use this one because I like to be really intentional about identifying what you think is going on and possible solutions.

But you can choose whichever one best fits you and your context and using language that resonates with you and indicates a process that you're willing to take up. Next slide, please. So really when you look at identify, you are doing this by thinking about how you're going to use observation, data, and inquiry. You're asking about what is the problem. What's really going on, how might we change it, and how might we address the issue?

So in identify, you're not just talking about the problem, but you're also thinking about, oh, what do we need in order to solve this? Is there an intervention or a solution or a strategy that we might be able to use? And I just want to say, for anybody who joined us last week when we were looking at the equity walk, it's very similar to how one starts the equity walk, which is by using data to spur deep thinking about an issue. Next slide, please.

Jessica Keach: All right. Thanks, Blaire. Now we're going to put some of this thinking into practice by looking at some data and walking through a reflection activity. And you're looking at data from the most recent release of the adult education pipeline. On the left, you see the number of participants by race and ethnicity that earned a diploma, GED, or high school equivalency in a pre-pandemic year, which is the 2018-19 year compared to 2020-21.

And on the right, you're looking at the percent change, or really the percentage decline, in students meeting the same outcome by race/ethnicity during the time frame. So on the left, you have the counts. On the right, you're really looking at the percentage decline by race/ethnicity, and comparing that to the state decline, which was about 43%. So the line represents the statewide change and then each bar represents the change among that particular demographic group. So, for example, between 2018-19, and 2020-21, the number of Asian students that attained a diploma, a high school diploma, GED, or high school equivalency declined by 52% compared to a 43% decline statewide.

I'm going to pause just for a minute to allow you to look at this data and take it in. All right. Now I'm going to turn it over to Ayanna, and she's going to kick off an activity where you are going to be able to reflect on what you're seeing in this data.

Ayanna Smith: Thanks, Jessica. So we have a link to a Jamboard. And, Jessica or Blaire, if either of you could paste that in the chat for me, please. And on the Jamboard we want you to reflect on that data that Jessica just went over with you. So go ahead and click on the link. And then the data is there on the first slide in the Jamboard.

So you don't have to worry about remembering what was there. You can definitely continue looking over that data and finding what stands out to you. And our discussion question is just to define the problem. What problems do you see when you look at those numbers? And so you can go ahead and create a post-it note, a sticky note, to add your comments there onto the Jamboard.

OK, thank you for those of you who are already adding your comments.

Blaire Toso: Ayanna, can you maybe share the screen so people can see what's on the Jamboard?

Ayanna Smith: Yes, I wasn't sure what happened. Let's see. Can you see the Jamboard?

Blaire Toso: Yeah, thank you.

Ayanna Smith: OK.

Blaire Toso: This is great. I also see a couple of comments on the previous slide as well where it talks about disproportionate declines for some groups. And then yeah, was there a shift in delivery online that impacted some groups more than others?

And yeah, it looks like when I'm sorting through these that there are challenges people are identifying, a problem that maybe some groups experienced greater challenges and then what impacted them. Definitely about-- somebody is asking, is the problem declining enrollment or attrition?

Are folks leaving the state or are they leaving adult ed? Are we not able to serve the students or do we not have enough teachers to serve a greater need not represented by this data? That the number of participants declined by 17% but the number of completers declined by 43%, interesting, yeah. So what's the problem? Is it-- that goes back to somebody asking about enrollment or attrition.

And then the numbers of students completing this outcome was low even before the pandemic. And then, however, the pandemic impacted some race, ethnicity groups more than others for this.

Were Asians fearful to come to campus due to the racial threats they were experiencing during COVID? Exactly. So that's great. Let's go ahead and go back to the discussion, the slide. And keep that bitly link or if you're in the board, keep it open because we'll go back into that.

So we wanted-- once you identify that problem, one of the things that we talk about in the continuous improvement process is to dig into those issues, right? And there's a process of doing this called the Five Whys. I'm just curious, has anybody used this in trying to solution around problems that they've identified to really dig into this?

Can't see if any hands are going up. I saw a couple of nods, for those of you who have your cameras on. That's great. Thank you. Thanks, Jenna. So the Five Whys really helps to get you to what's really at the heart of the issue you're discussing. So for example, sometimes what we think is the issue masks the real issue.

We might say, why students aren't enrolling in IELCS courses because the IET is not of interest to them. They don't want-- they're not interested in that occupation. However, if we look closer and we ask why aren't they interested in that occupation, it might be that we've begun to think about, well, there are a lot of women in our classes. Maybe that is a nontraditional occupation for women. Or it might be that they are not aware of what the job is.

Speaker 1: You have entered the waiting room. You are muted.

Blaire Toso: Thank you. Or it might be that they feel that they won't. We've even heard from students who say, well, we were interested. But we didn't feel like we had enough English skills to be successful or we didn't even know who to ask or where to go for it. And that each time you ask a successive why, like, why isn't a student interested?

Oh, well, because we think it's primarily women. Well, why do you think women aren't interested? Well, it's a nontraditional job. Why do you think it's a what-- it's not a nontraditional job? Well, there are societal perceptions about what this job is. Keep asking those whys. It helps you dig into it.

And the other piece, when you're digging into these whys, is to take it off of being inherent in the student. We talked about this in the equity as well-- is a lot of times it's about the system or our programming. So when we're asking those questions, digging into, so why aren't they?

And then the next question as opposed to being an attribute of the student might want to be, like, what is it about our society that creates that? What is it-- or the way our program is structured. Why don't they feel comfortable coming and talking to us as opposed to, well, they just don't have enough English to understand it.

So this is also a technique that's used to get at root causes, which is an important process for us as well, even if we're not looking at the root cause, even if we're just trying to dig into what we want a solution around-- is because it will help you refine your question and target the problem and identify solutions that address the issue.

So what we're going to do now-- next slide, please-- is we're going to do a quick breakout group about practicing those whys because sometimes those whys feel a little funny after you get past the second, when you're like, hmm, what is that next why going to be? But if you practice it, you'll begin to see how it just digs in and helps you refine into not just the problem, but then thinking about the strategy that you might want to employ or test out.

So we're going to go into random breakout groups. We're going to go back to the jamboard. And then there will be-- you'll look at your group number. And go in and post on one of the jamboards. They're labeled by group one, two, three, four. Five go ahead and use them. And then post your whys there. And Ayanna, can you just quickly share the screen of that those slides so people can see what it looks like?

So there, Ayanna will show you. If you click on the top carrot to the right, you will see how you have gotten-- you'll each have a jamboard that has the different whys. And just use your post-its to ask. Print your first question. Dig into your second question, your third-- until the fifth. And see how you are able to refine that question. Are there any questions before we go into the breakout rooms? Excellent. So we'll see you in a breakout room.

Looks like we're all back. And we have some great questions that popped up. And it is interesting to see where people got stuck and where they were able to populate further. So I just wanted to quickly touch base. If you flip through them, does anybody want to share what kinds of questions you came up with? And what new ideas or clarifications did you experience through this process? How did it refine your thinking?

Does anybody from group three-- we can see you're on the screen. You've got a lot of questions.

Speaker 2: Yeah, hi, everyone. This is Dulce.

Blaire Toso: Hi.

Speaker 2: I think for us-- I was part of group three. And one of the things-- the first thing that came to mind, at least for some of us, was pandemic, pandemic, pandemic. Hence our little post-it here, pandemic. Because that's been kind of on the tip of our tongues for all of these enrollments, right?

And then so it was hard to say, OK, how do we go from pandemic to another why from a pandemic, right? So that was kind of hard. But then we kind of talked about other whys including this-- we had already seen that there was an existing drop in K through 12 enrollments historically, even prior to the pandemic.

We talked about external factors that were impacting our students to access their courses, right? So lack of internet, a lot of job loss, food and housing insecurities, all of those pieces. So that's kind of where we ended up going.

But I have to say, it was hard once you hit-- when you throw out a why and answer it with a pandemic, this is the first time we've had to deal with this at least since the last 100 years. So it is hard to understand it in the context of a very unique outlier circumstance.

Blaire Toso: Yeah, thanks for pushing on that one a little bit, Dulce. I think that's a really valid point. But also to the point of saying, OK, so this is the pandemic. It also makes us look at other factors, right? I think that one of the things the pandemic has-- why has the pandemic hit us so hard? Part of it is where you get to in that third one about being affected by factors such as digital access or to take care of family.

So you're digging into not just that it's the pandemic, but why? And what structures within our system affected the fact that the pandemic not just sent us home and people had so many other things going in their lives. But what within our system, had it been in place, might have been able to better support adult learners and keep them engaged?

So I think that that is a really interesting example of yes, this is where we get stuck. And we know the answer. But digging into it helps us identify some places to go with it. Right, we're going to go ahead and move along so that we can make sure we have time. But that's exactly where it's supposed to push. Those kinds of discussions, it doesn't necessarily answer. But it does push discussions and make us think a bit more deeply about the issues.

So then we have the continuing improvement process. We move into the plan. And this is one that you all, I'm sure, are well familiar with-- identifying what data do you need to collect, what activities do you need to do to collect the data, what's your timeline who needs to be involved, right? And as you do this, you're going to want to continue to explore what intervention would you want to test if you look at Dulce.

And many of you went this route once you identified digital literacy, technology. People were building up boot camps in response and then figuring out if that was working. Provide-- getting technology into learners' hands, access to broadband-- so all of those pieces were things that you all identified. And it would be very interesting to see how you all landed on what you all thought the effectiveness of those strategies were.

So next slide, please. We're going to do a chat. But I think we're going to go ahead and continue on. And so it would have just been, like, essential to that planning process is identifying the data that you would like to collect. And then there's the execute, which is where you're really implementing your plan.

And throughout this, you want to make sure you're taking really concrete and copious notes because that's part of your data collection is understanding the process that you're going through. You're going to want to stay true to your plan unless, when you're revisiting it in the midstream, that you as a collective are already able to say, this strategy is not working.

And so we need to go back to that identify and plan section. And then you would swerve back. But you would have that mini review in there. And you can do that pretty quickly if you're executing and something isn't working.

You want to make sure that you're engaging your stakeholders. And then even before you get to the review process, you want to begin to see if any patterns or trends are emerging from that process. Next slide.

And then the review is really about, what are you learning? You'll analyze your data and determine whether your intervention or strategy has worked. And that can be simply, yes, let's go forward and continue looking at it and implementing it and maybe turn our eyes to a different issue or it might also be-- a question is, OK, it's working. But does it need to be tweaked? And you'll see that within your data.

And you'll be asking questions about what action steps you want to take and why and then identifying those next steps. And just because you've hit the review space, that doesn't mean it's over. You may want to test the strategy or intervention again, beginning the cycle there, or you want to abandon your strategy and revisit and start with something anew, or you may just want to say, this is great.

Let's move on to something else and begin that identify, plan, execute, and review process again. Regardless, you really don't want to abandon your work. You'll want to take a look back at what you've done, even if it doesn't work because that's a historical look at what you have been doing and what you may not want to consider again.

And the other piece that you can learn about this is establishing a process for going through these continuous improvements as you identify issues in your work. You can use the same process and procedures that you have already identified if they have worked for you and your team.

And then you'll always want to be continually monitoring your solution, even if you've identified that it works and you're moving on to a new issue. Keep those pieces in place so you can monitor and then make adjustments as maybe something in your consortium changes in your program changes, whether that's learners, new labor market information, or just new stakeholders.

Does anybody have any questions before we go on to some concrete examples of how this gets applied? Super. We're going to hear from South Bay Consortium for Adult Education first. Then we'll move to East Region.

Can you go back one more sec? Thanks. To East Region Adult Education Consortium and then we'll move to the Silicon Valley. And they'll each have been looking at a different issue and how they worked through that.

If you can either post your questions in the chat or hold them into the end so that we can make sure everybody has a chance to participate. And then we'll have a discussion time at the end. Thank you. I'm going to hand it over to Ilse.

Ilse Pollet: Thank you. Good afternoon, everyone. And thank you to the WestEd team and CAEP TAP team for inviting us from the South Bay Consortium for Adult Ed to share a little bit about our experience with continuous improvement and using data for continuous improvement. We wanted to touch on two broad areas where this applies in our consortium.

And one is in our day-to-day consortium work as a consortium as a whole. How do we approach data? And how do we commit to using it as a continuous improvement opportunity? And then of course, we're in a three-year planning year. So we also want to share a little bit more about how we have used data in our three-year planning process and how that is really informing and shaping our strategic plan for the next three years.

So I'll start by talking a bit more about, as a consortium overall, how do we look at data? And then my colleague, Rick Abare, will talk about how we approach the three-year planning process and using data in that one. So as a consortium overall, we made a commitment three years ago in our active three-year plan to be a data-informed consortium.

And I was happy to see that what was shared today was a cycle and was not a linear process. So this whole cycle of review, identify, plan, execute, it's an iterative process. And you go back and forth. And you can make small changes or bigger changes and then just always reassess and go back to your-- back to the drawing board. And that's really how it has been for us as a consortium.

And when I look at that circle, that cycle of continuous improvements, I would say as a consortium, we are somewhere in between the review and identify stages of continuous improvement. And that may be surprising because we've been at this for about three years or even more now. But it really was important for us, in order to be able to review our data, to be able to trust our data.

So we actually spend a long time and a good amount of effort in really making sure that we trust our data, that everybody in our consortium understands where the data is coming from, how it flows from the student information system into TE, into NOVA, into LaunchBoard. It's a complex landscape that we're operating in.

We are the only consortium in the state that has two community college districts. So it makes it a little more complicated. And we have five adult schools who don't all use the same student information system. So it was really a puzzle to understand how everything flows and how everything comes together.

And then the other piece of that is making sure that there is uniformity in how we collect and report data across our members so that when we look at data at the consortium level, we are comparing apples to apples. And we're talking about the same things. So we really built our capacity to understand our data, to talk about our data, to develop that data literacy among our members, to even have the conversations.

So we do that in Quarterly Data Study sessions where we look at our enrollment data overall and enrollment data broken out by program area. And that has just been a very helpful process and really engaging to be looking at our data at a consortium level. And then the questions, the whys, come up almost automatically because you'll look at numbers.

You see numbers coming down. And someone will raise their hand and say, oh, why is that? I see this ESL go up here but ESL go down there with another member. So we do have those why questions. In the next three years, I think we are going to move as a consortium into that planning and executing phase and really responding to what we're seeing in the data.

I feel like we have a strong foundation now, as to being able to talk about our data, to trust the data that we're looking at. We have the space in our Quarterly Data study sessions to take the time to really dig into the data with all of our members.

And so the next step will be for us to not only identify the problems and talk about the whys, but also ask that question as to what are we going to do? Which strategies are we going to develop and employ to address some of these areas?

And the area in which we think we'll see the most activity is in the transitions piece. So I think that's where it really comes down to consortium-level work to look at, globally, across our nine members, how our students moving in between our schools and our colleges. And are the pathways that we developed, are they really working? Are people trying positioning where we would expect them to transition?

And if they're not, what kind of support do they need? What kind of programmatic changes can we make to increase those transitions? So that's going to be a big focus in our next three-year plan. So I think that's globally as a consortium where we are. And then I'm going to turn it over to Rick to talk more specifically about how we use data in our three-year planning process.

Rick Abare: Cool. Thank you, Ilse. And echoing my thanks to everyone from the WestEd and CAEP TAP teams for having us. This is a lot of fun to get to hear from you and from other people about how they're using data for these types of really important initiatives because-- Ilse mentioned that we do our Quarterly Data study sessions.

That, in conjunction with using the CAEP fact sheets and LaunchBoard and several other tools, which I'll get into a little bit more in a second, allowed us to almost get meta with our own process and decide how we were going to improve the way that we approach the sharing of data internally and aligning ourselves to the three-year plan.

It also created a really great plan for us to start-- I think it was back in August, by doing, essentially following, the playbook of the requirements of the three-year plan, which was to identify our gaps to start with. So we looked at the fact sheets to do a, make sure we were on top of our regional need, which helps inform those three-year plan activities.

We built-- we call it a CTE matrix. But it's functionally a big list of all the CTE programs that we offer so that we could help identify gaps in conjunction with folks from our local work2future folding in labor market data to see, hey, what do we offer? What's in demand that we don't offer? What's in demand that we do offer but we may need to create a pathway for?

So that type of activity was basically step one. Step two was to leverage the Data Study session space to engage the stakeholders from the rest from our members because it's one thing if us as a consortium team just sit around and think about all this.

But we need to hear from everybody. We need to get them bought in on why we're thinking what we're thinking, especially when we want them to sign up for activities that they say they're going to do for the next three years.

So we took that space. And we did a few interesting activities that I thought were really valuable to hear back from. So we looked at-- there's a great online mapping interface called the Healthy Places Index that's active in California. And it's a zip code by zip code breakdown of, I think it's eight-- it's an index that they create at a zip code level of eight to 10 education, health-type measures where they rank in percentiles each of those measures.

And then they create an index out of that. That is a great way for us to stay in touch with, hey, we all think we know what the need is like around us. But does that actually match up? How does that match up with actual data?

Following that, we basically just created a map of our region and loaded all of the student zip codes into a density map to see, are we serving the students who we need to be serving-- and tried to spend some time thinking about-- understanding that in the context of the CAEP fact sheet's view on regional need, which is very top-down.

Related to the exercise we sort of just did, our biggest change here in our region has been our ESL enrollment. And we wanted to take a deeper look about halfway through last year as we were-- I'm sorry. I think it was in September, maybe November. We were trying to figure out, well, who's coming back? And how is that different?

And one of the great exercises that we did was to look at the-- race ethnicity is a really tough proxy. We have a lot of folks who are identified by these tools as Asian. But they come from all over the world and have to get lumped into that category. So to try to ameliorate that, we actually looked at language spoken at home, which gave us a better indication about-- we have a large Vietnamese population here.

It gave us a better indication who's returning to school in 2021? What does that even look like? So that kind of helped us dig a little bit deeper into some of the ways that different groups of people felt more or less comfortable coming back and what we could potentially do about that. We followed that up with diving deep into LaunchBoard to make sure we were as clear as we can be.

And it's still a work in progress on our data definitions-- to ensure that, as Ilse talked about, do we even know what we're talking about? To make sure that when we look at data that it's aligned to the way that we're defining these adult ed metrics in LaunchBoard so that we can populate goal setting trackers in preparation for this three-year plan.

That's been the most recent work we've done. We've gone around and done one-on-ones with all of our schools and looked at what we have for their last three years and had individual conversations about what strategies they were comfortable taking and what goals they wanted to set around which optional metrics that aligned to our plan.

So it has been a journey over the course of the year of taking our Data Study sessions, which were often just-- it was a place for us to look at our enrollment. It was a place for us to discuss data-related issues that we needed to have a consortium-level understanding on and actually use it to do the three-year plan.

But then going from that, to improve on this, is to align the Quarterly Data Study sessions to these metrics and monitor progress on this stuff so that we're fully aligned as a consortium with the way that we're reporting or the way that we're goal-setting based on our three-year plan. So I think I wanted to do that faster. If I do have one more minute, I did want to just give a shout out to that LaunchBoard for Equity Analysis session.

That was a really great session. I think there's some really valuable tools that are immediately turnkey to working at a member level. I actually, as part of two years ago's Regional Need Analysis for that annual plan, we went into LaunchBoard and we examined outcomes and divided those out by demographics and look to see if we could find any equity issues that were very obvious between percentages of groups achieving outcomes.

So I think the tool is really good. And there's a lot of value to that. I think that's pretty much-- did I forget anything, Ilse?

Ilse Pollet: No, I think you covered everything. Thank you, Rick. And yeah, we're looking forward to any questions or comments. And please feel free to contact us if you have any questions about this.

Blaire Toso: Thanks, Ilse and Rick. We will we should have time at the end for some questions I know I have questions. So hopefully we'll get to that point. But at this point, can we go ahead and switch over to Ute? And Ute, talk a little bit about your process.

Ute Maschke: Yes, good afternoon. Thank you for having me. And WestEd and CAEP for having me here. I want to share the screen in a moment to share a little bit, I think along the same lines that Ilse and Rick brought up, consortium-wide, but then also trust building internally for next steps within that concept of continuous improvement.

So the first, more consortium-wide, our consortium only has three members, one of which takes about 95% of all the programming. And when we moved into the strategic planning process for the next three years, we ran into this issue, how can we ever compare our work with one partner when one partner has 95% of the students and the other one has a handful?

So we needed to find a way to compare, as Ilse already said, apples to apples. And we needed to-- per request, we wanted to simplify it, make it more easy to work with without dumbing it down. So we decided together-- and that's what you see on my screen now. We decided together on the key points we want to measure. And similar to South Bay, transitions is what matters.

How do we measure successful transitions across very different partners? So we created together this spreadsheet as a preliminary first draft of the metrics that we could use when one member has 20 students and the other one has, say, 8,000. We can still measure transitions. We started looking at the data in LaunchBoard. And then compared it to the data we had internally.

And we discovered, just by identifying those descriptors, some discrepancies that allowed us to start a conversation. As you can see in column A, the categories that will look familiar to most of you. It's outcomes reports, right? And sometimes even summary reports. We discovered discrepancies that allowed us to have a really insightful conversation that also led to that trust building that Ilse was talking about.

So our welding class is one of the most successful ones. We routinely, every single term, have 24 completers. When I look in LaunchBoard, I have five. We then did sort of a backward design. If it's only five in LaunchBoard, what's happening in TopScroll? What's happening in our internal data tracking system? Where's the hiccup?

And we included not just program directors in our data tab but then also the teachers. And I think, after all these years, it was the first time that a teacher had this aha moment and understood what data entry means and the consequences, the lack thereof, has. So it was a refreshing conversation because I don't think the teacher ever connected the dots that way.

This measuring transitions also, I think for the very first time, allows us to include our transition services in the metrics. There is not really a space yet in our reporting to the states where we can report on our work in transition and student support services. At the colleges' side, there are different mechanisms to do so. But at k-12 and our schools, we usually don't have a space other than in the update record, those three boxes.

But we don't really have a space where we can report our work in student support services. So we created these descriptors for our transition services, for example, leveraging transition services for co-enrollment with colleges, to finally have a tool to report our work. That is so crucial with the adult learners-- the support system around classes, around training programs.

So we now also have internally a tool that we can use with teachers, with our governing board, to make a case for solid transition services. And I think we also finally have the beginnings of a metrics that allows us to make a case to the state that we need to invest in these transition and support services way more than we do. The program outcomes we now measure usually don't include these support services outside of the classroom.

And we have also been working on a better, more insightful, more trusting data dialogue. When we talked about improvements internally, we wanted to measure what happens between orientation and initial pre-testing and start of classes. And we had quite a high number of students dropping out between-- they took the pre-test. But they wouldn't show up for classes. And we wanted to figure out what's going on there.

So what we decided on was we looked at scores and placement data just following the NRS guidelines. And this is color-coded here. And I wanted to share this one for two reasons. Number one, not too surprising, we actually discovered that we had students in our previous classes who didn't belong there.

We also decided actually, there is no pre-literacy. This is beginning years. We want to modify and revise our curriculum to better address the needs of those students who test in the so-called lower scores. And just looking at these scores and these data, we decided together to revise our pre-testing and welcoming sessions for students.

And we actually introduced a more extensive welcoming session for our students to help them understand how assessments, how pre-testing works. That already has impact we are measuring now-- the attrition rates between testing and starting classes, participating in classes. And I think we can already see a correlation between a solid introduction to how assessments work and why we do them and students staying with us, starting classes, staying in classes.

So these two tools or conversations about data connect because transitions are only possible when the student starts classes. So by sort of aligning a larger conversation across consortium partners, with the conversation almost one-on-one with teachers and our proctors, that adds to improvements for both, for the students, but also curriculum review and the creation of a new welcome session for our students.

This whole conversation came up because we're using LaunchBoard a lot more nowadays than we ever did before. And of course, these discrepancies we saw didn't really sit well with us. So we dug a little bit deeper in our own internal data. So that's the broadest overview. A lot of it mirrors what Ilse and Rick described.

Blaire Toso: Thanks, Ute. Yeah, it's really interesting to begin to see the trends come out. And maybe we can talk a little bit about that afterwards because it's interesting how this continuous improvement process loans itself to particular pieces and pieces that you've pulled out that are really important. But before we go to that, it would be great if we can go on to Jenna. Welcome, Jenna. You're on mute.

Jenna Cestone: Hi, everyone. I'm Jenna Cestone from Silicon Valley. I'm going to go ahead and share my screen. Let's see. OK. Here we go. So we did a similar data dive and a cycle of inquiry when we saw similar things to what everybody is talking about, all the speakers who have talked so far, where we saw our high school enrollment dropping significantly.

And this is above and beyond what can be attributed to the pandemic. We just didn't see our students returning. And it wasn't necessarily attributable to a demographic, any kind of specific demographic group. And when we did our five whys and we kept asking the teachers and student what they were hearing so we could complete our TOPS report, we would hear similar things to what you all hear.

We've spoken about this at our consortium meeting ad nauseum about no transportation, lack of accessibility to technology, could not participate in the Zoom, did not understand the curriculum, could not get-- the biggest sticking point, and we've talked about this, is the math. The math was too rigorous or it was just not accessible.

And so we kept trying to think, what could we do? Where could we go? How could we figure out another plan? Well, we queried our consortium. And it was varied. Schools were doing different things. One solution was independent study. And that seems to work well for many of our adult schools. Our school in particular does not have independent study. So that wasn't even an option for us.

We also were given directive by our superintendent to be in-person. We had to be in-person. So that was very troublesome for us because many of our students kept saying, well, you have a program that's in-person five days a week for three hours a day with mandatory attendance. And I just can't do that. I'm working. I'm living on the gig economy. My schedule is fluctuating.

So from week to week, some folks would be on a night schedule, some would be in a day schedule. So we were constantly changing students around. It just wasn't working for us. So through that, we reached out and found, through CASAS, the problem-- how could we find a program that would fit our students and our students' needs?

OK, so we polled our students. And we polled our teachers what they were seeing. And that data suggested, yes, they don't have child care. But we have CalWORKS on site. Some students didn't qualify for that-- no transportation. We got them through our transition program passes. But even though we're on the VAT line, the transit takes two hours from their location to get to us. It's very laborious. So they miss class anyway.

We don't have independent study. We don't have a laptop rental program anymore. And we really didn't meet the DEI equity requirements. The students who were coming in, we were having a really hard time servicing them.

The consortium was very gracious to us. And they actually contracted with an outside person qualified to test. And I understand that Dr. Williams goes around to different adult eds. And so we're lucky to have him one day a week.

But we really need a lot more for our AWD that we're seeing coming through. And so in addition, our falling enrollment needed another thing. So what did the students want? We queried them. What do you want? They all want online, hybrid classes. They all want it. But they want to come to class, too. They want access to a real human being. They want access to teachers. They want flexible schedules.

So we came up-- this is the data. It's very clear that all of our ESL folks, they want five days in-person. ESL was drastically different than our high school diploma folks-- drastically different. And so all of these, you can see, hey, I want flexible schedule. I want one day a week with my teacher. I want two evenings, on, off. I want to be able to come in the morning and come at night depending on my work schedule.

So high school diploma was all over the place. The data bore it out. And we said, what can we do? So out of our students, we queried, would you take more classes if the schedules changed? 42% said yes. That's real data from our high school. And so I took that to my superintendent. And we chose the NEDP, the National External Diploma Program mostly because it's well vetted.

I'm New York resident-- former New York resident. I know this program well. I said, hey, I'll take this on. It's been around since 1975. It's not that hard. It's very successful, well vetted. Most of us are using pre and post-testing CASAS anyway. So I started this part of the program and got it through our board here.

It's very good for folks who fit all of this criteria that kept bubbling up organically. Too much anxiety, timed tests, from pandemics or otherwise. They have work or other obligations. They struggle with higher math. Some of them are non-native English speakers. Some of them just wanted to work independently. Sorry about that.

OK, so we had student testimonials. And teachers could see the relevance of the curriculum. So how do we identify students? What do we actually do in our-- or we pre-test them. They need to meet that threshold of 230 in math and 236 in reading as a gatekeeper. In orientation, we have this conversation with them. People are coming to us from all over the world, OK? So in our hour orientation, we have this conversation.

How do I finish my high school diploma in California? What do I do? If I know the traditional 150 unit in adult ed, maybe you have 30 or under and you just missed a couple of things like government or economics or English 4 your senior year. Great. The traditional high school diploma is for you. We wouldn't recommend NEDP.

The GED/HiSet asset class, that's four or five tests. That's the class that teaches and prepares you to take those tests. It's a class that is test-driven. It's very specific. And you could be in that class for years if you're one stumbling block is that HiSet math test. We're seeing it. We're seeing people try their best and really stumble in that area.

So they're frustrated. That's another little data area that's a niche piece of data that we asked the HiSet GED teacher, why are these students languishing? What do they need to graduate? Why have they been in your class for so long? Jenna, they only need the math. They only need the math. I'm like, ugh. If only we had had another program, we would have put them in there first. And they would have been done.

So the NED program, it's virtual. It's all virtual. Students work independently. They come in on-site one day a week. And they do in office checks with the teacher. The great thing is that if you have 70 units, 80 units, 30 plus units, it doesn't matter. You can go in that program. And you're done in a finite amount of time-- 6 to 12 months. It's finite.

And it's all depending on how much time and effort you put into it. The curriculum, this is very nitty gritty. The curriculum works very well with IET ready, transitional and community college pathways. So their schedule is not locked to this five days a week, three hours a day high school diploma program. They're now available to take a welding class two days a week because they're working on this high school diploma.

They can take medical assisting at night two days a week. And they can work on those Pathways which before, they're like, oh, I can't do it because I have to juggle my schedule and do my high school diploma first. And then I'm going to start my Pathway class. And I'm like, no, you should be in this NED program. It's going to help you. And you'll be able to take this GED. You'll be able to take this IET Trades class that you're interested in for your career.

So that was really our driving force. Well, I said that I would take lead in all of our consortium. So any school in our consortium that has this kind of student, we would take them on. Because a lot of the schools, you need to be someone who's not afraid to learn a new technology. It is a standalone technology. So I think it makes a lot of sense as a consortium to have one school pilot this kind of a program.

That makes a lot of sense because you can see what it would take. And then the teachers would eventually find one or two other teachers in other adult eds. And then slowly it would grow like that. Well, immediately we had 12 or 15 students sign up for this right away. They were like, this is for me. I want this right away. We have a very successful student who is in that class right now.

And I would love to tell you more. I know I'm very limited on time today. But I would love to tell you more. And I can take you actually to the inside of the portal. I can show you through CASAS how to get in. And I can show you the exact work. Because I think lots of times in different consortiums that have heard of NEDP before, the teachers always wanted to see, show me the work. Show me the curriculum. Show me what that looks like.

And now that you have one person in that, I can see exactly what they're doing. And it is rigorous. It's very rigorous. It's very relevant to adult ed's. The math is consumer math. And I know some colleges have consumer math. But it makes sense to them. They're learning things like APR and credit cards, how to get a mortgage, how to pay for prescriptions, how to budget, how to do a household budget. So the math is something that's adult-relevant as opposed to algebraic equations and Pythagorean theorem and word problems that are very mind-bending for some of our folks.

So I would love to go on. I'm not sure about my time. I could go deep or we can save it to the end. But that's just it in a nutshell. This is kind of the most critical piece is that all of our data kept pointing to something that we're just missing-- we just kept missing. And organically, the same things kept coming up. We needed to satisfy a virtual flexible program for high school diploma.

We needed something that was well vetted. This has been around for decades. It works very well for working people. We needed something that was in our IET that could play well with our Pathways that we have planning because we have a lot of trades and a lot of classes on our site in particular. And we needed to satisfy all the other issues that kept coming up.

If some folks want to go slower, great. If you want to go fast, great. And we're there along the way. But any time people want to go on a deep dive with me, I can do that with you. And I can also point you in the direction of Janita and Kay at CASAS. And I'll be happy to give you their emails as well. OK.

Blaire Toso: Super. Thanks so much, Jenna. And to Ilse and Rick and Ute, I think that I really appreciate the offer that you all have extended to your colleagues. Part of the reason why we bring your programs are kind of to present on the work that you're doing so that other people in the field know both some of the solutions and the things that people wrestle with, but also to be able to make those connections.

If you're looking at transitions, reaching out to Ilse, Rick, and Ute about some of the work that they're doing, or to Jenna when you're looking at those gaps maybe in high school-- that high school equivalency, high school diploma attainment, and thinking about the solutions that they've come to.

So I'd like to go ahead and open it up, if we can move forward to the discussion slide. This is a time for people who have questions for any of our presenters. It's a very quiet group out there today. So I am not naturally quiet. So I'm going to go ahead and ask my question.

It's really interesting to me to hear that this process that you all took, particularly Ilse and Rick, I know you were talking about a three-year process-- that you've really gone on this journey to begin to explore this. And I was curious, how do you find the patience to stick to it for three years and engage your stakeholders in this journey with you?

Ilse Pollet: That's a great question. And I think part of it is us having Rick in his role as data analyst who, I think, challenges us to step on the brakes when we need to and say, OK, we can look at these numbers. But I'm not sure if I can trust this. So let's go back. And let's make sure that we know where these numbers are coming from and that we're talking about the same thing.

So I think that's in part due to Rick's insistence as a data person, making sure that we're looking at things that make sense. And I think our members really want to understand their numbers, too. I mean, we see that desire across the board of members wanting to know what is going on and whether or not the data reflects accurately what is going on at their schools.

So when we look at data, people will point out and say things like, I see-- like Ute was saying, we see five people in LaunchBoard. But we think we have more. So let's go back and see what's going on, right?

So we get those questions a lot from our members that point out discrepancies that we're not able to see at a consortium-level. But our members on the ground see what's going on in the classes and can tell us if what we're looking at makes sense or not.

So I think it just kind of happens organically. And having our Data Study sessions is just a great place to ask these questions and keep coming back to building our data literacy and understanding what's going on.

Blaire Toso: Thank you. Rick, did you want to add to that? I see you off mute

Rick Abare: Yeah, I would just say, to add an example, there's been so many times where we've said-- one of the ways that we had initially, like, for me, was to-- we have data coming from TOPSpro. But our colleges were not WIOA after 1920. And the process for pushing data from their systems of record to TOPSpro was rough, shall we say.

So we couldn't look at that data all next to each other. We couldn't just export summary data and say, well, we actually have more students than that. Well, how many? I don't know. So we had to build the, how do we talk about how many students we served in a way that was operationally as consistent as it can be between adult school and college. Because the way that those systems function is so different.

So we decided to just start. We wanted to build people's capacity to look at a lot of information at once. You were looking at a graph that has three variables or more on it where you've broken out by institution for enrollment. You're looking at year over year. And you're looking at quarter by quarter. And watching cumulative enrollment and comparing it year over year, well, then you have a pandemic. And now you can't do that anymore.

But that was a great place to be able to build people's capacity, to look at relatively complex information and get used to seeing it because after we had done that for a few quarters, people started saying, hey, wait, I should have 50 people in Workforce Prep. And you don't have any for me. Why not? And we would say, well, go into your TOPSpro and check how your courses are coded.

Or I see, I have way too many people in ASE. Why? I don't know. But let's find out. And that was where you found out, oh, we realized one of our classes was actually miscoded. We went back and fixed it. So just even stuff like that is very, very valuable because when you hit Submit on that data in July 15, that's what the state thinks you had. Anything you do after that you fix, it's good for you to know what actually happened on the ground.

But in the eyes of CDE, it doesn't matter. So at least just having those-- that was just kind of a small example of the ways in which people have been able to build their own capacity to look at that stuff and say, hey, this is a good touch-point for me to say, I think something's wrong. So yeah, snaps to at least, at the very least, getting that done.

Blaire Toso: Thank you. I think that kind of rolls into the next question which is, what has been the most successful strategy you've used in helping faculty, teachers, stakeholders, partners, build trust in the data? And all four of you talked about needing to get people to buy in and trust the data before you could really move forward.

Ute Maschke: I think we are at the level of buy-in. Trust comes and goes. And with all data, right, we are, in ways, are suspicious of it. I think for us, there were two observations that teachers themselves voiced unease with too many of our students in what they wanted to call pre-literate. And we were wondering, why is that the case?

Why all of a sudden do we have so many who stay in that class for sometimes half a year? How can that be, when the understanding of foundational literacy is that you prepare students to start their journey as an English language learner. So that unease, discomfort, led to, let's look at scores. Let's look at data with the students you have right now and check.

And the scores are not all there is. But it gave us a baseline to review whom are we working with, how are we working with them. We all, I think, favor a strength-based, asset-based approach. But sometimes that gets lost in the classroom. So this was a good reminder of reviewing how we're working with students and that our goal is really transition, right?

And then, if our outcomes will be transitions, let's backward design what do we need to change. And I think it started with the teacher saying I have too many students. So by addressing her need, her unease, that helped a little bit.

The other one for us, we started-- I'm sure like many of us, we are looking at EFL gains because that gives us a rather objective, clean look at data. And we can compare. So it's not so much the raw numbers but the percentage. What is our trajectory across the years? And I think even during the pandemic, percentage-wise, it's still a good comparison.

Last, all looking at the same data at the same time. I think we started to get a little bit of buy-in that this process might just work if we do it continuously. So in a way, the powers in presenting EFL gains on one spreadsheet across a couple of years so that the comparison is already made for the ones who look at the data. There was a question in the chat about our welcome session.

When we looked at the scores in this pre-literate foundational literacy class, some of this came from some teachers suspecting that, quote unquote, "students don't take the assessment seriously." So we wanted to change that mindset. In those welcome sessions we have, we first introduce, like many of you, I'm sure, what is adult education about? Why are we here? How do we want to support an adult learner in transitioning?

And one of the goals of transition is for us to accelerate their experience in each class so that they start at the right point and move to the next point as effectively and efficiently as possible. So in these welcome sessions, then, once explaining what adult ed has to offer, we also talk about test taking culture in the United States, about how assessments work for us, why we use the tests or assessments we use. And then we show the group how it works, how to go through it.

So we set the stage. We do a little bit of a preview of what the test is like and why we're doing it and that the point is really to get them out of our class as fast as possible. And that seems to-- taking the time to explain that, we do that with interpreters. It makes a huge difference also in the willingness of students to stay with us for two hours or three hours.

Blaire Toso: Thank you. Ilse, Jenna, Rick, did you have anything you wanted to add to the discussion on building trust and data?

Ilse Pollet: Yeah, I do. I think that consistency is important. And I heard Ute talk about that, too. Just keep showing the same visuals quarter after quarter, month after month. And that creates familiarity and comfort in talking about these numbers. So that has really helped in our experience.

And then the other thing is-- and I believe Ute mentioned it as well, for teachers and other staff to see how what they're doing in their day-to-day practice in their classroom ends up in what we report to the state or what we look at as a consortium. I think people generally want to understand, why do I have to check this box in my student information system? And what is the outcome for that? What does that lead to?

And we've had those conversations, I think, with faculty-- happens more at the school level than at the consortium level. But with our team of transition specialists, we absolutely have that conversation with them regularly about, what you do as a transition specialist, how you enter data related to your meetings with students.

That matters. That shows us something. We can learn something from that. And then people generally want to understand how the data flows in the consortium and what their unique role and contribution is in that.

Jenna Cestone: Yeah, I definitely echo that. I do think we live it on a micro-level every day. And our teachers live it on a micro-level. They're dealing with the data of their class every day. But something that the consortium did that was really useful is that when we had our workday, they brought in that guest speaker who talked about global trends.

And I think it's really important to look at those global trends because that really explains a lot about how, across all industries, are getting affected by either migration or different industries that live and die and have peaks and valleys. And all of that affects what we do. Yeah, so that was very useful to keep looking at that data as well. It's all useful.

Blaire Toso: That's an interesting tactic of personalizing the data, which is interesting as being able to bring them in from that space. The other piece that I thought was interesting that you all have all talked about also is really personalizing the data by not letting your thinking-- while you're going through this continuous improvement process and you're identifying these issues, it seems like one of the first things that you also do is address someone else's question in that space of knowing that, if you can't answer that question, they're going to be stuck on that.

And so each of you have also forwarded the way you've talked about-- somebody raises a question. So you talk about that, even though it may not exactly be what you're talking about. And it's a really interesting, fine balance to listen to you all juggle all of these different moving pieces about making data trustworthy through personalizing, globalizing, responding, demonstrating the importance of where each person touches that data. And then that rises to the top to explain or at least inform the way different issues get positioned.

Jen, I think you did a great job of answering people's questions in the chat about the NEDP. And I just wanted to ask. One was posed about, is the program designed for out-of-state students?

Jenna Cestone: I don't know what they mean. If you're out of state, then you can't be tied to an adult ed program and do in-office checks. So in-office checks are face-to-face. I think if they're traveling for work, they could do a remote in-office check. They still have to be tied to a school. It's not the issuing school that gets-- the high school diploma will be Silicon Valley Adult Ed.

So for example, the National External Diploma Program, they will give you that. But then we are the teachers who are tied to that program. So I'm not sure I understand the question.

Blaire Toso: So I think that if what you're saying is that NEDP is a national program. However, the administration of it is localized, right?

Jenna Cestone: Yes.

Blaire Toso: So it would not be something that you would do with someone who was not enrolled into the program.

Jenna Cestone: Right.

Blaire Toso: If your program allows out-of-state students, that's great. But there's also this piece of having to come in and have a personalized check-in, which is one of the strengths of that program, is that there are a lot of different ways that this program keeps people enrolled, engaged, and moving along that very fast timeline.

Jenna Cestone: We had a student from Iran who asked that question. Can I go ahead and go back to Iran and stay enrolled? So it's not quite a standalone university just on its own. It doesn't function that way. So that's a really good, qualifying question.

Blaire Toso: Super. Thank you. We have just a few more minutes. If you have more questions, put them in the chat. But I think we're going to move forward and bring this presentation to a close. So can we go to the next slide, Ayanna?

So one of the ways that we use to reflect on learnings, which is also a great way to prompt when you present new information on the continuous improvement process and you're doing a learning or you've presented information and do want people to reflect on that-- and again, it's always that interplay of how do people interact with the information that you're presenting to them?

And we use a way of doing this. It's like, what is something we discussed that squared with your experience? What resonates with you? What are three points you want to remember? And then, what's something that's still lingering around in your mind just because we've presented information? It hasn't answered everything or maybe its prompted new questions.

So normally I would ask people to reflect on your learnings at this point. And then either post something in the chat about two points you want to remember or what question is still lingering in your mind about either continuous improvement process or what people are doing in their consortium and how it relates to you. So let's go ahead and move forward, please.

I just want to point out we still have some upcoming webinars. We have the second part of our Exploring Equity. And in June, we're really rolling out our Dashboard, which is the Adult Education to Workforce Dashboard Tool and then talking about how that can help identify gaps and position and prompt you to explore how your educational offerings align with labor market information-- similar to what Rick and Ilse are already doing. They talked about that point in there, looking at the gaps between what they're offering and what the Pathways might be.

I'm going to take this moment to confess that I did not put people's email on the PowerPoint. We will fix that before we send it out. But I wanted to give Ilse, Rick, Ute, and Jenna not just a huge thank you for presenting and sharing your work-- I think it's always courageous to share your work because we know, through our continuous improvement process, there's always a place where we can relook at it.

But your work is so strong and innovative that it's also so very exciting that you take the time to share it with your colleagues. So please, pop your email in the chat. And I just say, thank you. And thanks again to SCOE TAP and to everyone for joining us today.

Mandilee Gonzalez: All right, well, while they do that, I'll go ahead and just close this out. We really want to say thank you to the WestEd team, Blaire, Jessica, Ayanna, for this presentation along with the guest presenters, Ute, Jenna, Ilse, and Rick. And then again, thank you to Mira and Lindsey for joining us from the Chancellor's Office. We really appreciate everyone that stayed on and participated with us today.

And we'll go ahead and keep it open for just another minute or so while you go ahead and grab those email addresses. Also, please take note that we have popped in an evaluation link. So that does help inform how we continue to take some deeper dives, different areas where we can improve, and our presenters as well.

So please take a few moments and provide us that very valuable feedback. We'll also follow up with a direct email giving you that link as well. All right, thank you all. And have a wonderful afternoon.

Blaire Toso: Bye, everyone.