Rick Abare: Great. Thank you very much. Welcome, everyone. Today, we're going to talk about how we do what we call data study sessions at the South Bay Consortium for Adult Education. So bear with me while I make this presentable. A couple of minor housekeeping notes from us. We're happy to take questions on the fly, so raise your hand, go ahead and send questions to the chat.

We do have two spots built in to try to catch up in case we are getting questions that we don't have time to answer on the fly or that we want to take at those points. And hopefully, we'll have a couple of minutes at the end for some overall questions and anything we don't get to. I do want to mention that it wouldn't be education if we weren't making a couple of last minute tweaks to our presentation right up until game time.

So for anyone who has downloaded the slide deck to follow along, you may see a couple of things here that you don't see there. But after the session, we will update the slide deck in the Google folder. So by the time tomorrow comes around, you should have an updated version. Today, what is this going to look like? So first, we're going to welcome everybody and we're going to do a quick poll.

We've got four questions that we're going to give that are going to give us a good idea-- a better idea of who we're talking to today. Then, we're going to cover the intent of the data study session, why we do this, and then cover some highlights of topics that we address during those meetings that we felt like we wanted to share.

We'll move to talking about where we feel like we've been successful and then we'll move to talk about where we feel like we have room to grow ending it up with, how could I do this if this sounded like something I was interested in doing at our consortium? So let us introduce ourselves here.

Brenda Flores: Hello, everyone. So my name is Brenda Flores, and I'm a data accountability specialist for South Bay Consortium.

Rick Abare: And I am Rick Abare, a research analyst for the South Bay Consortium.

Ilse Pollet: Good afternoon, everyone. Thank you for joining our session. My name is Ilse Pollet, and I am the consortium director for the South Bay Consortium for Adult Ed.

Brenda Flores: Hello, everyone. So like Rick mentioned to you earlier, we would like to do a quick poll. It's also here just in case it doesn't work, but let's hope that the poll works fine. We'd like to figure out who our audience is and how to help you out. So if you can start by responding to some of these polls, we'd really appreciate it.

I don't see that anybody answered. So if you guys can please help us out.

Mandlee: So it looks like we are at 77% participation.

Rick Abare: Whoo-hoo! All right.

Brenda Flores: OK. It doesn't show it right, OK.

Rick Abare: Yeah, it'll show at the end I think. Yeah, so when you get about a minute in, feel free to kick it over into the next couple and we'll move through this part quickly, Mandlee. Thank you.

Mandlee: OK. Yep. No problem. OK. So we are at--

Rick Abare: I'd say--

Mandlee: Still going 82%. So we're going to go ahead and end--

Rick Abare: Great. Let's go ahead. Yeah, 80%.

Mandlee: I'll share this with us.

Rick Abare: Solid B minus.

[laughter]

So we have a-- great. We're talking to administrators and data and ops folks. That's great.

Brenda Flores: That is [inaudible].

Rick Abare: All right. Let's keep going.

Brenda Flores: OK. What we expected, right? All right. So do we have the second poll or did you put them all together?

Mandlee: It just launched this second.

Brenda Flores: And we [inaudible] are doing [inaudible].

Mandlee: We're moving along quickly. We're upwards of just under 80%, so I'll give it two more seconds.

Rick Abare: Awesome.

Brenda Flores: Thank you, everyone.

Rick Abare: Yeah. Everybody came back from lunch, ready to rock. This is great.

Brenda Flores: Got their energy.

Mandlee: OK. And I'm going to go ahead and end it now.

Brenda Flores: Thank you.

Mandlee: And share.

Rick Abare: Nice distribution there. Lot of 4's and 3's. A couple of 5's. We love it. We will be quizzing you later, so--

Brenda Flores: [laughs] We won't let them do that to you. Just kidding.

[laughter]

Rick Abare: It's a nice distribution that we had there for that. I like that and I like to see that.

Mandlee: And now, we're launching the third.

Rick Abare: And on this one, feel free to select the one that best describes-- if you look at both quarterly and yearly, go ahead and select quarterly if you look at it every month. And also quarterly, choose monthly. We're trying to get a sense for how often people look at their consortium level data.

Brenda Flores: And how are we doing?

Mandlee: And-- OK. We're at 80% so I-- oh, looks like someone-- I'm just going to give it two more seconds.

Rick Abare: Sure.

Mandlee: All right. Here we go.

Rick Abare: This is also a good-- it gives everybody in the chat a chance to introduce each other. Quarterly.

Brenda Flores: OK. Looks great.

Rick Abare: All right. We have a winner. And a nice representation for monthly. That's awesome.

Mandlee: OK. And the last poll. There you go.

Brenda Flores: Are we moving along pretty quickly too?

Mandlee: Mm-hmm.

Rick Abare: Yeah.

Brenda Flores: Nice. Thank you, everyone. We appreciate your help with this.

Mandlee: We are at 81%.

Brenda Flores: Yeah, I think we're good.

Mandlee: All right. Let me go ahead and end and launch, or end and share.

Rick Abare: Great. Enrollment outcomes. All right. Well, we should have put more choices. Everybody's looking at good stuff here. So this is very helpful. Thanks, everybody, so much for participating in this polling. If you give us a chance to get to know our crowd a little bit, we often find that when we do our data study sessions, we have a wide representation between faculty, support staff, transition specialists, administrators, and data and tech specialists.

So it's great to know that we're-- who we're working with today. I'll go ahead and move to the next slide.

Brenda Flores: OK, that's-- you were right.

Rick Abare: Yeah, so what-- we wanted to-- off the top, what exactly do we mean by a data study session? Well, that's basically the content of our talk today. But a couple of quick parameters we wanted to get to you right off the jump is that these are quarterly meetings. They take place after that month's steering committee meeting, and they're usually run for about two hours and we host them about a month after that quarterly reporting window closes.

We give ourselves a little more time in November or in September because we want to make sure that we get a chance to look at the whole year. That's our year-end wrap-up. But typically we will look at end-of-quarter data in the month following the reporting close of that quarter. And as I'd mentioned briefly before, who is invited? It's, everyone from our consortium is theoretically invited to come.

We have good representation from our steering committee members. We also have good representation from agency data and operations personnel, and as I mentioned, transition specialties faculty. But really, it's an open session where anybody can come and see what we're doing and ask questions, and the more the-- as everybody knows in these kinds of things, the more participation you get, the more of a rich experience that everybody has.

So with that, I'm going to kick it to our director Ilse to talk a little bit about where did this all come from and why we do it.

Ilse Pollet: Absolutely. So a bit of the history of how we got here, how did we arrive at quarterly data study sessions in our consortium. And you can move to the next slide, Rick. So as you all know, our consortium work can get complex pretty quickly. And in our consortium, we have five adult schools and four community colleges, and we're the only consortium that has two community college districts within their membership.

So you can imagine that data and data systems get pretty complex pretty quickly. And so, when we did our year 18-19 three-year planning process, we really noticed a big gap in data collection and data entry and data reporting. And it was sometimes even the simple foundational questions as to how many students do we serve. The running joke was, we serve 14,000, 15,000, or 18,000 students depending on who you ask, right?

So we knew that we had to get more consistent and do a better job at collecting our data and reporting it. Same with some of the outcomes or transitions data that was collected. It was sometimes understood differently by different members and thus reported differently. So we knew we had to do that backtracking work of making sure that everybody has a common understanding.

The same-- what we call-- data literacy to understand how to-- what a data point means, how to report it, and how to consistently report it. And then, looking at that data collectively as a consortium, so that we could truly become a data-informed, decision-making body. So as part of that three-- that last-- or actually the active three-year plan, we decided to invest in our data capacity as a consortium. So we invested in our data team. That's Rick and Brenda.

We also started a data community of practice for data leads at each of our agencies that meets monthly to discuss data-related items. And then the quarterly study sessions are really a place to tie it all together, where we come together as a consortium to look at our data, have conversations about it, try to make meaning out of those numbers, and have the-- and use that to inform our planning and our program development choices.

So multiple benefits to that, probably more than are listed on this slide, but it has been helpful for member accountability, member effectiveness conversations, planning, annual planning, three-year planning, programmatic decisions, budget allocations when that time comes around, and it has also been beneficial to our members when they go through their own accreditation processes, their WASC visits.

They can pull-- they can refer back to some of the data that we looked at our consortium data study sessions. So Rick and Brenda are going to take a deeper dive into those benefits in today's session. So I'm going to turn it back over to Rick to tell you a little bit more about what that looks like.

Rick Abare: Thank you, Ilse. And yeah, we're going to hop right into the weeds here pretty quickly. So, again, if anybody does have any questions on the fly, feel free to use your emojis or use or raise your hand in the Zoom or go ahead and type your questions into chat, and Brenda and Ilse will be kind enough to go Rick, Rick, Rick, stop. One of the things that I like to say is, I tend to talk pretty fast, so please don't hesitate. Don't be shy.

So the first gap that I wanted to cover in terms of what we want these sessions to ameliorate is basically to provide us with a forum to better support data-informed decision-making. It's so hard for this many busy schools to all be on the same page about these things that are consortium-level issues. So we wanted to make space to engage members in their understanding of data, both at their agencies, and also in the bigger picture.

That space, again, creates a place to discuss the methodology behind how that data is collected, and to use consistent language when we talk about certain metrics and other processes when it comes to data collection. And as Ilse mentioned, it's obviously crucial to have space to discuss the most relevant planning and goal-setting ideas. We're jumping into a new three-year plan, and we spent a bunch of time at our last data study session talking about the new required metrics, optional metrics, to look at the CAEP fact sheets and talk about other goal-setting things.

So having this space, intentionally setting it aside is a big theme. So it wouldn't be a data study session or a talk about a data-study session if we didn't start looking at some graphs. So as Ilse mentioned, we often talked about how much enrollment do we even have. So one of the things we wanted to do was to create a consistent visualization that we could look at every quarter that would enable us to communicate in one space what our enrollment is?

Now, a couple of things off the top. This is a complicated graph to look at, and I'm super-proud from a data-literacy standpoint of the group for wrestling with this every quarter and really starting to get their heads around it. Essentially, what we're looking at here is cumulative participant enrollment, so students who made it to 12 hours, and we have that cut in the columns by the agencies. And then, the line represents the cumulative total.

And what this allows us to do is to look across years and across quarters to look for trends, see how we're doing compared to prior years, whose enrollment's going up, whose enrollment's staying flat, and so on and so forth. So the consistency aspect is something I want to highlight here, making sure that we can look at this every quarter. So in November, when we look at quarter 1 for the new year, we will bump FY 2019 up off the list. And then FY 2022, quarter 1 will show up.

So immediately, we'll be able to start to compare that across the previous two years and see how we think we're looking and then we can get all excited to see what the line does as we go. Some more notes on underscoring what we talked about and why this is important. This common touchpoint on changing data and accountability considerations within our own organization is a way that we can ameliorate the data collection gap and trying to make that more consistent.

So just having a common understanding of concepts and terminology. I know everybody here experienced this turnover when it comes to consortium participation, both at the agency level and at the steering committee level. So making sure that there's a set of vocabulary that exists, hopefully in perpetuity or until it gets better, to make sure that people can come in and a lot of people come into CAEP, the CAEP and the adult ed world.

And they may not know, they may be fresh and they have five years of CAEP concepts and terminology to catch up on. It also allows them to solicit-- to-- we-- I'm sorry, excuse me. We are able to solicit questions from the group about their systems for further clarification. So what do certain outcomes mean? Why does this number show up here? Hey, they added an I-3 column in the TE CAEP Manager Summary Report, What does that mean?

These are all kinds of things that we give ourselves space to talk about. More specifically, the changes that we've undergone as a consortium, our colleges stopped participating in WIOA Title II after the 19-20 school year. So that had a huge impact on how we collected enrollment data. We couldn't just look at our TE reports anymore. And so we had to change how we were doing what we were doing.

And in order to do that, we have to have consistent business rules, and we have to have space to communicate those business rules. So that's part of what having this data study session gives us a space to do all these things. We're not the only ones changing what we do every year. CAEP also does change how they do what they do very frequently. This gives us space to talk about beginning-of-year letter updates.

We know we all look forward to getting those in their August or September-- hey, what are we changing this year? Program area changes, for example. In the November 2019 data study session, we spent a little while talking about how the CTE Program areas had all been collapsed under one CTE Program area. And that because there's a workforce prep in there, there's short-term CTE, and there's pre-apprenticeship.

Though in Nova and other systems, we'd consider those CTE, we're still able to tag our classes with those different program areas. So making sure we understand the implications of those changes is super-important. I'm sure everybody who is here remembers the attempt to collect service hours, and the fact that if we hadn't known we didn't need to do that, we might have had a lot of people wasting a lot of time.

So these are the kinds of things that we make sure that we make space to talk about. Ilse mentioned data capacity. Having us is great, but we also need to address the capacity gap with the systems. Otherwise, we're just sitting around, looking for data to analyze. So mapping our data systems is huge, understanding why we see certain patterns is because our systems are lined up in a certain way.

And if data-- and I'll visualize this in a minute, but why we might see certain enrollment patterns is something that we want to be able to talk about. If we don't spend time isolating, is an enrollment pattern change because of the way we're collecting data or is it actually what's happening on the ground. If you don't know that the change is coming from a collection process issue, then you may think you have a narrative about what's happening at your agency or within the region that isn't actually true.

So making sure that we have a common understanding of our whole systems model is super-helpful. We also try to use this to make space to discuss what our priority metrics and research questions are. There's so many fascinating things we want to know about our students. But making sure the steering committee and the consortium staff have a voice in helping us choose which items to tackle year over year is extremely helpful.

This also gives us space to make sure that we're feeding back into the group when it comes to planning and stuff like that. My favorite one is are we doing a good job collecting blank, and this one came up a few times as we looked at the next three-year plan and we tried to decide, hey, is there anywhere in MIS for employment barriers to go? So those are the kinds of questions that we are able to try to tackle in a group setting.

And then, the data team can go back and help try to figure out the solutions. I did mention earlier these-- the ability to give everyone the same picture and make it clear that if we're missing students from a roster for whatever reason, they're going to be missing from our downstream systems. So we always try to, where needed, make visual models to try to help us understand how the systems connect to each other, and how they talk to each other, like Ilse mentioned.

We do have two community college districts and they both use different management information systems. So we don't even get economy of scale there. Also from our five adult schools, we're also using two different student information systems there. So there's being able to just give everyone an understanding of why this might be missing here and what implications it might have down here is critically important.

I think one of my favorite examples of this is, there is often a narrative if you see adult school enrollment go down and college enrollment go up, that, oh, all 100 of those students who weren't in my adult school last year, but they must have gone to the college, and that isn't necessarily true. And the way that you understand if that's true or not is by having an idea about where those enrollments would show up and actually comparing to see what happens.

So these are kinds of-- and that's important narrative. It helps create cohesion for people to understand that that kind of stuff actually really isn't happening. So I bring that example up here. We'll look at some more graphs later that'll isolate where people might see that in the data. So I'm going to-- we've talked about a bunch of stuff there in terms of the intent of doing this. We did want to give you a little bit of a look into what does this actually look like in practice.

Essentially, we start every meeting with updates from Brenda and I, ongoing project updates, recent community of practice meetings. We look at our quarterly data, so we look at the first quarterly enrollment graph that I showed you and then we look at another graph that I'm going to show you in a bit which breaks out data by program area. We will then talk about one or more of three different topics generally, which is any CAEP data or accountability updates, project updates from the stuff that we're working on this year, we've--

Last year into this year, we're making a lot of-- trying to make a lot of headway, making sure that when schools are entering outcomes that they're doing it with a relatively similar theory of action. Otherwise, if we're looking at outcomes comparing them between agencies, we have numbers we could look at but they might not really be comparable, and that's something that's extremely critical as we go to try to define effectiveness and all these other ideas.

And often, we'll also do a topic-centered deep dive. We had a great conversation about a year ago about what exactly are bridge courses, and it was fun to have participants give us three completely different definitions on what they felt like a bridge course was, all of them good. So with that, I will-- this is our first built-in, little stretch break. So if anybody does need to get up after lunch, it'll give us a chance to catch up to any questions that we might have.

I see Sherida had a question for us. How many agencies in our consortium and are you direct-funded? Ilse, that may be a good question for you in terms of the funding. I'm not exactly sure.

Brenda Flores: If you can do that as well on the chat.

Rick Abare: I believe we are. OK. Oh, thank you. Ilse did get it. Thank you very much. I was looking for question marks, so I just--

Brenda Flores: Though we could still-- just thinking somebody is not looking at the chat. So it is five adult schools and four community colleges. Those four community colleges are two college districts, and we are direct-funded.

Rick Abare: Yeah. Thank you for the question, Sherida. And if anybody else has any other questions that they want to hit in the chat, we can take a few minutes here to try to talk about it.

Brenda Flores: All right. I think we should--

Rick Abare: Sure. Yeah.

Brenda Flores: Oh, I saw something come up. Is that you? Let's see.

Rick Abare: Yeah, I see a familiar face, Bonnie, in the chat. Bonnie had a great question about why workforce prep numbers seem to disappear, and that-- part of that has to do with the way that the CTE Program areas were collapsed. The other part of it has to do with the way those courses were tagged. And I-- getting into the minutia, I could not get a clear answer from CASAS as to why some courses that were in high school equivalency were showing up actually tagged as workforce prep.

The long story short that I got from them was it sounded like we needed to go through every single one of our classes and make sure they were properly tagged. And I think that's something that we could follow up and discuss. But this is a great example. Bonnie just provided us with a great example of feedback that we get back and forth from these data study sessions.

The degree to which we should care, I think that's probably an agency-level question in terms of if we feel like the courses that we have tagged in certain program areas are being misrepresented as being in other non-related program areas which we kind of have an example of that that we'll talk about later.

If we are still able to accrue outcomes-- I should say, as we are still able to accrue outcomes for students in those courses that are related to those program areas, functionally it doesn't necessarily matter because there isn't differentiated funding for CTE versus workforce prep, at least as far as I can tell. So does that answer your question, Bonnie? Give me a second to reply to the chat. And then I think what I'll do--

Bonnie: It could. It's just, I don't know if-- I mean, we looked-- we compared the data and it was set up the same in 2019 as it was in 2020 and didn't find why it showed-- the numbers showed up under workforce in 2019 and not in 2020. So to answer your question, does it answer my question? I'm still not positive and--

[laughter]

--should I care? I-- that depending on if it goes under CTE because a lot of our workforce now, we have it as a required elective that they have to take, the students have to take. So it shows up under high school diploma.

Rick Abare: Mm-hmm. And it--

Bonnie: Anyway.

Rick Abare: And then-- yeah, no, that's-- but that's a great question. I think we couldn't have asked for a better example of the kinds of stuff that these data study sessions come up with. It also does give us an example about how sometimes we'll try to go answer-hunting and we do not succeed, at least not in a way that's super-duper satisfying. But, Bonnie, I promise that we will at least try to get a better answer than we currently have, which I don't think-- which they don't necessarily think it matters.

They're not 100% sure why it was doing one thing one year and not doing another thing another year. Anybody who's-- everybody who's worked with CASAS and TOPSpro for a while will probably commiserate with our inability to understand the change management if I'm being gregarious. But the degree to which we care, why don't we circle up with Tracy and maybe have a conversation about that because I couldn't get a clear answer on that either.

It didn't seem like it was going to affect us in a negative way, but as far as I know-- and then we had a question from Jodie, how do we approach the variations between TOPS and the adult ed pipeline? We are focused mostly right now on the data that we see coming out of TOPS and aggregating that with the enrollment data we see from our community colleges. The LaunchBoard differentiation is actually the project that we're trying to sew up this year.

As CAEP produced those fact sheets and they had a different enrollment number than we did, it's a great opportunity for us to say, why don't we go close that gap? If you found 700 more students than I did, then they've got to be somewhere. And I think you're asking a greater question for how do we approach the variations.

Some of them are agency-level and the data study session does provide a great place for-- one of our agency leaders shared with us that a bunch of her high school-equivalency students were not getting counted and to give other agencies a heads-up that they need to double check those numbers. So we're currently still in the damage-control phase in terms of how different are they. I know everyone will commiserate with me on the 18-month lag or more between LaunchBoard and what we're actually doing on the ground is very challenging.

So that's why we're focused mostly on what are the numbers that we see. Those are the numbers we have to plan off. If we trust our systems and we trust our ability to get-- to be relatively close, then those are the numbers we feel like we need to work with. Let's see. Does that make sense, Jodie? Feel free to reply back if you need more clarification. And then Karen asked, when you mentioned your community college numbers, are you specifically counting students who are in CAEP-supported programs? Is that all non-credit classes?

Basically, yes. There are some skills lab courses and continuing ed like adult learning courses that we don't count, that are technically non-credit. Skills labs courses are specifically not allowed to be counted. But we comb through the catalog and look for enrollment in any of our non-credit classes that are in the CAEP Program areas. And we make sure we count those students. It's a bit of a manual process because we take the CAEP Manager Reports, and we take the enrollment reports from the community colleges, and we mush them together and we make them into that graph that we looked at above.

That's also why we choose to use 12 hours or more in terms of looking at our enrollment. Those are the students that we're able to count as participants under CAEP. It's a much more stable number than the number of students-- just the total head count in aggregate because a lot of students come for just a service or two, and they don't nearly have the organizational impact that those greater students do nor are they being counted on the CAEP side for us.

So to keep everything consistent-- as consistent as possible across the different agency levels, we chose to use those business rules. But that's a whole conversation all in and of itself. And if you are interested in talking about that, further picking our brains, please feel free to give us an email. Yeah, Jodie, I'm commiserating with you on this challenge.

Brenda Flores: And also, one thing-- not everything because we can't answer all of it. But one thing that we did find, it has to do when LaunchBoard takes our data, because once they take it, those are the numbers. So if you go fix anything at your site after the fact, your numbers will be different. So I would only be concerned if it's hundreds of people and not like 10 or 15 or even 20 a difference. Again, that's not all the answers, but that is one area that we did find if that helps at all.

Rick Abare: Thanks, Brenda. So, yeah, and just to complete the answer to Jennae-- and I'm sorry if I'm mispronouncing your name-- we get our college enrollment directly from our student information systems. So quarterly, colleges submit MIS data automatically. That's where the CAEP office is pulling that stuff for LaunchBoard. So we basically intercede and pick directly. I have a colleague at West Valley Mission, and I have access to-- my position is held by San Jose Evergreen.

So between the two of us, we can get a pretty solid enrollment number, which we're always trying to make sure we have as best as possible. For now, I'm going to keep going because we do have some chunk to get through here before the end. So highlights of what we've covered, and hopefully these will spawn-- those are great questions, everyone. Thank you so much. I really appreciate that.

And again, if you want to talk about this more, I may be able to give you some clues about who to talk to at your college and how to get that data. And I'm very happy to help. We're all a big team here trying to teach-- trying to improve adult education. So please feel free to reach out to us. First, we're going to look at, in this next section, a couple of models that we've used, and then we're going to look at a couple of other graphs.

So this first idea is the data lifecycle. And when I first joined, I needed a way to get my head around adult education, the California Adult Education Program, and all these other ideas. And it occurred to us that making sure we had a solid underpinning of what are the outputs that we're supposed to have in words. Do we understand the concepts? Do we know what the CAEP Programs are? Do we know what the outcomes mean?

Making sure that we understood, in concepts, how those outcomes appear in each of our data systems in terms of what do those look like in TOPSpro? What do those look like in the community college student information systems? Do we know where to find them? And then, upstream from that we have ASAP and Aires our adult school's front-end student information systems.

Once we had a conceptual understanding, the idea was now we can do a more technical trace-back, and say, OK, well, enrollment data starts here, attendance data starts here, so on and so forth. And making sure we can follow that along on a conceptual map. This allows us to define terms, identify them in the data systems, and throughout the process from input at the front-end, all the way to the output at the other end.

So we felt like this has given us a good theoretical underpinning. This was something that we shared early on. We don't necessarily share this at every data study session. We wanted to give you an idea of our overall theory of action. We also wanted to make sure we are all speaking the same language on transitions. So to give everybody a top-level model in terms of what our transitions even-- what do they even mean?

So this was another resource that we shared that we've used multiple times to model what transitions mean. So they go from adult basic or ESL at either of the agency levels, they move to adult secondary, that's a transition. They may move from there into adult ed or non-credit CTE or for-credit community college. They may also go directly from basic skills at either of those two areas into those same disciplines.

So it's something just to make sure that if anybody said, hey, what's a transition? What am I supposed to count? It's a nice thing to be able to send people a model of. We're going to switch gears and look at a couple more graphs. And then, after we look at those graphs, I'm going to highlight some points from those graphs. So the first one, this is another chunky one to look at so pardon the crunch.

This accompanies the graph we looked at earlier with the bars and the lines. It's great to have total enrollment in our agencies, cumulating quarter to quarter to see what we end up with at the end of the year. But we also need to look at how program areas enrollment's changing over time. So what this graph does is it looks at quarter 4, and we do this every quarter, but we only look at that one quarter. This looks at-- this particular graph is looking at quarter 4 enrollment at our different agencies year over year cut by program area.

And this has been really helpful for multiple reasons. It helps us find out things like, hey, we actually saw an increase in adult basic and adult secondary at some of our institution-- at one or two of our institutions during COVID, actually, I think just that top one.

It was great for everyone in the data study session to hear that agency respond back and say, we found out that a lot of our adult basic and secondary learners are-- it's more flexible for them to take these classes remotely whereas we saw drops, big drops in our ESL enrollment because a lot of our communities either were disengaging because those classes were-- they were either-- they either had a harder time crossing the digital literacy gap or they were benefiting more from the in-person classes than the adult-- basically, the adult secondary classes were.

So being able to talk about this kind of stuff, both, cut by program area and looking at it over time, we find to be really helpful. Again, these two visualizations are chunky. They're a lot to look at. So we always try to make sure we give people a few minutes to look at this stuff and that we can have more of a robust conversation, and also that we're using the same format every single time. That's super-critical because the more people look at this, the more at-bats they have, the easier it is for them to jump in and really figure out what's going on.

The next visualization is an example of a one-off topic that we would do. So what we looked at was adult school ethnicity demographic makeups for the last three years. This prompted a great breakout conversation when we looked at the changing in percentages of our students who fell into certain ethnic categories, and it prompted a really-- like I said, it prompted a great breakout session about why we were seeing this? What was happening on the ground?

And then, after we came back from the breakout session, there was really valuable sharing between agency members that was able to happen. So I went through those pretty fast because I don't want to chew up too much more time. I think we're doing pretty well on time, but I do tend to go over, so we'll see how we do. So some highlights from this idea are we get a chance to talk about those business rules.

So I showed you in that program area enrollment slide, being able to do that in a data study session allows an agency leader to email us after, and say, hey, I don't see workforce prep on there, close like Bonnie's question, and I think I should. And what we were able to do with him was go back and look at the courses and audit their program area tags on those courses and fix some stuff that wasn't actually working correctly.

So it's a great feedback loop for improving our practice both as data sharers and also as agencies. So that was a good, are we tagging the correct program areas? Are we doing a good job collecting consistent ethnicity data? This is challenging to do at the colleges because-- especially for a lot of non-credit, a lot of those questions are not required. And the CCCApply can be very challenging. On paper forms, those questions are generally not required.

WIOA schools generally have to collect that kind of data. So to keep things stable for the sake of the conversation we wanted to have to just use that ethnicity data. But that is part of the conversation, are we doing a good job collecting that stuff? Do we have a lot of-- do we have 96% of that data? Do we have 75% of that data? What are we looking at? And how does that compare to what we're observing on the ground and enabling us to have these informative breakout sessions?

Data study sessions also give the chance for us as a data team to be a resource. So in data study sessions, we've been able to uncover projects that the consortium would like us to focus on. We have an in-process, consortium-level data repository where we're trying to aggregate webinars and previous data study sessions. And all these kinds of resources.

We also provide fact sheets for all of our agencies that look at all students, not just the participant's level, but that way they can take that to legislation day and say, this is how many students I have on the front, and on the back, this is how many students our consortium has. And so far, it doesn't look like the CAEP ones have put us out of business on that yet, Brenda, but they are pretty good. They're very good.

We've also-- as Ilse mentioned much earlier, we have been able to be what I feel like is a great resource to our agencies who are going through their WASC visits and their accreditation. Being able to access the data that we've shared in our data study sessions as well as for them to know that we're there to support them with these kinds of things if they have ad hoc requests for how they're drawing up their narrative for their CIP plans and all these other things is very important.

And Ilse just, in the chat, put an example of one of our fact sheets up. That is something that we can throw in our resource folder that should be available at some point. I'm going to spring that on Mandlee. I apologize, Mandlee. A good example was, we-- after one of our data study sessions, we had a conversation with one of our agencies because they ran CAEP Summary Reports with the option to disaggregate HSD and HSE selected.

So when they saw their ASC numbers, they didn't match, and because we were able to have this conversation with them, we were-- one, we were able to share this nuance of reporting with everybody else because if I'm running a report one way and Brenda is running it the other way, there never-- not only can we not logically compare them, but we're never going to know what we're talking about, so again with having the space to have these conversations.

It also enabled that agency to dig deep and understand that it had accidentally tagged some of its adult basic courses as adult secondary and to go in and fix it. So these are the types of productive conversations that come out of these kinds of things, and I think with that, I'm going to kick it back to Ilse one more time for a little bit more about planning.

Ilse Pollet: All right. So this is where I get to hijack our data study session meetings and take time on the agenda to use that prescheduled, pre-calendar time of two-hour meetings every quarter with our entire steering committee and our other stakeholders to really use that time to talk about three-year planning and engage members in that three-year planning process. And as you all know, this year the three-year planning process is rooted in data.

So in this first phase that we're in currently, the assessment phase, we have been using our data study sessions to look at demographic data, labor market information, regional needs data, looking at our program capacity, specifically, which CTE Programs we already offer, where the bridge programs are, where the gaps are, and we've also been spending some time looking at our student journey mapping, how students navigate through our systems and further. We've been able to look at some initial transitions data in between our agencies as well.

So that has been very helpful, very informative. For demographic data, we looked at a tool called the Healthy Places Index which is really super-helpful for us to look at, where you can drill down by zip code in each of our cities to look at who lives there, what the educational attainment is, what the barriers are, et cetera, et cetera.

So I'll post the link in the chat to that as well because for us, especially in a region as Silicon Valley, it may look like everything is high income, high educational attainment level, but once you start zooming in, that's where you identify the pockets of poverty, the pockets of inequity and inequal opportunity, so that was really helpful. And then going forward through this planning year, we'll be using our data study sessions to talk about the metrics in the next three-year plan as you know, some mandatory metrics, some optional metrics.

So we'll have to have some conversations around that with our steering committee and make decisions on which metrics we're going to include in our three-year plan and then setting goals and targets related to those metrics. So that this is a natural forum for us to engage in those types of conversations.

After the planning year, once we get to the implementation of the new three-year plan, we will check back in periodically to measure our progress against our goals, and see how we are doing against the targets that we have set out to achieve in our three-year plan. So it's just been a very helpful forum to have those conversations and allows us to not add multiple meetings on top of what-- everybody's busy calendars.

So, Rick and Brenda, I'll continue to hijack some of your data study session meetings to talk about three-year planning. So we've been at this for over about two years now with our quarterly data study sessions, and we've learned things. So I'm going to turn it over to Brenda to talk a little bit about the successes and the challenges that we've experienced.

Brenda Flores: So like Ilse says, so we're going to go ahead and summarize some of our successes. And one of them is, as you see, consistent, visible consortium level of data sharing, which is like Rick was showing. I see the link there. Thank you, Ilse, for that link. Very helpful. Again, consistent visible consortium-level data sharing. The fact that they can look at each other's data, compare. Like he says, forum for data ops conversations, best practices.

So again, the questions would come up, why am I different? What am I doing wrong? So all that is a success because even though it sounds like a problem, we found why some of the issues are, even though we still owe Bonnie an answer on hers. But for the most part, we do get a lot of answers, and then we have a lot of things that we need to keep digging on which is not a bad thing.

It has improved our agency quarterly submissions because people know that we're watching. And we didn't do it for that purpose. And we didn't do it to shame them or anything, but they're like, oh, they're going to look at my data. So all of a sudden, everybody starts moving on their data which I'm sure makes some administrators very happy, that they don't leave until the end. So that has been a great success.

And again, understanding why our numbers are different, right? Do we need to fix something? Is it something that has to stay different because our site or our agency follow certain rules? But even if we don't get to be consistent with each other, at least we know why we're not consistent or why they are different. Or if we can make them be the same. So it has brought a lot of clarification on that end. There's sharing resources between agencies.

Just last week, one of the sites had a new clerical person, and there was nobody there to help her. So they called here and one of our clerks helped. Why? Because of these data study sessions where everybody sees the numbers, everybody kind of figures out where the numbers go, where they come from. So now, everybody wants to make sure everybody's matching so they help each other. So it does bring a community sense to the group. Again, the same with engagement from agency members.

And the last one is; follow up from agencies and the variety of purpose, process improvement initiatives. So, again, what do we fix? What do we do together? If you want to answer-- or did you-- Rick, did you see those questions? I'm not sure if you're following them.

Rick Abare: I am. I just saw Annabelle's question come in. Are your adult schools and colleges collecting the same barriers for employment? How did you accomplish this? That is exactly what--

Brenda Flores: A work in progress. it's a work in progress.

Rick Abare: That is a work in progress. They're very clearly defined through the enrollment process at the adult schools. And depending on how students enroll at the colleges, they're less clearly defined, especially the barriers that need to get checked in on every year. So barriers at the colleges, given that those are metrics that the state wants us to look at in this new three-year plan, that's a big project for us this year.

I hope to be able to give you a super-awesome answer in less than a year about how we manage to do that or how we manage to accomplish that. For now, we're at a, hey, we think this is not going to be super-duper easy, we need to start tackling this. And as you know, the community college admissions and records processes can be shenanigans, especially when it comes to the number of different people who have their hands on supporting students getting enrolled.

So understanding that whole process, when it comes to our non-credit students with a lot more clarity, is a huge project for us this year, going into our new three-year plan. John mentioned, imagine if we could map our data to census tract. We-- part of what we're excited about the CAEP fact sheets doing for us is, before this, we-- part of our-- and we didn't share this today, but part of the data team work was the regional need analysis, which census data does allow you to get some interesting regional information by census tract.

So if you ever want to talk shop about how we were doing that, please feel free to shoot us an email. Yeah. Annabelle, I mean, essentially, yes. So any barriers, that are not inferred by program area enrollment, are not going to be complete at the college level. We think the adult schools are doing a pretty good job. But again, that's self-reported data. So it's only as good as how good you are at getting students to be honest with that metric. And right now, the college data is probably very incomplete.

That's-- basically, the very first step in this chain is that bad question where you go, OK, how bad is it? How much are we actually missing? Knowing that some of the ELL and low-literacy barriers are inferred based on course enrollment.

Brenda Flores: And I think that brings us down to our room to grow, right?

[interposing voices]

Rick Abare: Yes, this is great. This is a great-- that's a great segue.

Brenda Flores: So a lot of room to grow, right? A lot of your questions are a room to grow. Those are things that we're tackling. But obviously, COVID had a big impact, made us question how we're going to do this data study session. But actually, some people liked it because they could see everything on their screen, right? So I guess even though it's a challenge, some people might have seen that as a success.

The turnover in administration and clerical, just in general, right? And we just had turnover everywhere. Students, staff, just everywhere. So that is room to grow. We're hoping that that gets better. Same for-- because of turnover, the bandwidth for multi-agency projects, just to get everybody to help, and now that everybody's back, maybe that'll get better. And the same for deep analysis. Let's hope that COVID is away soon, and we can get moving on this.

And just understanding the general practices and how they differ from agency to agency, that is a challenge, but it's also a sense of success because we are moving forward with it and we are understanding each other better. And we can compare apples to apples instead of apples to oranges. So [inaudible] between challenges and success. And everyone knowing what they enter, who enters it, and how it comes out at the end. So the data study sessions do show the end result.

So people now know that, oh, if I enter it here, it comes out there, but it's a challenge because now they have to rework how they work in their office, if that's at all possible. And I know we only have a few minutes left, but I'm handing it off to Rick.

Rick Abare: Thank you, Brenda. Yeah, that-- I think I love that last bullet point in the context of Annabelle's question [laughs] because it's like, yeah. Does everybody know who they need to do? It's one thing if we know, but getting that out to the actual people whose-- for whom it's their role, is always going to be a challenge. So how could you do something like this? I left this here, so just in case we didn't have any questions, we had complete crickets at the end, this is a good opportunity for us to pause for just a minute or two to make sure that we caught, and I'll do a little bit of scroll back through the chat.

We saw Annabelle's question, I saw John's optimistic musing. And, yeah, I think if anybody else has any questions, feel free to unmute. We can try to tackle them quickly in terms of what are we doing with these data study sessions. If not, we'll just sort of move into-- I see Annabelle's hand up again.

Annabelle: I have a-- I do-- I have a question. How are you measuring enrollment differences between the college and the adult school? That has been a question that is boggling our consortia. Yeah, so--

Rick Abare: That's a great question because of the way that enrollment-- I actually was in an earlier session about data and that has a lot to do with the fact that enrollment doesn't really mean the same thing at the different levels, right?

Annabelle: Exactly.

Rick Abare: Like, students come in, they're there, they take two tests, they gain an EFL. They move them into another class whereas enrollment at the college is much more strict. So that's why we look-- chose to look at cumulative enrollment from quarter to quarter. Now, the business rules for how we pick quarters at the colleges is complete shenanigans, which I'm happy to discuss with you offline.

But also saying like, we're not even going to look at it until they hit 12 hours because what it allows you to do is to say, up to this time point, we know we have this many 12-hour enrollments in these program areas. So it's really the best we could do in terms of getting ourselves lined up the best as possible with those two types of enrollment. TE set the tone for that because they do their quarterly-- we have to submit that data quarterly.

And the CAEP Summary Report is functionally cumulative. If you keep starting at the beginning of the year and then you count your enrollment until whatever the end of the quarter you're at. And you can basically do the same thing if you have direct access to the college data and they're taking attendance, which they don't always do. So we do the best we can, and we feel pretty good about it, though, as I mentioned much earlier, when the colleges moved from WIOA to not WIOA Title II, one of the things we realized was that the college student information systems were not doing a good job exporting their enrollment into TE.

And so the low enrollment that we think we saw at the colleges in like 17-19, 18-19, and even in the 19-20, is actually an illusion. So it helped-- it shaped the narrative for a while that, oh, college enrollment went like this and adult school enrollment has been going like this, when, in fact, that's probably not actually what happened. So we always have to be ready to reassess, and one of the things that we're trying to do before our November data study session is to retroactively calculate that enrollment for the colleges directly from the student information systems with the methodology that we know we trust.

Does that make sense?

Annabelle: Yeah, it does, and there's so many different variations, especially at the community college with this proliferation of non-credit courses when the ESL courses were moving from credit to dual-listed and some are not credited, it looked like it was a huge jump in ESL enrollment-- in non-credit enrollment when it was just like you started off the presentation saying it depends on what you're looking at.

Rick Abare: Right. What you're allowed to count changes, the numbers are going to change. Yeah. I'm sorry for cutting you off there.

Annabelle: So I think we just landed up for the adult schools looking at 12 hours of instruction and trying to do something similar with a college as well as far as enrollment. But again, it depends on how you're looking at your MIS data. But we talk about trying to compare oranges to oranges and your apples to apples, and I think that's just not a reality. These are not the same things, and they don't mean the same things for the different systems.

Rick Abare: Yeah, exactly.

Annabelle: That's a hard conversation to have.

Rick Abare: It is, and I'm-- and we're happy to continue having it. So yeah, definitely feel free to-- if we think anything that we said might give you some clues about how to help you out, we're happy to help. I see John with his hand up. I think they're going to cut us here in a minute, John, but let's see if we can get you in under the wire.

John: Yeah, my question was, do the fact sheets map student outcome data to census tracts?

Rick Abare: They don't. They're mostly a-- I mean, that's a great idea. I would love to figure out how to include that in there. They don't. They're mostly a tool for the schools to say, hey, here's how many students I'm serving and the challenges they face and their makeup. So it's a very top-level demographics and enrollment type of look. I would say, that would be amazing to census tract. I would love to think through how to do that.

I would also think you might have a hard time because you'd have to wait that given there's a massive disproportion in enrollment from different census areas. It would be really rough to do the zip code level on that and feel good about the underlying assumptions of comparing percentages of enrollment-- of outcomes achieved. I would have a rough time as a statistician feeling super-good about that, but it'd be fun to map and see what it looked like.

So I think they're going to cut us there. Does that make sense, John?

John: It does, yeah, and that's what I thought they worked as. My thought was, is, when we're looking at impact of work in specific subset regions of zip codes, it might be interesting to look at how we're pushing into those subregions--

Rick Abare: Absolutely.

John: --and the impact we're having with those communities.

Rick Abare: Absolutely. I think that a lot of that starts with enrollment, and then you could look at how quickly is a student hitting a different benchmark in their pathway. And then, if you notice any differentials there when you're looking at different zip codes, you could start looking at how to ameliorate that on the ground and then let the outcomes take care of themselves. That's probably how I would do it first, but I'm a minimum viable product type of guy.

Brenda Flores: We have a minute. But I think, again, like we said, our emails are on this, and he will update this power deck over at the folder, so you'll be able to grab those emails and reach out to us.

Mandlee: All right. Well, Rick, did you have a closing statement that you would like to--

Rick Abare: Just to thank everyone for their great questions and for spending some time with us this afternoon. And best of luck to all of you. And enjoy the rest of the convention.

Mandlee: Thank you. Thank you, Rick and Brenda and Ilse, for today's session. I can tell that everyone was thoroughly engaged, and thank you, all, for participating in today's second day of the CAEP Summit. Please take a moment to give feedback at the evaluation link posted in the chat previously. Here, I'll pop it in one more time.

And then, if you are available, join us now in one of our other Zoom rooms for one of our jam sessions or visit the exhibitor booths, chat with somebody in the lounge, and connect with us on your preferred method of social media. So thank you and thank you, all.

Rick Abare: Thanks, Mandlee.

Mandlee: No problem. Bye-bye.