Margaret Patterson: Great. Thank you, Veronica. It's great to be back. This is the third webinar in a series of three that I've been asked to do. And so welcome back to those of you that may have joined for one of the other two. Today, we're going to focus on logic modeling. And logic modeling is not strictly required in strategic planning, but it's definitely beneficial, and I hope that I'll be able to present some evidence to you today to that effect as we go along through our hour today.

So as we get started-- let's see. Here we go. So as we get started today, I have a poll question for you that's going to pop up. And the question is really, how familiar are you with logic models? And we have some options, anywhere from not at all to I would be all set to develop a logic model tomorrow. There's really only four options on here, so you can just ignore number five.

Are you all able to see the poll OK? it's OK.

Veronica Parker: So it looks like we have about 90% participation.

Margaret Patterson: OK. Great. All right. So we have most folks that have had a chance to-- oh, here it comes. OK. All right. I wasn't seeing it. So it's a pretty even spread here, actually. That's good to know. With most folks somewhere in the 2 range, maybe know a little bit, but haven't tried it yet.

OK, and a few folks have tried it. And then others are ready to go. Good to know. So even if you aren't very familiar with logic models, just based on the phrase having the words logic and model in it, in the chat, would you indicate what you would expect a logic model to do? Just based on those two words. Even if you're not real clear about what it is yet.

OK. Systematic approach. Oh, I like that, Emma. That's a good-- yeah. Yeah. An evaluation tool for what worked or did not work. Yes, connecting activities and strategies logically to goals and outcomes. OK. These are really good ideas. Yes.

You're all being very creative. I guess I was-- yeah, roadmap. Yeah. I was leaning even more towards that we would have a model and that it would have some logic to it. But that's the piece about being systematic. That makes total sense. OK. Well, thank you, everybody, for sharing in the chat there. We'll continue to share in the chat as we go along because I hope that keeps everybody engaged in what we're talking about.

So our definit-- by way of overview today, I would like us to look at a definition and some types of logic models. We're not going to do a lot of them because we really want to be able to focus on a few things and do it well in our time together. I will take a look at reviewing the components of logic models as a precursor to the evaluation process.

And we'll talk about a straightforward way to use one in planning. I'm going to share two logic model examples, and we'll also review SMART objectives, which can be useful in developing the outcomes for the logic model. We'll consider how the logic model connects with program evaluation.

And I will offer a few next steps that your consortia or you, as the consortia leader, for those of you who are leaders on the call today, can consider after the webinar. And then I have some additional resources, two URLs that may help you answer questions that you have. So that's our overview for today.

So let's start with the definition idea. So the Kellogg Foundation published a guide to logic models back in 2004, and they described a logic model as a systematic and visual way to present and share your understanding of the relationships among the resources you have to operate your program, in this case, the consortium, the activities you plan, and the changes or results you hope to achieve. I think that's a pretty clear definition of what logic modeling involves.

And I want to call your attention there to the word systematic, which we've talked about before in our previous webinars, and a visual way. Because ultimately, this is a diagram that we're creating, and it's sort of an at-a-glance thing. There are, of course, details in it, but it's basically a diagram that gives the big picture for where you're headed, and in this case, where you're headed with the strategic plan overall. So you'll have a diagram, and from there, you can fill in the details.

So a logic model is also a simplified picture of a program, or in this case, the consortium, that is responsive to a given situation. And I'm getting this from the University of Wisconsin Extension logic modeling guide from 2003. And both of these definitions that I'm giving you are very classic ones. They've been around for some time. But they're something that the evaluation field really looks to in considering logic modeling.

So they also talk about how logical relationships occur among the resources that are invested, the activities that we just referred to earlier, and the benefits or changes that result. And really, the logic model is the core of planning. And so it allows you to represent that as we go along.

Now, you may have heard of something called a theory of change. And the only reason that I'm bringing that up is that some people use those interchangeably, and they are not the same. So logic model is not quite the same, even though people tend to use them that way.

A theory of change is a depiction of how and why a desired change is expected to happen in a particular context, rather than the logic model itself explaining the logic behind the program and its results. So they have different emphases in what they're trying to have happen.

So today-- as I said earlier, we've got many different types of logic models. But today, I would like us to look at three of them as examples. We're going to start with Innovation Network, Incorporated as a very simple, straightforward example just to get the thought process out there. And then we're going to look at two adult education examples as we go along.

But first, I want to call your attention to the diagram in the graphic here. And in the chat, if you would, as you look at this diagram, you probably notice that there's some arrows in there. So what do you notice about the direction of the arrows in the graphic?

OK. Thank you, Mitch. OK. So you need outcomes. Yes. They're all pointing toward the outcomes. They do follow in a logical sequence. OK. Yes, it's counterclockwise. It's kind of going in a circle, right? Yeah. Yes. Cyclical. OK. Yes. So they're circling back to the goal. Exactly. Yes. These are all spot on answers. They're all very different, but they're heading in the right direction definitely. OK.

So here's the example from Innovation Network, Incorporated. And I will have a link for that at the end for you if you're curious to look at it more. But this is an example related to home buying. And it starts at the top, up here in the problem statement in the aqua color. And it's basically saying that the person doesn't own a home, and so they're not getting the benefits of that. So that's the problem.

And the goal, in the orange, would be to increase financial independence and security through owning a home. OK, great. So we kind of know where we're headed with that. That's helpful. From there, we continue down to the yellow boxes. And here, they're talking about the rationales and assumptions related to home ownership and what that person expects might happen as a result.

And from there, we go to some resources that the person is aware of. For example, they already know that they've got a steady income. They know about the neighborhoods where they might want to look for a house. They've identified a real estate agent and so forth. And they've actually dedicated 12 Wednesday evenings to go out and do some looking. So that's all the information under resources in the pink boxes there.

Moving along further to the right and the activity groups, we see that the person has done some research. They've gone ahead and done some financial preparation to get ready, and they're looking for houses. And they're getting ready to make a purchase.

As far as outputs go, they're getting everything organized. They're looking at those neighborhoods. They've actually done the 12 weekly Wednesday night sessions to go out and look at houses. And they've got the realtor ready to go. And they've now got an offer accepted. So those are some of the short-term outputs that are coming out of this whole process.

When we move further to the right, now we're starting at the bottom and kind of working our way up to the top. So the short-term outcomes that are there, they found out a whole lot more about the neighborhoods. They've got a lot more information on the home buying process.

And those of you that have been through it know it definitely is a process. And they know a whole lot more about outcomes-- or excuse me, options for housing that are out there. So a lot of these, as you can see, have to deal with knowledge that the person has gained in the short term.

Moving up to intermediate-term outcomes, some changes have happened in behavior. So the person now has increased their savings, their credit rating's better, and they actually are a home owner. And then moving up even further, we end up at the long-term outcomes where the person has greater financial security and wealth and so forth.

And all of these things ultimately feed back with a dashed arrow leading off to the left there, indicating that yes, I've increased my financial independence and home ownership. So it is kind of cyclical, in that sense, in this particular logic model.

So I do have a question for you in the chat, and that is, how do the long-term outcomes at the end feed back into the home buyer's original goal? What do you think?

Right. OK. So they've increased their financial independence because they saved some money. Right. Yes. Good point, Emma, the vision and change as a result of their actions. Right. So things have happened towards their goal.

And Guillermo, it looks like you might have been trying to start something, but it didn't quite come through. Yes. Increased wealth and net worth. Right. And that could presumably continue if the person decides to continue the home buying process later on or learning more about it. Definitely. Great. Good ideas.

All right. So in that example, we looked at how the individual benefited, the person that was buying a house. But if we extend that to the consortium-- oh, thank you, Guillermo, for trying it again. OK. So yes. The person accomplished their goal in a logic-- yeah. They did. They went about it in a logical, systematic fashion and were able to accomplish their goal. Definitely.

So if we look at it from the benefits that the consortium may experience, it's very likely that as a result of having a logic model, some folks who can use it will be not only the consortium members that are listed here, but program staff and administrators. Students may also benefit from this even. They may not see it. It may not be presented to them, although it could be.

But they can definitely benefit from that systematic approach. And then also, staff, strategic planners, and even evaluators can benefit from just knowing how it's organized and structured in planning ahead. So it's all going to continue to be helpful.

All right. So now, let's go ahead and look at the University of Wisconsin Extension model, and you can see this in the graphic on the right half of the slide. This is a very classic model, and this is the one that we'll be spending the most time with today because it's so commonly used. The typical components that it has in it are some assumptions and external factors. And you'll see those at the bottom of the graphic in blue.

And they basically lay the foundation for the logic model itself, some things that the logic model rests on, if you will. And then from there, we go to the left, and we look at the blue and aqua arrows for the situation and priorities. And from there, we look at inputs.

We have outputs in the green columns, which involve activities and participation. And then we have three types of outcomes, which are short-term, intermediate, and long-term impacts that we'll be taking a look at. So let's look at that in a little bit more detail.

So we'll start with assumptions and external factors. So thinking in terms of strategic planning for the consortium, we would include here the assumptions about your consortium and any external factors that affect its work. So assumptions are ideas and beliefs that you believe, as a consortium, to be true, things that are happening internally. I think that's an important distinction to make between the two in defining them.

So assumptions are internal. External factors are external. That will ultimately affect the outcomes in some way. So an example might be the starting condition of participants is considered to be x, whatever that looks like, whatever there is about the participants that we need to know. Or the program will have access to certain resources. That's important to know too.

External factors are events, conditions, activities, or situations that are really beyond your control as a consortium, but that might support or impede the outcomes. So for example, the program might be a first effort in a community to do something, and it might be looking at a particular situation.

Or there might be a policy that is really outside the consortium's control, but that policy can make a really big difference in ultimately what happens to the outcomes. And we'll be looking at some examples of that in just a little bit. So I have a chat question for you. What are some assumptions that you might make within your consortium?

So that's this internal piece. With respect to your strategic planning. So some things that you know about the consortium that could ultimately affect outcomes.

Yes. OK. So you may not have a lot of financial resources. Great point. Yeah. And it may not be consistent, or it might be consistent. That would be good too. OK. Great point. So by looking at geography, you've said, it's rural, it's isolated. We may not have all the partnerships we need.

Yes. The pandemic is still with us, at least for the short term. Yeah. These are great examples of assumptions that you need to be aware of. Super.

So once we've established the assumptions and the external factors, we identify the situation and the priorities. In short, this is our big picture of where we're headed. So the situation includes the problem statement, and we refer to that in our example about the home buying situation. But it's the needs of the situation that we're planning to improve in the strategic plan and who's involved in that.

Along with describing the situation, we also focus on the priorities of it, what outcomes we can actually realize in the context of the mission, the requirements and so forth. So we're beginning with the end in mind.

Then we go on to inputs and outputs. And inputs are what we have to put into the consortium to help it meet-- to help it meet its goals and realize its outcomes. So if you look at the gold column there on the far left, it's what you're investing. So it's things like staff, time, so human resources, time resources, financial resources, whatever you have that can be invested to make this plan work.

And the activities are what we do with those investments to support the situation and address the need. Participation is who we reach with the activities. And that could be staff or schools or partners in the community and then whatever occurs immediately. And those are in the green boxes in your graphic.

And then moving on to outcomes, we have three columns here because we've got short-term, intermediate-term, or medium-term if you want to call it that, either way, and long-term. So in the short term, we might consider what our participants have learned or gain. So if you recall the home buying example, that person gained some knowledge as a short-term outcome.

For intermediate outcomes, we look at actions that result from activities and participation, such as a change in behavior or a practice or some type of shift in policy or decision within the consortium. So over the long term, we look at changes in conditions, whether social, economic, or otherwise. So there's some time differences, but there's also some other differences.

So I have a chat question for you that's coming up here. So short-term, intermediate, long-term outcomes sound like they simply change across time, as I just pointed to. But what other potential change do you see occurring in these descriptors of outcomes besides time?

Yes. They're all part of the impact. That's true. This is a tough question. I know. Yes. Flexibility for unforeseen push-pull factors. Yes.

Right. And Kathy, I think you're making a really valid point here. The short term can lead to changes in the medium and then ultimately, the long-term impact. Yes. And so as we get at those social, environmental, and economic conditions changes there, it's a bit broader, isn't it?

And it's also-- in some cases, it's also in more depth. There's a more intense level of change occurring here. So it's not just more of the same, like we'll do this the first year, we'll do this the second year, we'll do that the third year. So it's not just time-bound. But the first year may involve changes in knowledge, and then medium-term, we're going to make changes in action or policies, and then ultimately those conditions. Right. Good.

So then we put it all together. And then this is the big picture of what the logic model looks like. So we took all of those parts and pieces, and then this is how it will come together. And it not only informs the strategic plan itself, but any evaluation that we might plan to do later.

All right. So I promised we would look at some adult ed examples, and it's time to do that. All right. I realize the print might be a little bit small here, but we'll take our way through it. So this particular one is an example from Riverside about students consortium for adult education. And it's from their 2019 strategic plan, which I imagine they're still working on it at this point.

And they developed a logic model for a goal that they have to strengthen industry and community partnerships and short-term CTE programs. And so the focus here is on strengthening things within the context of career and technical education. So on the left, we have our inputs.

And notice that below each of these descriptors, these column headers, they've talked specifically about what they're doing in each part. So that might be helpful to think through. So they've got their inputs. Obviously, their consortium members are an important resource. And they've got some information on what they're offering and projecting to offer in terms of courses. And then the information about the current industries as well.

And then under their activities, they list their strategy. And they have three asterisks here that get at what they're proposing to do in the short term related to reviewing things and identifying gaps and so forth in order to start to meet this need. So that's what they've decided to accomplish.

And the outputs, what they're expecting to get out of it immediately, is they'll end up with lists of the courses and certificates that they can do that they currently have and some things coming up as well. So they've thought through all of that. And then they have a short-term outcome, and it's looking at increasing short-term CTE and contextualized course enrollment consortium-wide by 5%, basically from 2019 to 2020, as measured by TOPS Pro data.

So they've been able to be very specific in terms of what they think they'll gain in the short term. So that's an example that we can look at. I do have another chat question for you here. What do you notice about how the outputs in this section relate back to the activities? So we've got three outputs, and how does that relate back to the activities that they talked about?

OK. So you're seeing a direct correlation. If they had wanted to, they could even put arrows from, say, this asterisk here to one of the outputs. A lot of times, people do that, and that's perfectly fine if it makes sense that way to do so. I think here, they were basically using a table to keep it simplified, and that's fine. OK. Right. Directly related intangible output. Good point.

OK. So those are some good insights as to what's going on. So part of the continuous improvement planning process, as many of you experienced very recently, is keeping objectives SMART. And by that, we're referring to Specific, Measurable, Achievable, Relevant, and Timely, within a particular frame.

So I thought it might be a good idea to just refresh our memory here, as the SMART acronym can apply to strategic planning as well. So for example, when looking at our logic model outcomes, we want them to be SMART. We want them to follow this acronym.

And in the Riverside example, we read earlier that their short-term outcome was to increase the enrollment consortium-wide by 5% by 2019, as compared to baseline and measured in TOPS Pro. So we can ask ourselves, well, is that Specific, Measurable, Achievable, Relevant, and Timely?

And we can use the SMART objectives to address the needs we've identified in the strategic planning process. And then we'll have some evidence of how we plan to improve integration of services and transitions in the consortium and some smart ways of determining whether we're boosting effectiveness of our services.

So I have another example here, and we're going to be looking at the full version of this in just a moment. That will be our second example. But imagine a situation where there's an intermediate outcome for an IET program with respect to employment. And this particular outcome says that more participants advance in jobs.

So if we look at that, that's not really particularly SMART. It's kind of vague, right? It's a great outcome, but it's a little bit tough to measure in terms of what's going on. So my question for you is, how SMART is this revision that I've suggested here?

And you can see it there in red and in quotes. But how SMART is this revision to the intermediate outcome? And would you revise anything? Would you add anything? Try to make a little bit more specific and measurable and so forth. So what do you all think?

OK. So it's more specific. It's more measurable. It adds a time frame. Does it seem to be relevant to the ultimate goal of increasing employment? OK. So maybe specify-- yeah. OK, so maybe not all 100 participants. We might say 80 out of 100 or something like that. OK.

OK. Right. And we may not have direct control over who gets the job. I mean, that's something important to consider when you're coming up with this wording because we're not the employer. We can't hire people. But we could say at least that they've got the certification that the employers require.

OK, Kathy would like to know more about how many companies actually need the welders? OK. Yeah, and again, that's outside of our control as well. All right. So these are good points. Good things to think through. Right. Which specific certificate they need. That's great too.

So we can tweak this. As we think this through, we can figure out how to make it as specific and as ready to go as possible. Great examples.

So I referred to this example in what we just looked at, but this is the second adult education example. And it's from the NRS website. And I'll have a link to that at the end here. The identified situation is that employment outcomes in IET programming are below an expected target.

So here, they've listed the problem statement, and they also have a goal. And the goal, in this case, is what the solution would be to the problem. And they're saying that they need to increase the number of hours participants attend so that they will obtain employment when they leave the program. So that's giving us a little bit more information than we had in the chat question that we just looked at.

All right. So we've got some assumptions and external factors here. And again, the instructions are what's happening sort of internally and externally. We've got things that are outside of our control, such as the unemployment rate in the community or the availability of those employers, the things that we were just listing in the chat.

So another chat question for us is, knowing this situation, are there any assumptions or external factors that we could add to this list beyond what's here?

Thank you, Veronica, for putting that up. Yes, cost. Absolutely.

Persistence in the program, and I'm guessing that you mean the students being able to persist. And I don't know, is that a challenge? Or what would be the assumption about persistence that you would be making?

OK. Yeah. Are they jobs that the participants actually want? OK. Yes. Oh, OK. So the amount of participants that actually finish the program. OK. Yeah. There might be definitely some assumptions there, and you've probably got a history of that that you have a sense of how that works. OK.

All right. Then we can move on to inputs. And you can see the inputs that are indicated there in this particular example. So we've got the funding. A great example of a resource is that the teachers are trained in IET instruction. You've got the curriculum all set to go.

You've got somebody to provide the IET instruction. And I believe by that, they probably mean the occupational piece of it rather than the basic skills piece or English piece. But all of it matters. You've got to have all of those providers. And then the employers. And this particular program is saying, as far as activities, that they want to do some outreach for IET to the students.

They're going to be looking at the classes that are offered and so forth. Got a lot of different things here that they've indicated they're going to be doing as far as activities. And then the participants are students, employers, the teachers, and the program staff, as we would expect. So that's helpful information to have.

And then moving on from here, we see the outcomes. And we already talked about this first bullet under the intermediate outcomes, that more participants would advance in jobs. And we tried to make that SMARTer as we went along. But let's take a look at one of the short-term outcomes.

I have a chat question again for you. The short-term outcome on higher skill gains among students. That's our very first bullet point there. How could we change that to make it SMARTer, to get it more Specific, Measurable, Achievable, Relevant, and Timely? What do you think? It's kind of vague.

OK. All right. So we could say that we're going to put a percentage in there for course completion. That would get us started towards demonstrating higher skill gains. OK. We could look at a specific type, like the EL Civics and the connection with IET. That's a good example.

Yes, and I totally agree, John. We want to get some numbers in there. And yes, Jasmine, add a timeline. How long is this going to take us? Those are all very important. You're all headed in the right direction here. That's great.

OK. Great. Yes. So completed by a certain time. So if we have that time in there-- and definitely define what the skill gain we'll be in. Those are important as well. I think we've got some good ideas here. I'm trying to get you to just think through the process as we go along. That's great.

So then we put it all together, and this is what it looks like. I think you can see where this could be done pretty easily in Word or perhaps a PowerPoint slide if you wanted to so that it would be easy to express as a diagram, as a single diagram.

All right. So connecting the logic model with program evaluation. Think this is a point that I want to make before we get to our wrap-up. The logic model is really the first step in evaluation. It helps determine when and what to evaluate so that evaluation resources are used effectively and efficiently.

Through evaluation, we test and verify the reality of the program's theory, how we believe the program will work. A logic model helps us focus on appropriate process and outcome measures. And that's coming from the University of Wisconsin extension guide from 2003.

So if we went back to our example on slide 20, we could see where the logic model is identifying the needs that we've come up with and kind of framing the evaluation. But it's also informing questions. So I'm going to take just a real quick look back to here, I believe. Yeah.

So for example, under activities, we said that we were going to do some outreach to students. And of course, we have students as participants and most likely staff participating in that outreach effort. And so one of our evaluation questions then could be, to what extent did the outreach to students occur as planned, and who was involved in that implementation?

In other words, how did it go? How did it work? What did it look like? Those are some questions that you could ask. You can ask at the end, which would be formative evaluation, or you could even say-- this is a three-year process. So if this happened in the beginning, how did things go formatively, as we were going along, and how did that look?

Because it might inform what you do with the rest of the plan and say, oh, OK. We didn't quite get what we wanted to go. I saw in the chat earlier that somebody asked-- or somebody said, we were on target, and then the pandemic hit us. So we didn't quite get where we wanted to be.

How could we regroup and then maybe do that from there? Emma, it looks like you have a question. I'm not sure if it's about this or if it's about where we were previously, but I'll pause there so that we can--

Emma Diaz: It's actually about the slide where you were after this. I think it was that first bullet that I thought actually triggered something in my head. The next one, actually.

Margaret Patterson: This one?

Emma Diaz: Yes. When you read that very top bullet where it says a logic model is the first step in evaluation. So in my head right now, looking at it through a director's lens, I'm a little, I think, confused. And I thought, this is, I think, a good time to ask my question is, is you've used the term evaluation, and evaluation is kind of looking at the past of what worked, what didn't, so forth, the past event. And we're asked to write a three-year for the next three years, which is a future forecast.

So looking at the example that you gave about the about students, is it that we create a logic model for something like they did with CTE and that you hold onto it, in a sense, for the three-year period? And so now that they're going into a new three-year phase, that you look back at it to see if those things were done? Is it that you're using it both as looking towards the past of evaluation, but using it as a future forecasting model as well?

Margaret Patterson: It could be both. I think I was more forward-looking, more at the future piece of your question so that if we said, well, this is what we're proposing to do, and so if we plan an evaluation either part of the way through the strategic plan or at the end to say, well, gee, did it work? Because presumably, there'll be a next three-year cycle. I don't want to get anybody freaked out.

But looking ahead to the next three-year cycle, what do we still need to do, and did what we did last time work? You could even go back to your 2019 to '22 strategic plan and say, if these things are related that we want to work on coming up, did it work? Were we able to figure out how it worked, and was it successful, and did we get the outcomes that we expected? And now, how does that inform what we're planning for '23, '24, and so on?

Emma Diaz: So then would you say that taking the evaluation component of it, but also using it as a forecast model and the gap in the middle still kind of the areas that need attention?

Margaret Patterson: Yes. Yes. I think you could do it that way too. Yeah.

Emma Diaz: OK.

[interposing voices]

Emma Diaz: I just wanted clarity on some of the words.

Margaret Patterson: Yeah. Yeah. It does not have to be at the end. It can be, and it often is, but you can evaluate as you go along, and that's actually a wise thing to do.

Emma Diaz: OK, thank you.

Margaret Patterson: Right. Yes. So Wendy's saying, and isn't evaluation something we do all the time as part of continuous quality improvement? And ideally, yes. Yes. But yeah. And I realize that depends on resources as well. So I'm really glad you brought this up because that might have been something that was rolling around in other people's minds.

Great. All right. So let's go to our wrap-up here. All right. So some next steps that you can consider. So first is selecting a template for the logic model and the elements that will work for your consortium. And I know some of the wording has been slightly different, but the elements are basically there.

So some advice from the NRS website is identify a topic or problem you want to address and the goal you have for solving it. Think through your assumptions and also those external factors to the extent that you can. And then determine the short-term, intermediate, and long-term outcomes that you want to achieve.

And again, we want to keep that as measurable as possible. We definitely want to take a little bit of time to just think about those inputs and outputs that are needed to get to the outcomes and to reach the goal. And again, we want to keep those SMART.

And then review the logic model that you think might work in terms of this evaluation process that we were just talking about to ensure that the relationships look clear to you. There might be some gaps in resources, or you might come up with some additional activities that you hadn't thought of before. So I just wanted to pass those along.

Oh, thank you. Thanks for attending. That's great to know. All right. So those are some things that you can do in the short term, some next steps back in the office. And before we go, I know we're running tight on time, I just wanted to mention some logic modeling resources that might be helpful.

There is a resource firm Pacific Regional Education Laboratory. I've got a link to that. So that's West Coast-specific. That might be helpful. We've talked about the Innovation Network today. That was the home buying example. And the NRS website example, I have a link to that. And then the University of Wisconsin Extension Logic Model from 2003.

The Illinois Community College Board last year put out an intro to logic models. That is adult education-specific for Illinois. So that might be useful to take a look at.

And then two other resources that you might be interested in are the Logic Models to Support Programs from the Northeast Regional Educational Laboratory that came out a couple of years ago. And another extension logic model guide for planning, this one is from the University of Idaho Extension. So lots of resources to take a look at, and I hope that you'll take the time to do that.

I know we're at time, so I want to thank everybody for participating today. And in the webinars, some of you have been with me for all three of them, and I really appreciate that. If you have questions, I can certainly stay on for a few moments. And my contact information is here, so feel free to reach out, and I'll be happy to email you back if you have any questions. Thank you, everyone.

Veronica Parker: Thank you, Margaret. We have some thank yous coming through, and we do have time for questions. So if anyone does have any questions, please either unmute yourself, or you can type them in the chat. Not seeing any as of yet, but if anyone has any, definitely feel free to type them in the chat or unmute yourself.

And while we're waiting for that, again, thank you all very much for participating in today's webinar. And thank you, Margaret, for joining us for this webinar series. Thank you for answering the call for three webinars. We definitely appreciate your time and your expertise and knowledge in putting these webinars together. So definitely appreciate you in that manner.

And thank you all for attending. We have been hosting these webinars for I believe it's a little over a month now. And then we have one next week on the three-year plan. And that's just a demo of what it will look like, although it will not become live until early next year, early 2022, because it's going to coincide with the CFAD. But more to come on that.

So if you have not registered for that particular webinar, definitely take the opportunity to register now. That will be next Wednesday, October 6, at 12:00 PM. We have also posted a link to the evaluation in the chat. Definitely take an opportunity to complete that evaluation to let Margaret know what you thought about this session today, as well as indicate if there are any additional areas that we should be exploring when it comes to PD around the three-year plan.

Mandilee has also posted a link to register for the summit. The summit is approaching. It's in a little under a month, and registration is coming in. We are close to 500 registered, so we are super excited about that. Of course, all are welcome.

It's free this year. We have 60 breakout sessions with a plenary address and opening session. The CAEP update is independent of any other session, so the CAEP update will not be competing with other sessions. So there's a lot of greatness to come from the summit. So if you haven't registered, definitely take the opportunity to register.

I'm not seeing any questions, so we will go ahead and close out. Be on the lookout for an email regarding all of the resources that we have been putting out for the three-year plan. It will be in a Google folder, so everyone will have access to those resources. And again, I'm not seeing any questions. So without any questions or anything, we'll go ahead and close. Margaret, do you have something to say?

Margaret Patterson: I was just going to say I'm looking forward to seeing how these strategic plans come out next year. I know it's going to be a little while, but I'll be really excited to see what happens and what people put together. It's going to be an adventure, right? So thanks for letting me be a little part of it.

Veronica Parker: Great, great. Well, thank you all very much for your time and your participation, and we hope everyone has a great afternoon and rest of the week. Bye, everyone.

Margaret Patterson: Bye.