--for joining today's webinar. It is the "Assessment to Instruction, using CASAS Test Results to Inform Teaching." And this webinar will be presented by Margaret Teske, Program Specialist with CASAS. My name is Veronica Parker, and I'm the Coordinator with the CAP Technical Assistance Project.
Before we get started, I like to go over a few items with you all. First, if you have any questions throughout the course of this webinar, please be sure to post them in the general chat. Throughout today's webinar, Margaret will be available to answer any questions you may have. She will stop when there is a natural pause. So if she does not gets your question immediately, please just be patient and she will get to your question as soon as she is able to.
The webinar will be recorded and available on the California Adult Education website. And I will be sure to post the URL of exactly where you'll be able to find the webinar recording, as well as a link to the PowerPoint just in case you need it for future reference, or if you would like to share this information with some of your colleagues who were not able to attend today's webinar. It will be available to you.
The webinar PowerPoint is available at this time in a PDF format under Resources, Handouts, and PowerPoints. You can select the file and select Download. The file will download to the webinar-- excuse me, to the browser that your computer downloads files to. So please be sure to download the PowerPoint and use it as a guide as we go through today's webinar.
If you experience any technical difficulties throughout the course of today's webinar, please be sure to let us know via the general chat pod. Myself or Melinda will be available to assist you via a private chat. And how that looks like is at the bottom of the chat pod, a tab highlighted in yellow will appear with either Melinda's name or my name. Please be sure to click on that tab, that is our way of communicating with you privately to address any technical issues you may be experiencing at that time. So throughout the course of the webinar, if you experience any technical difficulties, please be sure to let us know via the private chat.
We will be taking attendance during today's webinar. So if you have logged on and you are using your consortium name, your agency name, an acronym, or your initials, please be sure to let us know who you are in the chat pod, so that we can account for you via our attendance. In addition, if you are participating in today's webinar and you are with colleagues but only one person has signed in, please be sure to let us know who you are participating in today's webinar with, so that we can account for everyone during our attendance.
So I'm not seeing any issues at this time, so I will turn it over to Margaret, who will get us started with today's webinar. Margaret, are you there?
Yes, I am. Thanks, Veronica, appreciate the introduction. So I'm Margaret Teske, as she said, from CASAS. I'm Program Specialist, but I recently retired from Mt. San Antonio College, so I have experience as an administrator and a teacher. So this is coming to you from that perspective to try to help you help your teachers get a better feel for what to do with CASAS test results and how to help them do a better job with teaching their students.
So an overview of what we're going to do today includes understanding some basic information that TE or TOPSpro reports provide. We're going to look at potential areas of success or concern. So I want to draw your attention to some reports that you can look at and create possibly an action plan to address those concerns. That's what you would do at your agency after this. And look at what reports are the most helpful for students and for teachers to analyze areas for improvement, as well as for administrators to look at areas for improvement. And then at the end, kind of reflect on how this presentation could maybe affect your agency, or, if you're a teacher, your class.
CASAS is an integrated systems approach. So we do look at curriculum assessment, instruction, and accountability. We have basic skills, content standards, and CASAS competencies. We have assessment, of course, in reading, listening, and math. And we have a QuickSearch Online, which is a free resource really intended for instructors to find appropriate instructional materials that would match with any areas of concern through the content standards and competencies.
And then we have the TE data, or TOPSpro Enterprise data, and that is an accountability software that helps us track student test scores and generate reports. I'm assuming you all know pretty much all of this. If you don't, please let me know if I'm going too fast, or if I need to speed up. Let's put it that way.
The foundation of the CASAS system. Now we usually say content standards, competencies, and task areas, but now we include the college and career readiness standards. And that is part and parcel of our new goals series.
When we look at the College and Career Readiness Standards, this should be part of your curriculum at your schools as well. Key goals in the CCRS are for-- it needs whatever we're teaching, the overall content demands, should include those things that are relevant to preparing adult students for success for higher education and training. And we're looking most for adult learners. These are based on Common Core, which is K12. But now the College and Career Readiness Standards are for adults.
This is a sample of an alignment between the CASAS reading standards and the CCRS. So you see that they work together. For instance, vocabulary is one of our content goal areas, and that matches up with the reading anchor, R4. Reading comprehension skills, such as locating detail of the purpose, that is in the CCRS at the beginning, R1 and R2, also R6. So those are major areas that are covered in the A and B and into C tests. And then the C and D tests generally cover the higher order reading skills, as you see listed there.
Here's an example of a CCR reading standard if you're not familiar with them. This is one of the popular ones that are used quite often in most tests. Interpreting words and phrases that are used in text, sometimes into technical, connotative, and figurative meanings. And that's more on the higher end. So we start with level A, that's beginning, low beginning, asking and answering questions to determine or clarify the meaning, and then we get into more academic and domain specific words. And we don't get into figurative or connotative technical until the level D.
Let's look at a sample item-- sample test item. This is from our practice test items, so it's fair to show you this one. You can see there that it's asking a question about what? You can go ahead and chat. Feel free to get your fingers used to chatting because I will ask questions throughout. What CCR standard is being tested here?
OK, you guys aren't participating, but that's OK. I will answer the question for you. The standard is interpret words and phrases in a text. So it's R4 at the B and C level. So they're looking at the word liberal. How is the word liberal-- what is the meaning of the word liberal in this text?
Along with the CCRS, we have content standards, competencies, and task areas. Basic skills content standards for CASAS include academic skills measured, like locating the detail in this example. A competency is more of a measurable learning objective, and that is measured as a functional-- well, it's in a context of a life skill. And task area has to do with the formatting. So we see here, it's a chart of a daily activity. So we're reading an activity schedule and we're locating the detail in order to answer the question.
There are several content standard categories for reading, for listening, for math, for writing, and speaking. Most of you know the reading, listening, and math categories more than anything else. These are additional tests that we do but are not as well used in California. Reading standards include these five levels. Generally, level 5 is not included, as it's literary text only. So we're looking at foundational literacy, language and vocabulary, comprehension skills and strategies, and then higher order reading skills.
Here's an example from reading standard 2, which is language and vocabulary. And the little boxes that you see there-- I don't know, should be dots, but it didn't come through that way. But that's OK. The reading standard 2.1 has to do with punctuation and capitalization. That's actually used in all of our test levels through ESL and through ABE.
One that is not is the reading 2.3, which is more academic, technical, domain specific. And you see, it starts level 4 in ESL, level 2 in ABE, and goes up through all of ABE. So that's a little difference in the different standard. There's some background for you.
OK, again, we're looking at the same example. We have, again, the reading standard 2.8. This is interpret multiple meaning words. So it all goes together. This is an example of a math blueprint that is also available on the CASAS website. And this shows you that the content standards are there and what levels, how much of those levels. As you see as you move up, some things like statistics are in both, but number sense is more in the lower levels, whereas algebra is more in the upper levels. These math blueprints, reading blueprints, listening blueprints, are all online for everyone.
Competencies are another way that we code items. And we want to cover different areas of functional life skills, from basic communication to learning to learn. Independent living skills would be in the power tests, so those are tests that you may not be familiar with. Those are for adults with disabilities.
Competency coding starts with a content area. It gets to a competency area, 3.4, for instance. So we got health, and then we understand basic health and safety procedures. And then there are certain competency statements that are very specific. And those specific statements really help teachers know what students do not understand, or, vice versa, what they do understand. So every test item has a specific competency that it's testing.
Here's another example of a competency. We've got employment as the content. We've got the communicate effectively in the workplace as the area. And then we've got the statement. For instance, interpret and write work-related correspondence, including notes, memos, letters, and emails.
So that's how we categorize items. Let's practice again. What is the competency area that's being tested in this item? What competency area? Again, use your chat box if you'd like to and answer the question. Don't feel shy, nobody's giving you points or no points for getting this. What is a competency area being tested?
And I see someone's trying to answer, yay. Thank you, Will. It is employment. It's coming from an employment handbook, and that would be the competency employment, particularly interpret employee handbooks. Very good. Thanks, Will, for participating. I love it.
OK, task areas have to do with the format of the test item. Sometimes there's forms, like an application form. Sometimes there's charts or tables, graphs. Sometimes there's diagrams. Sorry, that didn't come through there. I covered it with that picture. In this case, for instance, a math item is asking for angles. Sometimes it's a sign, like fire exit, keep the door closed. So there's lots of different format or task areas.
And what is the task area of this question-- or the task format, I'll say. This answer is very easy. Text, exactly. So text is a major one, especially for reading tests. But again, in math, you might have diagrams, charts, things like that. Thank you very much, Jasmine. You got it.
OK, so in summary, we have in one test item, we're looking at CCR standards of, in this case, interpreting words or phrases. We've got a content standard, a reading content standard-- interpreting multiple meaning words. We've got competencies-- interpret employee handbooks. And we've got the task area of text. So every item has lots of portions and parts to it. There are lots of sample test items available to any of you if you want to go to the CASAS.org website. You simply go to Product Overviews, Curriculum Management Instruction, to Sample Test Items.
Students are particularly interested in the CASAS eTest Sampler that includes reading goals, math goals, life and work reading, and listening, like life and work listening. It's very helpful for teachers to use sample test items to familiarize students with the CASAS tests before they take it. They'll know better what it will look like. It really reduces their anxiety, and it helps our teachers to know more what the test items are that they're giving. Since they did not design the test themselves, it's good for them to know what a test might look like.
OK, now we're going to start looking at reports. We want to review big picture reports, think through what's working, and see what might need to be tweaked. So why we do this is to identify potential problems. There may be some persistence or performance areas that we need to address. Of course, in the black area, we've got high performance, high persistence. Great job, everyone's doing a great job.
With high performance and low persistence, not a normal situation. High performance means that students are doing very well on their testing, but they're not coming often enough. So that's not an area that's very normal. Let's put it that way. However, low performance is one of the areas that is more normal, unfortunately. Sometimes we have low performance because of persistence. The students are not coming enough, so their persistence has gone down, and their performance goes down as well.
In this category up here on the left, top left, we have low performance and high persistence. Students are coming, but their performance is not improving. So that's in the area where you would probably need to talk with your teachers and address that from the teacher perspective, instruction perspective.
Let's look at the big picture by looking at a CAEP summary. When you look at a CAEP summary, you've got literacy gains on the left side there in that group of column B, C, and D. And that's for ESL, and for ABE as well. We've got CAEP outcomes particularly for the state, and those include-- I don't know if you can read those-- but HSE achieved, post-secondary achieved, entering employment, increasing wages, and transitions.
And then in the last portion, we've got services. So those support services, they're not required for WIOA II reporting, but it's very helpful for your agency to know how many students are receiving support services, transition training services, and career services. And as you mark those in your system, it's great to have this summary to look at.
By looking at the summary, you can determine how are we doing in terms of pre and post-testing. Well, we've got 2,151 students enrolled and only 1,669 that have taken the pre and post-test. There may be an area to look at to see who's missing from the pre and post-testing, and then another area is how many gains you've gotten.
Here you'll see in career, CTE does a pretty good job about enrolling. I don't know if you can see this, but you've got a very good number here for enrollees with pre and post in CTE. The gains are about 100 less, so about 66%. But their percentages are pretty good, whereas ESL looks like they need to do a little bit of adjusting. And you can analyze it class by class.
What are causes of problem? Possibly rapid growth of a program. So it's really hard to keep up with everything. A lot of new teachers come on board. Maybe there's many multilevel classes. If it's a small agency, that's usually the case. Sometimes it's a student persistence, or sometimes it's a student motivation problem. Why do we have to take this test? I don't want to take this test. OK, so it's up to the teacher to really come into play with the motivation and persistence and help with the multilevel class. The program as a whole needs to deal with the top two-- the rapid growth and new teachers.
Once you have an idea of your problem area that you want to work on, you might want to put it together into the PD plan. I know that was due recently. So it's probably fresh in your mind if you're doing WIOA II. And you can do a PD plan whether you're doing WIOA II or not. PD plan is a Professional Development plan.
Here's an example from an agency down in San Diego area. They wanted their instructors to gain the ability to effectively use assessment to inform their instruction, to select or create materials that would help them address issues that they saw in the student reports, and to support post-secondary transition.
So their PD plan had more in depth. They were talking about the ESL program. Half of the teachers-- sorry. Half the teachers were either new, or they didn't really have a lot of adult ed teaching experience. So they needed a little bit more orientation. The adult high school group was fairly new to CASAS testing. So even though nearly all the teachers had taken an online CASAS implementation training, the teachers were not regularly provided with class performance reports.
So providing teachers with these reports would help the faculty target areas in which the students were not performing the standards. But in order to do that, they needed to get their teachers prepared by understanding different skills that are tested and how to interpret the test results.
OK, so what were the steps that they took? They decided to have a group meeting with the data manager for TE, faculty coordinators, principal. They collaboratively developed a self-assessment instrument for their teachers. They administered that as a pre-assessment. They compiled the results.
And as a result, they scheduled one, and then a follow-up actually, of a professional development workshop, in which the teachers received training on assessment data. Then at the end, they re-administered the developed instruments. So the surveyed, did a workshop, and surveyed again. This provided a lot of good feedback to them.
The first workshop helped them to identify and practice reading useful CASAS TOPSpro reports. So they learned all the different types of reports and what might be useful for them. And that's what we're going to do. We're going to look at different types of instructional reports. We're going to look at individual types of reports for students, administrator types of reports to learn learning games. We're going to look at skill reports that help the class-- the teacher to view the whole class at once. So let's take a look at the reports.
Let's start with the personal score report. This is a report that is given to the students, and it's given after they take the eTest if you're doing eTest, or it can be given after they take a pre-test. The score report looks like this. So the student's name is at the top. It tells them what agency, what site, what class they're in. And maybe there's no teacher assigned yet because they're not in a class. This gives them their actual score and their program area.
So this student is tested using the goals 905 reading test. The test level is C. And their scaled score is 220. So they're put into ABE level 3. What does that mean? This helps the student understand where they are on the full scale, and it helps the teacher to see-- if they bring it to their teacher-- where they are in terms of their scaled score and their NRS level.
If you're not familiar with CASAS test scores, generally there's raw scores and scaled scores. I did refer to scaled scores, so why do we do that? We don't refer to raw score because that is just the number of questions answered correctly. Scaled scores interprets or converts that score to a comparable score between different forms at different levels.
So a score of 220 would put that person in a level C. And you can compare different forms, so that you know where they're at. If you want to know more exactly about how to read scaled scores and what levels they are, I'll refer you to the CASAS portal or the CASAS website, either one.
We do have skill level descriptors for ELL and for ABE. This also helps provide information to teachers about what does that level mean. They're a little bit more difficult, the ABE not as much for ABE students, but the ESL one is very difficult for students, especially at low levels, to understand. So the teachers would have to interpret those, but it's not bad to put them in classrooms and to provide them to instructors. Very important to do that to help the instructors understand the levels.
OK, what is a student test summary report? This would list all their tests, all their scores, and test hours of instruction. So that is found in TE under Test Results, Test History, and then Student Test Summary. What does it look like? It looks like this. So we've got your students in a class. And this class is ESL low intermediate.
Melinda is an active student. This is her ID number. These are the dates she has taken these tests, what test she has taken. So she's taken the listening test and a reading test, a listening test and a reading test. And you can see, Leo has just taken two listening tests, not a second reading test. Any questions about this report? Very helpful to teachers. Very helpful if you're trying to determine which class may need a little assistance.
OK, learning gains report is also a great report for administrators and for teachers to look at their whole class. This learning gains report comes from the same area under Test Reports-- Test Results on TE. It's another way to look at the same report as we just saw, but it narrows it down to the first test and the highest test. So Melinda had 82 RX, a score of 220. She had 81 RX a few month later and a score of 224 with a 4 point gain.
Why is the hours-- probably it was not clicked. Sorry. That report-- I did not click that to get that report. It can be on there. Good question, Debbie, about that-- the diamonds. So you see the diamonds here, that means that test may not be as accurate. Let me look back here at-- oh, too far.
Down here, when you get to those diamonds, that indicates it may not be as accurate as possible. And they would recommend-- TE recommends that they retest if possible, or you take that into account. If they're getting a lot of the-- if your students get a lot of the diamonds, that may not be as accurate a score as you could get by going up a level. Does that help? Right, Janice is adding, the student scored higher than expected.
The ranges for-- I just threw this one in here to remind you that the reading goals-- the scale score ranges have changed. So I just wanted to alert you to that fact if you already didn't know that. They used to be a little lower, so they've been adjusted to the new goal series.
Christie asks, when the student with a diamond post-tests, will the eTest automatically go up a test level? The next suggested test level, if they're taking another post-test, will probably put them in the next test level. You can adjust that if you like. The diamond is still counted. It doesn't mean that you have them do a retest right away.
A class profile report is also very helpful for teachers. A class profile report gives a competency for each test item. And it tells you, for this class, for instance, there were a mean score of 15 out of-- I believe there's 15 items. So wait a minute. The mean score was 15. A raw score and the scaled score was 193, which is kind of low. But it was a level A 81R.
This tells what competencies that the students are doing well at, the pluses, and what they're not doing well with, which is the minus. So later in the test, it gets harder, it looks like, for the students. Those are accurate and these are inaccurate.
Skill reports I find very helpful. You have student content standards and you've got a class level content standard performance summary report. So you've got test items and content standards. Let's take a look. It's generated under test results again, content standards. You can choose student content or content standard performance.
So for student performance, it's got the student's name at the top with their ID number. This is just Maria's results. How many test items had different content reading standards, and then what percentage she got correct. On this one, she has a weakness here for reading 2.8, which is interpret unknown and multiple meaning words in text, things like root words, affixes. But she does very well, very, very well, on 3.14 and 4.3, identifying the author's purpose and determining what a text says implicitly, making inferences and drawing conclusions, which is interesting.
There is one test item she did not get about analyzing author's purpose. I don't know why. Well, 4.6 is just slightly different from 3.14. I have to look at the standards to see what that would look like. But it shows you where she could work a little bit harder and what she's doing well. So it's very important to note their successes, as well as the areas where they can improve, especially when talking to students.
It's hard to get these to come out exactly on this PowerPoint. But you'll see here, this is a class that Instructor Goldberg did. It's a reading goals level C. There were 13 tests, and this is how the class came out. So again, it shows you areas. This class in the middle on almost all items, except this one, which could easily be taught using text features. What does boldface print means, symbols? How do you locate details? Things like that may help the teacher know exactly what to focus on.
Student competency performance and competency performance summaries. These are two different types of reports-- again, student-focused and class level-focused. Student competency performance, again, under Test Results. Competency performance, this would be student competency performance.
Here it is. Again, Maria, the position on the test. So that is the item number, whether she got the item number correct or not. And the competencies are-- generally the major competency is listed on the right, the competency description. The tasks, they'll show you the task types they are or the format of the task. So you're going to see signs, charts, diagrams. Those are going to be ones. And then when you have two, that's text. And three is more about directions, labels, things like that.
This tells you what area under competency, and this summarizes it nicely. So you can show the student how many she got correct in a certain competency area. And understanding concepts and materials related to job performance was a big area. There were 13 items on this. She missed about half of those. So if you just focused on that one area, she may be able to improve her score quite a bit in focusing on materials related to job performance.
Communicating effectively in the workplace was great. She did 80% correctly. That was great. I would focus most on the most number of items and then the lowest percentages. Questions, let's see. Total test over what time period? I'm not sure what you mean-- total tests.
Thanks, Janice, for putting that in. On previous report, there were 13 total tests represented. What time period the report takes into account? The time period is-- let me go back. I think you're talking about this report. 13 total tests, let's see if I can go back and find that one.
13 total students took 13 tests. One day, they took this test. It's not over a period of time. It's just that one day they took that test, and then gave the results of that test. It was the 905R-- or 906, excuse me, the 906 reading goals test. Does that answer your question, Shawn OK, good. Keep asking questions, that's why we're here.
All right, let's look at the class competency performance. Again, this would help the teacher. There were 13 tests. How many items? There were 20 items. The correct-- it gives you the percentage correct. And then if you wanted to look-- so 92% of the students are interpreting these job related signs well, but 46 and 53 for the other two. So they may still need a little work on that area.
Other areas, again, it really depends on the items. You see, there's a variety of percentages here. So this item in particular, 14, was difficult for them. But 15 was not so difficult. I would look again at the areas that are lowest and see if there's a good impact that could be made if there is.
It's a snapshot in time. The task column tells us what the format of the question is. And that helps us to know if they're having trouble with diagrams, charts, that kind of thing-- that's number one-- or text, or reading labels, or directions, that kind of thing, which is three. Does that answer your question, Christy OK, good.
Yeah, I did skim over task area kind of fast. Individual skills profile and then, again, a summary. So you've got the student level report and then the class level report. So when we go into TE again, you've got the option-- test results skill profile, individual, or the summary. So the individual one is good for students and the summary's great for teachers. So let's look at those.
Individual skills profile, very interesting, especially in goals now. I like them. Math and reading, she took math and reading tests. She had two different scaled scores. She ended up at two different levels. Clearly, her reading skills are better than her math skills, which is good information for her teacher, since she's in the HSC course.
It tells of reading competencies that she's great at and some that she's not good at-- government and law. Over here on the right side, we've got reading content standards. Again, it shows what she's really great at-- reference materials and reading and thinking skills. So in order to find out more about that, I would look under reading content standards. On the CASAS website, you'll find out exactly what that means.
Computation wise, in math, not as good and she should be, probably. Number sense, she's OK but not great, since that's more of a level A. And measurements, she needs a little bit work on measurements. So again, you can see what she's good at-- reading forms-- not so great at reading charts, maps, billings, graphs, things like that.
It also tells us, since she's a goals student, she has a 79% likelihood of passing this area of the GED test. But it recommends more study for mathematical reasoning. Lots of good information there. OK, Connie's adding that at the bottom of each report, there are definitions of what the task areas are. Yeah, these are clipped reports.
Individual skills profile again, here's another one to give you a different idea. Ana Zin, how she did. Again, a little better in reading than math. And what are the areas here that are needing a little work? Would you like to answer that, anybody? What could Ana work on if you can see it? I know it's really tiny.
Under reading competencies, what could she work on? Learning and thinking skills, very good. Consumer economics, good. And under math, looks like she's doing pretty well on anything related to health or employment but maybe could work a little bit more on government and law. Thank you, Will. Guys, great, thanks for your participation on that. I can see that you can read these reports, that's great.
OK, individual skills profile summary, again, that's for a class. This class had 26 students in it, 26 tests. This is the dates that they took the tests. In reading, the mean score was 225. So they're definitely level C's, good. This shows over the class what they're good at, what they need a little work on.
They're pretty much middle of the road. They're pretty good at community resources. They do need a little more work on language and vocabulary, doing better at comprehension and higher order reading. That's great. The reading tests that they're best at are reading signs, advertisements, and product labels. This is, again, information for the teacher to know where to focus more of the attention.
Once they know that there is an area that they want to work on, they can go to quick search. Quick search is a-- it's an easy access database through the CASAS website. And it points out print, audio, video, software materials that are correlated to competencies and content standards. It's very helpful.
Again, you want to communicate the information with the students about the data. These are a couple questions to ask, especially of your teachers and of your program. Do you share with students about the purpose of the test, the overall class results, and individual results? And why is it important to communicate with students about the test and data in general? These are important things to consider for your program and to ask your teachers.
All right, this got lost again too. It's good to have the skill level descriptors posted in every classroom and to have teachers use them to show students where they are in the program.
Communicating with students about CASAS tests. Here's an example of a goal sheet that you may want to have teachers give to students to encourage them to focus on their test scores. It's really important to involve students in their own goals and tell them the what and why of CASAS. Maybe they want to write down in a little text box what are their issues that they need to work on from the standards.
It's really good to provide assessment results privately, if possible, on individual skills and have the student's chart their results and allow students to complete the assessment again later if they feel that their score doesn't accurately reflect their ability. If you have the retest done right away, maybe a student was sick, did not do well, they know that, they could be retested.
Some people are typing questions. Is this-- no, no, personal record sheet, no, I don't have that available. But I think it's a great idea. And if you take it from the slide, it's not really that hard to recreate. Thanks, Stella, for that question.
It's really important to follow-up with teachers. So there was a follow-up workshop recommended, where you report the results. You've given them a workshop already to talk about reports. They've used the reports. They've created activities and lessons to help the students gain the skills.
So find out, how did that go? Maybe they could share their activities and lessons with each other, share any challenges and successes-- like, I couldn't find materials to go with this, oh, I found materials from this source, that kind of thing-- and that really helps students become more aware. And they can discuss common scenarios with test results and hopefully set some new goals.
So let me give you some scenarios that were used. Here's an example scenario. The competency my students have trouble with is at the end of the textbook, so there's never enough time to get to that chapter. I'm not sure that I can jump to that chapter without doing the other chapters first because they need the scaffolding. So I have teachers discuss with each other what they would do in that situation. In other words, turn and talk.
Another scenario to propose to the teachers would be looking at a competency related to employment. Maybe you've seen that in the results talking with students and talking with the teachers. That's an area that many students are dealing with. In the evening, they're doing very well, but in the morning not so well because there are more stay at home parents, or maybe they're retired elderly, or they don't work, their spouse works. So that's a scenario to talk about.
OK, Sharine notes, having students take up to five different levels of tests makes it hard to share information with them in a way that's meaningful. OK, so you get class reports for each form. And you'll have to then analyze those to see what areas-- if you're a teacher, what areas the students are having the hardest difficulty with.
And I know with five different levels of tests, that means that you have five different levels of students, basically. And so you have multilevels and it's difficult. If you can find one or two areas to focus on, you can't focus on everything. But also helping the students know what their area of difficulty is helps them to focus by themselves because they're adults. They can focus on their-- take responsibility and focus on their weak areas as well.
It's great to have group discussions, teacher-to-teacher, how to address issues. And there may be a similar issue in your program that you can see from the reports, or that you're aware of. And that would be a great time to bring it to the teachers to talk about and discuss what's a strategy for dealing with it.
When we're talking about qualitative results, it's great to have teachers better understand the importance of CASAS test results. They can be better equipped to plan lessons, select materials, connect needs to the class content and skills practice. They can access more. They learn to access selected TE reports, given that if they're given access to those through the data manager. Maybe there's unintended consequences, like good consequences like reflective teaching, and really a good chance for teachers to talk about what they're doing in their class and how they understand the needs of their students and address those.
Quantitative results from workshops could take time. You need time to see the cycle between the pre-test, the reports to the instruction, to the post-tests. So give it time. Anticipate learning grains at the end of, maybe, the next term or the next semester, the next year even.
Report back to the instructors how things went. What did you see as an administrator? What did you see teacher-to-teacher? And from the data, identify additional content standards to address in the future. Maybe you addressed government and law, but now you need to address consumer economics.
Teacher benefits-- increased knowledge and understanding. They have greater professionalization, enhanced motivation. They improve success with standards-based instructional planning and delivery. They've developed a team culture and bonding, so that they can share lessons with each other that will help the students.
Students will have a better understanding by looking at their CASAS test results. They have improved perceptions of the links between what they're learning and what's being tested. And we can meet their needs, hopefully, in a better way. Happy teachers, happy students, give results. So here's a class at MiraCosta College with their certificates. And that's the teachers there at the workshop.
Going forward, make use of reports to reflect on instruction. Use CASAS test results systematically. So you want to return to reports often on a regular basis. Hold regular group meetings and check-ins with your teachers. Provide procedure and process for training new teachers, so that they understand the reports. They know what they're doing to address those concerns. And continue to share and compile lesson plans and activities related to the competencies. So teachers can really help each other. Are there more questions?
We're wrapping up. I'll give you another minute or so to get questions out there. Feel free to email me as well. I'll put my email address down there. You can also email Janice. She's been chiming in, so that's good. This will be recorded-- this has been recorded and will be posted. So you can get both the PDF, PowerPoint, and as well as the recording.
Oh, you think assessments are easier than tests. Yeah, a beginning level ESL students don't understand the word assessment. And I come from ESL, so I always say test. But you're probably right, it doesn't scare them as much.
OK, I think we've reached the end. So if there are other questions, please feel free to email us or stay on the line for a minute. I can answer some questions.
All right, Margaret, I see a few people are still typing in the chat. But while you all are doing so, I would like to first thank you very much for participating in today's webinar, as well as thank you to Margaret and Janice for being our CASAS experts this afternoon and answering all of the participants' questions.
In the chat pod, I have posted a couple of URLs. The first is where-- the web page where the webinar recording, as well as the PowerPoint presentation, will be located this afternoon. So please be sure to check back on this webpage later on this afternoon for the posting, and also please be sure to share the content with colleagues who were unable to attend today's webinar.
In addition, I have posted the URL for our upcoming webinars and workshops. We do have a webinar scheduled for Monday, November 18 from 11:00 AM to 12:00 PM. And that will be a CAEP data and accountability road show Q&A session. So if any of you have attended any of the workshops that Jay and Neil have facilitated this fall regarding program changes for CAEP, if you have any questions that are outstanding that you really would like answered, that will be an opportunity for you to ask Jay directly.
The entire hour is dedicated to answering any of your program related questions. So please be sure to join us on Monday during that time for that webinar. In addition, we have other webinars coming up, so please be sure to register and/or attend those webinars as well. Janice or, excuse me, Margaret, do you have anything else?
No, thank you very much for coming today. And you can refer to the CASAS website. And Janice has posted for the CASAS tech support number to call. Feel free to call them if you're having any trouble generating reports. They're the best people. They'll just walk you right through it. OK, thanks so much for coming.
OK, great. And thank you all again for joining us today. I will now close the webinar room. And when I do, an evaluation will appear. Please be sure to complete this evaluation and let Margaret know what you thought about today's topic, as well as if there are additional technical assistance or professional development opportunities that we can engage you all in. Thank you all very much for your time and have a great afternoon.