[audio logo]

Speaker 1: OTAN, Outreach and Technical Assistance Network.

Rachel Riggs: So I'm going to put this link in the chat for those of you who are with me on Zoom. Anybody who is in the room, you can scan the QR code. I think you know the drill. And we will start off with-- let me make sure I keep this QR code up for you. Oh, well, there it is again. So if you missed it the first time, there it is again.

I see people are in here, so I'm going to go ahead and go to the first question that I asked the OTAN CampGPT participants, which was to rank their knowledge of generative artificial intelligence. So at first glance, we can see how they responded, and then you guys can go ahead and add your responses, and we'll have an even better collective representation.

And what we saw in CampGPT is that a lot of teachers joined CampGPT already having used or, at least, having heard of generative artificial intelligence. And really, these days, it is almost getting hard to avoid the topic, so I think that's why we see these lots have heard of it, lots have used it. And we're still developing our knowledge of strategies to use it well because it is an emerging technology with not a lot of research and use under its belt. So I think it's natural that we are all developing.

Strategies to integrate it into our practice. So I also asked them this one. You can see some of their responses here. If you could ask a robot to do anything for you this week to make your job easier, what would it be? And you can see what some of the Campers said.

Do my monthly budget. Really clean my house well. It always cracks me up because what I usually find with this one is that there's a lot of cook for me, do these other tasks for me. I've got my work pretty well covered, but I need you to clean the kitty litter box for me. So I love seeing those examples of the real-life stuff we actually want the robots to do for us. Assist with cleaning.

And then, of course, we would love some assistance with coming up with materials, designing a logo, organizing our files, updating all the EL Civics materials. So these were the past responses and you guys can feel free to chime in here with what you would like a robot to do for you. I think I'm seeing some of them down here. Do my taxes. Yes. And we're getting closer to that one. I think it's so funny that joke that people make about like, why does anybody trust me to file my taxes with zero training? And it's kind of funny.

OK. So digital literacy, this was another question we asked to the CampGPT participants, how often do you integrate or teach digital literacy? So we saw that trend pretty close to all the time and would love to hear from this group as well. How often-- yeah, OK, great.

We see that moving along about at a 3.8. I think it was at a 3.9 with the CampGPT participants. So we're all pretty much hanging out there where we're integrating and teaching digital literacy somewhat often, maybe not all the time, but maybe not never either, which is good. That's a great starting point for us to be integrating AI literacy, and we'll talk a little bit about that today.

OK. So hi and welcome. Thanks for that. Like I said, I just wanted to show you guys a glimpse of the different questions I was asking the participants in CampGPT and give you guys an opportunity to chime in on those as well. But now I'm going to shift over to our presentation for today. And, well, let me share my screen so you guys can see what I'm shifting over to.

Yeah, and Catherine is right. Catherine says in the chat, tax software has been using AI for some time now. That is absolutely true. And I think we're just not quite at the point where from start to finish it can take over for us, and we might not, actually, want that ever to happen. But good point, Catherine.

OK. So welcome to-- not really CampGPT, but I am, in a sense, welcoming you to get a glimpse into CampGPT, which was a professional development experience that I did with OTAN throughout the month of February, so just last month.

And my name is Rachel Riggs. I'm a Technical Advisor for World Education. You may be familiar with our EdTech Center. Our EdTech Center and OTAN work very closely together. We are great partners and have shared a lot of work. I also work for our CrowdED Learning Initiative. That's an open educational resource initiative of World Education.

And finally, I've been leading our AI for Learning and Work Initiative, and really, that's been focused on building partnerships, collaboration, what do we need to work together collectively and figure out what the path looks like ahead of us when it comes to integrating AI more into learning, and when it comes to AI and the future of work knowing that our work in adult education is very much closely related to workforce development, continuing education, et cetera?

So that's a little bit about my role. My background is as a TESOL educator. So I taught English to speakers of other languages for maybe about eight years before I entered into an adult basic education program and got much better-- more familiarized-- I can't say that word right now.

Became more familiar with what adult basic education is, the unique challenges and opportunities in that realm. And that led me toward digital skills, digital literacy, educational technology, and to my path here at World Education. But my background is very much rooted in teaching English to speakers of other languages.

So let's start out-- and I know you guys have had so many sessions on AI at TDLS, which is really cool. So this may be repetitive, I hope not, but I think it does help to just level set a little bit. Like Catherine said in the chat, AI has been in our tax systems, AI has been really embedded in a lot of the technologies that we use for a long time. It's a really broad field, and it intersects with philosophy, math, statistics. It's very far-reaching, artificial intelligence is.

And so it helps, I think, to narrow our focus on generative AI because that's the new thing that we're all trying to explore and get to know better. So we can think of artificial intelligence as being big field, been around for a long time, cross-cutting, and all of that. We can think about machine learning being an innovation in AI that is significant, and a lot of the AI that we think and talk about today is machine learning AI.

Then we can think even further about deep learning being-- pushing us even further along in AI. And that is where AI engineers have developed neural networks that mimic the way that the brain works, making different connections and synapses between different nodes. And so deep learning brought us even closer to what now we know is called generative AI which generates new types of media images, video, et cetera.

And then within generative AI, we can think of these well-known examples like ChatGPT, a chatbot that generates text primarily, but now can also generate images. We can think about tools like Midjourney that generate images from a text prompt and lots of other tools that are able to do that now.

There's a tool called Synthesia. This is a video generator that creates a realistic voice over from a script that a lot of companies use for training purposes. You give it a script to talk about, what are the ethical values of our company, and then that avatar will tell your employees what the ethical values are.

We see now some more specialized tools, which is really exciting. So there's a tool called Magic School AI that teachers love because it is designed for teachers and gives you a lot of different tools for generating teaching materials. There is a tool called Pace AI. I know that there at TDLS, this-- Yeah, Catherine. Sorry, guys. Catherine put in the chat, do you have PowerPoint slides so we can use the links you have in them?

I had some trouble uploading my slides to my presenter dashboard, so I will be happy to work with Audrey or the OTAN staff and make sure that these slides get into the TDLS site. That's my bad. I should have maybe had more time ahead of this to do that and troubleshoot for you. The other thing is I can I'll pop a link in the chat to them in just a minute for you, Catherine, since you're on Zoom.

So Pace AI, that's right. So Pace AI is even more specialized. If we think of Magic School being for education at all different levels, Pace AI is a tool that leverages generative AI to transform any reading content into personalized lessons for adult English language learners. And I'm pointing that out because we started with lots of foundational systems like ChatGPT, and Bard or what's now Gemini, and Midjourney, and these were, really, just general purpose. And as we fine-tune those down to specialized tasks, they actually become a lot more useful for the specialized tasks that we're working on.

So that's what CampGPT is largely focused on, is supporting teachers in learning how to use generative AI tools. So CampGPT was, like I said, a professional development experience that we held with OTAN, and I've actually held it now three or four times with different groups.

But it is a couple of sessions learning about generative AI and tinkering with generative AI tools. And really, it's focused on collaboration. So bringing all of the different experimentation that we're all doing together so that we can talk about it together and share our tools and strategies and what we're uncovering as we explore generative AI.

So CampGPT was held in February. We had two live sessions. We had a worksheet in between the sessions where Campers were logging what they were doing with generative AI. And so what I wanted to do at TDLS today is share with you some of the things that we learned and some examples of what your colleagues in California did in CampGPT. And, again, all in the spirit of pushing us further along in our collaboration and collective knowledge around generative AI. So I appreciate you guys joining me today to talk about OTAN's CampGPT and think a little bit more about how we can move forward as a field.

So we have modeled CampGPT largely around these three steps that came out of the European Union's European Digital Education Hub's AI Squad, so it's three levels there, on artificial intelligence in education. So they developed a report called Use Scenarios and Practical Examples of AI Use in Education. Oh, thanks, Laura. Laura's sharing some links to Magic School and Pace AI. Thank you.

So they developed a report, and it outlines these three steps for teachers. Which are teaching for AI, and this is the knowledge, skills, and attitudes to navigate AI from a user perspective. I like to think of this teach for AI a lot as digital literacy. So that's why asked how much you're teaching or integrating digital literacy because the two are very--

AI literacy and digital literacy are very closely connected. And so we think about, what skills do learners need to navigate AI and technology in general from a user perspective? So how can we develop in them the digital literacy that they need to navigate an increasingly technological landscape and an increasingly AI-infused technological landscape. So teach for AI is closely related to digital literacy.

And then teach about AI. This is the idea of training students about AI from a developer's perspective, which is a little more technical. And then teaching with AI. And this is the idea of, what do teachers need in order to use AI effectively in instruction?

So in CampGPT, we actually focus on-- we push teach with AI up a little bit further in priority because we think teachers really need to feel comfortable using the tools themselves. So we focus first on teaching with AI, and then we talk about teaching for AI, and what that would mean specifically in an adult education setting.

We don't do as much around teaching about AI, acknowledging that that is something more technical, and it would really depend, when you're working with adults, whether or not you want to teach about AI and give them that developer perspective. Probably it's dependent on if it's relevant to them and their needs and goals. So we don't really address that as much in CampGPT. We keep our focus on teaching with and teaching for AI.

In CampGPT, we outline four rules for teaching with AI, our camp rules. Which are goals before tools. They spell out this acronym here, GEAR. Goals before tools, explore and have fun, avoid bugs, and remember to buddy up. Now, these are all connected to different ethical AI principles.

So when we think about goals before tools, this is our EdTech Center mantra. This is how we have approached educational technology at World Education for a long time, AI or not, and we think it's really about being user-centered and having a strategy and a purpose for using technology in the first place.

So here we're talking about, when you're using generative AI, knowing what your goal is, what you really want to accomplish, what's the challenge you're trying to overcome, what is the need you're trying to meet, what's the problem that you're trying to solve, and then you're much better equipped to choose the tools that will actually support you in getting closer to that goal.

And then secondly, explore and have fun. This is all about building our confidence, agility, and something we call digital resilience, right? So if we have an opportunity to explore and experiment and have fun with it and do it in a low-stakes environment, then we can build confidence to bring AI into the classroom and do that ethically and effectively.

And then we talk about avoiding bugs, which are the pitfalls and drawbacks that we know are inherent with many generative AI systems. And finally, remembering to buddy up. And this is about human-centeredness, and really relying on one another as humans, trusting ourselves as the human experts and the teaching experts, and really thinking about the human component of using generative AI.

So like I said, I wanted to give you guys-- skim the surface of what we cover in CampGPT, but really what I want to focus on is sharing the work that was done. So I'm going to start with this example of Gunhye Oh's-- and I'm maybe mispronouncing that, and I apologize if I am-- work in CampGPT. So how we are looking at these different rules, I wanted to connect that to their work.

So if we think about goals before tools, the goal here in Gunhye's work was to help create math questions that could be used in Schoology assessments and autograded. So specifically looking for something that would generate some multiple choice questions so that learners can get that immediate feedback. That's the goal.

So then we move on to experimenting. And they experimented with a new tool, Magic School. And we in CampGPT talked about different prompting strategies. And so the idea was to experiment with some new tools, and also explore prompting strategies that would help you reach your goal. Again, A is avoid bugs.

And so we also wanted to make sure that there was time for a reflection and sharing what was uncovered in our experimentation. And what they shared is that they ran into this bug of inaccuracy. And in Magic School, there actually is this little disclaimer when you go to generate multiple choice answers. What this does is Answer Key-- well, this was-- OK, it was the Answer Key. It was the little title right before the Answer Key.

So it says Answer Key, and then in parentheses it says, always review AI-generated answers for accuracy. Math is more likely to be inaccurate, and that is exactly what Gunhye found. That the actual questions and the correct answers in the Answer Key that Magic School provided were not actually correct.

So in that case, Gunhye did encounter that bug that Magic School, thankfully, is being very transparent about. But it was a good learning experience to experiment a little bit with Magic School to reach this goal, and then also be able to reflect and see that using it for math-related multiple choice questions may not be the best use, or, at least, not now until it improves and works out some of those bugs.

And then, of course, we come to this concept of remembering to buddy up and that human-centeredness and collaboration. And we facilitated that by sharing our worksheets with each other. And so Gunhye actually posted the work that they did in a Padlet so that others could see the example and learn from it.

Another example is by Hillary Estes. And these are your colleagues. I mean, it's possible that you may know these people, which is awesome, but these are your colleagues in California. So Hillary did their assignment for CampGPT. And so starting with the goals before tools, Hillary's goal was to generate HiSet-style essay prompts to use in student writing practice.

And so Hillary decided to try this with two tools. Again, I just love that Hillary tried out two different tools. That's part of our experimentation is we should be trying these tasks in different tools and compare and contrast. It's a great strategy for coming to a tool that's really going to work for what your goal is. So Hillary tried it in CampGPT-- ChatGPT and Claude, two different generative AI-based foundational models of chatbot.

So Claude-- I'm sure you've heard of ChatGPT. Claude is very similar to ChatGPT in interface and capabilities and all of that. Not the same, but similar. So Hillary experimented with those two tools and used this strategy called prompt chaining, or having just a really detailed dialogue and back and forth with a chatbot until you get the desired result.

And what Hillary uncovered was a lack of precision. And this goes back to what I was saying earlier about we start with these really general use tools like ChatGPT and Claude. They are not fine-tuned for a very specific use, and thus, ChatGPT wasn't familiar with HiSet-style essay prompts. We can all think about and know, maybe, that it's challenging to even explain to our friends and family members what we do as a profession.

It may be challenging if you onboard new teachers to break down for them all of the different acronyms we use in our field. And so we can imagine how the developers of these tools, maybe, didn't have our specific use case at the top of the list in designing these tools to be specifically useful for something like a HiSet style essay prompt. And so that was a bug that Hillary ran into was that it can produce great language that sounds good, but may not be familiar with these more technical or specialized aspects of your work.

And then finally, Hillary shared this, not just through the Padlet and sharing a worksheet, but also by sharing a link to the prompt and the conversation that she had with ChatGPT. So what we're calling those are open prompts. And what we have done at the EdTech Center is we have actually compiled what we call the Open Prompt Book. So we think of an open prompt as a prompt that is shared openly that others can see, but then a conversation or a prompt chain that others can actually pick up and use.

So we developed the Open Prompt Book, which has links to various prompts for teachers that are just getting started out maybe using these tools, or want to try someone else's idea. In the Open Prompt Book, you'll find a lot of different use cases. These were all created by adult educators in CampGPT. It was one that we ran over the summer.

And on each page, you have a different prompt that was created. So this on the right side is the prompt. It's color coded by a prompt framework. And it gives you a real example of what teachers in adult education are using these generative tools for. And it might also expand your thinking around, oh, I want to use it for a multiple choice assessment, oh, well, I also could use it for this, right? Just to give people a new ideas and give them an environment where they could really experiment.

So this one, this example here is from a classroom observation checklist. We saw use in professional development, teacher evaluation, which is what this is. We saw use in math, science, ELA, ESL, and then even administrative tasks. So the Open Prompt Book is linked here on my slides, and I'll put a link to it in the chat. And even if you Google Open Prompt Book EdTech Center, it'll probably come up for you. But it's got just a lot of great examples. I encourage you guys to dive into that and see what other teachers are doing.

And throughout the Open Prompt Book, there are also some explanations and considerations around what ethical AI use means, some things to be cautious of, and some ways that it can help you. So some of those potential and pitfalls are sprinkled throughout the Open Prompt Book. This is an example of bias, which is very plain to see in images that are generated by AI and may be more subtle in text that is generated by AI. So these kinds of examples are sprinkled throughout.

So I wanted to-- and I'm sorry, I know I used Mentimeter at the start, and now I'm using Slido, but this is a completely new one. This isn't from CampGPT. Thank you. Catherine says, yes, lots of bias in AI, particularly in areas of lending within business. Laura says the prompt book is all kinds of awesome. Well done. Thanks, Laura. We really hope that it's useful. We thought it would be great, too, just for teachers to see what other teachers are doing.

So join slido.com. Enter the code. Scan this QR code. I want to hear from you guys a little bit about how these different rules of goals before tools and experimenting and avoiding bugs, how are these playing out for you in your use of generative AI? So please join the Slido. I know we're switching, Mentimeter now Slido. And I'm going to give a minute here and you guys can-- you'll always have the instructions over here on the left. So you can still scan the QR code or enter this code at slido.com. Oops, that's not what I wanted to do.

OK. So first question, which of these generative AI tools or features have you used? We have ChatGPT-- and you can select multiple-- Bard/ now Gemini, Claude, Magic School, Pace AI, Khanmigo, Quizlet's Q-Chat, Diffit, Perplexity, and I know there are others. If you're using something else, then if you're online you can put it in the chat. If you're in the room, you should turn to the person next to you and tell them what it is that you're using.

So let's see. We've got lots of people have experimented with ChatGPT. And second up is Canva's Gen AI features. I'm right there with you. I use those a lot because I already used Canva a lot. So it's great when these are built into the things that we already use. And equal with Bard and Gemini.

We have some Padlet's "I can't draw." So maybe you've used Padlet. There's an I Can't Draw feature that actually has been around before ChatGPT, before all of this craze happened. They have had this AI image generator in Padlet. Maybe they weren't getting enough credit for it.

OK. Laura shared, in the chat, twee.com, T-W-E-E.com. I have not checked that out yet, but I certainly will. Diffit is a tool for differentiation. Khanmigo is primarily a primary, secondary K-12 tool. It's a chatbot that teachers and learners can use in Khan Academy. Pace AI, that's our friends who are supporting English language learners with generative AI.

Perplexity is a search engine-type tool that is supposed to be better at giving you search results along with citations for those results. Quizlet has their Q-Chat, which helps you-- goodness, I keep clicking on this slide and it's advancing me. OK. Quizlet has Q-Chat. That's a chatbot that helps you practice vocabulary. Magic School has all those different content generators. OK, cool. ChatGPT remains the winner of that.

So in a word, what has been your goal for using generative AI? So if we think back to that goals before tools, what has been your goal? So keep it to a word. It's going to make a word cloud. So maybe we could say like assessment or creativity. What are you really looking for it to help you with? OK. Laura says Microsoft Copilot. That's right. I didn't include that one. Microsoft Copilot is really, really embedded into the Microsoft suite of tools. Thanks, Laura.

OK. So we have saving time, creativity, create new ideas. Catherine Knox is saying, I will share that employers are now training employees to create bots for lower-level job duties, and now they are starting to expect students to know how to do this. That's interesting, Catherine. I have created some GPTs.

I don't know if others have created those, but in ChatGPT, you can create your own specialized chatbot. And Catherine will talk in a little bit about this expansion into digital literacy and AI literacy, which is critical when we think about the future of work and exactly what you just said in the chat that now this may become a demand on students to be able to develop and also interact well with these AI systems.

Save time is big. And I was on a panel recently and somebody brought up a great point, which is that, I think we want AI to support us with this and that is great as a teacher, anything that will save us some time. But, man, we should look at some other why. Why do we need to save so much time, and what other supports do teachers need because they're so pressed on time? But absolutely, I'm glad that teachers are finding ways to save time with these tools. And grammar lessons, time saver, review, new ideas. OK, cool.

Let's go to our next question. What bugs or drawbacks have you discovered? So this is open-ended. What are some things that you have uncovered that you're not liking? Bias is one that gets talked about a lot. Inaccuracy is another one like we saw in that. Yeah, not all information is accurate. Sorry, jumped ahead there.

Yeah. Yeah. Asking the right prompts. I mean, there's the term prompt engineering, which bothers me because I'm like, if the technology works really well, we shouldn't-- I mean, prompt engineering is also for developers of these tools, so I recognize they maybe need that term. As teachers, we shouldn't need to really engineer prompts. We should be able to interact with it in a really great user experience that is very intuitive. So I'm looking forward to them further developing and specializing so that it is a little easier in terms of prompting.

Lack of understanding of ESL literacy. Absolutely. Not able to answer. Too general. Too general is so big for me. I mean, it-- oh gosh, I keep doing that. They are so useful, but they're not that-- in terms of ChatGPT, it's not that that creative. So it'll give a good starting point, but it's not going to give you anything groundbreaking. Not everyone is open to AI. That's a challenge.

Old data. Bias. Incorrect. Can't understand. You need to vet it. That will always be important no matter how far we advance. I think it'll always be important to have a human intervention. And then students' access to devices. OK.

Understanding what it has access to. Yeah, like knowing where did this information come from. So that's a lack of transparency, I would say. All right. The art and accuracy is in the prompt. Well, I think the art is in the prompt. I think still, it will still generate inaccurate results, and I think then the accuracy is in the human verification component. Absolutely. Agreed.

All right. So we learned a lot about teaching with AI in CampGPT and explored those topics. We also talked about teaching for AI, which is the knowledge, skills, and attitudes associated with how artificial intelligence works, including its principles, concepts, and applications, as well as how to use artificial intelligence, such as its limitations, implications, and ethical considerations. So that's a big word salad, but so many important words here like applications, right?

So learners really need to know how they can use it in their lives. I am a parent, and so I always think about, OK, if I have an adult learner in my class, what are some ways that I could even integrate some family literacy concepts? Ways that this could be used to support a parent by making a meal plan, or helping with some of those tasks, to write a birthday invitation or whatever it may be.

So knowing some of the different applications, knowing, of course, its limitations, ethical uses. Catherine says ethical use is very important and Catherine's working on building out a course on this. Excellent. Good to know, Catherine.

Implications are important. What are the societal implications? What are the implications for my work, my future work, and opportunities, ethical considerations? So this is a hefty definition, but I think it's a good one. And it came out of TeachAI. I encourage you to look-- check out the resources that they're developing. They will have an AI literacy framework coming out.

They're focused very much, again, on K-12 education, but we've been drawing on a lot of their resources and adapting them for adult education. So they've come out with a guidance for schools toolkit. If you are in a leadership position and it's part of your responsibility to develop guiding principles or policy around AI, that's a great toolkit to check out. Thanks, Catherine. Catherine put the link to teachai.org in the chat.

So we think about knowledge, skills, and attitudes that learners need. We think about, OK, what do they know? I know what about AI? What do they know how, or what can they do? That's the skills component. I can what. And then the attitudes or the responsibilities, I like to even think about. I must, what am I called to do in this AI, this world.

So in DigComp-- that's the European Union's Digital Competencies, or Digital Literacy Framework-- they have developed some knowledge, skills, and attitudes relevant to AI. So when we think about knowledge, maybe just an awareness that the sensors used in digital technologies generate large amounts of data, including personal data that can be used to train AI. And so if I know that my Apple smartwatch is tracking my steps and location, and that could be used to train an AI system. That's a knowledge and awareness that learners need.

Skills. So knowing how to modify user configurations to enable, prevent, or moderate AI system tracking, collecting, or analyzing data. So here, it's a skill. It's a competency. It's a I know how to change my settings in an app to protect my own data. And you'll notice a lot of these have to do with data.

And then finally, an attitude. So my attitude or consideration, or I must consider ethics as one of the core pillars when developing or deploying AI systems. So knowing about AI ethics is another important part of that AI literacy.

So how, that's the what. What is AI literacy? And the question then is, how do we execute this with adult learners? And I think that what's really key is to integrate it just as we do with digital literacy into other skill areas and what's relevant for adult So this trio we have here are the academic skills that learners need, the cognitive skills that learners need, and then how those get applied in a digital environment.

So some examples of that integration might be integrating data privacy. OK, yeah. And Catherine just shared a link to Day of AI. There's lots of curricula on that site. Thanks, Catherine. Integrating data privacy concepts into a social studies lesson on human rights. So that's a way to integrate AI literacy into social studies or maybe you even teach ELA and social studies within ELA. I know we're always layering and layering and layering. How many skills can we address in a one-hour lesson.

And then integrating computational thinking into a language lesson in which learners are describing the steps of a process. That's another way you could integrate AI literacy. So once you begin to explore, those connections, I think, become a lot clearer. We at the EdTech Center and at World Education, we have a framework called BRIDGES, a digital skills framework.

And what we've been thinking about is really just, what is the extension of the existing digital literacy that we have mapped out and are supporting teachers in implementing? How do we extend that into AI literacy? So these are the different domains of BRIDGES.

We have the foundational domains of gateway skills, device ownership, privacy and security, and mobile, then we have independent learning domains, and then we have productivity domains, which would be a lot around communication, creation, workplace, PowerPoint, spreadsheets, et cetera.

And so we're thinking about, where does AI fit in? And with the BRIDGES framework, we've started with just writing out some I can statements that connect these digital skills domains to competencies or skills that are related to AI literacy. So an example is, I can use a website's chatbot to find information I need to troubleshoot. So that's part of that application of AI within AI literacy.

I can check and report when a software I use is acting on harmful stereotypes. That's part of that awareness. Yes. Thank you, Catherine. Catherine put a link to BRIDGES in the chat. And for those of you who are in person-- well, first I want to say, yes, thank you, Catherine, for putting the links in the chat. Those of you who are in person, this is at digitalskillslibrary.org/bridges.

So yes, reporting on harmful stereotypes. I can make an informed decision about purchasing an in-home virtual assistant that goes back to that data privacy, personal data and that awareness. I can use store apps to shop for breakroom supplies, which allows the store to learn about my shopping habits and preferences. So there, it's just knowing that AI can learn and recommend to us according to our interactions with it.

These are more in the independent learning portion. I can provide feedback on movies to get similar recommendations in the future. Again, recommender systems are AI-driven. And so knowing that the actions you take within an application will impact those recommendations that you get, that's part of AI literacy. I can determine whether a help chatbot is human- or AI-based. Part of AI literacy.

I can use strategies to verify whether a shocking video of a known person might be a deep fake. This will be so important for the elections this year. It's something we really need to talk about with learners. And that one connects-- and many of these AI literacy skills, knowledge, and awareness pieces connect very much to information skills.

And I can talk to my colleagues about emerging educational technologies and see how they apply to our work. So that's a teacher I can statement. And it's related to that lifelong learning, and digital resilience, and being able to collaborate and talk about new technologies and fit them into our practice.

And then finally, in this bucket of productivity, we have one about adjusting social media content preferences. We have another about writing a prompt to generate original images and music clips for a class project. That prompt writing and generation of content is very closely connected to various creation and workplace digital skills.

And I can avoid echo chambers and look for differing opinions on a topic, understanding that my actions impact what I see. So, again, how I interact with applications will then lead me to content that has been informed by those interactions. So that goes with the communication and social media digital skills.

These are some examples that we looked at in CampGPT of how AI literacy is integrated in education. So this is an example from dayofai.org. It was the one I just mentioned about talking about human rights and then seeing how AI connects to that, and what an AI bill of rights would or should look like.

These are AI snapshots. These are from aiedu.org, and they're really fun. They're like short, little, exactly what they sound like, snapshots. Quick little ways, one slide to put into a lesson to talk about AI and talk about AI ethics and benefits and harms. So I really love those snapshots.

This one is also from aiEDU. They have longer lessons in addition to the snapshots. A lot of them are K-12-focused, but they are-- all of these resources are free and they're openly licensed, which means that you can go and take the lessons and adapt them for your adult learners. So this one is an interview with ChatGPT. It goes back to some of the examples that we've seen from educators around having learners interact directly with these systems, and then maybe evaluate how that went for them.

So when we were in CampGPT, we actually wrote some integrated objectives trying to think about, OK, what is the, maybe, academic or the main objective that I'm working toward in my class, depending on what type of class I teach? And so we started with that main objective, and then we put that main objective into a digital environment, and then we connected that to an AI literacy skill to develop these integrated objectives.

So this is an example from CampGPT. Students will be able to-- now, this is the main core objective, create a functional resume. And then the digital environment is in Google Docs, and the AI literacy is describing its strengths and weaknesses for withstanding an AI-driven applicant tracking system. So if we're talking about resumes, then we also have this opportunity to talk about how those resumes get analyzed by AI and filtered by AI in some applicant systems.

So we just-- were trying to, again, make the connections between what we teach and how we could integrate AI literacy into that. So in this one, again, generated by one of the Campers at CampGPT, students will be able to conduct a job search. That's your main objective. The digital environment is Indeed job search website in order to identify one job that meets their needs in terms of work hours and salary.

And this person said, now, how can I incorporate awareness of algorithmic bias? And what I loved about this and why I left it in this form is that we can all-- I just wanted to emphasize that we're all still learning, and we should all keep talking. But even just being able to start with an objective and have an idea of what the AI literacy connection might be like algorithmic bias, that's a great starting point. And from there, we can begin to talk to our colleagues, search online, and get to know that better, and then think about how we build it into a lesson.

Oh, thanks, Laura. Sorry, guys. There was a typo on one of the previous slides. So it's aiEDU. I guess there was an extra D in one of those, aiedu.org.

So we were working on integrated objectives, and then I loved this one that someone contributed which I felt like was a big goal that we all could work toward. By the end of the year, students will proficiently navigate digital platforms, critically evaluate online content, and demonstrate a foundational understanding of AI concepts and societal impact through project-based learning activities.

I loved this inclusion of through project-based learning activities. Another thing we talked about in CampGPT was that when we think about digital transformation, it's not just about us having the skills and the tools and using those tools. It's also sometimes about changing the way that we teach.

And we can think about the COVID-19 pandemic and how it wasn't-- during that time, we adopted a lot of new tools, and we used a lot more technology in teaching. And we also, in some ways, very much changed the way that we teach and the way that we design learning experiences.

Hi, Owl Camera, you're a great example of how we can teach high flex. And it's not just about getting an Owl Camera, but then we have to actually think about the activities that we have to facilitate when we are supporting both learners in person and online. So I love this inclusion of through project-based learning activities and thinking about, OK, what are the evidence-based teaching strategies that will work well when we're integrating AI literacy?

And as students begin to use generative AI tools, what are assessment strategies that we need to start looking toward? We know a lot of people are talking about essays and writing and how to assess writing now that generative AI is so capable and so widely used. So really thinking ahead to the future of how this will shape the way that we deliver learning and the way that we assess learning.

So we're going to get into a Q&A. We'll do Q&A maybe for five minutes, and then I want to hear from-- actually, let's start with Babs. Babs, can you take five minutes? Babs was one of the participants in CampGPT, and I thought it'd be great to hear from someone directly, maybe about some of the things they've learned or tried. So, Babs, I see you. I'm going to stop sharing so we can all see you and the Owl will follow you.

Speaker 2: So I was very fortunate to be in Rachel's CampGPT in February. And so I've been working in ChatGPT since it came out. I've been using it. I was an early adapter because I saw the power right away in it, but I didn't have any idea what I was doing, and I was afraid of it because I didn't want to open a Pandora's box or something. It was fear. It was just based on fear.

So I haphazardly stumbled through creating lessons without any guidance in the first part of 2023. And then after a year, I talked to a few people like Susan Gaer, who's part of OTAN and CATESOL. She did a workshop CATESOL last summer that taught about prompt engineering. So then I started dabbling in that.

And then doing CampGPT with Rachel, she implied to me that she has the Open Prompt Book, which is a guide, which is really a great thing, and then the prompt framework. Because I didn't really-- I was just going into it haphazardly, not really considering too much about really making a solid plan, the GEAR aspect of this. And then looking out for biases. The prompt engineering-- the Open Prompt Book really does cover that.

So what I do is I use ChatGPT for discussing concepts and how to approach it. I teach ESL to level 3, and how to approach a lesson to generate text and step by step instructions for student groups to work together. And I used mostly ChatGPT, and then someone at World Education, Destiny, told us about Twee for generating text worksheets, gap-fill activities, et cetera. So I started using that, dabbling in that.

And most recently, I wanted to teach my students how-- I wanted to address both digital literacy issues, and I wanted them to be able to practice English and really get the English in their mouths and in their heads. So I thought, well, how about having them do a-- teaching them how to do a Google survey in-- or a survey in Google Forms.

So my goal was for beginner level 3 learners to learn how to create a survey in Google Forms, to learn the type of questions to put into their design, and to implement the survey with students personally from their Chromebooks, as well as to be able to send the survey that their group created to the other classmates. And each group had their own topic related to digital literacy, or one group texting, online communication, texting, email, social media.

And then the other part of the goal was to develop a greater digital literacy, to practice formulating three types of survey questions in English, and for them to be able to implement it, which would touch on each aspect of reading, writing, listening, and speaking. And so the prompt-- what I started out with is that I just asked ChatGPT, GPT, "What do adult English language learners need to know at A1 CEFR level?" That's the European standard, but it feels like it covers my level 3's.

And so what do they need to learn by developing and conducting simple surveys? And we had this conversation-- I had this conversation with my new friend, ChatGPT, and then that led to-- at the end of my discussion with ChatGPT, I asked the chatbot to generate a lesson plan, step by step instructions for the groups to follow, and creating three types of survey questions which are frequency, yes, no, and open-ended questions.

And then, also, I worked design options to ease the implementation of the survey. And we did that last week, and it was just an awesome class. All week long, the students were just interacting with each other, creating questions with one another, correcting each-- oh, no, no, no, not that one. We're going to do it this way.

Then there was a design-- you know how-- I don't know if you're familiar with Google Forms, but you can write a multiple choice question, or you can just write it out as a multiple choice and you put it in the options. Or you can hit the multiple choice grid, and then it gives circles for each student that you survey.

So it's a grid where you've got five students that they survey, and then a frequency question, how often do you text, always, often, sometimes, rarely, never. And I had them do just the option part first, and then I realized it'd be easier for them to implement the survey if they had the grid. So I went in-- and this is part of my human intelligence, right? I mean, my human intelligence has been augmented with AI, versus the other way around. You augment your own-- yeah, yeah.

AI's not using my-- I guess it's symbiotic. So rather than going back and changing all of those questions, all they had to do was go in to the grid. They didn't have to take out any of the always, sometimes never. They just had to click on the grid, and it would automatically populate that grid.

I didn't see it, but one of the other students in the class who's done Google Forms in Ukrainian knew it right away. And so she was able to go around the whole classroom and say, look, this is what's cool about this. And so I just found-- I'm finding that using ChatGPT and AI, it has totally lifted the weight-- it's--

Speaker 3: Elevated.

Speaker 2: Elevated. Yes, it's elevated the way that I teach, and I think I am teaching much more effectively now. But there are some-- sometimes it generates language that's way too high for them, even though I'm requesting a certain level. So I'm finding that I really have to check it. I can't just go with what AI gives me right off the bat. And next week or this coming week, we're going to learn how to analyze the survey data results. And I'm just so-- I'm looking so forward to it. So that's--

Rachel Riggs: Thanks, Babs. Thank you.

Speaker 2: Thank you, Rachel, for all of your guidance and support. You're my new guru.

Rachel Riggs: I'm so honored. Thank you for sharing, Babs. I know we're coming up on time. I wanted to mention that Babs mentioned a prompt framework that's in the Open Prompt Book. We have this RACEF framework that teachers have found to be really useful.

Another thing I wanted to mention, which, Babs, I'm so impressed by what you're doing, the Office of Educational Technology just put out the National Educational Technology Plan for 2024. And it talks about the digital use divide, and it talks about how some learners only have the opportunity to be passive consumers of technology and respondents to surveys, and we're not giving them enough opportunities to be creators and disseminators of surveys, and analyzers of data and stuff. So that's awesome, Babs. I think that's such a great step in the right direction.

Thank you, Babs. Thanks, everyone, for joining. I'm happy to hang out for questions, but I know you guys have some amazing sessions today. So I also just wanted to mention that we have an EdTech Center newsletter, and we're active on LinkedIn. So if you want to stay connected to the work that we're doing, I hope that you will stay connected to us. Thank you for your time.

Speaker 2: Thank you, Rachel.

Rachel Riggs: And have a great day.