Today’s post is the latest in a two-year series on artificial intelligence in the classroom.
‘These Tools Aren’t Magic’
Jane Rosenzweig is the director of the Harvard College Writing Center, writes frequently about AI and education in her Writing Hacks newsletter and elsewhere, and publishes The Important Work, a newsletter written by high school and college writing instructors about teaching writing in the age of AI:
There’s a lot of discussion—and disagreement—about whether and how students should be using generative AI tools in the classroom. But as these tools become more and more widely available, what we do know is that our students are going to be using them and that we need to be able to talk to them about the role of generative AI in their education.
I’m wary of making grand pronouncements about how to talk to students about using generative AI tools when things are changing so quickly. But I do think there are some guidelines we can follow when deciding to use these tools in the classroom—or outside the classroom.
1. Teachers should understand the basics about how generative AI tools like ChatGPT actually work
I’m not suggesting that every teacher has to become a tech expert—I’m certainly not one. But if you’re going to use these tools or encourage your students to use them, it’s helpful to understand how these tools are trained and how they generate output.
Here’s an example from my own class: On the first day of class in the fall, one of my students mentioned that she really liked using ChatGPT because it’s more objective than humans. If you believed that, it would definitely shape how you use ChatGPT. But it’s not actually true: AI tools like ChatGPT can only answer questions based on what’s in their training data, and that data is drawn largely from what’s available online—not from some objective or all-knowing source.
AI tools also “hallucinate”—meaning they sometimes just give you inaccurate information. Students find it interesting to learn how these tools generate output, and you can explain this in ways that are grade appropriate. Here are some resources that I’ve found helpful for learning how generative AI tools work.
This explainer from the Financial Times explains how large language models work with helpful examples.
If you want to take a deeper dive, try this article, “Large language models explained, with a minimum of math and jargon.”
2. Talk to your students about what you want them to learn, not just about what tools like ChatGPT can do or whether they are allowed to use them.
I think it’s helpful to look at the use of generative AI tools in terms of what problems you’re trying to solve in the classroom. (In fact, I teach a writing course called To What Problem is ChatGPT the Solution.)
I’ve found this framework to be helpful for myself—but also for my students. I talk to them about what problems they’re solving when they use AI: Is it the problem of not having time to do the work? Is it the problem of not having an idea? Or is it an interesting, knotty problem that’s hard to solve that generative AI might help them solve in a cool way?
I also tell them that I’m not asking them to write papers because the world needs more papers; I’m asking them to write papers because it’s one way of thinking through a problem—and then we talk about how using AI at different points in the writing process may or may not get in the way of that thinking.
There’s a big difference between telling students to use or not to use generative AI and telling them why what you want them to do matters in the first place. Framing things this way may not always stop students from using these tools in ways you think are counterproductive—but it will help students understand where you’re coming from.
3. Be aware of the difference between useful and not useful ways of using these tools.
We’ve heard a lot about how AI tools like Khanmigo can provide personalized tutoring. But some teachers are finding that some students using these tools are not engaging with them or learning from them—and that sometimes the way Khanmigo helps students is different from what you’d do in your own classroom.
If you’re asking your students to use AI tools, it’s going to be helpful to be aware of how the same tool you’ve set up to enhance learning could get in the way of that learning. Dan Meyer offers a useful example of this over at his newsletter, Mathworlds.
4. Don’t remove the friction from the learning process.
Tools like ChatGPT are being marketed as efficiency tools—tools that will save us time so that, as OpenAI says, we can focus on other things. But learning requires time, and it requires friction.
If you’re going to use AI tools with your students, it’s useful to consider how you’re setting up assignments to allow for that productive friction.
When I made a chatbot to help my students practice counterargument, some of them were surprised that the chatbot didn’t enable them to do the work more quickly. But I wasn’t trying to help them be efficient; I was trying to help them learn something complicated.
I’ve written more about friction and learning here. This piece on friction and time-saving is a great overview of the conversation about friction and AI, with a focus on Magic School.
5. Beware of the hype.
It seems like new tools are being released every day, and I’m the first to note that tools like Google’s NotebookLM, which turns any text into a podcast, are pretty cool! But they were not designed to solve problems that we’re trying to solve in the classroom. They were designed to get people to use them.
I’ve found over the past few years that when I question the role of these tools in the classroom or express concerns about the hype, some people tell me that I must be anti-technology. But that’s not true at all—I was an early experimenter with GPT and I’m very interested in all of these tools.
However: It’s not our job as educators to adopt technology because it’s cool; it’s our job to ask hard questions and think about what will help our students learn. Which brings me back to my earlier question: When thinking about how to teach your students about AI, it’s useful to start by asking what problems you’re trying to solve in your classroom and how AI can help solve those (or whether it will create new ones).
We’ve entered an era where there will be new generative AI tools regularly that come with promises to magically solve all the challenges we face as teachers. But it’s worth keeping in mind that these tools aren’t magic—and that the way you choose to use them—or not—should always be based on what you’re trying to do in your classroom.
Thanks to Jane for contributing her thoughts!
Today’s post answered this question:
What are guidelines teachers should follow when teaching students to use or not use artificial intelligence?”
Consider contributing a question to be answered in a future post. You can send one to me at [email protected]. When you send it in, let me know if I can use your real name if it’s selected or if you’d prefer remaining anonymous and have a pseudonym in mind.
You can also contact me on Twitter at @Larryferlazzo.
Just a reminder; you can subscribe and receive updates from this blog via email. And if you missed any of the highlights from the first 12 years of this blog, you can see a categorized list here.