TRAVERSE CITY — Artificial intelligence is in the classroom – and it requires thoughtful application by educators if those tools are to be used wisely and well. That was the focus of a far-ranging discussion on AI during a Traverse City Area Public Schools Board of Education study session this week.
A growing number of educators are training on using artificial intelligence in the classroom, according to a EdWeek Research Center survey conducted last October.
Fifty percent more teachers than they surveyed in spring 2024, about 43 percent, received training on the technology that is becoming increasingly present outside and inside the classroom.
TCAPS school board met Monday for their study session to discuss the implications and possibilities of using the technology as a tool for learning and educating.
This was the board’s first study session on the topic and President Scott Newman-Bale said he was glad to see that there seemed to be agreement among board members as to use of AI in the classroom – Vice President Erica Moon Mohr and Treasurer Andrew Raymond were not present – and that it should be taken seriously.
“I’m glad we’re all roughly on the same understanding that we’re going to have to be very vigilant of its progress,” Newman-Bale said.
“We’ve seen some school districts that are really going fast in a lot of different directions and we’re being really thoughtful and purposeful about how we do this, how do we do this right for our staff, for our kids and for our parents,” Chief Academic Officer for Secondary Education Jessie Houghton said.
But district officials don’t want to move too slowly, either.
“It’s already having massive impacts on career choices and things,” Newman-Bale said, adding that a young person he knew was getting out of art and told him “everyone is getting out of it.”
Newman-Bale said this was sad, but that the district could help in “retraining people on how to make these selections.”
These types of concerns surrounding AI were not the focus of the study session discussion, but one public commenter, a TCAPS teacher, touched on them.
David Richardson, a teacher at Traverse City West Senior High School, mentioned the “morally indefensible energy demands” of AI during a time of human-caused climate change. He said he worries that “tech oligarchs” are driving the social decision to push AI.
“I don’t see a democratic conversation happening about what we want the future of AI to look like,” Richardson said.
Most of Richardson’s time was used to provide a teacher’s perspective on the use of AI as a learning tool.
“I don’t think it’s super-useful in my job, in what I do personally. I’m just one teacher, this is just me. I see teaching as primarily a human connections job. I don’t think artificial intelligence is good at doing that now and I kind of hope it never is,” Richardson said.
“I kind of suspect it’s a way to continue to ask more of public education without giving us any more money – and it’s not free,” he said.
In the classroom, the goal, Houghton said, is “teaching students how to use it to get the learning out of it and not to escape the learning.”
AI can be used as additional support for kids who have a disability or may be struggling, Houghton said, giving an example of math support and tutoring that can help parents get a child “unstuck.”
“I do think there’s some room there for AI to help with that,” Houghton said.
The ability to customize learning plans to a student’s specific interest or learning style was also discussed. Trustee Holly Bird said members of the Indigenous community are “kinesthetic oral and audio learners” who could benefit from the real-time help.
“My hope is that this will help us get into more real time and maybe get off the internet for those learners who learn better that way,” Bird said.
The board considered various concerns and limitations of the software, including privacy, learning outcomes, and cheating.
Houghton said the technology is best applied as a learning tool for older students.
“The Gemini tool for 13 and up is available for our students. We have it actually linked in our cloud so they can access it there. So, when the teachers are comfortable, we’re definitely encouraging them to allow it,” Executive Director of Technology Evan OBranovic said.
TCAPS uses Goblic Tools and Gemini, which puts “a wall” around the AI, OBranovic said.
“If you’re signed in through a TCAPS account and you’re using Gemini, that data is not included in Gemini’s learning language model. It’s only used to process whatever that request is and then it’s forgotten,” OBranovic said.
Teacher assistance
Teachers in elementary schools primarily use AI tools for their work outside of direct instruction; the district is trying to get them to use the 80/20 rule.
This means using AI to “automate and handle the majority (80 percent) of repetitive tasks,” such as creating materials, simple grading, or providing initial feedback, board materials said. This will ideally allow teachers to spend more time on the 20 percent of their work that includes personalized instruction, addressing complex student needs, and fostering deeper learning interactions.
“We do have a few teachers who are using it to provide feedback to students and they’re using it to provide feedback faster or to look for patterns,” Houghton said.
Houghton mentioned an AP teacher who likes to use it for providing feedback for class-assigned essays. The teacher gives the essays to AI and then prompts it to “give these kids some feedback,” Houghton said. She said the teacher believes most of the feedback is “spot on” and provides it right away.
The same teacher will use AI to find patterns within a class or across different classes to see where certain groups are stronger or struggling, and think about what may be causing those trends.
But Richardson said he hasn’t had that experience, and that students might be disappointed with AI feedback from their educators.
“I will say … I think overall (tech tools) have not made my job easier or more efficient — they either just reformatted the same problems in a new way or often added a layer of hindrance,” Richardson said. “I think the prospect of using AI for human learning and connection to students is really concerning and I think the idea of AI tutoring or AI feedback is dystopia.
“From my point of view, I would be pretty upset if I found out, as a student, I was getting AI feedback.”
Critical thinking
“The challenge here is the critical thinking, the ability to take information and put it into a form that gives voice. So how do we move students in a way to use this so that it becomes part of their critical thinking?” Bird asked.
This is where being aware of the capabilities of AI can help teachers plan their coursework, Houghton said. Teachers need to be mindful of the purpose of the lesson and either work with AI or remove the possibility by utilizing discussion, teamwork, or old-fashioned memorization, Houghton said.
“It’s not that we’re removing all these rote skills, we still need them for the purpose of then having these next level thinking and discussions. They’re still really important, there’s stuff that we do need kids to practice,” Houghton said.
Part of thinking critically is making sure students consider the potential for built-in biases in AI tools when using them, OBranovic said.
OBranovic said making students aware of AI “hallucinations,” or made-up results, and the new issues surrounding Large Language Models using its own AI responses for learning (after incorporating all available human-created content) are all opportunities for deep discussions with students.
“AI will have its own kind of built-in biases based on all the information it’s gleaning, just like we all create our own biases … and that’s really important that they have that concept,” he said.
AI cheating
The urge to use AI for assignments is related to the difficulty felt when learning, Houghton said.
“Learning is actually uncomfortable because ‘I didn’t know it and I have to make lots of mistakes,’” Houghton said. “You actually have to have a lot of those conversations, especially with teenagers, about it.”
School board members were concerned about the use of AI to avoid the learning process altogether and that students would use the technology to cheat.
“I would think that the biggest concern is kids going and having an essay written by AI or something and it not being their own thoughts, not being their own work, and the same level AI can actually detect that,” Bird said.
But it turns out that AI detection software isn’t very effective. OBranovic said that, when the LLMs like ChatGPT were released, his inbox was flooded with software that claimed to detect the use of AI.
“I don’t think those do a very good job of doing that. I think, for the teachers, it’s usually just more obvious knowing their student,” OBranovic said. “The hit rate is pretty awful and then you’re going to end up accusing some poor kid who maybe just happened to hit the nail on the head and didn’t do anything with it.”
He said again that the student-teacher relationship needs to be the foundation of learning.
“If the student’s just generating an essay and submitting it, and the teacher’s just taking the essay and putting it in the checker, and then that’s all the interaction, we’ve kind of failed in this use of AI,” OBranovic said.
A good way to avoid misuse of the technology is through teacher instructions that explicitly say which technology they can use for each part of the assignment, Houghton said.
“I’ve seen the best success in the classrooms with that because kids just want to know … most kids are like, ‘Cool, OK,’” said Houghton.
Houghton said that teachers are currently utilizing a wide-range of AI guidelines, with some assignments allowing free use of the tool, some with limited access, and others that allow no use of AI at all.
For learners who struggle with certain tasks, it can “raise the floor,” Houghton said.
They can use it to create a first draft of their ideas that they then “do the work of making it better and polishing it,” Houghton said.