AI Cheating Software in Classrooms

There’s a difficult reality students and teachers are contending with in classrooms: there’s no foolproof tool to detect a student using artificial intelligence (AI) as a form of cheating on an assignment. Some students are using AI to cheat. But many insist they are not, even when software flags their assignments as AI-generated. While detection software is marketed as the solution to singling out cheaters, a chorus of educators and students insist that it doesn’t work. Michelle Gierman, an AI strategist at the Avondale School District in Oakland County, even tells students: She once plugged a 30-page paper she wrote while working on her master’s degree — a paper she wrote before AI chatbots were available — into detection software as an experiment. It flagged the paper for using AI.

“There’s no such thing as a good detector,” Gierman said. Research backs this up, finding AI detection can lead to false accusations against students. Instead, AI specialists in public schools like Gierman said that navigating conversations with students about cheating and AI requires more work than simply plugging an assignment into detection software. While cheating has always been a problem in education, AI has added thornier layers to that problem, especially because it’s still relatively new (ChatGPT, one of the most popular chatbots, was released in November 2022).

When Karle Delo, AI strategist for Michigan Virtual, talks to teachers about AI, cheating is usually their top concern. And that’s valid, she said. Some students are using AI to cheat, she said, but some are using it in more novel ways, leveraging it as a learning coach. And students feel shut down when teachers immediately accuse them of using AI dishonestly, hurting the classroom dynamic. “If students actually did the work themselves, it brings up a lot of emotion,” she said. “It can create more distrust and create more rupture between that teacher-student relationship, especially if it’s not accurate.”

AI detection software programs “are not the solution to the problem” of students cheating, according to Delo. And if schools ramp up trying to police students on AI use, they risk creating trust issues between teachers and students, particularly if they rely on software that’s not accurate. Instead, teachers should consistently share their expectations about what AI usage is and isn’t allowed in their classes. For instance, can students use AI grammar check software?

“When teachers actually set expectations of how you can use AI or not and why in a specific lesson, that’s really helpful because it gives students a guideline to follow,” she said. If teachers suspect a student has used AI on an assignment, Delo suggests they approach the topic with curiosity and conversation over flat-out accusations, asking how a student found information or bringing up specific concerns with the work.

Allison Green
Boston Tutoring Services