Ķvlog

Artificial Intelligence

Why Teachers Should Talk to Students Before Accusing Them of Using AI to Cheat

By Alyson Klein — February 27, 2025 3 min read
  • Save to favorites
  • Print
Email Copy URL

When schools first became aware that new versions of generative artificial intelligence tools could churn out surprisingly sophisticated essays or lab reports, their first and biggest fear was obvious: cheating.

Initially, some Ķvlog even responded by going back to doing things the old-fashioned way, asking students to complete assignments with pencil and paper.

But Michael Rubin, the principal of Uxbridge High School in Massachusetts, doesn’t think that approach will prepare his students to function in a world where the use of AI is expanding in nearly all sectors of the economy.

See also

Photo collage of woman working on laptop computer.
Education Week + Getty

“We’ve been trying to teach students how to operate knowing that the technology is there,” Rubin said during a recent Education Week K-12 Essentials Forum about big AI questions for schools. “You might be given a car that has the capacity of going 150 miles an hour, but you don’t really drive 150 miles an hour. It’s not about the risk of getting caught, it’s about knowing how to use the technology appropriately.”

While students shouldn’t use writing crafted by AI tools like ChatGPT or Gemini and pass it off as their own, generative AI can act as a brainstorming partner or tutor for students, particularly those who don’t have other help in completing their assignments, he said.

Rubin recalled that his daughter recently needed his assistance with a history assignment. “She has me to go to, and some kids don’t,” he said. “We do believe that the AI chatbots can sometimes be that great equalizer in terms of academic equity.”

But he added, “I did not do the work for my kid. So I want to make sure the AI chatbot isn’t doing the work for anybody else’s either.”

Rubin’s school uses a tool that helps teachers get a sense of how students composed a document they later turned in for an assignment. It allows teachers to see, for example, if a student did a lot of cutting and pasting—which could indicate that they took chunks of AI writing wholesale and passed it off as their own work.

If a teacher at Rubin’s school suspects one of their students plagiarized content from an AI tool, the teacher doesn’t launch into an accusatory diatribe, he said.

Instead, they’ll use it as a “learning opportunity” to talk about appropriate uses of AI, and perhaps allow the student to redo the assignment.

“It’s not just about giving a zero and moving on,” Rubin said.

Never assume AI-detection tools are right about plagiarism


Those conversations are important, particularly when a teacher suspects a student of cheating because an AI detection tool has flagged work as potentially plagiarized, said Amelia Vance, the president of the Public Interest Privacy Center, a nonprofit organization that aims to help Ķvlog safeguard student privacy. Vance was also speaking during the Education Week K-12 Essentials Forum on AI.

Most AI detection tools are wildly inaccurate, she noted. Studies have found that commercially available detection tools tend to erroneously identify the work of students of color and those whose first language is not English as AI-crafted.

Programs that look at whether a student copied and pasted huge swaths of text—like the one Rubin’s school uses—offer a more nuanced picture for Ķvlog seeking to detect AI-assisted cheating, Vance said. But even they shouldn’t be taken as the final word on whether a student plagiarized.

“Unfortunately, at this point, there isn’t an AI tool that sufficiently, accurately detects when writing is crafted by generative AI,” Vance said. “We know that there have been several examples of companies that say, ‘We do this!’ or even experts in education who have said, ‘This is available as an option to deal with this cheating thing.’ And it doesn’t work.”

The kind of technology that Uxbridge High School relies on gives Ķvlog “a better narrative” to work with than other types of detection tools, Vance added. “It’s not just, ‘Is this student cheating or not?’ It’s, ‘How is this student interacting with the document?’”

That’s why Uxbridge’s practice of talking to students directly when AI cheating is suspected is an important first step.

If a student admits to cheating using AI in those conversations, “you need to make it clear to the student that is not acceptable,” Vance said. But teachers should never take the word of an AI detector—or even the type of product Rubin described—as gospel.

“Avoid ever assuming the machine is right,” Vance said.

Related Tags:

Events

College & Workforce Readiness Webinar How High Schools Can Prepare Students for College and Career
Explore how schools are reimagining high school with hands-on learning that prepares students for both college and career success.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
GoGuardian and Google: Proactive AI Safety in Schools
Learn how to safely adopt innovative AI tools while maintaining support for student well-being. 
Content provided by 
Reading & Literacy K-12 Essentials Forum Supporting Struggling Readers in Middle and High School
Join this free virtual event to learn more about policy, data, research, and experiences around supporting older students who struggle to read.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence 'It's Not Magic': How These Schools Are Teaching AI Literacy
Students are building knowledge about the technology that could help them in future jobs.
9 min read
Nathali Hernandez, 9, left, and Zoe Estrella Quiroz, 9, center, build a program using commands to make a robot named Dash follow a path on a grid. Students worked together in Funda Perez’s 4th grade computer applications class at Dr. Martin Luther King, Jr. School No. 6 in Passaic, N.J., on Oct. 14, 2025.
Nathali Hernandez, left, and Zoe Estrella Quiroz use AI tools to design a program to direct a robot named Dash to follow a path on a grid. The 4th graders worked together in a computer applications class at Dr. Martin Luther King Jr. School No. 6 in Passaic, N.J., on Oct. 14, 2025. A growing number of school districts are emphasizing the development of AI literacy.
Erica S. Lee for Education Week
Artificial Intelligence Opinion AI Won’t Replace Teachers—But Teachers Who Use AI Will Change Teaching
Educators can’t wait until they feel comfortable with AI to start engaging with it.
Ingrid Guerra-López
5 min read
A silhouette standing in front of glowing data sphere. Teachers prepare students to live in a technological future.
iStock/Getty Images
Artificial Intelligence Video How Schools Can Use AI in Smart, Responsible Ways
Use of AI in school brings great opportunity and risk. Here's how Ķvlog can find a balance.
1 min read
Artificial Intelligence Opinion AI Is Trained to Avoid These 3 Words That Are Essential to Learning
Chatbots aren’t designed to model the type of thinking we expect from our students.
Sam Wineburg & Nadav Ziv
5 min read
Human and AI robot completing a jigsaw puzzle, the human is holding the right piece
iStock/Getty Images + Education Week