Ķvlog

Opinion
Artificial Intelligence Opinion

AI Has Done Far More Harm Than Good in My Classroom

Maybe AI can be conducive to deep learning—but not in my experience so far
By Lauren Boulanger — August 07, 2025 4 min read
Abstract, futuristic electronic cube, representing the concept of artificial intelligence. The cube's sleek, metallic surface and glowing lights suggest advanced technology and the limitless possibilities of AI
  • Save to favorites
  • Print
Email Copy URL

When I joined my district’s artificial intelligence committee earlier this year, we began by developing a shared philosophy that would preface our new AI policy.

“How can we say that we welcome AI?” asked a high-level district administrator. “I want to be clear that we aren’t afraid. We are embracing it.”

I winced. While administrators are eager to prove their innovative spirit, my experiences have led me to believe that integrating AI in classrooms will do more harm than good.

Since 2022, I’ve seen upward of 100 AI-generated responses that students have submitted as “original” work in my English/language arts classes. If you are familiar with student writing, it is very easy to tell the difference between a chatbot’s response and a high schooler’s work.

However, it is difficult to definitively prove that a piece of writing is AI-generated. Detectors have varying levels of reliability, and even the most reliable detectors sometimes generate a false positive.

Instead, I rely on Google Docs’ history with plugins like Draftback or Revision History to watch students’ drafting process in real time. Students using generative AI typically just paste in the bot’s output as a large block of text. My course syllabi are clear that I need access to students’ editing history to verify academic integrity.

But last spring, students caught onto my strategy and began typing out AI-generated responses. This creates an artificial “drafting history,” which usually shows that the response was written in one sitting in 15-30 minutes, without any significant revisions. Of course, this is nothing like human writing.

But is this enough to ethically hold students accountable for cheating? Not quite. Even if generative AI is allowed in classrooms, how can Ķvlog draw the line between ethical and unethical use—and hold students accountable for crossing it?

Those in favor of AI integration argue that students have always cheated: If a student wants to avoid work, they can find something to copy. But careful practitioners could craft assignments for which plagiarism could be easily detected.
With AI, however, students can avoid any intellectual labor in an unprecedented manner. My students have even used it for purely opinion-based questions like, “Which character in Gatsby is most insufferable and why?” or personal reflections like, “Describe a time you knew you were learning.” AI can answer most prompts, regardless of how personal or creative, with varying levels of accuracy.

Some would argue that this means we need to rethink our questions. This may be true to some extent—but aren’t these prompts still worth thinking about?

Education is about the process of learning, not the product. I ask my students to write short stories because I want them to engage in the difficult work of developing style, character, plot, and setting that all work together to create a thematic statement—not because I am in desperate need of 55 short stories.

Writing is thinking; it is a generative and metacognitive process. Writing is also relational, as writers have to look within themselves to connect with others. AI may make writing more efficient, but efficiency is not the goal. Intellectual challenge is what produces the learning.

Lately, I have to brace myself whenever I read student work. I want to believe the best about my kids, but AI has complicated this. Distrust creates a barrier between me and my students that feels foreign; it reminds me of the crabby old teachers I was warned about in graduate school. Every new teacher is cautioned to stay away from colleagues who believe that young people will lie, cheat, and steal whenever given the opportunity. Adolescents don’t want to learn from people who antagonize them, and I don’t want to be one of those people. Any false accusations signal to students that we doubt their ability, which can be emotionally crushing, even if the student is cleared of wrongdoing.

To be clear, I am not advocating that AI should never be used in the classroom. Using it sparingly and with purpose can have a positive impact. AI has theoretical benefits for personalized learning; it can also generate model work for critique, engage students in dialogue about a text, suggest organization strategies, and more. I’ve attended professional development workshops and read compelling case studies from classroom teachers and ed-tech companies where these strategies are presented as tools that can boost learning.

The operative word here, though, is can. AI can be used in supportive ways that are conducive to deep learning, but that is not how most students in my classroom are using it.

Teaching students to use AI ethically does not mean they will stop using it to avoid cognitive labor, no matter what we’d like to believe. And even if students do use AI in these more ethical, supportive ways, it does not necessarily provide better assistance than a capable peer. Offloading the feedback process to a machine deprives students of the opportunity to collaborate. Rather than using a chatbot as a sounding board, I want my students to use one another. That way, both the givers and recipients benefit from the exchange and develop essential collaborative skills in the process.

When I expressed these hesitations in our AI committee, I was told that “the train is leaving the station whether we are on it or not, so we might as well climb aboard.” But where exactly is the train headed? And are we sure that’s somewhere we want to go?

For my own classroom, I will largely be going back to pencil and paper next year, and most writing will be done in class. I don’t want to waste time or squander relationships in trying to determine whether a student’s writing is their own. I want them to practice and grow in their skill and confidence. I may integrate AI periodically if I feel it can meet a need in my classroom. But I want to make this choice myself and not let the current zeitgeist make it for me.

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Unlocking Success for Struggling Adolescent Readers
The Science of Reading transformed K-3 literacy. Now it's time to extend that focus to students in grades 6 through 12.
Content provided by 
Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and K-12 education jubs at the EdWeek Top School Jobs virtual career fair.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
College & Workforce Readiness Webinar
Climb: A New Framework for Career Readiness in the Age of AI
Discover practical strategies to redefine career readiness in K–12 and move beyond credentials to develop true capability and character.
Content provided by 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence Opinion Bloom's Taxonomy Needs an Update for the AI Age
Here’s how one superintendent is reimagining the classic framework of learning objectives.
Jeffrey Schoonover
5 min read
Concept of AI, Digital brain with ai chip on generate bar. AI created generate art, text, video, and audio with prompt. Big data visualization and machine learning. Vector illustration.
Education Week + iStock/Getty Images
Artificial Intelligence Opinion Is Your School’s Approach to AI Too Flexible?
It’s tempting to prioritize adaptability when dealing with AI tools. It can also be a mistake.
Laura Arnett
3 min read
040726 opinion Arnett principal is in hendrie fs
F. Sheehan/Education Week via Canva
Artificial Intelligence Opinion Can AI Support Student Learning? Depends Who You Ask
Ed tech is supposed to give teachers more time to mentor. It’s not clear if it does.
7 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week
Artificial Intelligence Letter to the Editor Artificial Intelligence: Reality Versus Hype
"AI is a deeper manifestation of the pernicious trend to let technology co-opt human agency."
1 min read
Education Week opinion letters submissions
Gwen Keraval for Education Week