Ķvlog

Artificial Intelligence

‘We Need to Reimagine What We Can Do’: How Teachers Are Adapting to AI

By Jennifer Igbonoba — August 20, 2025 4 min read
An illustration of computer keyboard keys on a red background. One key shows the letters AI and the other key shows an arrow suggesting "repeat".
  • Save to favorites
  • Print
Email Copy URL

The role of artificial intelligence in classrooms is rapidly evolving, and Ķvlog are divided over how to handle it.

Teachers are using AI to draft lesson plans, brainstorm solutions, and identify where students are struggling. But students’ use of AI is raising new ethical and instructional questions.

Many teachers cite generative AI platforms as a common tool for students to cheat on assignments, while some critics warn about potential cognitive effects for students who rely on AI instead of developing their own skills.

See Also

Close-up stock photograph showing a touchscreen monitor with a woman’s hand looking at responses being asked by an AI chatbot.
E+

Some states and school districts have adopted AI policies, but others have not—creating a patchwork approach that leaves room for misuse. Educators’ comfort with AI varies widely. However, as Tanisca Wilson, a member of the National Council of Teachers of English, said, “AI is our friend and not our enemy.”

“We need to reimagine what we can do with such a powerful tool,” she said.

To do that, Wilson said Ķvlog must shift their mindset from seeing AI as a tool that suppresses critical thinking to viewing students as contributors who can shape how AI is used.

She added that for students to develop their creative voice, it’s important for Ķvlog to demonstrate both the benefits of generative AI—such as revising a paragraph—and its limitations.

“You can’t be equally proud of something that a computer has generated,” Wilson said.

As more generative AI technologies become available, some teachers employ AI-detection tools, whose effectiveness at accurately detecting plagiarism has been questioned. Wilson said some Ķvlog are shifting to a more in-class model of assessments instead of traditionally assigning homework because of the rise in generative AI.

In a May and June 2023 EdWeek Research Center survey about how math instruction should—if at all—change to address the existence of AI platforms that can solve math problems for students, 43% of teachers, principals, and district leaders said students should solve problems in class using pencil and paper. Thirty-seven percent said students should explain their solutions orally, while 34% said students should be taught to incorporate AI into math assignments.

Classroom strategies in action

Ana Sepulveda, a 6th grade math and dual-language teacher in the Dallas Independent school district, said she allows her students to use generative AI platforms for specific assignments, such as translanguaging their curriculum. She added that although she assigns online learning modules, students are required to show their work in a journal, which counts as a project grade.

“That journal [is] the story of how much work they’re putting in, day in and day out,” Sepulveda said.

Lisa Apau, a high school science teacher in Massachusetts’ Worcester public schools, said she encourages her students to use AI platforms, but for situations when they don’t understand course material.

“I’m not going to fight that battle. What I want to do is ... encourage kids to use it with ethics,” she said.

To discourage students from inappropriately leaning on generative AI, Apau recommends that Ķvlog design assignments creatively. In her anatomy classes, for example, Apau asks students to draw parts of the body. Although not a direct result of more AI platforms, Apau said those methods make it harder for students to cheat.

Keeping students engaged while defining responsible use

At High Tech Los Angeles, a charter school in California, teachers use a project-based learning model to limit reliance on AI and assess students’ learning skills.

“If they get involved in a project, and are super passionate about it, and there’s a lot of buy-in, they don’t need to or want to use AI,” said Bianca Batti, an English teacher at High Tech High. “A lot of it is incumbent on me, as the educator, to make sure I am engaging in the kind of praxis and curriculum development and pedagogy that speaks to students.”

Ellese Jaddou, a chemistry teacher at the school, has seen students use generative AI platforms as a “supplement to Google”—sometimes copying homework answers directly from ChatGPT. But she said a smaller portion of her students use it for project inspiration.

In Yakima, Wash., Beth Dallman, a chemistry teacher and International Baccalaureate coordinator at Davis High School, said AI is included in her school’s academic honesty policy, which allows for its use if it is cited as a source.

Before the debate on AI, Ķvlog wrestled with similar concerns about calculators on students’ math performance. Like calculators, Dallman believes generative AI should be reserved for students age 13 and older—the minimum requirement for some platforms such as Google Gemini and ChatGPT. Dallman believes Ķvlog should educate students on how to use AI as a resource—like a search engine or calculator—as she said it is “going to be part of everyone’s life.”

Regardless of one’s personal stance, Colleen Molina, the principal at High Tech High, said Ķvlog must teach students how to use AI responsibly.

“I always tell myself, no matter what, we need to educate our students on it. We as a staff need to educate them on the risks of using AI, the [benefits] of using AI, the moral background of using AI, and what you’re putting out there,” she said.

Events

College & Workforce Readiness Webinar How High Schools Can Prepare Students for College and Career
Explore how schools are reimagining high school with hands-on learning that prepares students for both college and career success.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
GoGuardian and Google: Proactive AI Safety in Schools
Learn how to safely adopt innovative AI tools while maintaining support for student well-being. 
Content provided by 
Reading & Literacy K-12 Essentials Forum Supporting Struggling Readers in Middle and High School
Join this free virtual event to learn more about policy, data, research, and experiences around supporting older students who struggle to read.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence Americans Grow More Skeptical of AI in K-12 Schools, Poll Finds
Support for some AI use in schools has declined, according to the latest PDK poll on American attitudes toward public education.
4 min read
Vector illustration of bar chart tumbling link dominoes and an artificial intelligence robot trying to keep the last bar from falling.
DigitalVision Vectors
Artificial Intelligence Are AI Teacher Assistants Reliable? What to Know
Without diligence, these AI tools can cause problems.
7 min read
Photo illustration of woman using AI on laptop computer.
iStock
Artificial Intelligence Researchers Posed as a Teen in Crisis. AI Gave Them Harmful Advice Half the Time
ChatGPT prompted teens to harmful acts, a study reveals.
6 min read
The ChatGPT app icon is seen on a smartphone screen on Aug. 4, 2025, in Chicago.
The ChatGPT app icon is seen on a smartphone screen on Aug. 4, 2025, in Chicago. A new study found that ChatGPT can encourage vulnerable teenagers to engage in potentially harmful acts or behavior.
Kiichiro Sato/AP
Artificial Intelligence AI in School Security: A New Tool with Big Questions
As school districts think about ways to enhance security, experts say leaders should think about costs and effectiveness.
5 min read
Security cameras are seen at ZeroEyes' greenscreen lab for testing and training artificial intelligence to spot visible guns on May 10, 2024, in Conshohocken, Pa.
ZeroEyes, a safety and security company that uses AI in its products, has a greenscreen lab, shown here on May 10, 2024, in Conshohocken, Pa., for testing and training artificial intelligence to spot visible guns.
Matt Slocum/AP