Ķvlog

Artificial Intelligence

Brain Activity Is Lower for Writers Who Use AI. What That Means for Students

By Sarah Schwartz — June 26, 2025 7 min read
Illustration of an bright blue illuminating hand holding a pen and coming out of a laptop to write on a digitized paper. All on a dark blue background.
  • Save to favorites
  • Print
Email Copy URL

It’s a common sentiment from English teachers facing an onslaught of AI-generated writing in their classrooms: When students rely on ChatGPT to write their essays, they aren’t engaging their brains as deeply about the topic as they would if they did all the writing themselves.

Now, it’s also a finding backed by neuroscience research.

A new from researchers at the Massachusetts Institute of Technology, Wellesley College, and the Massachusetts College of Art and Design found that giving writers free reign to use AI as much as they wanted led to some bleak outcomes.

Participants, mostly undergraduate and graduate students, who constructed essays with the assistance of ChatGPT exhibited less brain activity during the task than those participants who were asked to write on their own. The AI-users were much less likely to be able to recall what they had written and felt less ownership over their work. Independent evaluators who reviewed the essays found the AI-supported ones to be lacking in individuality and creativity.

But if participants wrote essays on their own first, and then used AI to write on the same topics, the results changed. This group of writers showed an increase in brain activity.

“What it could potentially tell us is that timing could be very important for when you integrate these tools,” said Nataliya Kosmyna, a research scientist at the MIT Media Lab and the lead author of the paper.

If writers spend time thinking about their topic and collecting their thoughts before turning to generative AI, it’s possible that they could benefit more from using the tool, she said. “Maybe now you can ask questions, go back and forth. You have your opinions on the topic, you can prompt in different directions.”

The paper is a preprint, meaning it hasn’t yet been peer reviewed and published in an academic journal. And the researchers only worked with a small sample of participants, all of whom were undergraduate students, graduate students, or university employees.

Still, the findings could offer important clues about when generative AI use might short-circuit the learning process—and when it might actually deepen students’ thinking.

“It’s the kind of study that we need to get a handle on what’s happening when you use ChatGPT,” said Steve Graham, a professor who studies writing instruction at Arizona State University.

Study found less brain activity in writers using ChatGPT—except in one scenario

In the study, researchers asked 54 participants to write three essays each, across three separate sessions, responding to prompts drawn from the SAT college-entrance exam. They sorted the participants into different groups.

In the first group, participants could use Open AI’s GPT-4o however they wanted to assist in the writing process. In the second, participants could search the internet and use any website to inform their writing—except ChatGPT or any other large-language AI model. Participants in the third group couldn’t use any research tools; the researchers called this group “brain-only.”

The researchers monitored participants’ brain connectivity during the tasks through electroencephalogram, or EEG, a tool that measures electrical activity in the brain. They also asked participants a series of questions after the tasks and had independent raters—two English teachers and one AI—score their essays.

EEG results demonstrated that the group of writers who didn’t rely on any outside support had the strongest, widest-ranging neural activity. The group that used ChatGPT had the least activity, and the group that searched the web fell somewhere in between.

The brains of the writers who were working without research aids lit up in areas related to coming up with creative ideas, integrating multiple pieces of information, and self-monitoring—a pattern that “underscored the high internal demand for content generation, planning, and revision in the absence of external aids,” the researchers wrote. Those features of organizing thoughts, outlining, and revising, are core to the writing process.

Participants in the ChatGPT group also had a harder time remembering what they wrote. When asked to quote from their own essay, 83% of the group couldn’t do so. Only 11% of the participants in the search engine group and the brain-only group failed at the same task.

Is it your own essay if AI helped you write it?

And while almost all the brain-only group said they felt that their essays were their own, the ChatGPT group wasn’t as sure. This group “presented a fragmented and conflicted sense of authorship,” the researchers wrote. “Some participants claimed full ownership, others explicitly denied it, and many assigned partial credit to themselves.”

In evaluations of the essays themselves, the two English teacher judges wrote that AI-crafted writing stood out, due to “a close to perfect use of language and structure while simultaneously failing to give personal insights or clear statements.” The AI judge, by comparison, couldn’t tell the difference between essays written with and without AI’s help.

After participants had written the three essays, the researchers asked everyone back for a fourth session—with one twist. Those assigned to the brain-only group had access to ChatGPT, while those who initially could have used ChatGPT were asked to write unaided. Eighteen of the original participants completed this fourth session, in which they picked one of the three essay topics they had already written on, so they were familiar with the subject matter.

When the ChatGPT group switched to writing without support, they didn’t reach the same level of brain activity as the group that never had access to AI.

But when the brain-only group switched to using ChatGPT, they saw an increase in brain connectivity.

“From an educational standpoint, these results suggest that strategic timing of AI tool introduction following initial self-driven effort may enhance engagement and neural integration,” the researchers wrote.

Students need time to develop writing skills and form ideas without AI, Ķvlog say

For Brett Vogelsinger, a high school English teacher in Bucks County, Pa., the findings offer a warning: “Outsourcing too early is the biggest risk here,” he said.

Vogelsinger is the author of Artful AI in Writing Instruction and has incorporated the tool into some of his lessons. Still, he said, “you need to have some writing time that’s you, and the paper, and your thoughts.”

This could mean incorporating AI only for revision, said Kristina Peterson, an English teacher at Exeter High School in Exeter, N.H., and the co-author of AI in the Writing Workshop with her colleague, English teacher Dennis Magliozzi.

If students are using AI as a writing partner, Magliozzi said, “you have to show up to the table having written first.”

Beyond giving students the space to develop original ideas, the study also underscores the importance of teachers helping students learn writing skills and an understanding of the writing process, said Graham, the ASU professor.

Even if students come up with their own ideas, they lose out on exercising important writing muscles if they ask ChatGPT to arrange those ideas into an essay for them, he said. “We’re not going to learn a lot about sentence-construction skills,” Graham said, as an example. Writing tends to use more sophisticated syntax and vocabulary than oral language—syntax that students need to practice.

Targeted writing instruction also helps students learn how to structure their work in a logical, clear way.

Lacking these skills hampers students’ ability to write strong essays—and it also makes it harder for them to determine whether AI-generated content is high quality. If students are using ChatGPT as a writing aid, Graham said, “we need to make sure to have the skills we need to determine, did it meet our intentions?”

It’s not clear just how much writing expertise students do need before they can make good judgments about the quality of an AI’s output, said Kosmyna, the study’s lead author. She stressed that this study’s sample was quite small: 54 participants completed the first three sessions, and only 18 came back for the fourth.

For now, it’s important to carefully consider how teachers say AI could be best used—and to heed their warnings, she said.

“We do need to support them, and listen to them, and not just push randomly all the tools under the sun on them without understanding.”

Events

College & Workforce Readiness Webinar How High Schools Can Prepare Students for College and Career
Explore how schools are reimagining high school with hands-on learning that prepares students for both college and career success.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
GoGuardian and Google: Proactive AI Safety in Schools
Learn how to safely adopt innovative AI tools while maintaining support for student well-being. 
Content provided by 
Reading & Literacy K-12 Essentials Forum Supporting Struggling Readers in Middle and High School
Join this free virtual event to learn more about policy, data, research, and experiences around supporting older students who struggle to read.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence Q&A How AI Is Changing Career and Technical Education
A CTE expert recommends teachers and students fact check any information or advice AI generates.
4 min read
Students in Bentonville public schools’ Ignite program work on projects during class on Nov. 5, 2025, in Bentonville, Ark. The program offer career-pathway training for juniors and seniors in the district.
Students in the technology strand of Bentonville public schools’ Ignite program work on a project during class on Nov. 5, 2025, in Bentonville, Ark. The program offers career-pathway training for juniors and seniors in the district, including an emphasis on learning AI skills.
Wesley Hitt for Education Week
Artificial Intelligence ‘What Are You Doing on AI?’: How This District Added It to Career Education
AI literacy instruction is embedded across all 10 of the district's high school career pathways.
11 min read
Students in Bentonville public schools’ Ignite program work on projects during class on Nov. 5, 2025, in Bentonville, Ark. The program offer career-pathway training for juniors and seniors in the district.
Instructor Wendy Broughton, seated at left, works with students in the health sciences track of Bentonville public schools’ Ignite program on Nov. 5, 2025, in Bentonville, Ark. The program—which integrates lessons about AI into its curriculum—offers career-pathway training for high school juniors and seniors in the district.
Wesley Hitt for Education Week
Artificial Intelligence From Our Research Center Schools Are Fielding Complaints Generated by AI. How You Can Tell
Educators shared their experiences with this challenge in a recent EdWeek Research Center survey.
3 min read
Illustration of human arm pulling on strings of robot arm with pencil.
iStock
Artificial Intelligence Spotlight Spotlight on AI in Education: Save Time, Scale Programs, and Prioritize The Human Connection
This Spotlight will explore how AI is helping Ķvlog save time, scale programs, and refocus on what matters most—students.