Ķvlog

Artificial Intelligence

Another AI Issue for Schools to Know About: Bias Against Non-Native English Speakers

By Ileana Najarro — August 10, 2023 4 min read
Photo of student using chatGPT/AI.
  • Save to favorites
  • Print
Email Copy URL

As Ķvlog continue to explore what role artificial intelligence tools should or could play in the classroom, some researchers are cautioning teachers that AI detectors are biased against non-native English-speakers.

In an article , Stanford University researchers wanted to evaluate how accurate detectors of GPT, or generative pre-trained transformers, are when it comes to determining whether text authored by non-native English speakers was AI-generated or written by a human.

To test this, the researchers ran essays written by Chinese students for the Test of English as a Foreign Language, or TOEFL, through seven widely used detectors. They did the same with a sample of essays written by U.S. 8th graders who were native English speakers. The tools incorrectly labeled more than half of the TOEFL essays as AI-generated, while accurately classifying the 8th grade essays.

“We found this substantial bias whereby many of [non-native English speakers’] writings are mistakenly flagged as generated by GPT, when they were really written by humans,” said James Zou, one of the co-authors of the article and an assistant professor of biomedical data science at Stanford University.

GPT detectors look at something called perplexity of text, Zou said. Low perplexity text involves using more common, generic words that aren’t very surprising and thus are more likely to be flagged as AI-generated even if that isn’t the case. The researchers looked into the perplexity of the students’ writing samples for this reason and found that text with low perplexity was more likely to be flagged as AI-generated.

Students in the U.S. who are English learners may be more likely to use more common words in their writing as they work to expand their vocabulary, making them more likely to be erroneously flagged as having used AI, Zou added.

How to address bias in AI detectors

Bias in AI tools isn’t a new phenomenon, and GPT detectors have never been 100 percent foolproof, especially as AI technology continues to advance, said Christopher Doss, a policy researcher at the RAND Corporation.

“AI is trained on data. Societal biases are baked into data,” Doss said.

For Doss and others, the key takeaway is that teachers must be cautious when relying on GPT detectors to determine if a student cheated with AI assistance, but more importantly, that Ķvlog need to be thinking of different ways to use AI tools in the classroom.

“In the beginning of the class, how do you figure out how to use ChatGPT to help learning and teach children how to use these tools for good?” Doss said. “But then also, how do you make sure that your children don’t use it as a crutch?”

Peter Gault, the executive director and co-founder of Quill, a nonprofit that provides open-source literacy materials to teachers, said that the group’s AI detector known as was among the tools examined by the Stanford researchers. That tool is no longer available to teachers as of this week.

“When we launched this tool in January 2023, the only Generative AI tool available was ChatGPT. There are now a series of different tools available, and each of these tools is being upgraded weekly. As these tools make their AI more complex, the AI text output becomes more varied, and it becomes more difficult for algorithms to detect whether a piece of writing was generated by AI,” read a statement on the AI Writing Check website.

As a more reliable means to check for whether students used AI in their work and avoid the risk of biases in AI tools, Gault recommends teachers use students’ version histories of text. In other words, teachers can go into Google Docs or Microsoft Word and see the various edits and revisions students make in their writing. Those iterations allow teachers to tell if a student was using AI assistance and also better understand their students’ writing process to know how to best help them grow their writing skills.

Looking ahead, Gault said advancements in technology could lead to developing AI assistants to help students as they’re writing and check their work as they go rather than wait until they complete an assignment.

Zou, the Stanford researcher, also recommends this more proactive approach to checking students’ writing in progress as opposed to relying on an evaluation tool at the end.

With English learners in particular, he added, AI tools have the potential to help track students’ grammatical mistakes for more personalized assistance and could even aid students in need of translation services.

The biases against English learners in general

English learners are one of the fastest growing student demographics in the United States.

When it comes to working specifically with English learners, Ķvlog must be cognizant of broader biases these students face in K-12 schools, said Xilonin Cruz-Gonzalez, the deputy director of Californians Together, a research and advocacy organization for English learners and their families.

For instance, while there are clear federal guidelines and even some state guidelines on the rights English learners have in schools—whether it’s access to translation services or the ability to enroll in schools regardless of immigration status—some school districts at the local level may not always understand their legal obligations, Cruz-Gonzalez said.

It’s partly why in June the U.S. departments of Justice and Education published fact sheets reminding Ķvlog of immigrant students’ legal rights in K-12 schools.

Cruz-Gonzalez said English learners often face unconscious biases among Ķvlog when it comes to the linguistic and academic assets they bring to the classroom.

As new technologies emerge, she and other advocates hope developers address biases in AI tools and create opportunites for English learners to creatively and positively engage with AI technology.

Related Tags:

A version of this article appeared in the August 23, 2023 edition of Education Week as Another AI Issue for Schools to Know About: Bias Against Non-Native English Speakers

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Managing AI in Schools: Practical Strategies for Districts
How should districts govern AI in schools? Learn practical strategies for policies, safety, transparency, as well as responsible adoption.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Unlocking Success for Struggling Adolescent Readers
The Science of Reading transformed K-3 literacy. Now it's time to extend that focus to students in grades 6 through 12.
Content provided by 
Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and K-12 education jubs at the EdWeek Top School Jobs virtual career fair.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence Letter to the Editor Artificial Intelligence: Reality Versus Hype
"AI is a deeper manifestation of the pernicious trend to let technology co-opt human agency."
1 min read
Education Week opinion letters submissions
Gwen Keraval for Education Week
Artificial Intelligence Video Reading Is Hard to Teach. Can AI Help?
Artificial intelligence might be able to drive cars, treat diseases, and train your front door to recognize your face. But can it help kids learn how to read?
1 min read
Artificial Intelligence What the Research Says AI Chatbots Tend Toward Flattery. Why That's Bad for Students
Flattering technology can make people less willing to admit they are wrong.
6 min read
Illustration of AI robot manipulating a child's mind like a puppet on a string, the girl is using a laptop and interacting with an AI chatbot.
iStock
Artificial Intelligence FAQ: Artificial Intelligence in Schools
Education Week answers some key questions about the use of artificial intelligence in schools.
1 min read
Students grab Chromebooks during Casey Cuny's English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025.
Students grab Chromebooks during Casey Cuny's English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025.
Jae C. Hong/AP