Ķvlog

Opinion
Artificial Intelligence Opinion

What Makes Students (and the Rest of Us) Fall for AI Misinformation?

Studies show that students can become more savvy at evaluating online information
By Sam Wineburg & Nadav Ziv — October 25, 2024 4 min read
Trendy pop art collage search concept. Halftone laptop computer with search bar and cut out hands pointing on it.
  • Save to favorites
  • Print
Email Copy URL

Four years ago during the 2020 election, we in the Los Angeles Times that young people were struggling to spot disinformation because of outdated lessons on navigating the internet. Today, Ķvlog risk making the same mistakes with artificial intelligence. With the election at our doorstep, the stakes couldn’t be higher.

by our research team, the (formerly the Stanford History Education Group), showed that young people are easily deceived because they judge online content by how it looks and sounds. That’s an even bigger problem with AI, which makes information feel persuasive even when it content and ignores context. Educators must show students the limits of AI and teach them the basic skills of internet search for fact-checking what they see.

When it comes to AI, leaders preach “great excitement and appropriate caution,” as Washington state Superintendent Chris Reykdal in a recent teachers’ guide. He writes of a “full embrace of AI” that will put that state’s public education system “at the forefront of innovation.” New York City schools former chancellor, David C. Banks, who stepped down amid a federal investigation, said in September that AI can “” for the better. The “appropriate caution,” however, remains a misty disclaimer.

Washington state’s guidelines, like , , and , rightly warn that AI may be biased and inaccurate. Washington state stresses that students shouldn’t automatically trust the responses of large language models and should “critically evaluate” responses for bias. But this is like urging students in driver’s education to be cautious without teaching them that they need to signal and check blind spots before passing the car ahead of them.

This pattern repeats the mistakes we saw with instruction on spotting unreliable information online: Ķvlog wrongly assuming that students can recognize danger and locate content that’s reliable.

Massachusetts Institute of Technology professor Hal Abelson that if they come across “something that sounds fishy,” they should say, “Well, maybe it’s not true.” But students are in school precisely because they don’t know a lot. They are in the least position to know if something sounds fishy.

Imagine a history student consulting an AI chatbot to probe the Battle of Lexington, as one of us recently tested. The large language model says this conflagration, which launched the American Revolution, was initiated “by an unknown British soldier.” In truth, no one actually knows who fired first. The chatbot also reports that “two or three” British soldiers were killed during the skirmish. Wrong again. None was. Unless you’re a history buff, this information doesn’t sound “fishy.”

A second danger is that AI mimics the tone and cadence of human speech, tapping into an aesthetic of authority. Presenting information with confidence is a trap, but an effective one: Our 2021 of 3,446 high school students reveals the extraordinary trust students place in information based on a website’s superficial features.

When students conflate style with substance and lack background knowledge, the last thing they should do is try to figure out if something “sounds fishy.” Instead, the detection of unreliable information and responsible use of AI rests on internet search skills that enable them to fact-check.

Here’s the good news: Studies by our and show that students can become more savvy at evaluating online information. Without delay, Ķvlog should focus on AI literacy that emphasizes why content can’t be judged just by looking at it, along with search literacy that gives students the tools to verify information.

On the AI literacy front, Ķvlog need to help students understand that large language models can generate misleading information that looks good and pull scientific out of thin air. Next, they should explain to students how the chatbots work and how their training data are liable to perpetuate bias. When Purdue University researchers people how large language models struggled to recognize the faces of brown and Black people, participants not only grasped this point, they also became more skeptical of other AI responses.

Second, teachers need to make sure their students possess basic online search skills. Expert don’t rely on how something “looks.” Students, likewise, need to leave an unfamiliar website and . The same advice applies to AI: Students need to go beyond the seemingly credible tone of a chatbot and seek context by searching the broader web.

Once there, they should take advantage of, yes, Wikipedia, which has a remarkably accurate resource with safeguards to weed out errors. Having students compare AI responses to Wikipedia entries highlights the difference between artificial and human intelligence. Whereas AI issues a murky smoothie of ambiguously sourced information, Wikipedia requires that claims be anchored to verifiable sources. The site’s page provides a record of debates by real people—not algorithms—over the evidence that supports a claim.

Our studies have shown the danger of taking information at face value. This threat only increases as AI churns out flawed content with encyclopedic authority. And yet, some Ķvlog are telling students to vibe-check AI-produced information. Or to evaluate it without first making sure they know how.

Let’s pair genuine caution about AI with proven search strategies so that students can avoid falling for misinformation and locate trustworthy sources online.

Resources for Teaching Search Literacy

  • (book)
  • (website)
  • from teachers and students in Civic Online Reasoning classrooms (video, 3 minutes)
  • what a lesson looks like in real classrooms (video, 4 minutes)
  • Take advantage of (website, sign up)

See Also

Modern collage with halftone hands, eyes and search box. Person looking for information in the search bar. Concept of searching, looking, finding opportunities and knowledge in internet. SEO concept
Alona Horkova/iStock + Education Week
Artificial Intelligence Opinion What to Know About AI Misinformation: A Primer for Teachers (Downloadable)
Sam Wineburg & Nadav Ziv, November 7, 2024
1 min read

Events

College & Workforce Readiness Webinar How High Schools Can Prepare Students for College and Career
Explore how schools are reimagining high school with hands-on learning that prepares students for both college and career success.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
GoGuardian and Google: Proactive AI Safety in Schools
Learn how to safely adopt innovative AI tools while maintaining support for student well-being. 
Content provided by 
Reading & Literacy K-12 Essentials Forum Supporting Struggling Readers in Middle and High School
Join this free virtual event to learn more about policy, data, research, and experiences around supporting older students who struggle to read.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence 'It's Not Magic': How These Schools Are Teaching AI Literacy
Students are building knowledge about the technology that could help them in future jobs.
9 min read
Nathali Hernandez, 9, left, and Zoe Estrella Quiroz, 9, center, build a program using commands to make a robot named Dash follow a path on a grid. Students worked together in Funda Perez’s 4th grade computer applications class at Dr. Martin Luther King, Jr. School No. 6 in Passaic, N.J., on Oct. 14, 2025.
Nathali Hernandez, left, and Zoe Estrella Quiroz use AI tools to design a program to direct a robot named Dash to follow a path on a grid. The 4th graders worked together in a computer applications class at Dr. Martin Luther King Jr. School No. 6 in Passaic, N.J., on Oct. 14, 2025. A growing number of school districts are emphasizing the development of AI literacy.
Erica S. Lee for Education Week
Artificial Intelligence Opinion AI Won’t Replace Teachers—But Teachers Who Use AI Will Change Teaching
Educators can’t wait until they feel comfortable with AI to start engaging with it.
Ingrid Guerra-López
5 min read
A silhouette standing in front of glowing data sphere. Teachers prepare students to live in a technological future.
iStock/Getty Images
Artificial Intelligence Video How Schools Can Use AI in Smart, Responsible Ways
Use of AI in school brings great opportunity and risk. Here's how Ķvlog can find a balance.
1 min read
Artificial Intelligence Opinion AI Is Trained to Avoid These 3 Words That Are Essential to Learning
Chatbots aren’t designed to model the type of thinking we expect from our students.
Sam Wineburg & Nadav Ziv
5 min read
Human and AI robot completing a jigsaw puzzle, the human is holding the right piece
iStock/Getty Images + Education Week