Ķvlog

Artificial Intelligence Q&A

AI Makes Stuff Up. So How Can Teachers Use It in Instruction?

By Arianna Prothero — May 13, 2025 3 min read
Photo of a young white boy viewing AI Chatbot responses on his mobile device.
  • Save to favorites
  • Print
Email Copy URL

Artificial intelligence is the future, and Ķvlog must incorporate the swiftly evolving technology into their instruction to both stay current and prepare their students for the jobs of the future. Or so say many experts in education and technology.

At the same time, experts caution that generative AI tools can be biased and “hallucinate” made-up answers to queries. Some recent research shows that the most sophisticated new versions of popular AI chatbots are hallucinating more than before, even as they get better at performing some tasks such as math problems.

It’s no small wonder that many teachers remain doubtful about using the technology in their classrooms, especially for anything beyond composing emails to parents or creating grading guidelines.

So, then, how exactly are teachers supposed to responsibly incorporate this technology into instruction? Education Week put this question to Rachel Dzombak, a professor of design and innovation in Carnegie Mellon University’s Heinz College of Information Systems and Public Policy.

This conversation has been edited for length and clarity.

Why does AI ‘hallucinate’?

Rachel Dzombak

I think of hallucinations just as, simply, when a tool generates an inaccurate response. Recently, I said [to an AI tool], “Pull direct quotes from this article,” and it looked like they were real quotes, but then I tried to find the quotes in the article, and they did not exist.

There are other types of systems, software systems, that are designed to give the one right answer. That’s not how AI systems are set up. AI systems are complex systems. They’re not trained to give a single right answer. AI systems are trying to find patterns across data, and what’s going to come out of it is what we call emergent effects. I think of it as 2+2=5: that there’s always going to be these unintended aspects. It’s what makes AI and large language models, specifically, really great at helping us brainstorm and do creativity tasks. But the flip side of that is it gets things wrong.

The flip side of the hallucination is this massive capacity to pull information together in new ways that was previously not possible with traditional software systems. There are just trade-offs.

What I can say is that the systems are going to continue to evolve in good and bad ways over time. So, we could see, because it is a maturing tool, these swings where [AI] gets worse at some things and it gets better at some things.

How does this then affect teachers who are using AI to teach?

They need to have a continued sense of curiosity of, what are these tools? What are they today, how are they changing? There are no hard and fast rules.

A big challenge people have today in the education space is not thinking about how to use [AI tools] in a way that really fits with their intention. [Use them] where you have more wiggle room where you’re not looking for that one exact answer, where variability and randomness is a good thing, a feature rather than a bug of the system. I think about this in my own classes: How do I encourage students to use generative AI tools in places where it makes sense, where it encourages them to be creative?

When should students use generative AI?


I see Ķvlog saying, “You’re not allowed to use a large language model to write a term paper in my class.” I personally think that students are going to find ways around that anyway. I think [teachers] should say, “If you’re going to, here are ways that you should think about using [AI], and here are the trade-offs. In many ways, it’s forcing creativity on the part of Ķvlog to rethink, how are we really hitting the learning outcomes that we want to hit?

Education has needed change for a long time. People are at times blaming generative AI tools for shifting education in the current moment: It’s enabling cheating, it’s enabling all of these things. But maybe it’s just shining a spotlight on behaviors that were already there. I’m an engineer by training, and a colleague of mine did a study of engineering undergraduates, and the average engineering undergrad does 3,000 problem sets where there’s one right answer. But if you look at what the skills that are needed most in the workforce, it’s comfort with ambiguity and creativity and complex problem-solving.

That’s not what we’ve been assigning students.

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Teaching Webinar
Maximize Your MTSS to Drive Literacy Success
Learn how districts are strengthening MTSS to accelerate literacy growth and help every student reach grade-level reading success.
Content provided by 
College & Workforce Readiness Webinar How High Schools Can Prepare Students for College and Career
Explore how schools are reimagining high school with hands-on learning that prepares students for both college and career success.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
GoGuardian and Google: Proactive AI Safety in Schools
Learn how to safely adopt innovative AI tools while maintaining support for student well-being. 
Content provided by 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence Reports Six Big Questions About AI and K-12 Education, In Charts
This report examines AI’s impact in K-12 education. Survey results that provide insight into Ķvlog’ perspectives are presented in charts.
Artificial Intelligence Congress Wants to Protect Kids Using AI. Are Their Ideas the Right Ones?
Two bills in Congress aim to build guardrails for kids' use of artificial intelligence.
5 min read
Photo of the United States Capitol with overlayed computer circuitry and the letters "AI".
iStock/Getty
Artificial Intelligence Video These Students are Learning the Math That Makes AI Tick
Rather than study how to use AI, students in this machine learning class work with the math that makes the AI work.
1 min read
Student Nina Dong, second from left, helps classmates with a project examining the Titanic passenger dataset in Clay Dagler's machine learning class at Franklin High School in Elk Grove, Calif., on March 7, 2025.
Student Nina Dong, second from left, helps classmates with a project examining the Titanic passenger dataset in Clay Dagler's machine learning class at Franklin High School in Elk Grove, Calif., on March 7, 2025.
Max Whittaker for Education Week
Artificial Intelligence 5 Best Practices for Crafting a School or District AI Policy
Nearly half of Ķvlog say their school or district does not have an AI policy.
Illustration of woman teacher vetting artificial intelligence for classroom.
Weiyi Zhu/iStock/Getty