Ķvlog

Opinion Blog


Rick Hess Straight Up

Education policy maven Rick Hess of the think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

Artificial Intelligence Opinion

Schools Are Urged to Embrace AI—and Ban Phones. Can We Resolve the Tension?

The role ed tech should play in the classroom is full of contradictions
By Rick Hess — April 14, 2026 8 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
  • Save to favorites
  • Print
Email Copy URL

Educators have a lot of questions about AI. Well, when I want practical insight on ed tech, I frequently turn to the ever-thoughtful Michael Horn—lecturer at the Harvard Graduate School of Education, co-founder of the Christensen Institute, co-host of the Future U podcast, and author of many books, including Blended: Using Disruptive Innovation to Improve Schools. As we navigate big talk and dire warnings about what AI means for schools, I reached out to Michael for a reality check. Here’s what he had to say.
‸龱

Rick: Michael, evangelists are touting AI’s transformative promise in schooling. At the exact same time, schools are being urged to ban phones, limit social media, and reduce screen time. These strike me as contradictory takes on the role tech should play in classrooms—one fueled by optimism, the other by a growing backlash. You’ve thought about these tensions more deeply than almost anyone. How should we make sense of them? Is this situation less contradictory than it appears, or are the schools racing to embrace AI simply repeating the mistakes of the recent past?

Michael: I agree that this seems like quite the paradox. But here’s what can explain the disconnect. In my first book with Clay Christensen and Curtis Johnson, Disrupting Class, back in 2008, we argued that simply layering ed tech onto the existing classroom model would have little to no transformational impact. We agreed with Larry Cuban’s hypothesis about computers in classrooms in his book . What we didn’t anticipate was how ed tech in traditional classrooms could also accelerate the downsides of existing schooling models—incoherence, distractions, bad pedagogy, lack of focus and effort, and the like. It seems clear that in certain schools, that’s been the impact—particularly for schools that adopted ed tech without a design process on the front end and without paying attention to the coherence of their choices.

To your point, the conversation around reducing ed tech seems to be happening in one silo, and new “native AI” school models are happening in another. There’s a certain irony there. But if Ķvlog build a clear and coherent model in which the technology is optimized for the students and teachers, and those choices have been well thought out and tested, then we will likely get a very different reaction. A lot of families in these models—like Alpha schools—swear by them. And it’s also clear that they aren’t asking for their kids to be on screens all the time, sucked into a vortex of mindless clickbait. They want adults in the room with clear guidelines, purpose to the tech, and sensible restrictions. But they want to make those choices on the ground, not from on high as blanket bans.

We should be very wary about promoting blanket bans. Not only is it out of tune with the reality that some parents are intentionally opting for schools that leverage AI, but it will also limit the use in cases where ed tech makes sense in traditional schooling models.

Rick: You mentioned Alpha School. We’ve seen strong claims about its success and the promise of agentic tutors and other AI-enabled advances. The allure of such approaches is obvious: personalized instruction, additional practice, real-time feedback, and more time for discussion and mentoring. Of course, similar claims have been made for previous tech-infused school models over the past two decades. As the guy who penned the book Blended, do you think today’s efforts are more likely to deliver? What, if anything, do large language models and agentic AI change about this equation?

Michael: The learning model will always matter more than the technologies being used. Education technology in service of a model that doesn’t prioritize rigor—and allows students to skate by an incoherent curriculum with minimal effort and understanding—will never magically produce great outcomes. It will just accentuate what the existing school and classroom models are prioritizing.

AI increases the variability—the highs and the lows—of ed tech. The upside possibilities are big—your colleague John Bailey to some really promising randomized controlled trials. But I have a feeling the flops may be big as well—from cognitive offloading to hallucinations, even less instructional coherence, distractions, unhealthy AI companionship, and more.

So, will AI deliver all the things you referenced? Perhaps, but only when school models that prioritize mastery and coherence use AI in ways that serve those goals. It’s interesting to note that we have seen a lot of great outcomes where ed tech is used in schools that have very different underlying models. For example, just look at the results from the big portfolio of schools that Silicon Schools Fund has supported in California over the years. Many were blended and used relatively primitive tools compared to what AI can enable. How much better can those be with AI-powered tools now?

AI-powered ed tech can theoretically do a lot of the things you just mentioned much better than the previous versions of ed tech. As John Danner—who founded Rocketship and now —was recently , AI-enabled ed tech represents a big change from multiple-choice, “point-and-click” tools to highly conversational ones. These are much better at diagnosing misunderstandings, asking more targeted questions in the moment, and providing actual instruction. AI-powered tools are significantly richer, in other words, and promise to be far more personalized.

Another example: Dacia Toll, who founded Achievement First and is no stranger to rigor in schools, is using AI to build , a curriculum-aligned teaching assistant for English/language arts. Most importantly, she’s doing so with clear instructional design, guardrails, clarity around the importance of knowledge, and human teachers setting the rules around what good instruction, good questions, and actual student misconceptions look like. I think we’d be foolish to dismiss the opportunities that someone like Dacia sees in these tools.

Rick: Even those dubious about student-facing AI often say that AI might help by allowing teachers to automate mundane tasks like grading, communications, report filing, and individualized education programs. The hope is this will give teachers more time and energy for coaching and mentoring. The problem is that we’ve a history of teachers lamenting that tech which made similar promises wound up feeling like an intrusive burden. Will AI be different? And, if it is, how concerned should we be that teachers who rely on AI to send emails, grade essays, or craft lesson plans may find themselves less connected to students, parents, or their own work?

Michael: There are at least two parts to this.

First, we should pay attention to the response from teachers. Policymakers and school leaders should be wary of pushing a new tool on teachers if it’s met with resistance. That can be a signal that the tool feels like “one more thing,” rather than something that’s actually helping them get a job accomplished.

That said, one thing we’ve seen is that because existing school models struggle with coherence, rigor, and setting clear expectations for every student, teachers may pull in tools that help them accomplish a task—but these tools don’t result in the impact we’d like to see for students. The average school district is now using nearly , and students are accessing on average 48 tools over the course of a year. That seems like a recipe for incoherence.

So, am I persuaded that certain tools that allow teachers to spend more time on coaching and instruction are a good thing? Yes, but with a big caveat: While these tools may be making teachers’ lives easier, I’m not always certain these tools are actually doing what we want them to accomplish for students.

Rick: Given all this, what’s your advice for school and system leaders who are excited about AI but leery after a quarter-century of ed-tech disappointment?

Michael: Do not lead with the technology. Do not implement AI just because “that’s where the world is moving.” For the first time, we’re actually in a position where those who say Ķvlog are “tech backward” have it wrong. Educators are leading right now in implementing AI in schools—and that may be a problem because they aren’t asking more fundamental questions about their learning models first.

Start with what you are trying to accomplish for students. What are your school’s goals? What are the nice-to-haves for students, and what are you not going to do? Then, with the priorities identified, design your model—what are the processes and resources you’re going to utilize? AI-powered tools may be some of those resources. Test them out before committing. My books Blended and From Reopen to Reinvent offer design processes to do this work—it’s hard work but worth it.

Schools today do a lot of what they call pilots—but they don’t have clear measurable results that will tell school leaders if it’s working. Moreover, they often don’t have plans for how to sunset things that aren’t working and scale with fidelity those that are. Build that muscle before just letting people implement AI.

Also, why not put some outcomes-based contracts in place with the providers you work with? Make that standard fare. That would help schools prune tools and budgets when things aren’t working. That would also help prune the market of ed-tech companies not actually bringing useful tools to bear.

In short, start at the end and design backward—as have long advised.

This conversation has been edited for length and clarity.

Related Tags:

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Unlocking Success for Struggling Adolescent Readers
The Science of Reading transformed K-3 literacy. Now it's time to extend that focus to students in grades 6 through 12.
Content provided by 
Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and K-12 education jubs at the EdWeek Top School Jobs virtual career fair.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
MTSS + AI in Action: Reimagining Student Support
See how one district is using AI to strengthen MTSS, reduce workload, and improve student support.
Content provided by 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence What the Research Says AI Chatbots Tend Toward Flattery. Why That's Bad for Students
Flattering technology can make people less willing to admit they are wrong.
6 min read
Illustration of AI robot manipulating a child's mind like a puppet on a string, the girl is using a laptop and interacting with an AI chatbot.
iStock
Artificial Intelligence FAQ: Artificial Intelligence in Schools
Education Week answers some key questions about the use of artificial intelligence in schools.
1 min read
Students grab Chromebooks during Casey Cuny's English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025.
Students grab Chromebooks during Casey Cuny's English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025.
Jae C. Hong/AP
Artificial Intelligence Students Are Worried That AI Will Hurt Their Critical Thinking Skills
Despite those concerns, students are using the tech more and more for schoolwork.
4 min read
Students present their AI powered-projects designed to help boost agricultural gains in Calla Bartschi’s Introduction to AI class at Riverside High School in Greer, S.C., on Nov. 11, 2025.
Students present their AI-powered projects designed to help boost agricultural gains during an introduction to AI class at a high school in Greer, S.C., on Nov. 11, 2025. A new RAND Corp. survey of middle, high school, and college students shows nearly 7 in 10 middle and high school students say they are concerned that using AI for schoolwork is eroding their critical thinking skills.
Thomas Hammond for Education Week
Artificial Intelligence How AI Could Help or Hurt Student Testing
There's a balance to strike that uses AI to improve assessments and keep humans in charge, experts say.
4 min read
TeachersAI SG01
Teachers attend a training session on using artificial intelligence at American Federation of Teachers headquarters in New York City on March 18, 2026. The union has partnered with AI developers to train 400,000 teachers on AI use in the classroom. One question teachers face is how best to use the technology as part of testing students' subject mastery.
Salwan Georges for Education Week