Ķvlog

Opinion Blog


Rick Hess Straight Up

Education policy maven Rick Hess of the think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

Artificial Intelligence Opinion

Schools Are Urged to Embrace AI—and Ban Phones. Can We Resolve the Tension?

The role ed tech should play in the classroom is full of contradictions
By Rick Hess — April 14, 2026 8 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
  • Save to favorites
  • Print
Email Copy URL

Educators have a lot of questions about AI. Well, when I want practical insight on ed tech, I frequently turn to the ever-thoughtful Michael Horn—lecturer at the Harvard Graduate School of Education, co-founder of the Christensen Institute, co-host of the Future U podcast, and author of many books, including Blended: Using Disruptive Innovation to Improve Schools. As we navigate big talk and dire warnings about what AI means for schools, I reached out to Michael for a reality check. Here’s what he had to say.
‸龱

Rick: Michael, evangelists are touting AI’s transformative promise in schooling. At the exact same time, schools are being urged to ban phones, limit social media, and reduce screen time. These strike me as contradictory takes on the role tech should play in classrooms—one fueled by optimism, the other by a growing backlash. You’ve thought about these tensions more deeply than almost anyone. How should we make sense of them? Is this situation less contradictory than it appears, or are the schools racing to embrace AI simply repeating the mistakes of the recent past?

Michael: I agree that this seems like quite the paradox. But here’s what can explain the disconnect. In my first book with Clay Christensen and Curtis Johnson, Disrupting Class, back in 2008, we argued that simply layering ed tech onto the existing classroom model would have little to no transformational impact. We agreed with Larry Cuban’s hypothesis about computers in classrooms in his book . What we didn’t anticipate was how ed tech in traditional classrooms could also accelerate the downsides of existing schooling models—incoherence, distractions, bad pedagogy, lack of focus and effort, and the like. It seems clear that in certain schools, that’s been the impact—particularly for schools that adopted ed tech without a design process on the front end and without paying attention to the coherence of their choices.

To your point, the conversation around reducing ed tech seems to be happening in one silo, and new “native AI” school models are happening in another. There’s a certain irony there. But if Ķvlog build a clear and coherent model in which the technology is optimized for the students and teachers, and those choices have been well thought out and tested, then we will likely get a very different reaction. A lot of families in these models—like Alpha schools—swear by them. And it’s also clear that they aren’t asking for their kids to be on screens all the time, sucked into a vortex of mindless clickbait. They want adults in the room with clear guidelines, purpose to the tech, and sensible restrictions. But they want to make those choices on the ground, not from on high as blanket bans.

We should be very wary about promoting blanket bans. Not only is it out of tune with the reality that some parents are intentionally opting for schools that leverage AI, but it will also limit the use in cases where ed tech makes sense in traditional schooling models.

Rick: You mentioned Alpha School. We’ve seen strong claims about its success and the promise of agentic tutors and other AI-enabled advances. The allure of such approaches is obvious: personalized instruction, additional practice, real-time feedback, and more time for discussion and mentoring. Of course, similar claims have been made for previous tech-infused school models over the past two decades. As the guy who penned the book Blended, do you think today’s efforts are more likely to deliver? What, if anything, do large language models and agentic AI change about this equation?

Michael: The learning model will always matter more than the technologies being used. Education technology in service of a model that doesn’t prioritize rigor—and allows students to skate by an incoherent curriculum with minimal effort and understanding—will never magically produce great outcomes. It will just accentuate what the existing school and classroom models are prioritizing.

AI increases the variability—the highs and the lows—of ed tech. The upside possibilities are big—your colleague John Bailey to some really promising randomized controlled trials. But I have a feeling the flops may be big as well—from cognitive offloading to hallucinations, even less instructional coherence, distractions, unhealthy AI companionship, and more.

So, will AI deliver all the things you referenced? Perhaps, but only when school models that prioritize mastery and coherence use AI in ways that serve those goals. It’s interesting to note that we have seen a lot of great outcomes where ed tech is used in schools that have very different underlying models. For example, just look at the results from the big portfolio of schools that Silicon Schools Fund has supported in California over the years. Many were blended and used relatively primitive tools compared to what AI can enable. How much better can those be with AI-powered tools now?

AI-powered ed tech can theoretically do a lot of the things you just mentioned much better than the previous versions of ed tech. As John Danner—who founded Rocketship and now —was recently , AI-enabled ed tech represents a big change from multiple-choice, “point-and-click” tools to highly conversational ones. These are much better at diagnosing misunderstandings, asking more targeted questions in the moment, and providing actual instruction. AI-powered tools are significantly richer, in other words, and promise to be far more personalized.

Another example: Dacia Toll, who founded Achievement First and is no stranger to rigor in schools, is using AI to build , a curriculum-aligned teaching assistant for English/language arts. Most importantly, she’s doing so with clear instructional design, guardrails, clarity around the importance of knowledge, and human teachers setting the rules around what good instruction, good questions, and actual student misconceptions look like. I think we’d be foolish to dismiss the opportunities that someone like Dacia sees in these tools.

Rick: Even those dubious about student-facing AI often say that AI might help by allowing teachers to automate mundane tasks like grading, communications, report filing, and individualized education programs. The hope is this will give teachers more time and energy for coaching and mentoring. The problem is that we’ve a history of teachers lamenting that tech which made similar promises wound up feeling like an intrusive burden. Will AI be different? And, if it is, how concerned should we be that teachers who rely on AI to send emails, grade essays, or craft lesson plans may find themselves less connected to students, parents, or their own work?

Michael: There are at least two parts to this.

First, we should pay attention to the response from teachers. Policymakers and school leaders should be wary of pushing a new tool on teachers if it’s met with resistance. That can be a signal that the tool feels like “one more thing,” rather than something that’s actually helping them get a job accomplished.

That said, one thing we’ve seen is that because existing school models struggle with coherence, rigor, and setting clear expectations for every student, teachers may pull in tools that help them accomplish a task—but these tools don’t result in the impact we’d like to see for students. The average school district is now using nearly , and students are accessing on average 48 tools over the course of a year. That seems like a recipe for incoherence.

So, am I persuaded that certain tools that allow teachers to spend more time on coaching and instruction are a good thing? Yes, but with a big caveat: While these tools may be making teachers’ lives easier, I’m not always certain these tools are actually doing what we want them to accomplish for students.

Rick: Given all this, what’s your advice for school and system leaders who are excited about AI but leery after a quarter-century of ed-tech disappointment?

Michael: Do not lead with the technology. Do not implement AI just because “that’s where the world is moving.” For the first time, we’re actually in a position where those who say Ķvlog are “tech backward” have it wrong. Educators are leading right now in implementing AI in schools—and that may be a problem because they aren’t asking more fundamental questions about their learning models first.

Start with what you are trying to accomplish for students. What are your school’s goals? What are the nice-to-haves for students, and what are you not going to do? Then, with the priorities identified, design your model—what are the processes and resources you’re going to utilize? AI-powered tools may be some of those resources. Test them out before committing. My books Blended and From Reopen to Reinvent offer design processes to do this work—it’s hard work but worth it.

Schools today do a lot of what they call pilots—but they don’t have clear measurable results that will tell school leaders if it’s working. Moreover, they often don’t have plans for how to sunset things that aren’t working and scale with fidelity those that are. Build that muscle before just letting people implement AI.

Also, why not put some outcomes-based contracts in place with the providers you work with? Make that standard fare. That would help schools prune tools and budgets when things aren’t working. That would also help prune the market of ed-tech companies not actually bringing useful tools to bear.

In short, start at the end and design backward—as have long advised.

This conversation has been edited for length and clarity.

Related Tags:

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Unlocking Success for Struggling Adolescent Readers
The Science of Reading transformed K-3 literacy. Now it's time to extend that focus to students in grades 6 through 12.
Content provided by 
Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and K-12 education jubs at the EdWeek Top School Jobs virtual career fair.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
MTSS + AI in Action: Reimagining Student Support
See how one district is using AI to strengthen MTSS, reduce workload, and improve student support.
Content provided by 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence Real-Time Data Shows Exactly How Students Use AI on School Technology
About 20% of student interactions with AI using school technology involved problematic behaviors.
4 min read
Vector illustration of a robotic trojan horse in a gift box with the letters AI on the top of the box and inside behind the horse.
Xeniya Udod Femagora/DigitalVision Vectors
Artificial Intelligence Teens Say They Should Be Able to Use AI to Complete Assignments. Parents Disagree
That tension is rising as many schools are expanding their use of AI.
2 min read
Image of a laptop with prompts floating in the air.
Education Week + iStock/Getty
Artificial Intelligence Data How Teens and Young People Use AI Tools for Learning and Mental Health Support
Two reports detail ways young people are engaging with AI and how it impacts their mental health.
2 min read
Art teacher Lindsay Johnson, center, has students explore how to use generative AI features at Roosevelt Middle School, on June 25, 2025, in River Forest, Ill.
Art teacher Lindsay Johnson, center, has students explore how to use generative AI features at Roosevelt Middle School, on June 25, 2025, in River Forest, Ill. As the use of AI among teens and young adults increases, many are using it to seek out mental health advice.
Nam Y. Huh/AP
Artificial Intelligence Are Teens Just Using AI to Cheat? Well, Not Quite (If You Ask Them)
There’s fear among many Ķvlog that students are using AI to do most of their critical thinking.
3 min read
Photo collage of a high school boy dressed in casual wear sitting among open books, concentrating on his tablet with books scattered all around him and a graph chart and asterisk as part of the collage in the background.
iStock/Getty