Ķvlog

Artificial Intelligence

Real-Time Data Shows Exactly How Students Use AI on School Technology

By Alyson Klein — March 09, 2026 4 min read
Vector illustration of a robotic trojan horse in a gift box with the letters AI on the top of the box and inside behind the horse.
  • Save to favorites
  • Print
Email Copy URL

Roughly one in five student interactions with generative artificial intelligence on school technology involved cheating, self-harm, bullying, and other problematic behaviors, according to data collected and analyzed by Securly, a company offering internet filtering and other safety services.

What’s more, Securly identified roughly 1 in 50 student-AI interactions as red flags that students might be involved in violence, cyberbullying, or self-harm.

looked at nearly 1.2 million interactions in more than 1,300 districts from Dec. 1, 2025, to Feb. 20, 2026.

Educators should take heart that most of the time, students use AI appropriately, said Tammy Wincup, the CEO of Securly, whose competitors include GoGuardian and Lightspeed Systems.

“When a district actually sets some guardrails and policies around their AI usage in schools, 80% of the conversations happening are within the district’s policies,” Wincup said. “That’s the good news on the learning side of the house.”

Why the usage data is so ‘fascinating’

The analysis offers an early window into how students actually use generative AI tools. Most other research on student usage of AI comes from surveys, which rely on student self-reporting.

Securly’s data shows “what are students really doing when they’re writing text into generative AI,” said Jeremy Roschelle, the co-executive director of learning science research for Digital Promise, a nonprofit organization that works on equity and technology issues in schools.

“That’s why it’s fascinating,” he said.

In November, Securly allowed district officials to set parameters around students’ AI use, similar to the way they ask the company to filter out particular types of websites.

If districts opt to use this feature, large language models will “deflect” a student’s query to AI that’s out-of-bounds with district policy.

For instance, if a student tries to use AI to complete an assignment, large language models may instead point to information on the general topic but won’t supply an exact answer. Or if a student asks about dosing for a particular medication, the tool will tell them to ask a trusted adult for help.

Nearly all the deflected student queries—95%—were from students trying to get AI tools to complete their schoolwork for them.

That percentage didn’t surprise Wincup. She expects that when districts allow students to use large language models on school networks and devices, kids will “experiment with understanding the guardrails” placed around the tools and try to get around those guardrails.

Another 2% of the interactions identified as inappropriate related to games. A little less than 1% dealt with sexual content and a similar percentage concerned firearms or hunting. Gambling, drugs, and hate (such as racism and antisemitism) each comprised roughly 0.5% of flagged interactions.

Though only 2 percent of interactions were identified as potentially unsafe, that represents more than 24,000 queries overall. And some of the questions students asked AI were troubling.

For instance, one student directed a large language model to help draft an email to their mother explaining they had suicidal thoughts.

Another student conducted a quick series of internet searches on questions, including “What’s the main nerve in the forearm?” and “What nerve near the wrist carries blood?” Then the student switched to an AI tool, asking it how to commit suicide. (In both of these cases, the identity of the student was ‘unmasked’ by Securly and district officials were made aware of the safety issues.)

Students used ChatGPT more often than large language models created for K-12 schools

Overall, Securly detected a higher percentage of potentially unsafe AI interactions—2%—than potentially unsafe student internet searches, 0.4%.

It’s too early to pinpoint an exact explanation for that discrepancy, Wincup said. She noted that Securly has had many years to hone its system for recognizing when a student’s internet searches may be a sign of danger, while its work with AI interactions is brand new.

Roschelle, meanwhile, is curious about what, exactly, students asked AI in the 80 percent of interactions that were deemed appropriate for school.

How did their prompts and AI’s responses help—or hinder—their understanding of an assignment, an issue, or the world around them, he wondered.

“What we want to do is make sure [AI] is not just appropriate, but is actually valuable for student learning,” Roschelle said.

The analysis also revealed which large language models students use most often.

ChatGPT is by far the most popular, accounting for 42% of interactions. Securly’s AI Chat made up 28%. Google’s Gemini comprised 21%. And other ed-tech tools that embed AI features—including MagicSchool, SchoolAI, and BriskTeaching—comprised 9%. (That data isn’t nationally representative because only districts that use Securly have access to Securly AI. But Wincup believes “big tech” large language models are probably most popular in all districts.)

AI puts education technology leaders in a new position, Wincup said.

“They’re no longer just buying things and setting things up like this,” she said. This is a moment “where they have to have visibility in order to help their district make not just great tech decisions but make great teaching and learning decisions.”

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Managing AI in Schools: Practical Strategies for Districts
How should districts govern AI in schools? Learn practical strategies for policies, safety, transparency, as well as responsible adoption.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Unlocking Success for Struggling Adolescent Readers
The Science of Reading transformed K-3 literacy. Now it's time to extend that focus to students in grades 6 through 12.
Content provided by 
Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and K-12 education jubs at the EdWeek Top School Jobs virtual career fair.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence Letter to the Editor Artificial Intelligence: Reality Versus Hype
"AI is a deeper manifestation of the pernicious trend to let technology co-opt human agency."
1 min read
Education Week opinion letters submissions
Gwen Keraval for Education Week
Artificial Intelligence Video Reading Is Hard to Teach. Can AI Help?
Artificial intelligence might be able to drive cars, treat diseases, and train your front door to recognize your face. But can it help kids learn how to read?
1 min read
Artificial Intelligence What the Research Says AI Chatbots Tend Toward Flattery. Why That's Bad for Students
Flattering technology can make people less willing to admit they are wrong.
6 min read
Illustration of AI robot manipulating a child's mind like a puppet on a string, the girl is using a laptop and interacting with an AI chatbot.
iStock
Artificial Intelligence FAQ: Artificial Intelligence in Schools
Education Week answers some key questions about the use of artificial intelligence in schools.
1 min read
Students grab Chromebooks during Casey Cuny's English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025.
Students grab Chromebooks during Casey Cuny's English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025.
Jae C. Hong/AP