For districts hiring and teachers job-hunting this spring, chances are artificial intelligence is helping shape the path from application to classroom.
Even more likely, teachers don’t know it.
That’s a problem as the technology becomes ubiquitous in districts. Advocates say the systems can cut hiring time and improve teachers’ fit in schools, but they may still pose privacy risks and introduce new forms of bias into the hiring process for job-seekers.
Fifty-three percent of district recruiters now use AI tools, according to a nationally representative survey of 270 recruiters fielded this fall by the EdWeek Research Center.
But only 2% of teachers in a connected survey of more than 700 job-seekers told EdWeek that they have applied in the last 12 months to a district that uses AI in hiring. Taken together, the data points suggest Ķvlog may not realize the technology is being used or have a clear understanding of how its underlying algorithms work.
The and the , the two national teachers’ unions, both have guidance on how AI should be used in education, but those resources don’t specifically cover how the technology shapes teachers’ entry into the workplace. Separately, the AFT has created a national academy to train teachers on other aspects of AI in education.
New teachers’ contracts in both ., and , include language seeking to limit ways AI could be used to replace or evaluate teachers, but neither covers its use for hiring them in the first place.
“AI is a time-saving device, but it’s still an early technology, and there are some problems with it,” said Kathryn Cernera, president of the Ithaca teachers’ union, “which is where the human decisionmaking piece has to come in.”
Kyra Wilson, who studies the effects of AI on human decisionmaking at the University of Washington, said it’s more likely than not that the systems will simply reproduce the bias that humans already possess.
“AI [tools] are not a miracle solution for hiring,” she said
What the landscape of AI-assisted teacher hiring looks like
There are no federal data on how many districts are using AI for hiring. Only about school districts have set policies on AI use, according to a recent RAND Corp. study, and virtually none of them cover how the technology can be used in hiring.
But signs point to education following the larger workforce trend of AI-based hiring. Across industries, nearly a third of recruiters now use the technology, according to the human resources group Criteria Research’s .
Some of the most commonly used hiring software, including PowerSchool’s and HireVue, use AI to match or rank candidates, among other tasks. HireVue’s chief science officer, Mike Hudy, said in a statement that the technology has “moved from an ‘interesting experiment’ to ‘essential infrastructure’” in school districts the company serves.
Filling deeper district needs
In the last seven years, Golf Middle School Principal David Norman has grappled with longer hiring windows and a shrinking pool of candidates. His 600-student district in Morton Grove, Ill., has no separate human resources department, leaving it up to Norman and other administrators to recruit, manage applications and interviews, and secure new teachers and staff.
“The hard-to-fill positions have become harder to fill,” Norman said. “The candidate pool is changing, and so the way we approach hiring has to change.”
This year, Norman developed a homegrown AI hiring agent based on Google’s Gemini model that he hopes can help find teachers and support staff who are a good fit for his tiny two-school district 15 miles north of Chicago.
Golf uses an AI agent—a program that can do complex tasks autonomously—to screen candidate resumes and help administrators prepare for interviews. Norman uploads applications stripped of identifiable information, and the hiring agent provides analyses of each candidate’s strengths and challenges as well as how closely they match district job postings.
The agent’s ability to limit bias in the hiring process has been “eye-opening,” Norman said.
The AI agent looks at the compatibility between a candidate’s redacted resume and skills prioritized in the job description, which “removes those internal questions of ‘should we?’ or ‘shouldn’t we bring them in?’ and allows us to quantify why we would bring this person in to interview,” Norman said.
“When I first got to administration, it was like ‘oh, you went [to my alma mater], I’m definitely going to interview you,’” he said. “Those are conversations that I’m no longer having because of this process, which just makes me a more objective hiring manager.”
Training and caution needed
However, studies suggest it can be difficult to eliminate AI privacy and bias risks fully in the hiring process.
AI tools don’t operate in a vacuum—they learn from existing data, which can be .
“An AI-enabled teacher hiring system might be assumed to be more objective than human-based résumé scoring,” warned the U.S. Education Department’s now-defunct office of educational technology in a . “Yet, if the AI system relies on poor quality historical data, it might de-prioritize candidates who could bring both diversity and talent to a school’s teaching workforce.”
States like and , as well as places like New York City, have passed laws or regulations requiring employers—including school districts—to ensure equity and privacy protections in their AI-based tools.
In 2023, New York City mandated strict oversight of AI hiring tools, requiring all employers (including school districts) to conduct annual independent bias audits of the tools, post the results, and notify job-seekers within at least 10 business days of their intent to use the tools, with fines up to $1,500 for violations. The law has , mostly because it can be difficult to tell whether a particular software uses AI, or how an underlying AI algorithm makes decisions about job candidates.
Similarly, the algorithms that undergird AI analyze such wide-ranging data that they may be able to identify candidates even when their applications have been stripped of common identifiers. An algorithm might categorize a job-seeker as an older woman of color by patterns of colleges or membership in professional groups, gaps in job history associated with caregiving, and graduation or certification dates.
There have already been high-profile tool and alleging that, among other things, AI systems undervalued job-seekers who attended historically Black or women’s colleges or deaf applicants whose voices didn’t match typical speech patterns. Other lawsuits allege the tools by culling personal data from beyond the application without notice.
And while the Golf district and others leave the final decision to administrators, not algorithms, about whom to interview, emerging research suggests people may be more influenced by AI recommendations than they realize.
In their research, Wilson and her colleagues at the University of Washington found when AI hiring tools favor a specific group, 90% of the time—even when the people considered the AI’s recommendations irrelevant or low-quality.
“Right now AI is still in its infancy in workplace integration, so a lot of people don’t have the knowledge and don’t necessarily expect to see these [algorithmic] biases,” Wilson said. District hiring staff need more training on how to evaluate and use AI recommendations, she said.
How to inform human decisions without scaring people off
Teach Away, a for-profit international teacher-recruitment firm that places teachers across 100 countries including the United States, launched its own AI-based system six months ago. AI has helped speed up teacher-school matching on the platform and allows administrators to review information that may get missed on a typical application, such as teachers’ experience with particular curricula or student groups, according to CEO David Frey.
But Frey said noticeably automated hiring systems, like chatbots similar to the annoying customer service ones most people sigh over now, can also “scare off” teachers if they make the hiring process seem cold or disconnected from education.
In a , 42% of international job-hunting teachers on the Teach Away platform said they had spoken to an automated recruiter or AI chatbot as part of the hiring process, and 30% reported withdrawing from a job application because they found the process “too impersonal.”
That hasn’t been an issue yet for the Golf district’s hiring agent. Norman, the principal, said it has helped his staff have deeper and more substantive conversations with the teachers they look to hire.
Candidates for a science teacher post, for example, might focus heavily on their experience coaching science instruction or developing curriculum, but few specifically mention differentiation in their resumes or application forms. The hiring agent developed an interview question asking teachers to describe a specific science unit where they successfully differentiated instruction for English learners and students with disabilities.
“AI will not replace the human element in this,” Norman said, “but I think people have to recognize the potential of AI to analyze data and give us ideas.”
And no AI system can make up for the power of recruiters helping potential teachers understand the culture of the school, the types of students, and the challenges new teachers there will face.
“Teachers often want to know they are going to a strong community,” said Frey, “and that’s harder for teachers to figure out if they’re just talking to chatbots.”