As schools deal with the complexities of using AI in classrooms, here are four trends to look for in 2025, according to an article in K-12 Dive.
1) Districts and states will crank up AI guidance
It’s likely schools will need to rely more on national organizations and nonprofits for high-level guidance in 2025 — and for several years to follow, says Kris Hagel, chief information officer at Peninsula School District in Gig Harbor, Washington.
Over the past two years, the U.S. Department of Education’s Office of Educational Technology developed useful AI resources and guidance, says Pat Yongpradit, chief academic officer of Code.org and lead for TeachAI. But Yongpradit doesn’t expect similar federal assistance in the near future.
The Education Department’s AI resources “really set the tone for state education agencies,” Yongpradit says. “Regardless of what happens with the Ed Department, state education agencies are going to take it from there,” he says. Local districts will also take more control of guidance.
As of November, 24 states had released guidance for AI in education, according to TeachAI, a national coalition that aims to guide schools on safe and ethical AI use.
2) Special education and English learners will be targets for more customized AI tools
Special education teachers are increasingly expressing interest in AI tools, and Yongpradit hopes “more tailored experiences” will be on the horizon for this sector.
At Peninsula School District, leaders are exploring how to securely analyze students’ Individualized Education Program data through the district’s own AI enterprise system, says Hagel. The goal is to use AI to help improve IEPs by comparing students’ testing data to their IEP goals, he says.
There are ways to do this “safely and securely,” Hagel says. “I think people haven’t wrapped their heads around the underlying technology to understand.”
Hagel strongly advises against using free, publicly available AI tools like ChatGPT for special education needs. Districts could explore special education solutions with AI enterprise systems where “you have environments built-out or safe, where you know that the large language model is not saving that data, you know it’s not taking it anywhere, and nobody else is storing it,” Hagel says.
Robin Lake, director of the Center on Reinventing Public Education at Arizona State University, expects to see more AI tools quickly rolling out this year to also support multilingual learners.
For English learners, Lake expects real-time translation tools to be more integrated in classrooms.
3) Relying on AI detectors will continue to grow
More teachers will opt for AI detection tools in 2025 to spot text generated or paraphrased by AI, predicts Yongpradit. And he says more public pushback is likely against using this software to address cheating and plagiarism.
Yongpradit often dissuades teachers from using AI detectors. “Even if these tools were perfect — no false positives, no bias,” the detectors are designed for particular generative AI models, which often change and ultimately make the detectors less effective, he says.
It’s best to figure out why you’re teaching what you’re teaching, and why the kids would be cheating in the first place, Yongpradit says. “Is what you’re doing just basically in need of a change itself?”
Lake believes more teachers will go beyond detection tools. Teachers may seek live feedback from an AI coach listening in on their instruction, or more teachers might start using AI for targeted professional development.
Personalized instruction tools, such as AI tutors, also could see growing popularity among educators, Lake says.
4) Integrating AI will still be a struggle for some districts
A sizable number of school districts have yet to start implementing AI technology or continue to block its use altogether, Hagel says.
Yongpradit expects “huge swaths of the education community” still won’t do much with AI. That’s “simply because they have bigger fish to fry,” Yongpradit says.
Lake has heard both rural and urban districts say they don’t have the capacity, money and time to seriously invest in AI — even though the interest is there. But that’s where federal and state officials should step in to provide support and guidance, she says.
But not many states are providing financial investments for schools looking to innovate with AI tools, she says.
Some districts continue to struggle with AI implementation because “there’s a lack of understanding fundamentally on how AI works” and they’re fearful of it, Hagel says. This challenge illustrates a need to rethink how to explain AI to school leaders.
“Something’s going to have to happen to get people to understand the underlying technology behind AI so that they can feel more comfortable with moving forward with it,” Hagel says.
Challenges including lawsuits over a school’s plagiarism policies or concerns with student data privacy protections can have a “chilling effect” on districts moving forward with the technology, Lake says. Schools shouldn’t take unnecessary risks involving AI, but they should feel comfortable experimenting with these tools in controlled, evidence-based settings to find solutions for students and teachers, she says.
K-12 Dive