AI’s Risks in Education: Four Takeaways from a New Report

AI’s Risks in Education: Four Takeaways from a New Report

A new report from the Brookings Institution concludes that AI poses risks to educational and social-emotional development, as well as teacher-student trust, according to an article in The 74.

Researchers interviewed K-12 students, parents and teachers in 50 countries and concluded that AI undermines young people’s foundational development in a way that can’t be offset by its productivity advantages.

“The risks we found are things like shortcutting learning so that you have less cognitive development,” says Rebecca Winthrop, who heads Brookings’ Center for Universal Education and is an author of the report.

Researchers found that young people who spend a lot of time with AI companions are “de-skilling” when it comes to basic human interactions, she says.

Researchers acknowledge that AI’s rapid evolution puts educators in a bind. They’re operating with little rigorous, longitudinal evidence on how AI affects student learning and well-being. As a result, they say, “None of us, not even AI’s creators, can predict its potential dangers or benefits with complete accuracy.”

Four key findings from the report:

1) AI poses risks that undermine foundational development

Researchers note that AI at its core is a set of powerful productivity tools being harnessed most effectively by “professional adults with fully matured brains. They have already developed sophisticated metacognitive and critical thinking skills that undergird their approach to their work.” They also have deep expertise in their fields and the cognitive flexibility that comes with that expertise, allowing them to use AI as a “cognitive partner.” 

Young people aren’t “mini-professionals.” Their brains are still developing and school should help them practice critical thinking and “sustained engagement with challenging material.” 

For most young people, AI isn’t a “cognitive partner” but a surrogate. It doesn’t accelerate their development — it diminishes it via cognitive offloading. The result: declining skills across the board.

A teacher tells them, “If students can just replace their actual learning and their ability to communicate what they know with something that’s produced outside of them and get credit for it, what purpose do they have to actually learn?”

A student puts it a bit more bluntly: “It’s easy. You don’t need to (use) your brain.”

2) Social and emotional development can be impeded

Using AI can undermine young people’s ability to form relationships, recover from setbacks and stay mentally healthy, observers tell researchers.

Young people use AI chatbots for everything from homework to emotional support, therapy and companionship. This has adults worried, researchers report. Nearly one in five teachers worry about AI’s influence on student well-being, even though just 7% of students mentioned chatbots’ emotional harm.

It’s equally possible kids aren’t experiencing emotional dependence — or that they simply lack “the self-reflective capacity” to recognize unhealthy emotional dependence and how it impacts their well-being.

3) Trust between students and teachers is diminishing

Teachers tell researchers they increasingly doubt that students are producing authentic work — while students think the same about their teachers.

Teachers trust students less when they suspect them of using AI to complete homework. In interviews, 16% of teachers said this erosion of trust is “a significant concern.” 

Students also trust teachers less when teachers use AI to create lesson plans and assignments, but aren’t open about it. 

This development could undermine students’ trust in educational institutions. “One of AI’s greatest casualties may be the trust that ensures young people have what they need in school to meet their needs and prepare them for the future,” researchers write. 

4) It’s not too late to reduce AI risks

Researchers say AI is doing damage but the wounds are “fixable” and that adults “should neither capitulate to these harms nor focus solely on limiting their repercussions.”

The report offers 12 recommendations, including:

  • Shift education away from “transactional task completion” that AI can most easily help students with. 
  • Co-create AI tools with educators, students, parents and communities. Schools can create “student AI councils” to embed student voice into AI tool design “to ensure their relevance, inclusivity, and pedagogical soundness” before adoption. 
  • Use AI tools that “teach, not tell.” Winthrop suggests using AI to interface with a difficult digital text. “I’ve read this paragraph twice,” she says. “I don’t get it. Can you explain it to me in a different way?” Used in such a fashion, with vetted content, she says, “it can be really effective.” 
  • Offer AI literacy that helps students, educators, and families understand its capabilities, limitations and broader implications. Robust professional development is needed to equip teachers with deep knowledge to teach students about AI.

 

The 74

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
InnovativeSchools Insights Masthead

Subscribe

Subscribe today to get K-12 news you can use delivered to your inbox twice a month

More Insights