In training teachers to use AI, we must not outsource the foundational work of teaching, writes Timothy Cook, M.Ed., in a Chalkbeat essay. Cook teaches third grade and researches AI’s impact on education.
Last year, all of my students chose a topic they wanted to explore and pursued a personal learning project about it. One student had discovered the relationship between lever arm length and projectile distance entirely through his own experiments, which involved mathematics, physics, history, and data visualization.
Soon a cluster of 8-year-olds were debating trajectory angles and comparing medieval siege engines to ancient Chinese catapults. They were learning because they wanted to know, not because they had to perform. This is what I dream of as a teacher.
But my heart sank when I recently read about the American Federation of Teachers’ new $23 million partnership with Microsoft, OpenAI, and Anthropic to train educators how to “microwave” routine communications with artificial intelligence.
Yes, I use AI, but only for administrative work like drafting parent newsletters, organizing student data, and filling out required curriculum planning documents. It saves me hours on tasks that have nothing to do with teaching.
I fear the $23 million initiative isn’t about administrative efficiency. According to a press release, they’re training teachers to use AI for “instructional planning” and as a “thought partner” for teaching decisions.
This sounds more like outsourcing the foundational work of teaching.
Most teachers I talk to have similar concerns about AI. They focus on cheating and plagiarism. They worry about students outsourcing their thinking. How to assess learning when they can’t tell if students actually understand anything. Students have always found ways to avoid genuine thinking when we value products over process. I used SparkNotes. Others used Google. Now, students use ChatGPT.
The problem is not technology; it’s that we continue prioritizing finished products over messy learning processes. If education rewards predetermined answers over curiosity, students will find shortcuts.
Teachers need professional development that moves in the opposite direction — PD that helps them facilitate genuine inquiry and human connection; create classrooms where confusion is valued as a precursor to understanding; and develop in students intrinsic motivation.
Children don’t need teachers who can generate lesson plans faster or give AI-generated feedback. They need educators who inspire questions, model intellectual courage, and create environments that emphasize curiosity and real-world problem-solving.
Combining computational tools with human wisdom, ethics, and creativity requires us to maintain the cognitive independence to guide AI systems rather than becoming dependent on them. We can’t microwave that. And we shouldn’t try.
Chalkbeat


