Take the Human Element out of AI Teaching and “Something Is Lost”

Take the Human Element out of AI Teaching and “Something Is Lost”

As artificial intelligence tools proliferate and their capabilities keep improving, few observers believe education will remain AI free, according to an article in The Hechinger Report. There are also few who say teaching is best left to the expanding cast of AI tutors and chatbots. Today’s debate is about what are AI’s most effective roles in helping students learn, and what aspects of teaching should remain most defintely human?

AI’s place in the classroom often centers on students using the technology to cheat or on AI’s tendency to hallucinate — make stuff up — in an eagerness to answer every query. Bots can be programmed to base responses on vetted curricular materials, among other steps. But how can AI engage and motivate students?

There is agreement that teachers can quickly spot and address things like confusion and flagging interest in real time. Numerous tech and education experts believe AI is best used to augment and extend the reach of human teachers. This is a vision that takes different forms. For example, the goal of Merlyn Mind’s voice assistant is to make it easier for teachers to engage with students while also navigating apps and other digital teaching materials. Instead of being stationed by the computer, teachers can move around the class and interact with students.

Other aspects of this vision: 1) Using AI to help train human tutors to have more productive student interactions; and 2) multiplying the number of students a human instructor can engage with by delegating specific tasks to AI. The end goal is a partnership where AI is not called on to be a teacher but to supercharge the power of humans already doing the job.

A strong connection between teachers and students is especially important when learners feel challenged or discouraged. AI has many strengths but experts say it’s not very good at motivating students to keep doing something they are not very interested in doing.

In trial runs, some students just ignored AI attempts to probe their understanding of a topic, and engagement with the bot dropped off dramatically. Despite AI’s knowledge and facility with natural language, students just weren’t interested in chatting with it.

Many tutoring experts stress the importance of building a strong relationship between tutors and students to achieve significant learning boosts. “If a student is not motivated, or if they don’t see themselves as a math person, then they’re not going to have a deep conversation with an AI bot,” says Brent Milne, the vice president of product research and development at Saga Education, a nonprofit provider of in-person tutoring.

Saga has been experimenting with AI feedback to help tutors better engage and motivate students. Working with researchers from the University of Memphis and the University of Colorado, the Saga team fed transcripts of their math tutoring sessions into an AI model trained to recognize when the tutor was prompting students to explain their reasoning, refine their answers or initiate a deeper discussion. The AI analyzed how often each tutor took these steps.  

Saga piloted this AI tool in 2023 and provided the feedback to their tutor coaches, who worked with four to eight tutors each. Tracking some 2,300 tutoring sessions over several weeks, they found that tutors whose coaches used the AI feedback filled their sessions with significantly more of these prompts to encourage student engagement.

Saga is moving cautiously in having AI deliver some feedback directly to tutors because, according to Milne, “having a human coach in the loop is really valuable to us.”

Experts expect AI’s role in education is bound to grow, and its interactions will continue to seem more and more human. Earlier this year, OpenAI and the startup Hume AI separately launched “emotionally intelligent” AI that analyzes tone of voice and facial expressions to infer a user’s mood and respond with calibrated “empathy.” Still, even emotionally intelligent AI will likely fall short in terms of student engagement, according to Brown University computer science professor Michael Littman.

No matter how human-like the conversation, he says, students understand fundamentally that AI doesn’t really care about them, what they have to say in their writing or whether they pass or fail algebra. So students will never really care about the bot and what it thinks. A June study in the journal Learning and Instruction found that AI can already provide decent feedback on student essays. What is not clear: will student writers invest care and effort — rather than offloading the task to a bot — if AI becomes the primary audience for their work. 

“There’s incredible value in the human relationship component of learning,” Littman says, “and when you just take humans out of the equation, something is lost.”

The Hechinger Report

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
InnovativeSchools Insights Masthead

Subscribe

Subscribe today to get K-12 news you can use delivered to your inbox twice a month

More Insights