Embedding the Human Element of Teaching

Insights 22 600x400 HumanElementTeaching

Since the November 2022 launch of OpenAI’s ChatGPT, an expanding cast of AI tutors and helpers have entered the learning landscape, according to The Hechinger Report. Most of these tools are chatbots that can generate quizzes, summarize key points in a complex reading, offer step-by-step graphing of algebraic equations, or provide feedback on the first draft of an essay, among other tasks.

Even the staunchest techno optimists hesitate to say that teaching is best left to the bots. The debate is about the best mix — what are AI’s most effective roles in helping students learn, and what aspects of teaching should remain indelibly human no matter how powerful AI becomes?

Experts in technology and education believe AI’s best use is to augment and extend the reach of human teachers. For example, the goal of Merlyn Mind’s voice assistant is to make it easier for teachers to engage with students while also navigating apps and other digital teaching materials. Not tethered to the computer, they can move around the class and interact with students, even the ones hoping to disappear in the back.

Others use AI to help train human tutors to have more productive student interactions, or by multiplying the number of students a human instructor can engage with by delegating specific tasks to AI’s technology strengths. Experts envision a partnership in which AI is not called on to be a teacher but to supercharge the power of humans already doing the job.

Study after study shows the importance of student engagement for academic success. A strong connection between teachers and students is especially important when learners feel challenged or discouraged. AI has many strengths, but it is not very good at motivating students to keep doing something they are not very interested in doing.

AI has become more engaging. One of the breakthroughs of generative AI powered by LLMs is its ability to give unscripted, human-like responses to user prompts.

Many tutoring experts stress the importance of building a strong relationship between tutors and students to achieve significant learning boosts. “If a student is not motivated, or if they don’t see themselves as a math person, then they’re not going to have a deep conversation with an AI bot,” said Brent Milne, the vice president of product research and development at Saga Education, a nonprofit provider of in-person tutoring.

While Saga is looking into having AI deliver some feedback directly to tutors, it’s doing so cautiously, because, according to Milne, “having a human coach in the loop is really valuable to us.”

In addition to using AI to help train tutors, the Saga team wondered if they could offload certain tutor tasks to a machine without compromising the strong relationship between tutors and students. Tutoring sessions are typically a mix of teaching concepts and practicing them, according to Milne. A tutor might spend some time explaining the why and how of factoring algebraic equations, for example, and then guide a student through practice problems. But what if the tutor could delegate the latter task to AI, which excels at providing precisely targeted adaptive practice problems and hints?

The Saga team tested the idea in their algebra tutoring sessions during the 2023-24 school year. Students who were tutored daily in a group of two had about the same gains in math scores as students who were tutored in a group of four with assistance from ALEKS, an AI-powered learning software by McGraw Hill. In the group of four, two students worked directly with the tutor and two with the AI, switching each day. The AI assistance effectively doubled the reach of the tutor.

Earlier this year, OpenAI and the startup Hume AI separately launched “emotionally intelligent” AI that analyzes tone of voice and facial expressions to infer a user’s mood and respond with calibrated “empathy.” Still, even emotionally intelligent AI will likely fall short on the student engagement front, according to Brown University computer science professor Michael Littman.

No matter how human-like the conversation, he says, students understand at a fundamental level that AI doesn’t really care about them, what they have to say in their writing or whether they pass or fail algebra. In turn, students will never really care about the bot and what it thinks. A June study in the journal Learning and Instruction found that AI can already provide decent feedback on student essays. What is not clear is whether student writers will put in care and effort — rather than offloading the task to a bot — if AI becomes the primary audience for their work. 

“There’s incredible value in the human relationship component of learning,” Littman says, “and when you just take humans out of the equation, something is lost.”

The Hechinger Report

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
InnovativeSchools Insights Masthead

Subscribe

Subscribe today to get K-12 news you can use delivered to your inbox twice a month

More Insights