AI Guidelines for K-12 Aim to Bring Order to the ‘Wild West’

Insights 14 600x400 WildWestOrder

How much work would it take to develop guidelines to help educators manage the challenges of using generative AI tools for their work? In Michigan, it was a team effort, according to an article in EdSurge.

A coalition of 14 education organizations, helmed by the nonprofit Michigan Virtual Learning Research Institute, recently released sample guidelines that walk teachers and administrators through potential pitfalls to consider before using an AI tool in the classroom or for other tasks. This includes checking the accuracy of AI-generated content, citing AI-generated content, and judging which types of data are safe to enter into an AI program.

Ken Dirkin, senior director of the Michigan Virtual Learning Research Institute, says the group wanted to create a document that was digestible, but “there’s probably 40,000 important things that could have been included.”

Dirkin says the group wanted the document to help school districts and educators think through the use of generative AI without defaulting to either extreme of banning it or allowing unrestricted use.

The speed at which generative AI is evolving makes this a critical time for educators and districts to have guidelines about when and how they use it.

“AI is everywhere. It’s doing everything for everyone and anyone that’s interested,” says Mark Smith, executive director of the Michigan Association for Computer Users in Learning. “By the time we get a handle on the one-, three-, five-year plan, it’s changing right underneath our noses. We must get in front of this now with a nimble, flexible, guideline policy or strategy as a collective whole. AI is only going to continue to change.”

School principals want to know how AI can be used in the classroom beyond having students copy and paste from it, Paul Liabenow says, and are of course concerned about students using it to cheat.

But many of the questions he gets as executive director of the Michigan Elementary and Middle School Principals Association focus on AI programs and legal compliance with student privacy laws, Liabenow explains, and how they stay in line with laws like FERPA and Individuals with Disabilities Education Act.

The AI guidance document urges educators to always assume that, unless the company that owns a generative AI tool has an agreement with their school district, the data they’re inputting is going to be made available to the public.

Liabenow says one of his confidentiality concerns is over any teacher, counselor or administrator who might want to use an AI program to manage student data about mental health or discipline — something that has the potential to end with a lawsuit.

Smith, of the Michigan Association for Computer Users in Learning, says the privacy pitfalls aren’t in the everyday use of generative AI but in the growing number of apps that may have weak data protection policies — one of the agreements that virtually no one reads when signing up for an online service. It may be easier to run afoul of privacy laws, he adds, considering the proposed changes to strengthen the Children’s Online Privacy Protection Act.

“How many of us have downloaded the updated agreement for our iPhone without reading it?” Smith says. “If you magnify that to 10,000 students in a district, you can imagine how many end user agreements you’d have to read.”

Teachers use generative AI to create lesson plans, and any school district employee could use it to help write a work document. “The more we disclose the use of AI and the purpose, the more we uplift everybody in the conversation,” Dirkin says. “I don’t think in two or three years people will be disclosing the use of AI — it’ll be in our workflows — but it’s important to learn from each other and tie it back to human involvement in the process.”

Generative AI is increasingly becoming integrated into software that is already widely used. That growing ubiquity is going to make it easier to access AI-powered education tools and make it more complicated when it comes to using it with safety in mind, Dirkin says.

“A lot of times, it’s the Wild West in terms of access to tools. Everybody has a Google account, and people can use their Google account to log into a ton of free services,” Dirkin says. “We wanted to make sure people had a tool to reflect on whether they’re using it in a legal or ethical [way], or if they’re violating some sort of policy before they do that. So just stop and think.”

A section of the new guidelines asks educators to think about how something generated by AI might be inaccurate or contain bias. Even as generative AI gets better, he says, “there are risks and limitations to all AI, no matter how good it is.”

“Sometimes the best data set for an educator is the teacher down the hall with 10 years more experience, and not an AI tool,” Smith says. “There is still a human element to this.”

EdSurge

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
InnovativeSchools Insights Masthead

Subscribe

Subscribe today to get K-12 news you can use delivered to your inbox twice a month

More Insights