The question of what makes a person AI literate is evolving, according to an Education Week article. AI literacy is something that every student needs exposure to—not just those who are planning on a career in computer science, experts argue.
Experts and educators weigh in on how to begin developing AI literacy:
1) Understand the basics of how AI works
Knowing the technical aspects of AI—how the technology perceives the world, how it collects and processes data, and how those data can inform decisions and recommendations—can help temper misperception that AI is an all-knowing, infallible force, experts say.
Students want to know more about AI. More than nine in 10 teens say they would be interested in learning in high school about how to work with artificial intelligence, according to a survey by the nonprofit Junior Achievement with the marketing and data firm Big Village.
All students must grasp that the decisions AI makes—whether recommending a particular pair of boots to an Amazon customer or to flag a job applicant as a promising prospect for an employer—aren’t driven by the same kind reasoning a human can perform.
Students should have some idea of how machines perceive the world. Discuss things like speech-recognition technology, sensors, and machine vision and understanding how they work.
Children in early-elementary school, for instance, could start with a simple lesson in which they identify the human organs—ears, eyes—involved in hearing and seeing and then find their technological counterparts —microphone, camera—on a digital device.
Students must also know how biases in the data that’s used to train AI can perpetuate discriminatory policies unless humans recognize the problem and do something about it.
2) Get hands-on to understand how the technology works
One hands-on lesson for more advanced students: Give them a flawed historical dataset to train an AI system. Students could create a program that gives suggested salary ranges for a company’s employees. If that program uses data in which women are paid less than men for doing the same job, the technology will probably propose lower salaries for female employees than for male workers. If women are at salary parity with men in the dataset, the results will be more equitable.
Educators can show how human subjectivity can penetrate AI’s decision-making. Students in a middle school AI course that Georgia is piloting, for example, engage in a game in a physical classroom that initially asks students to decide whether something is edible or not. Depending on their decisions, students choose where to stand in the room.
Some answers are obvious: A metal stop sign is not edible, for instance. But there can be disagreement on a word like “chicken,” which vegetarians in the class may claim is not edible.
The game adds more and more categories and subcategories to the list of options, mimicking how AI algorithms can work.
Students are shown that the computer makes those same determinations based on the viewpoints of most people it interacts with — and those determinations are often faulty ones.
3) Address ethical questions about the technology
Once students know that humans—and not some sentient robot—are behind how these tools analyze and communicate, they can think about them in a broader context.
This raises important and interesting ethical issues. In Georgia’s middle school AI course, for instance, students might consider a case of passengers going to sleep in the backseat of a self-driving car while it continues along the road.
Questions follow without easy answers: “What are the legal implications of that? Do we need to stop them from doing that? Or is AI to the point where you can sleep in the backseat of a car and let it drive itself?”
Students can discuss legislation in states that have banned facial-recognition software, which has falsely flagged some people as having a criminal record. Should all states ban facial-recognition software until it becomes more accurate? Or could it be used for some purposes but not others?
They can also discuss data-privacy concerns, including the implications of having devices such as smart speakers—Amazon’s Alexa, Google Assistant, and Apple’s Siri, among others—in homes. Teachers can pose questions: “How is that impacting you? Is it listening to you all the time? Is it sending information back to Google or Amazon all the time?” This helps students guard against the unintended consequences of AI.
4) Learn how to engage effectively with AI
Students need to practice using AI tools to get information, the same way previous generations learned the card-catalog system to navigate the library.
For instance, a student could tell ChatGPT to “’write about the American Revolution.’’ A very textbook response will follow. The student could then say, “well, write about 15 women who shaped the American Revolution or draw connections between 15 women today and the American Revolution.” The way you prompt the tool completely changes the output and your thinking and learning.
The stakes in the real world can be a lot higher than a classroom assignment. Getting the prompt right can turn into a matter of life and death for a doctor using AI to pinpoint a diagnosis.
That’s why it’s not a good idea to ban ChatGPT or other AI tools, as New York City and some other school districts have done, some experts argue. While the nation’s largest public school district essentially banned the technology except for use in limited circumstances, at least one Manhattan private school is offering instruction in “prompt engineering” — an AI skill that could lead to a lucrative job.
5) AI skills are not just for computer science experts
Students need to be exposed to how AI is being used in the workforce today and how they might use the technology in their future careers, even if they don’t go into a computer science field.
AI impacts very diverse roles and diverse industries. Students should know when they enter the workforce, even if they are not making AI products, they are probably using them in a way that helps get their job done.
AI will work best when it is designed by people who are part of the community that the tool is aimed at serving. Having people from a variety of backgrounds—racial, socioeconomic, age, gender, profession—can help root out some of the biases embedded in society and AI.
Students from communities that are underrepresented in the AI field need to understand that by getting in on the ground floor of this technology, they can help ensure that it works better.
Give students the education, the curriculum, the tools and the community so they can make a difference.
Education Week