By Jennifer Kiilerich
With artificial intelligence being rapidly deployed across all sectors of public life, including education, it is becoming increasingly important to understand how younger children interact with AI algorithms. Future programmers will need to consider the social and ethical impacts of technology, contends Vanderbilt Peabody College of education and human development professor Golnaz Arastoopour Irgens.
"There's a myth that computing, programming and technology are objective. But people make computers, people program computers," said Arastoopour Irgens, assistant professor of human-centered technologies. "Algorithms are essentially baked-in opinions and world views held by the people making them."
Arastoopour Irgens wants to find out whether teaching children to build their own computing systems at a young age will help them think more critically about the role of humans in designing technology. She is currently heading up the second year of the five-year, $1.5 million National Science Foundation-funded study, "A research-practice partnership for co-designing and implementing critical computing elementary education curricula."
Inspiring curiosity with games
As a former middle school computer science and high school math teacher, Arastoopour Irgens understands that kids want to have fun. After spending years co-designing activities, learning apps and robots with elementary students in informal, after-school settings, her takeaway was that kids are eager to engage with machine learning and AI through a social and critical lens-but they want it to be playful, and they like to learn through stories.
The result is S.P.O.T. (Solving Problems of Tomorrow), a game that Arastoopour Irgens brainstormed alongside school children and is now collaborating on with teachers. She plans to launch and study the game in a few Middle Tennessee upper-elementary classrooms this fall.
The game takes learners on a role-playing adventure as secret agents who time travel to the future. They are tasked with uncovering how and why technology has harmed future populations. In one activity, a self-driving car has misclassified children, and kids are unable to get to their soccer practices and birthday parties-something easily relatable for young learners.
Game players then return to the present and apply what they've discovered with hands-on activities such as programming their own machine learning algorithms, including with physical robots and interactive props like badges and stickers. "Through our experience," said Arastoopour Irgens, "physical computing is something that's really engaging for younger kids. Just classifying something and saying that it's a rabbit is not as exciting as seeing a robot do a dance when it sees a rabbit."
Cultivating critical thinking in AI
Due to the human nature of programming, AI doesn't always accurately reflect individuals or the complex social nature of our world. Consider one common artificial intelligence tool: facial-recognition software. Facial recognition is being implemented in more and more situations, from scanning sports fans into stadiums to airport security and catching criminals, making it increasingly important that it correctly identifies users.
Though children may choose to build a model that classifies cute animals like bunnies, squirrels and chipmunks for algorithms in the S.P.O.T. game, the concept is universal. If their robots misidentify an animal, students must consider, "Did you leave out any bunnies? Did you think about different colored bunnies? We would talk about the consequences of excluding animals from training data sets," said Arastoopour Irgens. "But the children also learn that the stakes are much higher with humans."
Arastoopour Irgens directs Vanderbilt's IDEA Lab, which focuses on creating welcoming digital learning environments in engineering, computer science and computational statistics, and she is a board member for the LIVE Learning Innovation Incubator, which develops tools to support learning and training across disciplines. She subscribes to "slow research," in which she spends time building relationships and carefully developing studies and tools. With this project, Arastoopour Irgens hopes to leave behind a sustainable-and fun-curriculum to teach kids not only about the workings of technology, but also how to think seriously about technology's ethics and consequences.