Is Alexa sexist? In short, yes

University of Waterloo professor and Canada Research Chair in Technology and Social Change Dr. Lai-Tze Fan analyzed hundreds of Amazon's virtual assistant Alexa's voice-driven skills. Dr. Fan's goal was to better understand how the encoded technology mirrors and reinforces traditionally feminized labour and sociocultural expectations.

"I wanted to demonstrate how Alexa is designed to be female presenting, and as a consequence of this design, expectations of gendered labour and behaviour have been built into the code and user experiences of various Alexa skills," Fan said. "While users have the option to change the voices of Alexa, Siri, and other AI assistants, research shows that male-presenting voices have not been as popular. In addition, developments in gender-neutral voices have not been integrated into the most popular interfaces."

AI assistants such as Alexa perform tasks through the user's text- or voice control, which prompts the software to search for keywords that match its scripts for executing a specific task. As of mid-2022, Alexa possessed over 100,000 skills that can help with cooking, cleaning, checking the weather, playing music, activating home appliances, and adding items to a user's calendar. "These and other tasks often resemble the kind of support labour that may be associated with personal assistants and domestic workers," Fan said.

While the popular virtual assistant technology's code is closed-source, Fan said certain features of the code architecture may be identified through methods akin to reverse engineering. She used a novel approach to study the product's closed-source code within fair dealing laws.

These methods include using Amazon's official software developer console, the Alexa Skills Kit, as well as GitHub, to access open samples and snippets of Amazon-developed code. Fan combined this limited code information with examining code of unofficial, third-party user-developed Alexa skills.

Fan was able to examine code samples that demonstrate Alexa's pre-scripted and enabling responses to users' flirting and verbal abuse, as well as how users try to trick Alexa into accepting overtly misogynistic behaviour.

"In all this work, the goal is to analyze Big Tech culture and its self-presentation of objective data, information, and logic - pillars that begin to crumble when we examine the exclusionary, discriminatory, and systemically unequal foundations upon which they are built," Fan said.

Understanding the design of AI designated for care, assistance, and menial labour is particularly important for tracing how design choices may affect and influence user behaviours in both virtual and real social contexts.

Fan's research builds on interdisciplinary scholarship related to gendered design, which includes science and technology, critical data studies, critical race studies, computer science, feminist technoscience, and other adjacent fields.

The paper, Reverse Engineering the Gendered Design of Amazon's Alexa: Methods in Testing Closed-Source Code in Grey and Black Box Systems, is published in Digital Humanities Quarterly in a special issue on Critical Code Studies.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.