Communication Currents

Communication Currents

Alexa and the White Femininity of Artificial Intelligence

March 4, 2021
Critical and Cultural Studies, Digital Communication & Gaming, Feminist Studies

In the 2013 film Her, a man falls in love with an artificially intelligent virtual assistant (AI VA), named Samantha, voiced by Scarlett Johansson, a white woman. Real AI VAs, such as Siri and Alexa, are also coded as feminine through their names and voices. In a new article published in NCA’s Communication and Critical/Cultural Studies, Taylor C. Moran argues that a commercial for Amazon’s Alexa reveals the ways in which the device is coded as white and feminine and considers the implications. 

Technology, Voice, and Gender 

Moran argues that white male voices have been normalized as authoritative through education, including speech classes. Historically, the “nonaccented” masculine voice was considered ideal in speech classrooms, particularly in the 1930s. Although this may no longer be true in many classrooms, the stereotypes that associate white masculine voices with authority, power, and intelligence linger. Likewise, there are stereotypes associated with other types of voices. White feminine voices may be perceived as weak or uneducated, while Black feminine voices are stereotyped as “loud.” 

In the United States, many AI VAs have feminized voices. Because the voices may be perceived as “white,” as in the case of Samantha in Her, the voices associate whiteness with authority and knowledge. While women’s voices may be stereotyped as “weak,” Moran argues that these feminine voices are “not whiney… or cutesy.” Furthermore, the use of voice technology is intended to create an intimate relationship between the user and the AI VA. The use of a feminine voice in this regard is intentional and builds on stereotypes associated with femininity and domestic labor. 

Feminized Labor 

Moran writes that in the United States, AI VAs’ labor is feminized because the home is traditionally the realm of women’s labor, such as homemaking and caretaking. The role of AI VAs is enhanced by “smart” appliances that incorporate internet access. Among other features, this allows users to use voice commands to interact with appliances. These AI VAs perpetuate long-standing ideas about domestic robots, such as Rosie from The Jetsons, that are designed to serve. 

AI VAs and White Femininity 

In the article, Moran recounts a 2017 Alexa commercial that featured “a menagerie of celebrities: rapper Cardi B, actress Rebel Wilson, actor Anthony Hopkins, and Chef Gordon Ramsay.” In the commercial, each celebrity attempts to fill in for Alexa, who is sick. For example, Ramsay berates a home chef who asks for a recipe, as is typical of Ramsay’s behavior in Hell’s Kitchen. The commercial closes by implying that none of the celebrities would be an acceptable substitute for Alexa. 

According to Moran, the contrasts among the celebrity voices also serve to show “what the AI VA voice is by showing what it is not.” Ramsay’s and Hopkins’s masculine voices not only fail to accomplish the task, but they also have the wrong tone for the job. Their aggressive masculine voices contrast with Alexa’s authoritative, yet servile feminine voice.  

Moran argues that Cardi B’s and Wilson’s voices also offer distinct contrasts that help define the Alexa voice. When AI VAs are visualized, they are typically shown as thin, white women with a servile demeanor. In the commercial, Wilson is asked to “set the mood” during a party. In response, from a bubbly bathtub, Wilson says, “You’re in the bush, and you’re just so dirty and so sweaty, because it’s hot in that bush…” Moran argues that Wilson’s performance contrasts with the expectations for Alexa in two ways. First, while Wilson is white, Wilson does not have the “American” voice associated with Alexa, but instead speaks brashly and performs an Australian stereotype by using Australian slang. Second, Wilson’s curvy physique and overt sexuality do not meet the thin stereotype associated with servile white femininity.

AI VAs also possess limitless knowledge; users can ask Alexa anything and receive a response. In the commercial, the all-knowing Alexa is contrasted with Cardi B, who responds to a question about how far Mars is by saying, “How far is Mars? Well, how am I supposed to know? I never been there. This guy wants to go to Mars. For what? There’s not even oxygen there.” Cardi B’s New York City accent and lack of knowledge perpetuate stereotypes that portray Black women as unintelligent.  

Conclusion 

AI VAs are coded as white, feminine, and servile. Moran argues that these characteristics make the devices feel less threatening for many users; thus, despite the privacy concerns of an always-on microphone, users may bring multiple Amazon, Apple, or Google devices into their homes because of the benefits of using voice commands to accomplish daily tasks. These devices can then collect data to be used by advertisers, marketers, or in other ways not envisioned by users. Furthermore, Moran argues that “as the relative costs of these devices decline, every citizen may gain access to their own personal assistant, freeing them up for labor, but increasing the manual labor of the exploited populations that construct the machines.” Thus, these devices aim to free upper middle-class users of certain obligations, while simultaneously furthering the global exploitation of the people who produce the devices.

This essay was translated by Mary Grace Hébert from the scholarly journal article: Taylor C. Moran (2020): Racial technological bias and the white, feminine voice of AI VAs, Communication and Critical/Cultural Studies, DOI: 10.1080/14791420.2020.1820059 

About the author (s)

Taylor C. Moran

Louisiana State University

Doctoral Candidate, Department of Communication Studies

Taylor C. Moran