The world may soon have more voice assistants than people—yet another indicator of the rapid, large-scale adoption of artificial intelligence (AI) across many fields. The benefits of AI are significant: it can drive efficiency, innovation, and cost-savings in the workforce and in daily life. Nonetheless, AI presents concerns over bias, automation, and human safety which could add to historical social and economic inequalities.

One particular area deserving greater attention is the manner in which AI bots and voice assistants promote unfair gender stereotypes. Around the world, various customer-facing service robots, such as automated hotel staff, waiters, bartenders, security guards, and child care providers, feature gendered names, voices, or appearances. In the United States, Siri, Alexa, Cortana, and Google Assistant have traditionally featured female-sounding voices.

As artificial bots and voice assistants become more prevalent, it is crucial to evaluate how they depict and reinforce existing gender-job stereotypes and how the composition of their development teams affect these portrayals. AI ethicist Josie Young recently said that “when we add a human name, face, or voice [to technology] … it reflects the biases in the viewpoints of the teams that built it,” reflecting growing academic and civil commentary on this topic. Going forward, the need for clearer social and ethical standards regarding the depiction of gender in artificial bots will only increase as they become more numerous and technologically advanced.

Given their early adoption in the mass consumer market, U.S. voice assistants present a practical example of how AI bots prompt fundamental criticisms about gender representation and how tech companies have addressed these challenges. In this report, we review the history of voice assistants, gender bias, the diversity of the tech workforce, and recent developments regarding gender portrayals in voice assistants.

Read the full article about voice command bots by Caitlin Chin and Mishaela Robison at Brookings.