A few days ago, I came across an article featuring Susan Bennett, a voice actor, recounting her experience recording the initial Apple Siri prompts back in 2005. What struck me most was the fact that she was unaware of how those recordings would be used later on. She just recorded her voice for an interactive voice response company, only to later discover she was the voice behind Siri. One reason for her casting was that the studio owner said she ‘didn’t have an accent’, a choice that raises questions about the underlying accentism in the selection criteria – after all, there’s no such thing as ‘no accent’, but I digress. Back to Bennett, she says she never got any payment or recognition from Apple, who purchased the recordings from the previous company. Bennett mentions she has met other voice actors with very similar stories. It’s often overlooked that there are human beings behind artificial intelligence-generated voices who rightfully deserve to be paid. Reading about this matter, I couldn’t help but wonder how the rights we have towards our own voice, image, and body are often threatened. This is a serious issue as the ownership of a woman over her own body has been a matter of discussion over the centuries. Who does it belong to?
In society, women have been attributed the role of the care-giver – and that explains the overwhelming prevalence of female voice assistants. Sadly, this also exposes voice assistants to abuse. In 2021, a Brazilian bank addressed this issue with a campaign to explain why their virtual assistant would start answering more assertively to sexist messages; different companies have reported similar issues.
I don’t want to sound totally negative, though. It is also true that voice assistants and chatbots can become sweet companions to a generation experiencing increasing loneliness. Many apps allow users to create personal AI friends, offering support and information – they can even be a study buddy! However, just as Paulo Freire pointed out about media before, these tools are expressions of human creativity and of the science developed by human beings. So, when considering their use, we must ask ourselves whom they serve; we must acknowledge the underlying power dynamics. Did you know that you can find threads on Reddit, a social news aggregator, where users share screenshots of their abusive interactions with female chatbots?
Poet Adrienne Rich once stated, “a language is a map of our failures.” Our language reflects our thoughts and emotions. Chatbots cannot be ‘harmed’, but these examples of abusive interaction reflect the reality of violence against women. There is no sense in developing advanced prompt literacy skills if our map-reading skills are still too limited.
References
- Isobel Asher Hamilton. “The Original Voice of Siri Finally Revealed: Voice Actor Apple Used for Its Iconic Virtual Assistant.” Business Insider, February 14, 2023.
- “Bradesco’s BIA Chatbot Takes a Stand Against Harassment with Sharp Responses.” UOL, April 8, 2021.
- “Chatbot Abuse: Unraveling the Dark Side of AI Chatbots.” Futurism.
- “AI Chatbots and Our Loneliness Epidemic.” Digital Native.
- Adrienne Rich, The Fact of a Doorframe: Poems Selected and New, 1950-1984
- FREIRE, Paulo and GUIMARÃES, Sérgio. Educar com a mídia: novos diálogos sobre educação. São Paulo: Editora Paz e Terra, 2013.