HomeNewsChatGPT's use of a Scarlett Johansson-like voice reflects a troubling history of...

ChatGPT's use of a Scarlett Johansson-like voice reflects a troubling history of gender stereotypes in technology

Actress Scarlett Johansson published a press release She expressed anger and concern this week that OpenAI had used a voice “eerily similar” to her own because the default voice for ChatGPT.

The voice in query, called Sky, has been available to users since September 2023, however the similarity to Johansson's voice became more apparent last week when OpenAI has demonstrated an updated model called GPT-4oJohansson claims that Sam Altman, the CEO of OpenAI, previously asked her if she would supply her voice for ChatGPT and she or he declined the invitation.

The warm and playful tone of Sky's voice bears a striking resemblance to the digital companion named Samantha within the film (2013), voiced by Johansson. Although Altman has since claimed that Sky's voice was never meant to resemble Johansson'she appeared to allude to this connection by simply tweets the word “she” on May 13, 2024 – the day GPT-4o was launched.

OpenAI has now published its process for creating Sky’s voice in a blog entrywhich states that the voice was provided by “one other skilled actress using her own natural speaking voice.” increasingly smaller audio samples could be used to generate synthetic voicesCloning an individual's voice without their consent is less complicated than ever.

As a sound studies scholar, I’m excited by the ways by which AI technology raises recent questions and concerns about voice and identity. My research situates recent developments, fears and hopes regarding AI inside longer history of voice and technology.

Stolen voices

This shouldn’t be the primary time an artist has objected to an unlicensed simulation of his voice.

In 1988, Bette Midler sued the Ford Motor Company for using a series of AdvertisementThe U.S. Court of Appeals for the Ninth Circuit ultimately ruled of their favor. District Judge John T. Noonan wrote in his Decision that “imitating their voice is tantamount to stealing their identity.”

Tom Waits filed an identical and successful lawsuit against Frito-Lay after hearing what seemed like his own gravelly voice in a radio business for Doritos. As a musicologist Mark C. Examples describes, on this case, from a legal perspective, “an individual’s vocal timbre was elevated to the extent of his or her visual representation.”

Legislators have only just begun to handle the challenges and risks related to the increasing use of artificial intelligence.

For example recent decision of the Federal Communications Commission Robocalls that use AI-generated voices have been banned. In the absence of a more specific policy and legal framework, these examples of voice imitation proceed to function essential precedents.

Chatbots and gender

OpenAI’s obvious reference to the film within the design of Sky’s voice also locates ChatGPT inside an extended tradition of assigning female voices and personalities to computers.

The trailer for Spike Jonze’s 2013 film “Her”.

The first chatbot was developed in 1966 by MIT professor Joseph Weizenbaum. It was called ELIZA and Weizenbaum has designed it to speak with its users in the identical way as a psychotherapist. ELIZA was a model and reference for today's digital assistants, which frequently have feminized voices because the default setting. When they first got here in the marketplace in 2011, Siri told stories about ELIZA as if it were a friend.

Many scientists within the engineering sciences, including Thao Phan And Heather Woodscriticize how tech corporations depend on gender stereotypes when designing their voice assistants.

Communication scientist Jessa Lingel and Kate Crawford suggest that voice assistants are taking up the historically feminized role of the secretary, as they perform each administrative and emotional labor. Referring to this subservient stereotype, they argue that technology corporations are attempting to distract users from the surveillance and data extraction that voice assistants perform.

OpenAI says that when casting for ChatGPT's voices, they were in search of “an approachable voice that inspires trust.” It's telling that the voice the corporate selected to make users feel comfortable with the rapid advances in AI technology seems like a girl. Even because the conversational capabilities of voice assistants proceed to advance, Sky's voice shows that the tech industry has not yet moved beyond these regressive stereotypes.

Protecting our voices

Johansson's statement ends with a call for “transparency and the adoption of appropriate laws” to guard voice similarity and identity. It will indeed be interesting to see what legal and political consequences might arise from this high-profile case of unauthorized voice simulation.

But celebrities aren’t the one ones who ought to be concerned about how their voices are utilized by AI systems. Our voices are already recorded and used to coach AI by platforms corresponding to Zoom and Otter.ai and is involved within the training of virtual assistants like Alexa.

The unauthorized artificial imitation of Johansson's voice may appear to be a story from a dystopian future, however it is best understood within the context of ongoing debates about voice, gender and privacy. It shouldn’t be an indication of what’s to come back, but of what already exists.


Please enter your comment!
Please enter your name here

Must Read