The Controversy of AI Voices and Personal Rights: The Case of "Sky"

The removal of "Sky" from ChatGPT highlights the complexity of AI-generated voices and the need to balance innovation with the protection of personal rights and likeness.

May 21, 2024 - 09:00
 19
The Controversy of AI Voices and Personal Rights: The Case of "Sky"

We used Midjourney to create this image based on the image of Joaquin Phoenix in a poster for the movie “Her” - is that ethical?

In the rapidly evolving world of AI, OpenAI recently faced a significant challenge: the removal of a voice named "Sky" from ChatGPT due to its striking resemblance to Scarlett Johansson's voice in the movie "Her." Despite being recorded by a voice actor, "Sky" sounded remarkably similar to Johansson, raising questions about personal rights and the ethical use of AI-generated voices.

The Rise of AI Voices

AI voice technology has made substantial strides, allowing for the creation of highly realistic and versatile voices. These voices are used in various applications, from virtual assistants and customer service to entertainment and education. However, as AI voices become more lifelike, the line between human and synthetic voices blurs, leading to complex legal and ethical issues.

The Issue of Personal Rights

The case of "Sky" highlights the potential infringement on personal rights when AI-generated voices closely mimic real individuals. Scarlett Johansson's voice, as portrayed in "Her," is distinctive and recognizable. The use of a similar-sounding AI voice can be seen as a form of impersonation, which may violate the original voice owner's rights to privacy and control over their voice.

Voice Actors and Similar Sounding Voices

Contrastingly, voice actors have long used their ability to mimic or emulate the voices of famous personalities. This practice is generally accepted in the entertainment industry, provided that the imitation is not intended to deceive or exploit the original voice owner. Voice actors bring their unique talent to create voices that evoke certain emotions or characters, often adding their personal flair to the imitation.

Ethical and Legal Considerations

The use of AI to replicate voices adds a layer of complexity to this dynamic. AI-generated voices can be fine-tuned to sound nearly identical to a target voice, raising concerns about consent, attribution, and compensation. There are several key considerations:

  • Consent: Did the original voice owner consent to their voice being replicated?

  • Attribution: Is the AI-generated voice being attributed correctly, or is it misleading users about the origin?

  • Compensation: Is the original voice owner being compensated for the use of a voice that closely resembles theirs?

Balancing Innovation and Rights

While AI voice technology offers incredible opportunities, it also requires a careful balance between innovation and respect for personal rights. Companies like OpenAI must navigate these waters thoughtfully, ensuring that their use of AI voices does not infringe on individual rights or mislead consumers.

The removal of "Sky" from ChatGPT is a step towards addressing these concerns, but it also underscores the need for clear guidelines and regulations in the AI industry. As AI continues to evolve, ongoing dialogue between technology developers, legal experts, and the public will be essential to create a framework that supports innovation while protecting personal rights.

The controversy surrounding the "Sky" voice in ChatGPT serves as a reminder of the complex intersection between AI technology and personal rights. As AI voices become more advanced, it is crucial to address the ethical and legal implications to ensure that these technologies are used responsibly and fairly. The case of "Sky" highlights the importance of consent, attribution, and compensation in the use of AI-generated voices, setting a precedent for the future development and deployment of AI in voice applications.

To celebrate this moment in AI history the AI Store has created a T-shirt saying bye bye to Sky.

Bye Bye Sky T-Shirt (unisex)
https://www.artificial-intelligence.store/products/bye-bye-sky-t-shirt-unisex

But surely, since the voice was recorded by a voice actor, Scarlett Johansson should have no claim to privacy here? Or is it more complex?

The Complexity of AI Voices and Personal Rights: A Closer Look

The removal of "Sky" from ChatGPT brings into focus the nuanced interplay between AI technology and personal rights. Although "Sky" was recorded by a voice actor, its resemblance to Scarlett Johansson's voice from the movie "Her" raises important questions about privacy and likeness rights.

The Voice Actor's Role

Voice actors often imitate or emulate the voices of famous personalities, a practice generally accepted in the entertainment industry. These imitations are typically regarded as creative expressions rather than direct impersonations, especially when they add unique characteristics or context to the voice. However, when an AI-generated voice mimics a specific individual's voice too closely, it can create ethical and legal challenges, even if the voice was recorded by another person.

Privacy and Likeness Rights

Scarlett Johansson, like many public figures, has certain rights over her likeness, which includes her voice. These rights allow her to control and protect how her voice is used, especially in commercial contexts. If an AI-generated voice closely replicates her distinctive voice, it may infringe upon these rights, regardless of the original recording process.

Legal and Ethical Considerations

  • Impersonation vs. Emulation: While voice actors emulate, AI can replicate voices to a degree that may cross into impersonation, potentially misleading users or exploiting the original voice owner's identity.

  • Consent and Attribution: Ensuring that the original voice owner consents to the use of their likeness and that proper attribution is given are crucial steps in respecting personal rights.

  • Commercial Use and Compensation: Using a voice that closely resembles a celebrity's for commercial purposes without proper compensation or licensing can lead to legal disputes.

Balancing Innovation and Rights

As AI voice technology advances, it is essential to establish clear guidelines to navigate these complex issues. This involves protecting the rights of individuals while fostering innovation. OpenAI's decision to remove "Sky" highlights the importance of respecting personal rights in AI development.

The case of "Sky" underscores the complexities involved in using AI-generated voices. Even when recorded by a voice actor, the close resemblance to a well-known voice can raise significant ethical and legal concerns. Moving forward, it is vital for AI developers to address these issues thoughtfully, balancing technological advancements with the protection of individual rights. This approach will help ensure that AI innovations are both responsible and respectful of personal privacy.

Here is what Scarlett Johansson wrote about the matter …

Last September, I received an offer from Sam Altman, who wanted to hire me to voice the current ChatGPT 4.0 system. He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and Al. He said he felt that my voice would be comforting to people.

After much consideration and for personal reasons, I declined the offer. Nine months later, my friends, family and the general public all noted how much the newest system named "Sky" sounded like me.

When I heard the released demo, I was shocked, angered and in disbelief that Mr. Altman would pursue a voice that sounded so eerily similar to mine that my closest friends and news outlets could not tell the difference. Mr. Altman even insinuated that the similarity was intentional, tweeting a single word "her"- a reference to the film in which I voiced a chat system, Samantha, who forms an intimate relationship with a human.

Two days before the ChatGPT 4.0 demo was released, Mr. Altman contacted my agent, asking me to reconsider. Before we could connect, the system was out there.

As a result of their actions, I was forced to hire legal counsel, who wrote two letters to Mr. Altman and OpenAl, setting out what they had done and asking them to detail the exact process by which they created the "Sky" voice. Consequently, OpenAl reluctantly agreed to take down the "Sky" voice.

In a time when we are all grappling with deepfakes and the protection of our own likeness, our own work, our own identities, I believe these are questions that deserve absolute clarity. I look forward to resolution in the form of transparency and the passage of appropriate legislation to help ensure that individual rights are protected.