OpenAI’s recent controversy surrounding the voice model “Sky” has sparked widespread discussion. Initially, I hesitated to delve into the topic, but upon reading OpenAI’s blog, I found myself compelled to share my thoughts.

A woman's silhouette with the blue sky background

For those who don’t know what I’m talking about, here is the very rough overview.

In September 2023, OpenAI approached Scarlett Johansson, known for her iconic portrayal of an emotional companion AI in the movie “Her,” to use her voice for their system. Johansson declined the offer. Despite this, the voice model “Sky” was made available last September, and it gained prominence during OpenAI’s demo of ChatGPT and GPT-4 updates this month. Many listeners mistook it for Johansson’s voice, leading to confusion and speculation. After she asked OpenAI for an explanation, they paused the usage of Sky. In her statement, she revealed that Altman, the CEO of OpenAI, asked her to reconsider two days before the demo.

I think the way they handled this case was wrong in so many levels.

While OpenAI maintains that they did not directly copy Johansson’s voice, the similarities are hard to ignore. Johansson’s voice is distinctive, and when you listen to “Sky,” you can detect resemblances. In token of that, on the demo day, Altman tweeted with one word, “Her,” referring to the movie.

Given the way they communicated with her and the CEO’s tweet, it’s entirely understandable that she lost trust and confidence in their actions. As we all know, AI-generated products are rarely truly original; they often draw from existing sources, whether it’s a photograph, artwork, or written content. If I were in her position, I too would have harbored doubts. Did they not only select someone with a voice similar to Johansson’s but also match the pitch and pace to hers? Did they utilize fragments of her voice data? And what about the actress who provided the voice? It must’ve been disappointing for her that the product was paused only after a few days.

As we navigate the uncharted waters of AI, let’s remember that authenticity isn’t just about replicating voices or faces. It’s about understanding the human experience, acknowledging our fears, and building bridges of trust. The technology always needs to be considerate of humanity. After all, we’re not just ones and zeros; we’re storytellers too. The recent SAG-AFTRA strike remains fresh in our collective memory—didn’t they learn that transparency is the key to building trust? The more concerns grow, the louder the call for honesty becomes. Perhaps the tech industry should take a leaf out of their playbook. Hey, OpenAI, maybe seek advice from ChatGPT on effective communication strategies next time?

Pin It on Pinterest

Share This