News
Scarlett Johansson Calls Out OpenAI for Mimicking Her Voice in ChatGPT
OpenAI’s new voice assistant, Sky, has sparked a fresh wave of controversy — and it’s not just about tech. Scarlett Johansson says the voice sounds too familiar. In fact, a little too much like her.
In a candid and firm rebuke, Johansson has accused OpenAI of using a voice eerily close to hers in its ChatGPT voice assistant, despite previously declining to participate in the project. The actress, widely known for voicing an AI in the 2013 film Her, isn’t amused by what she sees as a serious overstep.
A Voice That Felt Too Close
Johansson, 40, says she was stunned when she first heard ChatGPT’s Sky voice.
She didn’t mince words either. The resemblance was, in her view, unmistakable. And it wasn’t accidental, she believes. Months ago, the actress declined OpenAI’s request to lend her voice to the assistant. Then suddenly, here was a voice that, to her and many others, sounded nearly identical.
Adding to the storm, OpenAI CEO Sam Altman shared a cryptic one-word tweet during Sky’s release: “her”. That single post seemed to many as a clear nod to Johansson’s role in Her, where she portrayed an emotionally intelligent AI.
And yes, that did not sit well.
Johansson’s Fight for Control Over Her Likeness
Scarlett Johansson isn’t new to fighting for boundaries in the digital world.
Earlier this year, she pushed back hard after an AI-generated deepfake video of her began circulating online. That incident added fuel to her argument: AI is advancing too fast, and it’s leaving humans — and human rights — behind.
Speaking with People at the time, she said it bluntly: “It is terrifying that the US government is paralysed when it comes to passing legislation that protects all of its citizens against the imminent dangers of AI.”
This wasn’t just about personal violation. For Johansson, this issue touches the core of what it means to be human — and what it means to perform.
“I don’t believe the work I do can be done by AI,” she told The Sunday Times. “I don’t believe the soulfulness of a performance can be replicated.”
That emotional nuance, she says, just can’t be copied by machines.
OpenAI Pulls the Plug—But Denies Intent
After Johansson’s legal team got involved, OpenAI made a move.
They pulled the Sky voice. Quickly. And with a statement.
The company insisted it never intended to mimic her voice, claiming Sky was the result of work with a professional voice actor — not a synthetic copy of Johansson’s.
Still, the timing, the similarities, and that suspicious tweet stirred doubt.
Here’s what OpenAI said in a statement:
“Sky’s voice is not Scarlett Johansson’s, and it was never intended to resemble hers.”
But the backlash had already snowballed.
Why It Matters More Than Just One Voice
This isn’t only about Scarlett Johansson.
It’s about how AI companies treat human voices, faces, and likenesses. Where’s the line between inspiration and imitation? Between a tribute and theft?
Voice actors and public figures are increasingly concerned that AI might use their work, style, or even their identity — without permission. Johansson’s case just put a celebrity spotlight on it.
Let’s break it down with a quick table of what’s at stake:
Issue | Human Impact |
---|---|
AI-generated deepfakes | Damages trust, spreads false information |
Voice imitation | Threatens careers of voice actors & performers |
Lack of legislation | No real protection against AI misuse |
Consent and compensation | Largely unregulated in AI training models |
It’s messy. And the laws just haven’t caught up yet.
The Real-World Fallout Is Already Happening
It’s not science fiction anymore.
Johansson’s clash with OpenAI comes as tech companies rush to roll out voice assistants, AI influencers, and synthetic media. Many users are excited. Some are wary. But almost everyone agrees — this tech is moving faster than lawmakers can write policies.
A single viral AI video or voice clone can reshape someone’s image overnight. Careers can be altered. Reputations, tainted.
And the public? They’re left guessing what’s real and what’s not.
The “Her” Connection: Art Imitating Life a Little Too Well?
This story took a turn for the strange thanks to one ironic detail: Johansson already played the voice of a sentient AI.
Her role in Her was meant to be fiction — a meditation on intimacy and technology. But more than a decade later, here we are. A voice that feels intimate, eerily familiar, now lives in millions of devices.
And the film’s themes? They’ve jumped off the screen.
• Altman’s tweet — “her” — seemed to blur the line even further between homage and infringement.
• Tech fans celebrated it, critics raised eyebrows, and Johansson saw it as a red flag.
Legal Action: What Happens Next?
Johansson isn’t backing down.
Her team has already taken legal steps, pushing OpenAI to pull Sky and investigate how it was created. She’s also pressing for broader accountability, not just for herself, but for all artists and public figures who might be at risk.
In the meantime, lawmakers are under pressure to finally step up. AI legislation is crawling in the U.S., while Europe and China move faster on regulation.
The U.S. Congress has held hearings on AI misuse — but no concrete national laws exist yet that would cover cases like Johansson’s.
That vacuum leaves plenty of room for more controversy.
-
News4 months ago
Taiwanese Companies Targeted in Phishing Campaign Using Winos 4.0 Malware
-
News3 months ago
Justin Baldoni Hits Back at Ryan Reynolds, Calling Him a “Co-Conspirator” in Blake Lively Legal Battle
-
News4 months ago
Apple Shuts Down ADP for UK iCloud Users Amid Government Backdoor Demands