Illustration for: AI Videos Fuel Influencer Drama, Raising Legal Threats Over Facial Likeness
Policy & Regulation

AI Videos Fuel Influencer Drama, Raising Legal Threats Over Facial Likeness

3 min read

When I saw a clip of a celebrity dancing in a kitchen that never happened, I realized AI-generated videos are already slipping into the fights that dominate influencer culture. Kat Tenbarge wrote for Spitfire News earlier this month that the tech can slap a famous face onto almost any scene, and that seems to create an “almost constant potential threat of legal action” whenever the footage pops up without permission. The danger isn’t just theory.

Scarlett Johansson has already “lawyered up” after her likeness was used without consent, showing how a short clip can quickly become a courtroom drama. The hype around AI videos grabs headlines, but the real risk is that a face that can be swapped at will may soon be as exposed as a trademark. So the next legal battle isn’t about a new platform or algorithm, it’s about the very features that make us recognizable.

Influencers and their followers are now walking a tightrope where a single misused frame could bring real-world fallout.

As Kat Tenbarge chronicled in Spitfire News earlier this month, AI videos are becoming ammunition in influencer drama as well. There's an almost constant potential threat of legal action around unauthorized videos, as celebrities like Scarlett Johansson have lawyered up over use of their likeness. But unlike with AI copyright infringement allegations, which have generated numerous high-profile lawsuits and nearly constant deliberation inside regulatory agencies, few likeness incidents have escalated to that level -- perhaps in part because the legal landscape is still in flux. What happens next When SAG-AFTRA thanked OpenAI for changing Sora's guardrails, it used the opportunity to promote the Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act, a years-old attempt to codify protections against "unauthorized digital replicas." The NO FAKES Act, which has also garnered support from YouTube, introduces nationwide rights to control the use of a "computer-generated, highly realistic electronic representation" of a living or dead person's voice or visual likeness.

Related Topics: #AI videos #deepfakes #Scarlett Johansson #OpenAI #SAG-AFTRA #NO FAKES Act #Kat Tenbarge #Spitfire News

It feels like we might be stepping into a time when a synthetic voice can actually trigger a copyright fight. The AI-generated track “Heart on My Sleeve,” which mimics Drake’s vocals, shows just how easy it is to pull off a convincing impersonation, and listeners are already asking who owns that sound. Kat Tenbarge wrote in Spitfire News that influencers are now using such videos as ammo in their feuds, and every unauthorized clip seems to carry the whiff of a possible lawsuit.

Scarlett Johansson’s recent lawsuit over the misuse of her likeness is a reminder that the entertainment world is moving fast on this front. The Stepback newsletter calls it the next legal frontier, pointing out that facial and vocal clones raise questions current law can’t quite answer. Still, it’s unclear whether courts will settle on a uniform rule or if the industry will simply police itself.

What’s clear is that creators, brands and platforms have to walk a tighter line, juggling fresh tech with respect for image rights. Until we get solid guidance, the push-and-pull between AI tricks and legal safeguards will probably keep humming.

Common Questions Answered

What legal risks do AI‑generated videos pose for celebrities like Scarlett Johansson according to the Spitfire News article?

The article warns of an "almost constant potential threat of legal action" when a celebrity's facial likeness is used without permission. Scarlett Johansson has already "lawyered up" to challenge unauthorized AI videos, highlighting the risk of lawsuits and possible damages for infringers.

How does Kat Tenbarge describe the role of AI videos in influencer drama?

Kat Tenbarge reports that AI‑generated videos are being wielded as "ammunition" in influencer feuds, turning deepfakes into a new kind of weapon. The technology makes it easy to stitch a celebrity's face onto any scene, escalating conflicts and prompting legal threats.

Which synthetic‑voice example does the article cite as sparking a potential copyright dispute, and which artist is involved?

The article references the AI‑generated track "Heart on My Sleeve," which mimics the vocal style of Drake. This example illustrates how easily a synthetic voice can be produced, leading listeners and rights holders to question ownership and possible copyright infringement.

Why are likeness‑related lawsuits described as less common than AI copyright infringement cases in the article?

While AI copyright infringement has generated numerous high‑profile lawsuits and ongoing regulatory debate, the article notes that likeness claims remain relatively few. This disparity is attributed to the newer legal landscape surrounding facial deepfakes and the still‑evolving statutes that govern personal image rights.