Already have an account?
Get back to the

Taylor Swift Taking Legal Action Against Graphic AI Pics Could Be ‘Impetus’ for Legislative Change (Exclusive)

Taylor Swift could have a legal case regarding the sexually explicit images of her likeness that spread on the internet.

“This is [an] area of the law where really technology has outpaced the law and it’s really state-specific,” attorney Neama Rahmani, who does not represent Swift, exclusively told Us Weekly on Tuesday, January 30. “Unfortunately, there is no federal law that prohibits these types of images, and there’s only a small handful of states that do prohibit it.”

There are only 10 states — including California, Florida, New York and Illinois — that have current or proposed state laws prohibiting deepfake or AI-generated images. Tennessee, where the 34-year-old pop star is a resident, does not currently have such legislation in place.

Rahmani, who lives and practices law in California, further explained that Swift could potentially file a lawsuit in the West Coast state.

Taylor Swift Hit With 1 Million Lawsuit Over 'Lover' Photo Book

Related: Taylor Swift Controversies Through the Years

“You can sue if it’s malicious and get up to $150,000 for each violation,” Rahmani said. “It really depends on the state and hopefully other states follow suit. There’s really only a minority of them right now that are taking up this issue, but we know that Swifties have a lot of power, just like Britney Spears helped change some conservatorship laws.”

He continued, “Hopefully Taylor Swift, even though this is something that is terrible, does be an impetus for some legislative change because it really needs to happen.”

Fake explicit images of Swift, which were created using artificial intelligence without her consent, spread via social media site X in January. After 17 hours, X eventually took down the photos.

Jeremy Piven, Harvey Weinstein, Kevin Spacey, Sexual Misconduct, Hollywood, Sexual Harassment

Related: Hollywood's Sexual Misconduct Scandals

“The sexually explicit, A.I.-generated images depicting Taylor Swift are upsetting, harmful, and deeply concerning,” SAG-AFTRA said in a statement at the time. “The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal. As a society, we have it in our power to control these technologies, but we must act now before it is too late.”

Taylor Swift Taking Legal Action Against Graphic AI Pics Could Be 'Impetus' for Legislative Change
TAS2023 via Getty Images

The statement concluded: “We support Taylor and women everywhere who are the victims of this kind of theft of their privacy and right to autonomy.”

SAG-AFTRA, whose desire to protect performers from AI was one of the reasons that actors went on strike in 2023, further shared support for the Preventing Deepfakes of Intimate Images Act. New York Congressman Joe Morelle had proposed the bill to stop the exploitation of NSFW images crafted from fake photos.

Within days, X users were unable to search Swift’s name on the platform. A representative for X later confirmed on Monday, January 29, that Swift was searchable again after all the NSFW images were taken down.

“[X will] continue to be vigilant for attempts to spread this content and will remove it wherever we find it,” a statement read.

With reporting by Christina Garibaldi

In this article

Got a Tip form close button
Got a tip for US?
We're All Ears for Celebrity Buzz!