Already have an account?
Get back to the

Taylor Swift Is No Longer Searchable on X After Fake AI-Generated Explicit Images

Taylor Swift Is No Longer Searchable on X After Fake AI-Generated Explicit Images
Jeff Kravitz/Getty Images for MTV/Paramount Global

UPDATE: 1/30/24, 11 a.m. ET — X rep Joe Benarroch confirmed that the explicit images had been removed and Taylor Swift was searchable again as of Monday, January 29. Benarroch told TMZ that X will “continue to be vigilant for attempts to spread this content and will remove it wherever we find it.”

Original story:

Social media users on X cannot presently search Taylor Swift’s name, Us Weekly can confirm.

Any individual typing the 34-year-old pop star’s name into the X search bar will receive an error message. After pressing enter, a message will read, “Something went wrong. Try reloading.”

Lower on the page, a second note adds, “Don’t fret — it’s not your fault.”

While neither X, owner Elon Musk nor Swift have addressed the social media incident, it comes shortly after an AI-generated photo scandal swept the site.

Taylor Swift Hit With 1 Million Lawsuit Over 'Lover' Photo Book

Related: Taylor Swift’s Biggest Controversies Through the Years

Fake explicit images of Swift, which were created using artificial intelligence without her consent, spread on X on Wednesday, January 24. After 17 hours, X took down the images amid reports circulating that Swift was considering legal action.

“The sexually explicit, A.I.-generated images depicting Taylor Swift are upsetting, harmful, and deeply concerning,” SAG-AFTRA said in a statement released on Friday, January 26. “The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal. As a society, we have it in our power to control these technologies, but we must act now before it is too late.”

Taylor Swift Is No Longer Searchable on X After Fake AI-Generated Explicit Images
Taylor Swift Matt Winkelmeyer/Getty Images for The Recording Academy

The statement concluded: “We support Taylor and women everywhere who are the victims of this kind of theft of their privacy and right to autonomy.”

SAG-AFTRA — whose desire to protect performers from AI was one of the reasons that actors went on strike in 2023 — further shared support for the Preventing Deepfakes of Intimate Images Act. New York Congressman Joe Morelle had proposed the bill to stop the exploitation of NSFW images crafted from fake photos.

Celebrities Who Took a Break From Social Media 747

Related: Celebrities Who Took a Break From Social Media

The White House, under President Joe Biden’s administration, is also looking into the issue.

“Of course, Congress should take legislative action,” press secretary Karine Jean-Pierre said in a press conference on Friday. “That’s how you deal with some of these issues.”

Jean-Pierre, 49, added: “We know that lax enforcement disproportionately impacts women and also girls, sadly, who are the overwhelming targets of online harassment and also abuse. There should be legislation, obviously, to deal with this issue.”

The White House had previously launched a task force in 2022 to address online harassment, which Jean-Pierre referred to on Friday as a “patchwork approach.”

“The Task Force is an interagency effort to address online harassment and abuse, specifically focused on technology-facilitated gender-based violence,” a statement read at the time. “In consultation with survivors, advocates, educators, experts from diverse fields, and the private sector, the Task Force will develop specific recommendations to improve prevention, response, and protection efforts through programs and policies in the United States and globally.”

In this article

Got a Tip form close button
Got a tip for US?
We're All Ears for Celebrity Buzz!