Taylor Swift’s Name No Longer Searchable on X After AI-Generated Explicit Photos Go Viral

Taylor Swift’s name is no longer searchable on the X platform (formerly known as Twitter) after nonconsensual sexually explicit deepfake images of the singer began circulating online earlier this week. When users type Swift’s name into the search box on X, a message appears reading, “Something went wrong. Try reloading.”

The deepfake images began circulating on X and other social media platforms on Wednesday. The fake photos were created by artificial intelligence tools that can “undress” a regular, clothed photo of someone.

The explicit images have spread following X greatly relaxing and reducing its content moderation guidelines and tools since the company’s acquisition by Elon Musk. The source of the images is not clear.

Citing a source close to Swift, the Daily Mail reported that the pop star is considering legal action.

“The Twitter account that posted them does not exist anymore. It is shocking that the social media platform even let them be up to begin with,” the Daily Mail’s source said. “These images must be removed from everywhere they exist and should not be promoted by anyone. Taylor’s circle of family and friends are furious, as are her fans obviously.”

SAG-AFTRA stepped in to advocate for Swift on Friday, condemning the images and calling for new protective legislation.

“The sexually explicit, A.I.-generated images depicting Taylor Swift are upsetting, harmful, and deeply concerning,” the actors’ union said in a statement. “The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal.”

“SAG-AFTRA continues to support legislation by Congressman Joe Morelle, the Preventing Deepfakes of Intimate Images Act, to make sure we stop exploitation of this nature from happening again,” the statement continued. “We support Taylor, and women everywhere who are the victims of this kind of theft of their privacy and right to autonomy.”

Swift’s fans assembled to defend the singer en masse and flooded the hashtags #ProtectTaylorSwift and #TaylorSwiftAI with messages encouraging others to report the images if they see them.

One fan tweeted, “What has happened to Taylor is actually horrifying. I cannot put into words how wrong this is. Love her or hate her, there is no excuse to exploit her like that. I hope that whoever has either shared or made those images gets the most karma possible.”

Another wrote, “Claiming Taylor Swift is a billionaire doesn’t excuse sharing inappropriate AI images of her. She’s still a human being with feelings. Show respect.”

A third fan added, “Supporting artists like Taylor Swift is important because they use their platform to advocate for creative ownership and fair treatment, inspiring a new generation of artists to stand up for their rights.”

On Friday, White House press secretary Karine Jean-Pierre addressed the problem. She said, “We are alarmed by the reports of circulation of images that you just laid out — false images, to be more exact — and it is alarming. So while social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation and nonconsensual, intimate imagery.”

Jean-Pierre continued, “This problem is not new, and it’s one that the Biden-Harris administration has been prioritizing since day one. We have taken this very seriously.”

“Again, this is alarming to us. As you know, he launched a task force to address online harassment and abuse, and he did that just this fall. The Department of Justice launched the first national 24/7 helpline for survivors of image-based sexual abuse,” she concluded.

The post Taylor Swift’s Name No Longer Searchable on X After AI-Generated Explicit Photos Go Viral appeared first on TheWrap.