NewsNationalScripps News

Actions

'Protect Taylor Swift': Fans help get explicit AI fakes taken down

Swift's fans are busy reporting the explicit content and posting actual pictures of the pop star in hopes of making the fake images tougher to find.
Fake, AI-generated nudes of Taylor Swift go viral on social media
Posted
and last updated

Taylor Swift's loyal fans are swarming social media to get fake pornographic pictures of the singer taken down. 

Shortly after the images, which were apparently created with artificial intelligence, went viral, fans joined the online campaign to "protect Taylor Swift."

They have been busy reporting the explicit content and posting actual pictures of the pop star in hopes of making the fake images tougher to find. 

Without naming Swift, X appeared to respond to the campaign to get the images taken down. 

"Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content," a statement from the X Safety account says. "Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them. We're closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed."

The AI-generated photos, known as "deepfakes," show Swift in various explicit positions at a Kansas City Chiefs game, a reference to her relationship with the team's tight end Travis Kelce. 

The nonconsensual photos garnered 27 million views and over 26,000 comments in 19 hours before the account that posted them was suspended, according to reports. However, the images continue to circulate on other accounts.

It is not immediately clear who is behind the images, but a watermark suggests they came from a years-old website known for publishing fake nude photos of celebrities, reports say. Part of the website is titled "AI deepfake," according to NBC.

The production of these images underscores the dangers of AI and its potential to create convincing and damaging material.

President Joe Biden in October signed an executive order to regulate AI and manage its risks — one of them being to protect against the use of nonconsensual intimate imagery of real individuals. 

Last week, Reps. Joseph Morelle, D-N.Y., and Tom Kean, R.-N.J., reintroduced a bill called the Preventing Deepfakes of Intimate Images Act that would make sharing such fake explicit images a federal crime. No decision has been made yet on whether the bill will pass.

SEE MORE: Biden issues executive order to enhance government AI risk monitoring


Trending stories at Scrippsnews.com