Pornographic images – deepfakes – of Taylor Swift are circulating online and that is bringing the global attention back to the infamous scourge of tech platforms that the platforms have been trying to and have failed solving yet. Last week sexually excpicit and abusive fake images of Swift began circulating on X (formerly Twitter).
With great rigour, swiftly, her fanbase called “Swifties” mobilized and launched a counteroffensive on the platform called #ProtectTaylorSwift. This hashtage infiltered the platform with positive images of the pop star. Some of them even said they were reporting accounts that were sharing the deepfakes.
On Friday, the Screen Actors Guild released a statement calling the images of Swift “upsetting, harmful, and deeply concerning,” adding that “the development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal.” Reality Defender, the deepfake detecting group said it tracked a uproar of nonconsensual pornographic material depicting Swift, especially on X. some images worked their way up to Meta – owned Facebook and other social media platforms too.
“Unfortunately, they spread to millions and millions of users by the time that some of them were taken down,” said Mason Allen, Reality Defender’s head of growth.
The most widely shared AI-generated image was that related to foodbal, showing a painted or bloodied Swift that objectified her and in some case inflicted violent harm on her deepfake persona. Brittany Spanos, a senior writer in the Rolling Stone who also teaches a course on Swift in New York University stated that – “This could be a huge deal if she really does pursue it to court.”
“Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them,” the company wrote in the X post early Friday morning. “We’re closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed.”
On the other hand, Rep.Joe Morelle, another New York Democrat has been pushing a bill that would criminalize sharing deepfake porn online.
“The images may be fake, but their impacts are very real,” Morelle said in a statement. “Deepfakes are happening every day to women everywhere in our increasingly digital world, and it’s time to put a stop to them.”