- Taylor Swift is the most searched for celebrity in the world
- This has led to her likeness being used for artificial intelligence deepfakes
- X is attempting to crack down on inappropriate posts with her image
The internet was ablaze with controversy this week as social media behemoth X (formerly known as Twitter) took drastic measures, blocking all searches for pop sensation singer-songwriter Taylor Swift after explicit AI-generated images of the star went viral, causing uproar among fans and US officials alike.
That's definitely not Taylor!
In a bold move to "prioritize safety," X's head honcho of business operations, Joe Benarroch, announced the "temporary action" following the alarming spread of the fake graphic content. Swifties worldwide were greeted with the cryptic message, "Something went wrong. Try reloading," as they searched for their idol, only to find a digital ghost town where her presence once electrified the platform.
Last week, A.I.–generated nude images of pop superstar Taylor Swift were produced and distributed without her consent. They circulated throughout the web, with one single post on X garnering 45 million views before the site took it down.
Imagine that!
Deepfakes, as they’ve come to be called in recent years, often target female celebrities, but with the rise of A.I., it’s easier than ever for everyday people (almost always women) to be targeted.
As 'Slate' reports, last year, more than 143,000 deepfake porn videos were created, according to one estimate from the independent researcher Genevieve Oh, more than every other previous year combined. That number will, in all likelihood, only continue to rise.
The scandal erupted earlier this week when the counterfeit images, viewed by millions, set off alarm bells, prompting a legion of loyal fans to launch a digital counterattack. With the battle cry "protect Taylor Swift," her supporters flooded X with authentic images and videos of the beloved artist, standing as a human firewall against the onslaught of fakes.
The issue is bigger than any one celebrity
Back in 2017, when activists and the first people affected by A.I.–assisted deepfakes, like famous actors and singers, started to raise the alarm about this issue, they really gave us a roadmap for what would happen:
They said this is going to be used to create child sexual abuse imagery, this is going to have really damaging physical and psychological effects not just on famous women, but on everyday women and girls. And all of their predictions have come true in really disturbing ways, especially over the last year.
There was a tenfold increase of A.I.–generated nude images online within a seven-month period in 2023. We also see preteen and adolescent girls who already disproportionately experience what some people call revenge porn, disproportionately in the classroom.
They’re now experiencing the circulation of fake nude images in their school environments as well. So there has been an exponential increase. There’s also been an increase in the breadth of victimization.
X, stepping up to the plate, declared an unequivocal stance against non-consensual nudity, stating, "We have a zero-tolerance policy towards such content." The platform's teams sprang into action, scrubbing the site clean of the offending material and cracking down on the rogue accounts responsible.
Also interesting:
The ripple effect of the scandal reached the White House, with press secretary Karine Jean-Pierre labelling the incident "alarming" and underscoring the disproportionate impact on women and girls.
She called for a legislative hammer to smash the misuse of AI technology on social media, while also urging platforms to enforce their own rules against the spread of misinformation and non-consensual imagery. As the dust settles, the question of when X initiated the search blackout for Swift remains shrouded in mystery.
However, one thing is crystal clear: the battle lines have been drawn in the digital sand, with a clarion call for new laws to criminalize the creation of deepfake images echoing through the halls of power.