The number of deepfake images has increased in the past few years, as the technology used to produce such images has become more accessible and easier to use, researchers said. In 2019, a report by AI company DeepTrace Labs showed that these images were widely used as a weapon against women. She added that most of the victims were Hollywood actors and South Korean pop singers.
Brittany Spanos, senior writer at Rolling Stone Swift's fans are quick to mobilize in support of their artist, especially those who take their fandom seriously and in cases of wrongdoing, says who teaches a course on Swift at New York University.
“This could be a huge deal if you actually take it to court,” she said.
Spanos says the deepfake pornography case is consistent with other cases Swift has faced in the past, pointing to her 2017 lawsuit against a radio station DJ who allegedly harassed her; Jurors awarded Swift $1 in damages, an amount her lawyer, Douglas Baldridge, described as “one symbolic dollar, the value of which is immeasurable for all the women in this situation” in the midst of the #MeToo movement. (The $1 lawsuit subsequently became a trend, as did Gwyneth Paltrow's 2023 countersuit against a skateboarder.)
When reached for comment on Swift's fake photos, he directed X to a post from his security account saying the company strictly prohibits the sharing of non-consensual nude photos on its platform. The company has also sharply reduced its content moderation teams since Elon Musk took over the platform in 2022.
“Our teams are actively removing all identified images and taking appropriate action against the accounts responsible for spreading them,” the company wrote in an early X post. “We are closely monitoring the situation to ensure any further violations are addressed immediately and the content removed.”
Meanwhile, Meta said in a statement that it strongly condemns “the content that appeared across various Internet services” and worked to remove it.
“We continue to monitor our platforms for this violating content and will take appropriate action as needed,” the company said.
A representative for Swift did not immediately respond to a request for comment.
Allen said the researchers are 90 percent confident that the images were generated by diffusion models, a type of generative AI model that can produce new, realistic images from written prompts. The most popular are Stable Diffusion, Midjourney, and DALL-E from OpenAI. Allen's group has not attempted to identify the source.
Microsoft, which offers an image creation tool based in part on DALL-E, said it was investigating whether its tool had been abused. Like many other commercial AI services, it said it does not allow “adult or non-consensual intimate content, and any repeated attempts to produce content that violates our policies may result in loss of access to the service.”
A question about Swift's deepfakes has been answered NBC Nightly NewsMicrosoft CEO Satya Nadella told host Lester Holt in an interview that there is still a lot to be done in putting AI safeguards in place and “we have to move quickly on this.”
“This is certainly alarming and terrible, so yes, we have to act,” Nadella said.
Midjourney, OpenAI, and Stable Diffusion-maker Stability AI did not immediately respond to requests for comment.
download
Federal lawmakers who have introduced bills to further restrict or criminalize deep porn have pointed out that the incident shows why the United States needs to implement better protections.
“For years, women have been victims of non-consensual deepfakes, so what happened to Taylor Swift is more common than most people realize,” said Yvette Clarke, a New York Democrat who has introduced legislation that would require content creators to place a digital watermark on deepfake content. . .
“Generative AI helps create better fakes at a fraction of the cost,” Clark said.
Joe Morrell, another New York Democrat pushing a bill that would criminalize sharing deep porn online, said what happened to Swift was troubling and has become increasingly widespread online.
“The images may be fake, but their effects are very real,” Morrell said in a statement. “Deepfakes happen every day to women everywhere in our increasingly digital world, and it's time to put an end to them.”
AP
Get a direct note from our foreign correspondents on what's making headlines around the world. Subscribe to the weekly What in the World newsletter here.