1 in 6 Women Targeted by AI-Generated Deepfakes That Reveal Sex
More than a dozen members of Congress have been the victims of deeply sexist comments – and the majority of those affected are women, according to a new study that highlights gender inequality in this technology and the growing risks of women’s participation in politics and other forms of public engagement.
The American Sunlight Project (ASP), a think tank that researches disrespect and advocates for policies that promote democracy, released results on Wednesday that identified more than 35,000 expressions of intimate non-conformist images (NCII) showing 26 members of Congress – 25 women and one man. – recently found on deepfake websites. Most of the images were quickly removed as researchers shared their findings with concerned members of Congress.
“We need to consider this new environment and the fact that the Internet has opened up more dangers to women and marginalized communities,” said Nina Jankowicz, an online fraud and abuse expert who founded American Sunlight. The project is also an author on research.
Illicit intimate images, also known as deepfake porn although advocates prefer the former, can be created by using artificial AI or by overlaying headshots on older players’ media. There is currently a restrictive policy to limit its creation and spread.
ASP shared the first-of-its-kind findings exclusively with The 19th. The group collected the data in part by building a custom search engine to find members of the 118th Congress by first and last name, and abbreviations or nicknames, on 11 of the most popular deepfake sites. No association or location had an effect on the likelihood of being targeted for abuse, although younger members were more likely to be abused. The biggest factor was gender, with female members of Congress 70 times more likely than men to be targeted.
ASP has not released the names of the lawmakers featured in the photos, in order to avoid inciting searches. They have contacted the offices of everyone affected to inform them and offer resources on online risks and mental health support. The authors of the study note that after that, the images directed at many members were completely deleted or completely removed from the sites – a fact that they cannot explain. The researchers noted that such removal does not prevent the material from being shared or uploaded again. In some cases involving law enforcement, search results pages remain indexed by Google despite content being largely or completely removed.
“Removals can happen by accident. Regardless of what exactly led to the removal of this content – whether it was ‘cease and desist’, claims of copyright infringement, or other links to sites that host serious abuse that is not true – it highlights a huge disparity in rights,” according to the study. “People, especially women, who have no resources given to Members of Congress, they would be less likely to receive this immediate response from the creators and distributors of NCII-produced AI if they started it themselves request for abatement.”
According to the preliminary findings of this study, about 16 percent of all women currently serving in Congress – or about 1 in 6 congresswomen – are victims of intimate photos generated by AI that do not agree.
Jankowicz has been the target of online harassment and threats for his domestic and international disinformation work. He also spoke publicly about being a victim of serious harassment – a fact he discovered through Google Alert in 2023.
“You can be made to appear in these serious, intimate situations without your consent, and those videos, or you can say, pursue a copyright claim against the original poster, – like me – are spreading on the Internet without you. control and without some kind of consequences for the people who raise or create the -deeply fake porn,” she said. “That continues to be dangerous for anyone in the public eye, who participates in public discourse, but especially for women and women of color.”
Image-based sexual harassment can have devastating mental health consequences for victims, including everyday people who are not involved in politics – including children. In the past year, there have been reports of high school girls being targeted for sexual harassment based on images in states like California, New Jersey and Pennsylvania. School officials have had varying degrees of response, though the FBI has also issued a new warning that sharing these images of children is illegal.
The full impact of deepfakes on society remains to be seen, but research already shows that 41 percent of women between the ages of 18 and 29 check themselves to avoid online harassment.
“That is a very strong threat to democracy and freedom of speech, when we have almost half of the people who are keeping quiet because they fear the abuse they will receive,” said Sophie Maddocks, director of research at the Center for Media at Risk at. at the University of Pennsylvania.
There is no federal law that establishes criminal or civil penalties for someone who produces and distributes inappropriate AI-generated intimate images. About a dozen states have enacted laws in recent years, though most include civil, not criminal, penalties.
Unauthorized AI-generated intimate images also open up threats to national security by creating conditions of distrust and national consent. That may have a negative impact on policy makers regardless of whether they are the direct target of the image.
“My hope here is that members are pushed into action when they realize that it’s not just affecting American women, it’s affecting them,” Jankowicz said. “It affects his colleagues. And this only happens because they are in the eyes of the people.”
Image-based sexual harassment is a unique threat to women running for office. Susanna Gibson narrowly lost her competitive legislative race after a Republican staffer shared an unauthorized recording of a live sex tape between the Virginia Democrat and her husband with The Washington Post. In the months after her loss, Gibson told The 19th that she heard from young women who were discouraged from running for office because of fears of intimate photos being used to harass them. Gibson has since started a nonprofit organization dedicated to combating image-based sexual harassment and an associated political action committee to support female candidates against intimate privacy violations.
Maddocks researched how women who speak publicly are more likely to experience digital sexual violence.
“We have this long, ‘women should be seen and not heard’ pattern that makes me think of Mary Beard’s writing and research on this idea that feminism is against public discourse. So when women speak publicly, it’s like, ‘Okay. Time to be shy. Time to take them off. Time to bring them back to the house. It’s time to be ashamed and shut up.’ And that silence and that shameful motivation … we have to understand that in order to understand what this damage looks like as it relates to congresswomen. “
ASP is urging Congress to pass federal legislation. The Disruptive Images and Non-Conforming Editing Act of 2024, also known as the Non-Conformity Act, will allow people to sue anyone who creates, shares or accepts such images. The Take It Down Act would include criminal liability for such activity and require tech companies to take down deepfakes. Both bills passed the Senate with bipartisan support, but must address concerns about free speech definitions and harmful definitions, which are common barriers to technology policy, in the House.
“It would be a dereliction of duty for Congress to let this time pass without passing at least one of these laws,” Jankowicz said. “It’s one of the ways that the harm of artificial intelligence is being felt by real Americans right now. It’s not harm in the future. It’s not something we should think about.”
In the absence of congressional action, the White House is working with the private sector to find creative solutions to curb image-based sexual harassment. But critics are pessimistic about Big Tech’s ability to regulate itself, given the history of injuries caused by its platforms.
“It’s very easy for criminals to create this content, and the signal isn’t just one woman being targeted,” Jankowicz said. “It’s from women everywhere, who say, ‘If you take this step, if you speak up, this is a consequence you may have to face.’
If you have been a victim of photo-based sexual harassment, i Cyber Civil Rights Initiative maintains a list of legal resources.
This article was originally published on Markup and is republished under a Creative Commons Attribution-NonCommerce-NoDerivatives license.
Source link