July 30, 2024

RELEASE: AUCHINCLOSS INTRODUCES BIPARTISAN BILL TO TACKLE RISE IN NON-CONSENSUAL DEEPFAKES ON SOCIAL MEDIA PLATFORMS

WASHINGTON, D.C. –Today, U.S. Representatives Jake Auchincloss (D, MA-04) and Ashley Hinson (R, IA-02) are introducing the Intimate Privacy Protection Act to tackle the steep rise of deepfake pornography without the consent of the deepfake’s subject and the misinformation spread through AI-generated content on online platforms. 98% of online deepfake imagery is deepfake pornography–99% of which targets women. 

The New York Times has documented the “epidemic” of “widely available ‘nudification’ apps” used to create convincing deepfake pornography of female teenagers. The Intimate Privacy Protection Act would require online platforms to take good-faith action to address deepfakes that cause significant harm to both individual victims and our democracy in order to maintain their protections under Section 230.

Section 230’s liability shield for online platforms has led to a lack of incentive to ensure users’ safety or address harmful content on their platforms. The Intimate Privacy Protection Act conditions the applicability of Section 230’s liability shield on a platform implementing a duty of care: a set of basic steps that requires companies to be responsive to complaints about cyberstalking, intimate privacy violations, and digital forgeries. 

The bill requires that this duty of care include implementation of: 

  • A process to prevent, to the extent practicable, cyberstalking, intimate privacy violations, and digital forgeries.
  • A clear and accessible process to report instances of these harms, as well as a process for investigating them.
  • A process to remove, within 24 hours, information that the provider knows or has reason to know constitutes one of these harms.
  • Data logging requirements to ensure victims’ access to data for legal proceedings.
  • A process for removal or blocking of content determined unlawful by a court.

“The rise in deepfake pornography targeting women and teenage girls is deeply disturbing and threatens long term psychological damage. Trillion dollar social media corporations should have a duty of care to prevent intimate image violations and swiftly remove such content within 24 hours,” said Rep Auchincloss. “Congress must prevent these corporations from evading responsibility over the sickening spread of malicious deepfakes and digital forgeries on their platforms.” 

“It has become way too easy for bad actors to create and circulate inappropriate deepfake images online with damaging consequences for the victims. Big Tech companies shouldn’t be able to hide behind Section 230 if they aren’t protecting users from deepfakes and other intimate privacy violations,” said Rep. Hinson. “As parents, Rep. Auchincloss and I found common ground on this important issue and I look forward to working with him to hold Big Tech accountable and protect users, especially minors, from these gross violations.” 

The Intimate Privacy Protection Act is endorsed by the Cyber Civil Rights Initiative (CCRI), National Organization for Women (NOW), National Network to End Domestic Violence (NNEDV), National Alliance to End Sexual Violence (NAESV), Sexual Violence Prevention Association (SVPA),  and the Electronic Privacy Information Center (EPIC).

“Victims across the country have had their lives deeply impacted by deepfake pornography.  Tech companies need to take action to prevent this traumatic and devastating harm.  The Intimate Privacy Protection Act is vital for women, minorities, and children to be safe online,” said Omny Miranda Martone, Founder & CEO, Sexual Violence Prevention Association. “ As a victim, I know firsthand how important this bill would have been when I was targeted with deepfake pornography.  The Sexual Violence Prevention Association (SVPA) is proud to endorse this legislation that will ignite tech companies to prevent non-consensual deepfake pornography."

 “Deepfakes and other AI content steal our bodily autonomy, with new ways to exploit and harass women that can spread like wildfire online and cause lasting damage to victims,” said Christian F. Nunes, National President of the National Organization of Women. “We insist on a duty of care—a responsibility to protect—from our schools, and we should expect nothing less from companies whose platforms have such influence and impact on our lives.”

“The time is now to reform Section 230. For too long, online platforms have been shielded from liability for online abuse that we know silences victims and ruins lives. Nearly every industry owes basic duties to prevent foreseeable harm; with this bill, so will the tech industry. This bill imposes a well-defined duty of care on online platforms to prevent, investigate, and remove cyberstalking, nonconsensual intimate images, and digital forgeries,” said Danielle Keats Citron, Vice President of the Cyber Civil Rights Initiative. “The bill also corrects an overbroad judicial interpretation of Section 230 that lets platforms solicit or encourage online activity without accountability. With this bill, online intermediaries will be responsible not only for online speech activity they helped create or develop but also for online speech activity that they solicit or encourage. This is the bill that we need to protect civil rights and liberties online.”

“Cyberstalking, deep fakes, and the non-consensual distribution of intimate images are some of the most dangerous threats to privacy that individuals face online. These violations can cause irreparable damage, and internet platforms should not be immune if they knowingly enable such harms,”  said Alan Butler, Executive Director, Electronic Privacy Information Center. “That is why EPIC supports the Intimate Privacy Protection Act. This bill would make important changes to Section 230 to ensure that platforms establish clear processes to address these threats.