As technology continues to advance, deepfake pornography presents growing concerns for women and girls. In 2023, the total number of deepfake videos online was 95,820, up 550% from 2019, according to a report by Home Security Heroes, a group that researches best practices for online security. Pornography made up 98% of them.
What are Deepfakes?
Deepfakes are hyper-realistic digital manipulations of audio, video, or images using Artificial Intelligence, which can deceive audiences by presenting fabricated or altered content, often blurring the line between reality and fiction.
Deepfakes have a particularly harmful impact on women and girls exacerbating existing issues related to harassment and online abuse. Indeed, 99% of the individuals targeted in deepfake pornography are women. Deepfakes have become an additional tool used for cyber harassment, making it easier for perpetrators to create and distribute manipulated content that is sexually explicit or designed to harm a woman’s reputation. This adds a new dimension to “revenge porn,” where deepfake technology is used to fabricate compromising material that doesn’t actually exist but can still have damaging effects.
Women who are targeted may face relentless harassment, threats of violence, and widespread sharing of the fake content across platforms, often feeling powerless to stop the viral spread. Deepfakes, like other forms of cyber sexual abuse, can have detrimental consequences for women’s safety, privacy, and well-being. Women feel violated, even if the video or image is fake, because it is often indistinguishable from reality.
Is Deepfake Pornography Legal?
The legal landscape around deepfake pornography is still developing, with new laws and court rulings being created to address the unique problems posed by deepfakes. Currently, there is no comprehensive enacted federal legislation in the United States that bans or even regulates deepfakes. However, there are multiple bills that were recently introduced that would address deepfake pornography.
- The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act (Take It Down Act) was recently passed by the Senate and is currently held in the House (Introduced 06/18/2024).
- The Disrupt Explicit Forged Images and Non-Consensual Edits Act (Defiance Act of 2024) was passed by the Senate and is currently held in the House (Introduced 01/30/2024).
- The Prevention Rampant Online Technological Exploitation and Criminal Trafficking Act of 2024 (The Protect Act of 2024) was introduced on 01/31/2024.
Many states are in process of enacting laws that would criminalize or establish a civil right of action against the dissemination of “intimate deepfakes” depicting adults. According to the consumer advocacy organization Public Citizen, which recently launched a new legislation tracker, 21 states have now enacted at least one law which either criminalizes or establishes a civil right of action against the dissemination of “intimate deepfakes” depicting adults (as opposed to minors) who did not consent to the content’s creation. The legal system often struggles to keep up with the pace of technological innovation, leaving victims with few avenues for recourse.
Contact Us Today
The rise of this technology amplifies the existing vulnerabilities women face in the digital world, with few legal or social protections currently in place to effectively combat the harm. Addressing these issues will require not only stronger laws and policies but also cultural shifts to hold perpetrators accountable and protect women from the abuses made possible by AI-generated media. Despite the limited legal protections available, legal action can still be pursued to remove harmful content and hold those responsible accountable.
If you or someone you know has been the victim of deepfake pornography, don’t wait to take action. This malicious form of harassment can have devastating personal and professional consequences, but you don’t have to face it alone. Contact an experienced attorney who specializes in cases of digital harassment and deepfake content. Lindsay Lieberman, Esq. (licensed in NY, NJ and RI) and Sydney Rendel, Esq. (licensed in FL) can help you understand your legal options, take steps to have harmful content removed, and hold those responsible accountable.