For Immediate Release
Contact:
Mary Anne Franks, JD, DPhil
President, Cyber Civil Rights Initiative
info@cybercivilrights.org
Please also see the CCRI Media Guide
CCRI Statement on the Passage of the TAKE IT DOWN Act (S. 146)
April 28, 2025, Washington, DC:
Today, Congress passed S.146, the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act. President Trump has previously indicated that he will sign the bill. S.146 has two main provisions. The first provision criminalizes the nonconsensual distribution of intimate images (NDII), whether authentic (sometimes referred to as “revenge porn”) or digitally manipulated (often referred to as “deepfake porn”). The second provision requires covered platforms to remove nonconsensual intimate visual depictions within 48 hours. While we welcome the long-overdue federal criminalization of NDII, we regret that it is combined with a takedown provision that is highly susceptible to misuse and will likely be counter-productive for victims.
As the leading US-based organization dedicated to combating image-based sexual abuse and the drafters of the first model criminal statute addressing the nonconsensual distribution of intimate images, we have long called for a federal criminal law against NDII. We are gratified that TAKE IT DOWN Act incorporates the language of the Stopping Harmful Image Exploitation and Limiting Distribution (SHIELD) Act, a bipartisan bill based on CCRI’s model federal statute that has been introduced in some form in almost every session of Congress since 2016, as well as much of CCRI’s recommended language on sexually explicit digital forgeries. We have appreciated the opportunity to provide legislative analysis and feedback to Senators Cruz and Klobuchar, along with many other members of Congress, in the drafting of TAKE IT DOWN.
For too long, image-based sexual abuse has undermined the privacy, free expression, and other civil rights and liberties of vulnerable groups, particularly women, girls, and sexual minorities. While Congress enacted a federal civil remedy for NDII in 2022 (patterned after a uniform law that CCRI leadership helped draft), it has until now failed to criminalize the unauthorized distribution of intimate images. Federal criminal prohibition of this abuse is essential to supporting current survivors and deterring future perpetrators.
CCRI must, however, note its objection to the exception provided for “a person who possesses or publishes an intimate visual depiction of himself or herself,” which creates a dangerous loophole that would seemingly allow a person to disclose intimate images without consent so long as that person also appears in the image.
In addition, as noted in our March 7, 2025 press statement, CCRI has serious concerns about the constitutionality, efficacy, and potential misuse of TAKE IT DOWN Act’s notice and removal provision. While we wholeheartedly support the expeditious removal of nonconsensual intimate content and have long called for increased legal accountability for tech platforms that choose to distribute unlawful content, CCRI objects to the notice and removal provision because it is (1) unlikely to accomplish these goals and (2) likely to be selectively and improperly misused for political or ideological purposes that endanger the very communities most affected by image-based sexual abuse.
CCRI repeatedly raised its concerns about the notice and removal provision with federal lawmakers in the hopes that significant revisions would be made to the bill prior to passage. While some of our suggested revisions were made, the takedown provision as passed by Congress today remains unconstitutionally vague, unconstitutionally overbroad, and lacking adequate safeguards against misuse. Below, we offer a detailed explanation of our concerns about the notice and removal provision to inform the general public and potentially affected parties.
Section 3 of the TAKE IT DOWN Act, “Notice and removal of nonconsensual intimate visual depictions,” provides that a “covered platform” that receives a “valid removal request from an identifiable individual or an authorized person acting on behalf of such individual” must remove “intimate visual depictions” within 48 hours of receipt. A valid request includes a statement that the depicted individual has a good faith belief that the depiction was published without consent; information for the covered platform to locate the depiction and to contact the depicted individual; and the signature of the depicted individual or a person authorized to act on behalf of that individual. S. 146 delegates the power to enforce these obligations to the Federal Trade Commission.
- No Safeguards Against False Complaints. While S. 146’s notice and removal provision is roughly modeled after 17 U.S.C. § 512, the Digital Millennium Copyright Act (DMCA) notice and takedown process, it contains none of that law’s safeguards against false or malicious reports. The DMCA requires complainants to attest that they are authorized to act on behalf of the allegedly injured party under penalty of perjury, provides a process for counter-notice, and imposes liability on individuals who make knowing and material misrepresentations about the unlawful or lawful nature of the content.
The lack of any similar safeguards makes S. 146’s reporting process highly susceptible to abuse. Individuals and organizations could flood platforms with reports about explicit content simply because they morally disapprove of it rather than because it is in fact NDII. Indeed, it would be entirely possible for a platform to be overwhelmed with reports of content that are not intimate visual depictions at all. The notice and removal process can far too easily be misappropriated by disgruntled users, competitors, or other entities with agendas that have nothing to do with concerns about image-based sexual abuse.
- Overbreadth. While the criminal provisions of S. 146 are appropriately narrow and comport with constitutional requirements, the notice and removal provision is extremely broad. The criminal provisions apply only to authentic intimate visual depictions that are disclosed in violation of the individual’s reasonable expectation of privacy; inauthentic intimate visual depictions disclosed without consent; or intimate visual depictions of minors disclosed with the intent to harm or to arouse sexual desire. The criminal provision includes several additional restrictions to the prohibition of intimate visual depictions of adults. What is depicted must not have been voluntarily exposed in a public or commercial setting or be a matter of public concern, and the disclosure must either cause or be intended to cause harm. Disclosures are not prohibited if they have a legitimate medical, scientific, or educational purpose; or are made in the reporting of unlawful content or unsolicited or unwelcome conduct; or made in pursuance of a legal, professional, or other lawful obligation; or are part of an effort to seek support or help with respect to the receipt of an unsolicited intimate visual depiction; or are reasonably intended to assist the identifiable individual.
By contrast, S. 146’s notice and removal provision applies to any removal notice about an “intimate visual depiction” submitted by any person claiming to be the depicted individual or that individual’s authorized representative, and which claims that the depiction was published without consent. This creates a legal obligation for platforms to locate and remove an incredibly broad range of content that may be perfectly legal and constitutionally protected within 48 hours.
Here are some examples of depictions that platforms could be obligated to remove:
- A journalist’s photographs of a topless protest on a public street.
- Photos of a subway flasher distributed by law enforcement in an effort to locate the perpetrator.
- Any commercially produced sexually explicit content.
- Sexually explicit material of a depicted individual voluntarily distributed by that individual but falsely reported as being nonconsensually distributed.
The statute also explicitly prohibits claims against covered platforms for “good faith disabling of access to, or removal of, material claimed to be a nonconsensual intimate visual depiction based on facts or circumstances from which the unlawful publishing of an intimate visual depiction is apparent, regardless of whether the intimate visual depiction is ultimately determined to be unlawful or not,” which means that individuals or entities who may be harmed by the removal of lawful content will have no recourse against the platforms.
- Arbitrary Definition of “Covered Platforms.” S. 146’s notice and removal provision applies to websites and other online services that “primarily provide a forum for user-generated content, including messages, videos, images, games, and audio files,” but explicitly does not apply to those “that consist[] primarily of content that is not user generated but is preselected by the provider of such online service, application, or website.” It is unclear why platforms dedicated to “audio files” should be included in the definition of covered platforms for a provision intended for nonconsensual intimate visual depictions, but it is even more troubling that platforms that “preselect” content are not included. Many websites and other online services that feature sexually explicit content are curated by providers; curation does not make the distribution of NDII any less harmful. While the provision does apply to online services whose “regular course of trade or business” is ”to publish, curate, host, or make available content of nonconsensual intimate visual depictions,” the exemption for curated content is arbitrary and insupportable.
- Expansive Scope of Federal Trade Commission (FTC) Enforcement. S. 146 provides that “failure to reasonably comply with the notice and takedown obligations” will be treated as an “unfair or deceptive act or practice” subject to enforcement by the FTC. The removal provision’s lack of any safeguards against abuse and the arbitrary definitions of covered platforms invites broad discretion in enforcement. This is troubling on its face, but it is particularly so at a moment when the chair of the FTC has taken unprecedented steps to politicize the agency and has explicitly promised to use the power of the agency to punish platforms and services on an ideological, as opposed to principled, basis. What is more, S.146 states that the FTC’s jurisdiction will not be limited, as it generally is when regulating unfair or deceptive practices, to commercial entities. This is an alarming expansion of the FTC’s enforcement authority, especially under an administration that has openly expressed hostility to nonprofit organizations that do not serve its political interests.
- Giving Victims False Hope. The TAKE IT DOWN Act’s removal provision has been presented as a virtual guarantee to victims that nonconsensual intimate visual depictions of them will be removed from websites and online services within 48 hours. But given the lack of any safeguards against false reports, the arbitrarily selective definition of covered platforms, and the broad enforcement discretion given to the FTC with no avenue for individual redress and vindication, this is an unrealistic promise. Platforms that feel confident that they are unlikely to be targeted by the FTC (for example, platforms that are closely aligned with the current administration) may feel emboldened to simply ignore reports of NDII. Platforms attempting to identify authentic complaints may encounter a sea of false reports that could overwhelm their efforts and jeopardize their ability to operate at all. This would in turn benefit unscrupulous platforms who can take advantage of neutralized competitors.
Additionally, the notice and removal provision uses the term “intimate visual depiction,” which is defined by cross-reference to 15 U.S.C. § 6851, the existing federal civil “revenge porn” provision. As that provision defines the term, it would appear to apply only to authentic depictions, that is, not to “deepfakes.”
Again, while CCRI welcomes S. 146’s long-overdue criminalization of image-based sexual abuse, we greatly regret that it includes a notice and removal provision that lacks adequate safeguards against false reports, is over- and under-inclusive, chills protected expression, invites arbitrary and selective enforcement, and gives false hope to victims.
About CCRI: CCRI’s mission is to combat online abuses that threaten civil rights and civil liberties. CCRI is the nation’s leading nonprofit working to protect vulnerable groups from image-based sexual abuse (IBSA). CCRI provides model legislation, policy guidance, legal analysis, and amicus briefs on issues relating to online privacy and free expression and has provided support to over 32,000 victims and survivors of IBSA through the CCRI Image Abuse Helpline. If you or someone you know is a victim of image-based sexual abuse, please visit the CCRI Safety Center, which offers a step-by-step guide on next steps you might take.
###