ARTICLE
6 June 2025

The Take It Down Act Becomes Law: Combatting Revenge Porn And Deepfakes In The Digital Age

PC
Pryor Cashman LLP

Contributor

A premier, midsized law firm headquartered in New York City, Pryor Cashman boasts nearly 180 attorneys and offices in both Los Angeles and Miami. From every office, we are known for getting the job done right, and doing it with integrity, efficiency and élan.
Public debates about the costs and benefits of open, online public forums and, more recently, artificial intelligence, are commonplace. But for anyone confronted with unauthorized posting of intimate images...
United States Technology

Public debates about the costs and benefits of open, online public forums and, more recently, artificial intelligence, are commonplace. But for anyone confronted with unauthorized posting of intimate images or online harassment, or targeting of minors through sexually explicit images, these technological developments have led to concrete harm and distress. Unfortunately, while a patchwork of state laws attempted to provide some semblance of a remedy, in reality, victims had few avenues to stop proliferation of the harmful images. On May 19, 2025, President Trump signed the Take it Down Act (47 U.S.C. § 223(h) et seq.) into law in an attempt to combat these problems. Enjoying rare bipartisan support, the Take It Down Act (the "Act"), introduced by Senators Ted Cruz (R-TX) and Amy Klobuchar (D-MN), is designed to provide a federal legislative solution to the increasingly prevalent problems of the online posting and sharing of so-called "revenge porn" and "deep fake," or artificial intelligence-generated, pornography by imposing criminal penalties and creating obligations for online platforms to remove offending content.

The passage of the Act comes after a surge in the creation of nonconsensual pornographic images, including of high-profile celebrities such as Taylor Swift and political figures such as Representative Alexandria Ocasio-Cortez, as well as private individuals including in incidents involving school bullying. It comes on the heels of several states, including Florida and California, that have passed their own legislation, providing for both civil and criminal remedies for victims of these offenses.

The Act criminalizes the publication of nonconsensual intimate images — whether real or created using technology such as generative artificial intelligence — and imposes obligations on sites where those images may appear to respond to requests for removal. While much remains to be seen in how the Act will be enforced, it will no doubt change the landscape meaningfully for victims and online providers alike.

Criminal Provisions

Section 2 of the Take It Down Act imposes criminal penalties on those who engage in "intentional disclosure of nonconsensual intimate visual depictions[.]" Specifically, the Act criminalizes two categories of conduct: the publication of real, intimate images without subjects' consent and the publication of "digital forgeries," i.e., technologically-created intimate images that appear to be, but are not, depictions of real people.

First, with regard to adult victims, the Act makes it a crime, with limited exceptions, to publish authentic intimate images without the depicted individuals' consent or to publish technologically-created images — i.e., those "created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means"— that are "indistinguishable from an authentic visual depiction of the individual[.]"

These new crimes appear to have six elements: (1) knowing publication of "an intimate visual depiction of an identifiable individual" or of a "digital forgery of an identifiable individual"; (2) publication "in interstate or foreign commerce" "us[ing] an interactive computer service"; (3) that an authentic image was "obtained or created" under circumstances in which the offending person knew or should have known that the "individual had a reasonable expectation of privacy" or, in the case of a digital forgery, that the image was "published without the consent of the identifiable individual"; (4) that the individual depicted in either case did "not voluntarily expose[]" themselves "in a public or commercial setting;" (5) that the depiction is "not a matter of public concern;" and (6) that the depiction is "intended to cause harm" or does, in fact, cause harm, including "psychological, financial, or reputational" harm. A criminal violation of these provisions of the Act may be punished by up to two years' imprisonment.

The Act also alters the law regarding the publication of images of minors, providing that it is a crime for "any person, in interstate or foreign commerce, to use an interactive computer service to knowingly publish" either an "intimate visual depiction" of a minor or a "digital forgery of an identifiable individual who is a minor" with intent to "(i) abuse, humiliate, harass, or degrade the minor; or (ii) arouse or gratify the sexual desire of any person." A criminal violation of these provisions of the Act may be punished by up to three years' imprisonment.

Notably, the Act also criminalizes threats to publish either nonconsensual intimate images or digital forgeries, of both adults and minors, "for the purpose of intimidation, coercion, extortion, or to create mental distress."

Requirements for Online Platforms

The Act also mandates that, within one year of its enactment, by May 2026, certain online sites and platforms must establish a process by which an individual depicted in either a nonconsensual intimate image or in a digital forgery may notify the platform that such an image has been published without his or her consent and may submit a request for its removal. Whatever notice and removal processes are developed by online providers, those sites are required to provide "clear and conspicuous notice" of their processes, in "easy to read...plain language[.]"

Where an individual submits a valid removal request — which includes identification of the depiction at issue and a statement of the basis for it to be deemed nonconsensual — online platforms are required to remove the images and to "make reasonable efforts to identify and remove any known identical copies" within 48 hours.

The Act's provisions are targeted at online sites that host primarily user-generated content, including, for example, social media platforms. Specifically, the platforms covered by the Act are online sites or services that (i) "serve[] the public" and (ii) "primarily provide[] a forum for user-generated content, including messages, videos, images, games, and audio files; or for which it is in the regular course of trade or business of the website, online service, online application, or mobile application to publish, curate, host, or make available content of nonconsensual intimate visual depictions." The Act expressly excludes email and internet service providers, as well as online services that "consist[] primarily of content that is not user generated but is preselected by the provider of such online service, application, or website; and for which any chat, comment, or interactive functionality is incidental to, directly related to, or dependent on the provision of the content[.]"

The Act dictates that a failure by an online platform to comply with the notice and takedown obligations it imposes will be treated as a violation of the Federal Trade Commission Act, with enforcement authority delegated to the FTC.

The Act also offers immunity from liability to covered online platforms that engage in "good faith disabling of access to, or removal of, material claimed to be a nonconsensual intimate visual depiction," even if the depiction is not ultimately determined to be illegal. Thus, a social media site or other website that removes content in good faith in response to a Take It Down Act-based request can defend itself against, for example, a claim for damages by the person or company that uploaded the content, even if the content is later determined to have been legally uploaded.

Implications of the Act

For victims of nonconsensual pornographic images (both real and AI-generated) and "revenge porn," the Take It Down Act is likely to be a powerful tool to seek accountability for offenders and stem the tide of unlawful images online. Victims can now report these images — or threats to publish them — to federal law enforcement authorities, including United States Attorney's Offices and the FBI, as violations of federal criminal statutes, worthy of investigation and prosecution. Prosecutors, too, now have a new and valuable tool to address these problems, which had previously been, in many cases, outside the reach of federal law. The first criminal cases brought under the Act are likely to receive a great deal of attention and may have a meaningful deterrent effect if significant criminal penalties are imposed.

Notably, the Act does not explicitly provide victims civil remedies, neither against those who create unlawful images, nor against online providers who publish and host them. While the FTC can bring an action against online platforms that may result in injunctive relief and/or monetary penalties, the Act does not, on its face, explicitly provide for a private right of action for victims seeking relief from those platforms. What the Act does, though, is create meaningful incentives for platforms to act quickly in removing potentially offending content. By imposing a 48-hour window for platforms to resolve takedown requests — and providing a safe harbor to platforms removing content in good faith, even if that content is ultimately not deemed to be unlawful — the Act is likely to lead to platforms erring on the side of removing potentially offending content. Online providers subject to the Act will also need to undertake efforts to craft removal protocols — including, potentially, developing tools to identify offending content — and to update terms of service and other user-facing materials to address these new compliance obligations.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More