🎉 Limited-time: Unlimited usernames/accounts for life Get it →

The TAKE IT DOWN Act: Deepfakes, 48-Hour Takedowns, and What the Adult Content Industry Needs to Know

Alice McKinley
Alice McKinley

Alice McKinley is a Content Protection Specialist at CopyrightShark (since 2023), helping OnlyFans, Fansly, Patreon creators and more with DMCA enforcement, platform reporting, and long-term protection strategies.

Published

In two years, the number of deepfake files online grew from 500,000 to 8 million. Ninety-eight percent of them are pornographic. The TAKE IT DOWN Act, signed into law on May 19, 2025, is the federal government’s first direct response to this problem. It does two things: makes publishing nonconsensual intimate imagery (including AI-generated deepfakes) a federal crime, and forces platforms to remove that content within 48 hours of a valid takedown request.

The platform compliance deadline is May 19, 2026. If you create adult content on OnlyFans, Fansly, or similar platforms, this law gives you new legal protections against deepfakes and leaked intimate content. If you operate a platform that hosts user-generated content, you have roughly 80 days to build the required takedown infrastructure or face FTC enforcement.

This article breaks down the actual text of the law, the penalties, and the compliance requirements for both creators and platform owners.

What is the TAKE IT DOWN Act?

The TAKE IT DOWN Act (full name: Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act) is S.146 from the 119th Congress. Senator Ted Cruz (R-TX) introduced it on January 16, 2025, with Senator Amy Klobuchar (D-MN) co-sponsoring alongside 20+ other senators from both parties.

The bill passed the Senate by unanimous consent on February 13, 2025. The House followed on April 28 with a 409-2 vote. President Trump signed it into law on May 19, 2025, at a Rose Garden ceremony where First Lady Melania Trump also appeared (she had been vocal about the deepfake issue throughout the legislative process).

S.146 TAKE IT DOWN Act on Congress.gov showing the bill's progression from introduction to becoming law

The law has two pillars:

Pillar 1: Criminal prohibition. It is now a federal crime to knowingly publish nonconsensual intimate visual depictions, whether those are real leaked images or AI-generated “digital forgeries.” This took effect immediately on May 19, 2025.

Pillar 2: Platform takedown mandate. Covered platforms must establish a notice-and-removal process and remove flagged NCII within 48 hours. Platforms have until May 19, 2026 to comply.

Before this law, no federal criminal statute specifically targeted nonconsensual intimate imagery. State laws covered it unevenly (48 states plus DC have some form of NCII law, but the protections vary wildly). The TAKE IT DOWN Act is the first federal attempt to standardize that.

Are deepfakes illegal under federal law now?

Yes. Since May 19, 2025, publishing AI-generated intimate imagery of a real person without their consent is a federal crime.

The Act defines a “digital forgery” as any intimate visual depiction of an identifiable individual created through software, machine learning, AI, or other technological means that, when viewed as a whole by a reasonable person, is “indistinguishable from an authentic visual depiction” of that individual. That includes AI face-swaps, generative imagery from tools like Stable Diffusion, and manipulated real photos.

The threshold: the output must look real enough to fool a reasonable person. Obviously synthetic content, artistic expression, and clearly artificial imagery don’t qualify. For minors, the threshold is lower, and different penalties apply.

The numbers explain why Congress moved on this. According to SecurityHero’s 2023 research, 99% of deepfake pornography targets women. The file count went from roughly 500,000 in 2023 to over 8 million by 2025. DeepStrike reported that humans can only correctly identify high-quality deepfake video 24.5% of the time. Detection is getting harder, not easier.

Criminal penalties under the TAKE IT DOWN Act

The penalties depend on two things: whether the victim is an adult or a minor, and whether the offense involves actual publication or threats.

Publishing NCII of an adult carries up to 2 years in prison plus fines. For minors, that goes up to 3 years. Threatening to publish is its own offense: up to 18 months for threats involving adults, 30 months for minors. That last part matters. Threats alone are now a federal crime even without actual distribution. This is the sextortion provision.

The court must also order forfeiture of all materials distributed and any proceeds from the violation, plus mandatory restitution to victims.

One provision that will come up constantly in the adult content industry: consent to creation is not consent to publication. The law says this explicitly. The fact that someone agreed to create intimate content does not mean they agreed to have it published by someone else. And the fact that someone shared an intimate image with one person does not mean they consented to that person publishing it.

For collaborative content creators on platforms like OnlyFans or Fansly, this distinction is critical. If you create content with another performer and they later distribute it without your authorization, the TAKE IT DOWN Act’s criminal provisions can apply.

The 48-hour takedown rule and platform obligations

This is where the law shifts from criminal law to platform regulation. If you run a platform, this is the part that requires actual engineering work before May 19, 2026.

Who counts as a “covered platform”?

A “covered platform” is a website or online service that either (a) primarily provides a forum for user-generated content, or (b) regularly publishes or hosts NCII in the course of business. That second category is broad by design. It catches tube sites and other platforms where leaked content frequently appears.

Excluded: email services, internet service providers, and sites that primarily serve curated (non-user-generated) content where comment or chat features are incidental.

What platforms must do

By May 19, 2026, every covered platform must:

  1. Establish a clear notice-and-removal process. The process must accept requests that include: a physical or electronic signature, identification of the content with enough detail to locate it, a good-faith statement that the content is nonconsensual, and contact information for the requesting individual.

  2. Remove content within 48 hours of receiving a valid request. Not 48 business hours. Forty-eight actual hours, which means weekends and holidays count.

  3. Remove known identical copies. The platform must make “reasonable efforts” to identify and remove duplicates of the flagged content. This effectively requires some form of content fingerprinting or perceptual hashing technology.

  4. Post clear notice of the removal process so users know how to submit requests.

FTC enforcement

The Federal Trade Commission enforces the takedown mandate. If a platform fails to comply, the FTC treats it the same way it treats unfair or deceptive business practices. That brings the full range of FTC enforcement powers into play, including civil penalties.

FTC legal library page for the TAKE IT DOWN Act, showing the statute classification and enforcement authority

The safe harbor (and its implications)

Platforms get liability protection for good-faith removal of content, even if the content turns out to be lawful. This sounds reasonable on its face, but the one-sidedness is worth noting: there is safe harbor for removing content, but no safe harbor for keeping it up. Combined with the 48-hour deadline, this creates a strong incentive to remove first and ask questions never. (More on this in the criticisms section.)

What adult content creators should know

If you make money from adult content, the TAKE IT DOWN Act gives you something DMCA never did: legal protection based on being the person in the content, not just the copyright holder.

DMCA vs. TAKE IT DOWN Act

For years, creators have relied on DMCA takedowns to remove leaked content. The TAKE IT DOWN Act adds a second, parallel path. Here’s how they compare:

DMCA TakedownTAKE IT DOWN Act
What it protectsCopyrighted worksIntimate visual depictions (real or AI-generated)
Who can fileCopyright owner or authorized agentThe depicted individual or authorized person
Platform response time”Expeditiously” (no fixed deadline)48 hours
Counter-noticeYes (DMCA 512(g))No mechanism required
DuplicatesNot requiredMust remove known identical copies
EnforcementCivil lawsuitFTC enforcement + criminal prosecution
AI deepfakesOnly if using copyrighted source materialCovered regardless of source material

DMCA is still your primary tool for most leaked content because copyright ownership is straightforward to prove. But for deepfakes (where you might not own the copyright to an AI-generated image of your face) and for intimate leaks where copyright ownership is murky, the TAKE IT DOWN Act gives you a second path.

For help crafting DMCA takedown requests, see our DMCA template generator.

StopNCII.org: proactive protection

StopNCII.org lets you create a “digital fingerprint” (hash) of your intimate images directly on your device, without uploading the actual images. Participating platforms then use these hashes to detect and block matching uploads.

StopNCII.org homepage showing the case creation process for victims of nonconsensual intimate image abuse

Several major adult platforms already participate: OnlyFans, F2F (Fansly’s parent company), Aylo (Pornhub’s parent), RedGIFs, and Playhouse. Over 2 million images are protected through the system as of early 2026. The Free Speech Coalition specifically recommended StopNCII.org integration as a practical path to TAKE IT DOWN Act compliance.

Since consent to creation does not equal consent to publication, collaborative content creators need to document consent agreements in writing. If a dispute arises, clear records of who authorized what protect both parties. Without documentation, it becomes one person’s word against another’s.

What platform owners need to build before May 2026

If you run a platform that hosts user-generated content, you have until May 19, 2026. Here’s what you need to build.

Intake system

Build a dedicated intake form that collects all four required elements: signature (electronic is fine), content identification information, a good-faith nonconsensuality statement, and contact information. This needs to be clearly accessible from your platform, not buried in a help center.

Verification workflow

You need a process to determine whether incoming requests are valid. The law doesn’t define “valid” with extreme precision, which means you’ll need reasonable verification steps without creating barriers that discourage legitimate requests.

48-hour SLA pipeline

The clock runs continuously once a valid request arrives. That means you need staffing or automation that covers nights, weekends, and holidays. If your content moderation team works business hours only, you have a problem.

Duplicate detection

The law requires “reasonable efforts” to find and remove known identical copies. In practice, this means implementing perceptual hashing or content fingerprinting. Platforms that already use PhotoDNA, StopNCII.org hashes, or similar systems have a head start.

Record-keeping

Log every request received, what verification you did, what you removed, and when. This paper trail is your safe harbor defense if the FTC comes looking.

Adapt existing DMCA workflows

If your platform already handles DMCA takedowns (and it should), you can build the TAKE IT DOWN Act process on top of that existing infrastructure. The intake requirements are similar. The main difference is the 48-hour clock, which is much tighter than the DMCA’s vague “expeditiously” standard.

Counter-notification gap

The TAKE IT DOWN Act, unlike the DMCA, does not require a counter-notification process. Build one anyway. Without it, you have a one-way system that removes content with no recourse. That will generate user complaints and, eventually, lawsuits.

Encryption complication

If your platform offers end-to-end encrypted messaging, you have a problem. You can’t scan E2E content for NCII without breaking the encryption. The law doesn’t carve out an exception for encrypted communications. Nobody knows how the FTC will handle this. Expect it to end up in court.

Need help removing nonconsensual content?

Criticisms and open questions

The law has real critics, and they’re not fringe voices. These concerns come from organizations with track records on digital rights.

The EFF’s position

The Electronic Frontier Foundation opposed the bill, arguing it “gives the powerful a dangerous new route to manipulate platforms into removing lawful content.” Their concern: the combination of a 48-hour deadline, no counter-notification requirement, and safe harbor only for removal creates a system where platforms will reflexively take down content rather than risk FTC enforcement. That’s not hypothetical. It’s the rational economic response to the law’s incentive structure.

No counter-notification

This is the biggest structural difference from the DMCA. Under DMCA Section 512(g), the person whose content was removed can file a counter-notification, and the platform must restore the content if the original complainant doesn’t file suit. The TAKE IT DOWN Act has no equivalent. Content goes down and stays down, with no built-in mechanism for the content creator to dispute the removal.

Weaponization risk

Bad-faith actors could use the takedown process against legitimate adult content by falsely claiming nonconsensuality. The 48-hour deadline and lack of counter-notification make this easy to abuse. A competitor could weaponize the system to get a rival’s content pulled. So could a disgruntled ex or a troll with a grudge.

Encryption threats

Signal and WhatsApp have both raised alarms. End-to-end encryption means platforms literally cannot inspect content for NCII. The law creates pressure to weaken encryption or drop E2E messaging entirely, which would make everyone less safe, not just NCII victims.

FTC enforcement concerns

Who actually runs the FTC matters here. The commission’s composition and priorities will shape how aggressively (and how selectively) the takedown mandate gets enforced. BillTrack50’s February 2026 analysis framed this bluntly: “deepfake deterrence or fast-track censorship.” For platforms building compliance programs right now, that’s not an abstract question.

The CDT’s middle ground

The Center for Democracy and Technology landed somewhere between the EFF and the bill’s supporters. CDT backs the criminal provisions but has serious reservations about the takedown system. Their August 2025 piece in Tech Policy Press argued that “platforms must do more than the bare minimum,” pushing for voluntary counter-notification processes and transparency reporting even though the law doesn’t require either.

NCII and revenge porn federal law in 2026

The TAKE IT DOWN Act isn’t the only thing happening in NCII law right now. It landed in the middle of a broader legislative push, both in the US and internationally.

On the domestic side: 48 states plus DC already have some form of NCII or revenge porn law, and 30 states specifically address deepfake NCII. The DEFIANCE Act, which would let victims sue for civil damages, passed the Senate in January 2026 and is waiting on the House.

Internationally, the direction is the same. The UK’s Online Safety Act 2023 created platform obligations similar to the TAKE IT DOWN Act. The EU’s Digital Services Act requires notice-and-action procedures for illegal content broadly. Australia’s eSafety Commissioner can order content removal with civil penalties. Canada updated its Criminal Code to cover intimate image distribution. And in February 2026, privacy regulators from 61 countries issued a joint statement backing enforcement against AI deepfakes.

The pattern across all of these: mandatory platform removal of NCII, backed by real penalties. The US got there later than some countries, but the TAKE IT DOWN Act is now the most specific federal framework in the world for this problem.

What the law doesn’t cover

The Act only covers intimate imagery. It doesn’t touch election deepfakes (no coverage for synthetic political media), non-intimate synthetic impersonation (your AI clone giving a product endorsement), or financial fraud deepfakes (the CEO impersonation calls that have cost companies millions). Those are separate problems that Congress hasn’t addressed yet.

For a broader look at how content protection works across platforms, check our content protection tools.

Frequently asked questions

What is the TAKE IT DOWN Act?
The TAKE IT DOWN Act (Public Law 119-12) is a federal law signed on May 19, 2025 that makes publishing nonconsensual intimate imagery, including AI deepfakes, a federal crime. It also requires covered platforms to remove such content within 48 hours of receiving a valid takedown request.
When does the TAKE IT DOWN Act take effect?
The criminal provisions took effect immediately when signed on May 19, 2025. The platform takedown mandate has a one-year compliance period, meaning covered platforms must have their removal processes operational by May 19, 2026.
Is deepfake porn illegal in the United States?
Yes. Since May 19, 2025, publishing AI-generated intimate imagery of a real identifiable person without their consent is a federal crime under the TAKE IT DOWN Act. Penalties include up to 2 years in prison for adult victims and 3 years for minors.
Does the TAKE IT DOWN Act apply to OnlyFans and Fansly?
Yes. Both OnlyFans and Fansly are covered platforms under the Act because they host user-generated content. Both already participate in StopNCII.org's hash-based detection system.
What happens if a platform doesn't comply with the 48-hour takedown rule?
The FTC enforces compliance. Failure to comply is treated as an unfair or deceptive act under the FTC Act, which can result in civil penalties, injunctions, and ongoing FTC oversight.
Can someone abuse the takedown process to remove legitimate content?
This is a real concern. Unlike the DMCA, the TAKE IT DOWN Act has no counter-notification mechanism. Bad-faith requests could be used to remove lawful content, and platforms have strong incentives to remove rather than risk FTC action.
How is consent defined under the TAKE IT DOWN Act?
Consent means an affirmative, conscious, and voluntary authorization free from force, fraud, duress, misrepresentation, or coercion. Consenting to create intimate content does not equal consenting to its publication.