The Federal Trade Commission sent letters to 17 major tech companies this week, warning them to comply with the Take It Down Act by May 19 or face fines of $53,088 per violation.
Amazon, Alphabet, Apple, Meta, Microsoft, TikTok, X, Reddit, Discord, Snapchat, Pinterest, Bumble, Match Group, Automattic, and SmugMug all got the same message from Chairman Andrew Ferguson.
We obtained a copy of the letter for you here.
“We stand ready to monitor compliance, investigate violations, and enforce the Take It Down Act,” Ferguson wrote.
“Protecting the vulnerable, especially children, from this harmful abuse is a top priority for this agency and this administration.”
The law, signed by President Trump in May 2025 with strong backing from First Lady Melania Trump, requires platforms to delete non-consensual intimate imagery (NCII), including AI-generated deepfakes, within 48 hours of receiving a removal request.
Platforms must also find and remove identical copies, provide clear notice about the removal process and let people track their requests. The FTC published a business guidance page alongside the letter spelling all of this out. The definition of “covered platform” is broad enough to capture social media, messaging apps, video sharing, gaming platforms, and essentially any site hosting user-generated content.
Nobody wants revenge porn circulating online. But the law Congress passed is far broader than the problem it claims to solve.
The TAKE IT DOWN Act borrows its structure from the DMCA’s already-controversial notice-and-takedown system, then strips out the safeguards.
Under the DMCA, a takedown request must include a statement under penalty of perjury. False claims can result in liability. There’s a counter-notice process so the person whose content was deleted can push back. TIDA has none of this. There’s no penalty for false claims, no counter-notice, no requirement that the filer prove anything before content disappears. A platform gets a complaint, has 48 hours, and deletes. That’s the entire process and exactly why the Take it Down Act introduces a new censorship mechanism.
The law defines a violation as involving an “identifiable individual” engaged in “sexually explicit conduct,” without defining that conduct narrowly.