All News

Trump Signs Take It Down Act Criminalizing Nonconsensual Intimate Images Including Deepfakes

President Donald Trump signed the Take It Down Act into law, criminalizing the distribution of nonconsensual intimate images (NCII), including AI-generated deepfakes. The law requires social media platforms to remove such content within 48 hours of notification. While widely supported by advocates and tech companies, critics warn the law's enforcement could harm survivors and chill free expression. The FTC will enforce the law, with companies given a year to comply.

Published May 19, 2025 at 04:11 PM EDT in Cybersecurity

On May 19, 2025, President Donald Trump signed the Take It Down Act into law, marking a significant step in combating the distribution of nonconsensual intimate images (NCII), including AI-generated deepfakes. This legislation criminalizes the sharing of such images and mandates social media platforms to remove them promptly upon notification.

The law imposes penalties of up to three years in prison and fines for those who distribute NCII, whether the images are authentic or artificially generated. Social media companies are required to establish processes to remove such content within 48 hours of notification and to make reasonable efforts to eliminate any copies circulating on their platforms.

The Federal Trade Commission (FTC) is tasked with enforcing the law, and companies have a one-year grace period to comply. The bill received bipartisan support, with backing from tech companies, youth advocates, and even First Lady Melania Trump, highlighting the growing concern over digital privacy and online abuse.

Despite widespread support, the law has drawn criticism from privacy and digital rights organizations such as the Electronic Frontier Foundation (EFF) and the Center for Democracy and Technology (CDT). These groups caution that the takedown provisions could be misused to suppress legitimate content and threaten privacy technologies like encryption, which hinder platforms' ability to monitor user communications.

Critics also express concern that platforms aligned with the current administration might selectively enforce the law, potentially ignoring reports of NCII or overwhelming platforms with false complaints. The Cyber Civil Rights Initiative (CCRI), a leading advocate against image-based abuse, has voiced that the law might ultimately provide survivors with false hope due to enforcement challenges.

President Trump, during the signing ceremony, acknowledged criticism but emphasized the bill's passage despite concerns over constitutional rights. He notably stated his intention to use the law personally to address online mistreatment.

Legal experts anticipate that challenges to the law’s ambiguous provisions may arise over time, especially as enforcement begins. Courts will likely play a crucial role in interpreting the law’s scope and balancing it against free speech protections.

Broader Implications for Digital Privacy and Platform Governance

The Take It Down Act reflects the increasing legislative focus on regulating online content to protect individuals from digital abuse, particularly as AI technologies enable the creation of realistic deepfakes. It underscores the complex balance between protecting victims, preserving free expression, and maintaining privacy through encryption.

For social media platforms, the law introduces operational and legal challenges in content moderation, requiring robust systems to identify and remove NCII swiftly while managing potential false reports and legal risks. The FTC’s enforcement role signals increased regulatory scrutiny on platform governance.

For survivors of image-based abuse, the law aims to provide stronger legal recourse and faster content removal, though its effectiveness will depend on enforcement and platform compliance. Advocacy groups continue to push for clearer, survivor-centered policies that minimize unintended harms.

Conclusion

The Take It Down Act represents a landmark legislative effort to address the harms of nonconsensual intimate image distribution in the digital age, including the challenges posed by AI deepfakes. While it establishes important legal frameworks and enforcement mechanisms, ongoing dialogue and refinement will be essential to ensure it protects victims without compromising constitutional rights or technological privacy safeguards.

Keep Reading

View All
The Future of Business is AI

AI Tools Built for Agencies That Move Fast.

QuarkyByte offers in-depth analysis and compliance strategies for emerging digital privacy laws like the Take It Down Act. Explore how our insights help tech companies implement effective content moderation and safeguard user privacy while navigating regulatory challenges. Partner with QuarkyByte to protect your platform and users in the evolving cybersecurity landscape.