President Trump’s first joint address to Congress on March 4. During his speech, he had the opportunity to celebrate the passage of the recently enacted Take It Down Act. This legislation would make it a crime to publish intimate imagery without consent, applying to both real and AI-generated content. Trump expressed his eagerness to sign the bill into law, emphasizing its potential impact on combating revenge porn and explicit deepfakes.
The Take It Down Act mandates rigorous standards for online platforms. It requires them to take down nonconsensual intimate imagery (NCII) within 48 hours of a victim’s request to destroy it. Noncompliance might expose the platforms to large lawsuits. Senator Marsha Blackburn has been instrumental in moving the bill forward. We’re grateful for the support she has previously shown victims of online exploitation, co-sponsoring this legislation to enhance protections.
Implications of the Take It Down Act
The Take It Down Act will go into effect immediately upon passing. This law requires that online platforms develop a policy to remove NCII within one year. This requirement has free speech advocates and experts alike worried about potential overreach and unintended consequences. Critics argue that the law will lead to excessive content moderation. This would have a chilling effect especially on decentralized platforms like Mastodon, Bluesky, and Pixelfed.
India McKinney, Director of Federal Affairs at the Electronic Frontier Foundation, expressed concerns about the extent of the law’s repercussions. She stated, “Content moderation at scale is widely problematic and always ends up with important and necessary speech being censored.” McKinney cautioned that the law’s wide reach could lead to more preemptive moderation. That would open the door for platforms to preemptively remove content before it ever gets to a user, raising severe free expression red flags.
Additionally, McKinney pointed out that the law’s enforcement mechanisms will likely chill protected speech. “The default is going to be that they just take it down without doing any investigation to see if this actually is NCII or if it’s another type of protected speech,” she explained. This refrain only punctuates the chills felt at the idea that this law could be misused in our rapidly polarized political environment.
The Role of Technology in Compliance
Now, platforms are under a mad dash to comply with the new Take It Down Act. This is why technology is a vital partner in ensuring they’re able to meet its demands and do so well. Hive, a fast-growing dynamic media detection startup, has made waves since their founding to position themselves as the most important player in the industry. It works closely with online platforms to proactively detect deepfakes and CSAM. You can plug Hive’s API in directly at the upload stage. This configuration allows you to pre-screen content before it’s published community wide.
Kevin Guo, CEO and co-founder of Hive, shared his optimism about the benefits of the Take It Down Act. “It’ll help solve some pretty important problems and compel these platforms to adopt solutions more proactively,” he noted. Yet this increased tech approach has such serious implications for privacy and breadth of monitoring that it deserves close scrutiny.
Sackerson raised concerns about how these developments in content moderation might spill over onto encrypted DMs and SMS down the line. “I really want to be wrong about this, but I think there are going to be more requests to take down images depicting queer and trans people in relationships,” she stated. Her concerns reflect broader fears that the law may disproportionately impact marginalized communities and their ability to share personal narratives online.
A Broader Context of Content Moderation
As such, the Take It Down Act is a natural companion to the Kids Online Safety Act. Senator Marsha Blackburn is the leading cosponsor of both bills. Lawmakers on both sides of the aisle are clearly concerned about online safety and content moderation issues. At the same time, free speech advocates are understandably alarmed at this dramatic shift towards regulation.
Different political groups are clamoring to enact tighter controls on what can be said online, from critical race theory to debates over reproductive rights. McKinney was forceful in highlighting the sinister implications of such developments. “It is deeply uncomfortable for us with our past work on content moderation to see members of both parties openly advocating for content moderation at this scale,” she remarked.
The tension between complex technology and fast-paced legislative action is felt by online platforms and their civil society users. As they navigate compliance with the Take It Down Act, these entities must balance legal obligations with the necessity of protecting free speech and fostering open dialogue.
Leave a Reply