News Warner Logo

News Warner

Take It Down Act heads to Trump’s desk

Take It Down Act heads to Trump’s desk

  • The Take It Down Act has passed the House with a vote of 409-2, requiring social media companies to remove nonconsensual (including AI-generated) sexual images within 48 hours of being flagged.
  • Critics fear that the bill’s approach could be exploited to inflict harm in other ways, and may lead to selective enforcement by the Federal Trade Commission, potentially harming victims of image-based abuse.
  • The Electronic Frontier Foundation warns that smaller platforms will struggle to comply with the quick turnaround for removals, leading them to abandon encryption or use flawed filters, which could compromise user privacy.
  • Despite concerns, the bill has garnered significant support from various groups, including First Lady Melania Trump, parent and youth advocates, and some tech industry leaders, who see it as a step towards protecting individuals from nonconsensual explicit imagery.
  • Some critics, like Rep. Thomas Massie (R-KY), have expressed concerns that the bill is a “slippery slope” ripe for abuse, with unintended consequences, highlighting the need for careful consideration and oversight to ensure its effective implementation.

The Take It Down Act is heading to President Donald Trump’s desk after the House voted 409-2 to pass the bill, which will require social media companies to take down content flagged as nonconsensual (including AI-generated) sexual images. Trump has pledged to sign it.

The bill is among the only pieces of online safety legislation to successfully pass both chambers in years of furor over deepfakes, child safety, and other issues — but it’s one that critics fear will be used as a weapon against content the administration or its allies dislike. It criminalizes the publication of nonconsensual intimate images (NCII), whether real or computer-generated, and requires social media platforms to have a system to remove those images within 48 hours of being flagged. In his address to Congress this year, Trump quipped that once he signed it, “I’m going to use that bill for myself too, if you don’t mind, because nobody gets treated worse than I do online, nobody.”

The proliferation of AI tools that make it easier than ever to generate realistic-looking images has supercharged concerns about deepfaked, damaging content spreading through schools and creating a new vector of bullying and abuse. But while critics say that’s an important issue to deal with, they worry that the Take It Down Act’s approach could be exploited to inflict harm in other ways.

The Cyber Civil Rights Initiative (CCRI), which was created to combat image-based sexual abuse, said that it can’t cheer the Take It Down Act’s passage. “While we welcome the long-overdue federal criminalization of NDII [the nonconsensual distribution of intimate images], we regret that it is combined with a takedown provision that is highly susceptible to misuse and will likely be counter-productive for victims,” the group writes. It fears that the bill, which empowers the Federal Trade Commission — whose Democratic minority commissioners Trump fired in a break with decades of Supreme Court precedent — will be selectively enforced in a way that ultimately only props up “unscrupulous platforms.”

“Platforms that feel confident that they are unlikely to be targeted by the FTC (for example, platforms that are closely aligned with the current administration) may feel emboldened to simply ignore reports of NDII,” they write. “Platforms attempting to identify authentic complaints may encounter a sea of false reports that could overwhelm their efforts and jeopardize their ability to operate at all.”

“Platforms may respond by abandoning encryption entirely”

Because of the quick turnaround for platforms to remove content flagged as nonconsensual intimate imagery, the Electronic Frontier Foundation (EFF) warns that especially smaller platforms “will have to comply so quickly to avoid legal risk that they won’t be able to verify claims.” Instead, they’ll likely turn to flawed filters to crack down on duplicates, they write. The group also cautions that end-to-end encrypted services including private messaging systems and cloud storage are not exempted from the bill, posing a risk to the privacy technology. Since encrypted services can’t monitor what their users send to one another, the EFF asks, “How could such services comply with the takedown requests mandated in this bill? Platforms may respond by abandoning encryption entirely in order to be able to monitor content—turning private conversations into surveilled spaces,” including ones that abuse survivors commonly turn to.

Even so, the Take It Down Act quickly garnered a wide base of support. First Lady Melania Trump has become a leading champion of the bill, but it’s also seen backing from parent and youth advocates, as well as some in the tech industry. Google’s president of global affairs Kent Walker called the passage “a big step toward protecting individuals from nonconsensual explicit imagery,” and Snap similarly applauded the vote. Internet Works, a group whose members include medium-sized companies like Discord, Etsy, Reddit, Roblox, and others, praised the House vote, with executive director Peter Chandler saying the bill “would empower victims to remove NCII materials from the Internet and end the cycle of victimization by those who publish this heinous content.”
Rep. Thomas Massie (R-KY), one of two members (both Republican) who voted against the bill, wrote on X that he couldn’t support it because “I feel this is a slippery slope, ripe for abuse, with unintended consequences.”

link

Q. What is the Take It Down Act?
A. The Take It Down Act is a bill that requires social media companies to take down content flagged as nonconsensual (including AI-generated) sexual images.

Q. Who voted in favor of passing the Take It Down Act?
A. The House voted 409-2 in favor of passing the bill, with support from First Lady Melania Trump, parent and youth advocates, and some tech industry representatives.

Q. What is the main concern about the Take It Down Act among critics?
A. Critics fear that the bill’s approach could be exploited to inflict harm on individuals or groups who are not victims of nonconsensual intimate images, but rather those who disagree with their views online.

Q. Who created the Cyber Civil Rights Initiative (CCRI)?
A. The CCRI was created to combat image-based sexual abuse and is a group that opposes the Take It Down Act’s passage due to concerns about its misuse.

Q. What is the Electronic Frontier Foundation (EFF) warning about?
A. The EFF warns that smaller platforms may abandon encryption entirely in order to comply with the bill’s takedown requests, posing a risk to user privacy.

Q. Who has expressed support for the Take It Down Act?
A. First Lady Melania Trump, parent and youth advocates, some tech industry representatives (such as Google’s president of global affairs Kent Walker), and Internet Works have all expressed support for the bill.

Q. What is the Cyber Civil Rights Initiative (CCRI) warning about the potential misuse of the bill?
A. The CCRI warns that platforms may selectively enforce the bill in a way that props up “unscrupulous platforms” and ignores authentic complaints, leading to unintended consequences.

Q. Who voted against the Take It Down Act?
A. Rep. Thomas Massie (R-KY) was one of two Republican members who voted against the bill, citing concerns about its potential for abuse and unintended consequences.

Q. What is the main goal of the Take It Down Act?
A. The main goal of the Take It Down Act is to require social media companies to take down content flagged as nonconsensual (including AI-generated) sexual images within 48 hours of being flagged.