The TAKE IT DOWN Act Is Poised To Become Law. But Will It Provide Justice To Victims?
Jasmine Mithani / Apr 15, 2025Jasmine Mithani is a fellow at Tech Policy Press.

Reihaneh Golpayegani / Better Images of AI / CC-BY 4.0
Last week, the House Energy and Commerce Committee voted to approve the TAKE IT DOWN Act, a bill that would create criminal penalties for distributing nonconsensual intimate images (NCII) and mandate tech companies to remove such content. It is now eligible to move to the full House for a vote.
The bill has, relatively speaking, sailed through Congress. The Act is a bipartisan, bicameral bill, jointly introduced into the Senate by Sens. Ted Cruz (R-TX) and Amy Klobuchar (D-MN). Introduced for the first time in June 2024, it passed the Senate unanimously both this year and last. President Trump has also vowed to sign it, and First Lady Melania Trump held a roundtable in support of the bill this March.
But even if the bill passes, a larger question looms: Can it be enforced? By whom?
Enforcement of the act falls squarely to the Federal Trade Commission (FTC), the independent agency that has shouldered most of the responsibility for holding tech platforms accountable.
But the FTC, like much of the federal bureaucracy, has been caught in the crosshairs of the Elon Musk-led effort to streamline the government. The agency already saw layoffs as a part of the purge of probationary federal employees shortly after Trump took office. Several outlets have reported that as of last week, DOGE officials have begun work at the FTC, usually a premonition of further staff reduction.
Moreover, Trump removed the two Democratic commissioners last month. Now both are suing the administration, saying that attempting to terminate their positions without cause violates a Supreme Court precedent.
These moves have left the agency hobbled, raising questions about its ability to uphold the law’s core provisions and whether the administration’s actions leave the law a symbolic win for victims.
The trouble with enforcement in the age of Trump
Rep. Brett Guthrie, the Republican chair of the E&C Committee, opened the meeting by praising the cross-aisle teamwork on the 26 bills scheduled for discussion on the agenda. But the Democrats present had a very different perspective on events.
House Democrats repeatedly reminded their fellow committee members during the meeting that most of the bipartisan bills scheduled for discussion were struck from the December continuing resolution package after Elon Musk, the world’s richest person and a major campaign donor to President Donald Trump, led an online campaign against it on his social media platform X.
Now Musk is the leader of the effort to slash federal spending and the workforce via the so-called Department of Government Efficiency, a controversial effort that has spawned multiple court cases about privacy rights and public records. DOGE’s mass layoffs and associated “cost-cutting” actions were one of the reasons for massive nationwide protests earlier this month.
“I know Republicans want to pretend like it’s business as usual around here – but the daily chaos and illegal activity that we are seeing from the Trump Administration is not business as usual,” said ranking member Frank Pallone, a Democrat from New Jersey. He repeatedly called out Musk’s outsized influence in the Trump administration, blaming the billionaire for sidelining most of the bills on the agenda that were slated to be passed in December.
Under the TAKE IT DOWN Act, covered websites would have to remove NCII within 48 hours of a request. A hamstrung FTC would likely struggle to hold platforms accountable to that critical timeline, as several House Democrats pointed out during the meeting.
To that end, Rep. Darren Soto, a Democrat from Florida, introduced an amendment during the committee markup that would have conditioned the TAKE IT DOWN Act on the reinstatement of the two Democratic commissioners Trump is attempting to fire.
It was voted down along party lines.
All the changes at the FTC will make it harder to enforce the report-and-remove mechanism that is the namesake of the TAKE IT DOWN Act, said Omny Miranda Martone, the founder and CEO of the nonprofit Sexual Violence Prevention Association. Layoffs, partisanship, and general weakening will make it more difficult for the FTC to hold companies accountable to the take-down provisions — and for many survivors and advocates, that’s the most valuable part of the bill.
“From working with a lot of the young victims in particular, what we are often told is that the most important thing to them is the ability to get their images removed from the internet,” said Adam Billen, vice president of public policy at youth-led AI policy nonprofit Encode.
“Often, what happens with victims is that once an image is up on the internet, it quickly spreads to other social media platforms, for example, and so getting it removed from that initial platform is incredibly important,” said Billen. “Once it starts spreading to other platforms, often you're playing whack-a-mole at that point, and it becomes incredibly, incredibly difficult to get your images off of every possible website and application.”
In addition to changes at the FTC, there have been dramatic shifts in the priorities and staffing at the Department of Justice, which would be involved in some way in the enforcement of the criminal provisions of the TAKE IT DOWN Act.
As a result, Susanna Gibson, the founder of My Own Image, a newly-minted nonprofit pushing for comprehensive NCII policy, worries that federal prosecutors won’t have the bandwidth to press criminal charges against perpetrators of NCII. She knows from personal experience — despite a law banning NCII in her home state of Virginia, she hasn’t been able to take action against the person who leaked nonconsensual recordings of her to a major newspaper. She said federal recourse wasn’t an option either.
The hearing also lightly touched on concerns raised by civil liberties groups about how the TAKE IT DOWN Act could be abused, particularly in a political environment that is hostile to vast swathes of the population.
Digital rights groups, including the Center for Democracy and Technology, urged the E&C Committee to revise the definition of covered platforms to explicitly exclude encrypted storage and communications. While no amendment was introduced, Rep. Lori Trahan (D-MA), a strong supporter of privacy rights, addressed those concerns on the floor, saying that the bill only applied to “forums,” which, to her, excludes private channels like messaging.
An amendment proposed by Rep. Debbie Dingell (D-MI) would have created protections for material considered relevant to the public interest. This conflict surfaced earlier this year when BlueSky took down digital forgeries of Trump and Musk, but later reinstated the posts after it was clear that the media was part of a newsworthy event.
The amendment was also voted down along party lines, meaning the Act passed out of committee as originally written.
Can states fill the gap?
In many cases, state laws are the easiest way survivors can receive justice through the courts, said Gibson.
Forty-nine states and DC have laws banning the nonconsensual distribution of real intimate images (also known as “revenge porn”), and about half that have laws that address synthetic NCII. There are many differences across laws, some of which impede legal action.
The internet has no boundaries, Martone pointed out, and many instances of NCII occur across state lines.
“That leaves the victim with very little options to move forward when the two states have different laws,” Martone said. “We need a federal law to make sure that all victims aren't going to fall through the cracks.”
Still, Gibson is focusing her energy on strengthening laws at the state level, including in South Carolina, the only state without any sort of NCII ban.
Gibson is also concerned about how the wording of some laws can prevent justice for survivors; the TAKE IT DOWN Act is narrowly tailored in what qualifies as explicit content and includes an intent clause. In other words, there has to be proof that someone “intended to cause harm” or that the sharing of NCII caused “harm, including psychological, financial, or reputational harm, to the identifiable individual.”
In her advocacy work, Gibson has met with numerous women who were unable to pursue cases due to exceptions in the law around intent or other minutiae. For these reasons, the model state policy that My Own Image crafted does address cases without intent to harm. The organization focuses on state policy because cases at the lower level can progress more quickly.
While local courts might provide relief in the form of monetary damages or restraining orders, the pressure necessary to help victims remove their images from websites or apps is only actionable at the federal level. As a result, the lack of uniform federal regulation means tech regulation at the state level can lead to vastly different experiences for residents across the country.
Consider the difference in data privacy laws, or the difference in experience when trying to access pornography in Louisiana, Texas, or Vermont. Louisiana has the infrastructure to support digital driver’s licenses, and some adult entertainment companies have integrated those age verification systems into their websites. In other states, companies like PornHub have entirely blocked access to their sites. And in others, there is no age verification at all.
Federal takedown requirements are a compelling solution, as they are often the only reliable way to remove media from websites. Research published last October found that the social media network X only removed requests for NCII that invoked copyright law.
Advocates are continuing to push for additional laws that would provide avenues to justice for survivors of NCII. The DEFIANCE Act, introduced by Rep. Alexandria Ocasio-Cortez (D-NY) last year, would create a civil right of action for survivors to sue creators of NCII.
“Our goal is really just to make sure that as many survivors as possible have as many options as possible to seek justice. And to prevent this before it happens by giving these options and making sure that people know there will be consequences and accountability if they do perpetrate harm,” Martone said.
Authors
