Why Congress Is On Sound Legal Footing To Pass The TAKE IT DOWN Act
Slade Bond / Apr 28, 2025Slade Bond is the Chair of Public Policy and Legislative Affairs at Cuneo, Gilbert, & LaDuca, LLP.

The United States Capitol.
Something rare is about to happen on Capitol Hill. This spring, Congress has an opportunity to pass bipartisan legislation to address some of the harms children face online and require tech companies to respond to abusive content on their platforms.
The proposed bill, the TAKE IT DOWN Act, promises to help stop the horrific spread of non-consensual intimate images (NCII) online, which has skyrocketed in recent years. The bill has significant momentum. The Senate passed it unanimously in February, House leadership has signaled plans to take it up, and President Donald Trump has pledged to sign it if passed.
But as the bill nears the finish line, critics have raised concerns about the TAKE IT DOWN Act’s constitutionality, and in particular, its approach to the removal of NCII — something powerful tech companies have not done nearly enough to address.
As a final vote approaches, it’s important to set the record straight on what the TAKE IT DOWN Act does to empower survivors while also safeguarding speech online.
The consequences of inaction by Congress are far-reaching and terrifying
NCII is a form of sexual abuse that involves sharing sexual images, both real and deepfakes, without a person’s consent. The harms of NCII distribution are well-documented: it causes significant emotional, reputational, psychological, and financial harm. In many cases, it also leads to self-harm and suicide. The advent of generative artificial intelligence (AI) and other AI tools has increased the realism of NCII while accelerating its production and distribution.
NCII is also being exploited by malicious actors as a tool for the financial exploitation of minors. The Federal Bureau of Investigation (FBI) has released multiple alerts about the “horrific increase” in the extortion of children using threats to disseminate sexually explicit content online. The FBI has also documented a sharp rise in AI-enabled sexual abuse by malicious actors to exploit photos posted on social media.
At the same time, tech giants like Meta and Google have failed to implement policies that adequately facilitate the swift removal of deepfake NCII. A review by Meta’s oversight board last year found that the company’s policies were “not sufficiently clear” in some cases and “too narrow” in others to address “manipulation techniques available today, especially generative AI.” Months later, a subsequent investigation by CBS still “found a high prevalence of AI-manipulated deepfake images on the company’s Facebook platform.” Some of these companies, including Meta and Google, have since endorsed the TAKE IT DOWN Act.
The TAKE IT DOWN Act addresses the NCII crisis in several key ways
The bill criminalizes the publication of NCII, including AI-generated NCII, and closes gaps in state law. It also requires social media platforms and other covered sites to establish a process for victims to make good-faith requests to remove NCII on an expedited basis. This good-faith removal provision is essential to the goals of the legislation: facilitating the end of NCII distribution and ensuring its swift removal online. Finally, the TAKE IT DOWN Act empowers the Federal Trade Commission to hold companies accountable when they fail to act.
Congress is on firm footing to restrict NCII and require its removal
The American legal tradition of free speech protects what Justice Louis Brandeis referred to as the “discovery and spread of political truth.” But the Supreme Court has made clear that the First Amendment does not protect certain forms of harmful content, such as child sexual abuse materials (CSAM), obscenity, and speech that is integral to crime.
Congress also has significant leeway when crafting laws to protect children’s safety. The Supreme Court has long upheld the constitutionality of legislation “aimed at protecting the physical and emotional well-being of youth even when the laws have operated in the sensitive area of constitutionally protected rights,” as it did in New York v. Ferber in 1982. It has routinely reaffirmed these principles in the years since, including in 2002 in Ashcroft v. Free Speech Coalition.
The TAKE IT DOWN Act fits well within this constitutional framework
The speech regulated by the TAKE IT DOWN Act is not protected by the First Amendment. In many cases, the most heinous forms of NCII include CSAM — which is never protected by the First Amendment — and materials used to commit sexual extortion, which is illegal. In other cases, the speech regulated by the bill is integral to violating other state and federal laws, such as those criminalizing the abuse of sex-based imagery. Should the regulated speech somehow not fit neatly within those categories, NCII would also fall under the umbrella of obscenity, which is not shielded by the First Amendment.
But to the extent the bill is closely scrutinized under the First Amendment, the TAKE IT DOWN Act is narrowly tailored to serve a compelling government interest. In addition to protecting privacy, which is a compelling government interest, the bill also seeks to protect individuals from sexual harassment, abuse, humiliation, and other harms documented by Congress through numerous hearings.
The bill is also narrowly tailored to advance these goals. In all cases, it requires that the government show proof that a person “knowingly” published the NCII. In cases involving minors, it requires proof of intent to harm the minor or satisfy a person’s “sexual desire.” In cases involving non-minors, it requires proof of multiple elements that significantly narrow the scope of the bill’s reach. These include that the person knowingly published the NCII and “knew or reasonably should have known” about the victim’s expectation of privacy; that the image depicted was not voluntarily disclosed or about a matter of public concern; and that its publication was intended to cause harm or does cause harm.
The bill is also not constitutionally overbroad. Its definition of “intimate visual depiction” draws from current law, which has been construed by courts in dozens of cases. Importantly, it excludes matters of “public interest or concern” or “commercial pornography,” and contains several exceptions for legitimate purposes, such as seeking medical help, alerting law enforcement, and protecting national security.
The speech the bill regulates has little, if any, expressive value. The TAKE IT DOWN Act does not favor viewpoints, speakers, or messages. It does not threaten art, including sexual art, education, or public debate. It does not allow government officials to insist on their own truth or shield themselves from uncomfortable public debate on matters of public interest. Instead, it simply restricts the publication of NCII, a well-defined term, and facilitates its removal online.
But to the extent there is a balancing of First Amendment rights, it is well-documented that the publication of NCII causes a chilling effect on victims’ speech, who often engage in self-censorship in response to the horror of NCII abuse. The scales of justice should tip in their favor.
Claims that the TAKE IT DOWN Act will stifle commercial speech mischaracterize the bill
Some critics of the legislation, which is substantively identical in both chambers, argue that it could have a chilling effect on speech such as commercial pornography. In support of this argument, they claim that the bill does not define “intimate visual depiction” for its NCII removal process in section 3 of the bill.
Others argue that this section “applies to a much broader category of content—potentially any images involving intimate or sexual content—than the narrower NCII definitions found elsewhere in the bill.” (These concerns were repeated more recently.)
But these claims do not withstand scrutiny.
Section 4 of the TAKE IT DOWN Act establishes definitions for the entire Act, including Section 3’s removal process. Its definition of “intimate visual depiction” is exactly the same as in Section 2 of the bill, which defines the term by incorporating current law, 15 U.S.C. § 6851, into the Act. In 2022, Congress enacted this law for “individuals whose intimate visual images are disclosed without their consent” as part of reauthorizing the Violence Against Women Act (VAWA). Similar to other parts of the TAKE IT DOWN Act, the definition expressly excludes “matters of public concern or public interest” and “commercial pornographic content.” Simply put, this line of criticism appears to be based on a fundamental misunderstanding of the legislation. It is not supported by its actual text or a review of relevant case law.
There are certainly times when Congress seeks to regulate content on a good-faith basis that may nevertheless fail to satisfy First Amendment scrutiny. This is not one of those times. While Congress may amend the law as necessary, there is a strong bipartisan consensus that Congress must act now, and any additional delays are unacceptable.
Concerns that the bill will be misused are misplaced
Others have raised concerns that the TAKE IT DOWN Act could be weaponized by the Trump administration.
As an initial matter, this type of argument has been deployed for years by tech-funded opposition to weaken support for even modest reforms.
Substantively, the TAKE IT DOWN Act includes a series of safeguards to prevent misuse. As noted, the bill’s definitions and high evidentiary burdens on the government significantly narrow its scope. These limitations restrict enforcement discretion and give courts broad authority to reject dubious or politically motivated claims that fall outside the legislation’s intent. For criminal enforcement, the bill’s burden of proof is the highest in the legal system: beyond a reasonable doubt.
Finally, the authority granted to the FTC to ensure the success of the bill’s removal process is the same authority it already wields under Section 5 of the FTC Act to police unfair or deceptive practices. If the FTC were inclined to target tech platforms in the manner that opponents of the bill fear, it could already do so under this existing law. At the same time, that authority is narrowly constrained within the statute, and those same guardrails would apply under the TAKE IT DOWN Act, making it a poor tool for politicized enforcement.
Authors
