Home

Donate
Analysis

In Russmedia Ruling, the GDPR Displaces Europe's Rules for Online Speech

Daphne Keller / Apr 14, 2026

Luxembourg City, Luxembourg—Sept. 17, 2024: Towers (left to right Rocca, Montesquieu, Comenius) of the Court of Justice of the European Union (CJEU).

The European Union’s data protection rules and its intermediary liability rules have been on a collision course for years. They finally intersected in the December 2025 Russmedia ruling, in which the Court of Justice of the EU (CJEU) firmly prioritized the General Data Protection Regulation (GDPR). As this post will explain, Russmedia suggests that GDPR rules may eclipse key provisions of Europe’s intermediary regime under the Digital Services Act (DSA), and incentivize platforms to remove far more lawful expression than either law actually requires. A second post will consider — and express skepticism about — potential interpretations of the ruling that might limit this damaging impact.

Broadly speaking, intermediary liability law defines platforms’ legal responsibilities for users’ speech; while data protection law defines platforms’ responsibilities for users’ data. But, of course, content hosted by platforms can fall in both categories at once. Often, one platform user’s online speech is another person’s personal data.

Suppose I post false information about the French politician Marine Le Pen on Bluesky, for example. If she asks Bluesky to remove my post for defaming her, EU law provides a well-defined notice and takedown process under the DSA. But if she says that the same post processes her personal data in violation of the GDPR, she can credibly argue that entirely different rules apply: that Bluesky has no immunity for unwittingly hosting false content, that it has a duty to proactively monitor and filter uploads, and that it should not notify me or allow me to appeal the removal. Russmedia will make EU member state courts substantially more likely to accept such arguments.

I wrote about this brewing doctrinal conflict and the GDPR’s “Right to Be Forgotten” rules in a 2018 article, “The Right Tools.” As the title suggests, I think European law does provide appropriate tools for dealing with my hypothetical Le Pen post. In 2018, those tools came from the DSA’s predecessor law, the eCommerce Directive (ECD) — the law at issue in Russmedia. Today, the carefully negotiated legal framework of the DSA offers even better tools. Laws like the DSA were designed for the multi-party conflicts that arise when platforms must adjudicate competing legal rights, including private individuals’ data protection rights and platform users’ expression and information rights. The DSA was also, importantly, crafted to account for the real-world mechanics of platform content moderation. Russmedia shows us the serious problems that can arise when courts instead rely solely on the GDPR.

How we got here and where we are going

In 2014, the CJEU ruled that search engines like Google were “data controllers” of content indexed from public websites, with resulting Right to Be Forgotten obligations to de-list some search results. (I was a lawyer for Google at the time.) Since then, data protection claimants have asked Google to de-list nearly 7 million web pages. According to Google, 3.2 million of those requests were legally invalid under European law. Three more Right to Be Forgotten cases about web search have reached the CJEU in those years, leading to a nuanced body of law with increasingly strong protections for expression and information rights.

The CJEU has never been asked to resolve the obvious corollary question: how does the Right to Be Forgotten apply to content hosted by social media platforms like Instagram or TikTok? I think this is because, as I predicted in 2018, major platforms have had powerful reasons to avoid litigation about the obligations they might face as data controllers of hosted content. When the inevitable case about content hosts and the Right to Be Forgotten reached the Court, the defendant wasn’t a well-lawyered foreign giant, but a small European company that did not even manage to file a brief before the CJEU.

The Russmedia ruling is, as the Max Planck Institute’s Erik Tuchtfeld put it, a bombshell. Many European experts will likely seek to defuse it, using the kinds of arguments I will explore in the second post. They will look for doctrinal reasons to distinguish Russmedia from, for example, Germany’s ongoing and politically loaded Renate Künast case. But many practitioners — including lawyers who advise platforms — will just try to get their clients outside the blast radius. They will likely recommend swiftly removing content, proactively monitoring uploads, and readily settling cases to avoid Russmedia’s fallout. Michael FitzGerald predicted this split in responses to the ruling in an early post, noting that it is the platforms’ choices that will shape users’ real-world experiences.

Ultimately, incumbents like TikTok, Instagram, YouTube, Meta, and X can almost certainly afford to weather the legal uncertainty created by Russmedia. Their existing proactive systems can be repurposed as Russmedia compliance measures. The same will not be true for many smaller rivals — platforms that recently retooled for DSA compliance, only to learn now that this may not be enough. That may include Publi24, the Romanian advertising platform at issue in Russmedia itself. Publi24 appears to be a normal free advertising platform. Its site design and operations, as described in Russmedia, seem similar to those of hundreds or thousands of other platforms — probably including comparable local ad listing sites in the Netherlands, Spain, Poland, or Ireland. If, five years from now, Facebook Marketplace has replaced local advertising sites like these, Russmedia may be a significant reason why.

Case overview

Publi24 is an advertising forum that functions something like Craigslist or the classified ads section in a newspaper. The CJEU does not provide much detail about Publi24, but a review with the help of Google Translate shows a seemingly thriving platform with ads for used furniture, cars, baby equipment, and electronics, as well as quite a few offers to sell and deliver farm animals. The site also includes a “Matrimonial” ad section, which appears to feature solicitations from sex workers. (I’m grateful to Michael Veale for pointing this section out, since neither the CJEU ruling nor the earlier Advocate General opinion mentions it. Interestingly, Publi24’s iPhone and Android apps have no Matrimonial section — perhaps owing to app stores’ policies, or perhaps also owing to a US law that eliminated platform immunities for prostitution claims.)

Most ads, the site’s Help pages say, run for free, though a paid tier does exist. The Court refers to Publi24 as a “marketplace,” which seems correct in a colloquial sense. But the site doesn’t seem to facilitate transactions, meaning that it likely would not count as a regulated “marketplace,” like Shein or Amazon, under the DSA.

Publi24’s troubles began when a user posted a woman’s photographs and her phone number in an ad appearing to offer sexual services. It seems likely, but is not confirmed in the ruling, that this would have appeared in the site’s Matrimonial section. The woman notified Publi24, which took down the ad within an hour. But in the meantime, the ad had already been replicated on third party sites. It was disputed in the Russmedia hearing whether Publi24 itself syndicated the ads as part of its service, or whether third parties unilaterally “scraped” or copied them. (AG Par. 82)

The CJEU ruled that Publi24 counted as a controller of the woman’s personal data under the GDPR, because of the role it took in determining the purpose and means of processing uploaded content. As I will detail in the next post, the Court reached this conclusion based on factors that are hardly unique to Publi24. These included its role in “organis[ing] the classification” of ads and “set[ting] the parameters” for their dissemination, along with fairly generic content usage authorizations in the Terms of Service.

The resulting GDPR obligations, the CJEU held, superseded the intermediary liability rules of the eCommerce Directive (ECD). (Par. 131) Publi24 could not claim the ECD immunities that would have applied in a case about defamation, copyright, or almost any other law governing expression and information. Under the GDPR, the Court determined, Publi24’s swift response to a takedown notice did not protect it from liability. Instead, the platform’s legal duty was to proactively screen uploaded ads for content containing sensitive personal data; ascertain the identities of advertisers who post such content; and take measures to prevent posts from being distributed to third party sites. For Publi24, these obligations effectively eliminated major components of the legal framework established in the ECD, and later expanded in the DSA.

Changed content moderation rules under Russmedia

The Russmedia ruling removes two kinds of legal barriers for claimants who can formulate their content removal demands as GDPR claims. First, claimants who would have encountered substantive legal barriers to suppressing publicly available information under longstanding doctrines like privacy or defamation may nonetheless prevail using data protection. As legal scholar Joris van Hoboken has explained, those older laws offer “intricate doctrines” to “balance the interests in society in the publicity of and about others and the interests of privacy and dignity of natural persons.” Courts have not had the time or reason to develop a comparably rich body balancing those interests with data protection law — particularly not for issues, like social media hosts’ obligations, that have largely been kept away from courts.

Second, Russmedia gives claimants an avenue to avoid the platform immunity rules embedded in the DSA, including provisions expressly designed to protect online expression and information. The remainder of this post will assess likely resulting differences for platforms’ proactive monitoring practices and ability to distribute content, as well as for users’ ability to post anonymously and to be notified when their online expression is removed. It is possible that courts in future cases may decline to extend Russmedia’s holding to other platforms, for example by reasoning that it applies only to advertising platforms or platforms that take too much control over uploaded content. Such limitations are appealing as a matter of public policy. However, as the second post in this series will discuss, they are difficult to reconcile with the CJEU’s language and legal reasoning. In any case, courts may have few opportunities to limit Russmedia’s impact if platforms choose not to litigate these questions.

Proactive monitoring

Russmedia holds that a platform like Publi24, which “knows, or ought to know” that users may upload sensitive personal data, must “implement appropriate technical and organisational measures in order to identify such advertisements before their publication.” (Par. 97, emphasis added) That pre-publication monitoring obligation stands in striking contrast with the DSA and the ECD before it. One commentator described it as taking “an axe to the very root of the safe-harbour principle”.

Both the DSA and ECD expressly prohibit courts or lawmakers from imposing “general monitoring” obligations on platforms. European scholars and the CJEU itself have long identified this rule as a protection for platform users’ fundamental rights, including their rights to data protection and to freedom of expression and information. International human rights officials and civil society organizations have weighed in strongly against legal monitoring obligations on similar grounds.

The Court has upheld some “specific” monitoring mandates, including against challenges based on free expression rights. In Poland v. Parliament, it held that a specially legislated copyright monitoring obligation adequately respected users’ rights because of the multiple safeguards for expression spelled out in the law. These included substantive protections for parody and other forms of lawful expression, as well as procedural requirements for an “expeditious complaint and redress mechanism” for erroneous removals. No such protections are mentioned in Russmedia.

Monitoring mandates have cascading consequences for platforms and their users. Platforms reviewing uploads risked losing immunity under the ECD by gaining knowledge of “facts or circumstances” from which illegality should have been apparent. That risk persists under the DSA, despite a legislative attempt to ensure that platforms do not lose liability “solely” based on voluntary monitoring or legal compliance efforts. Such exposure to legal risks unrelated to the GDPR, such as claims about trademark or hate speech, gives risk-averse platforms who review potentially illegal content reason to simply remove it.

The CJEU asserts that the obligation created by Russmedia “cannot… be classified” as general monitoring of the sort prohibited by the ECD. (Par. 132) That’s an odd thing to say, given that the Court also says that the ECD’s rules do not apply to the case. . If the ruling requires Publi24 to review every post, as seems likely, that would count as “general monitoring” under older CJEU cases like eBay v. L’Oreal. And Russmedia certainly requires Publi24 to independently assess the legality of new uploads — which would make it “general monitoring” under the Court’s more recent Glawischnig-Piesczek standard. (My article on this evolving definition of “general monitoring” is here.)

Perhaps the Court’s assertion was intended to avoid creating complications with the DSA and ECD. If so, that seems likely to backfire. Future plaintiffs will almost certainly argue that the Court has actually redefined “general monitoring,” making Russmedia-like obligations permissible even under the DSA.

Collecting users’ identity

If Publi24 allows advertisers to post sensitive personal data, the Court rules, it must ascertain the advertisers’ identities. (Somewhat circularly, “allowing advertisements to be placed anonymously” is one reason the site is treated as a data controller, but being a controller means it must collect identities. (Par. 69)) One purpose of this check is to limit the “feeling of impunity” that advertisers might otherwise have. (Par. 104) More fundamentally, the platform must know each advertiser’s identity because, per Russmedia, it is entering a joint controller relationship with them. (Par. 101) The identity check is also a first step in determining, as the controller must, that posted data is actually the advertiser’s. (Par. 102) The ruling also says the uploader “must be able to demonstrate that the personal data concerned are accurate” — a difficult and arguably cruel standard for platforms moderating online dating profiles or sexual solicitations. (Par. 93, emphasis added)

A legal mandate to de-anonymize users may, as one law firm notes, create tensions with national laws protecting anonymous communications. Requiring ID from users is also likely to have disparate impact based on factors including immigration status, gender identity, or membership in Roma communities. For a service like Publi24’s Matrimonial section, a “know your customer” obligation would also seemingly turn private platforms into de facto registries of sex workers. That’s a change that might drive commercial sex workers off of Internet platforms, ultimately putting them in more danger. The CJEU does not appear to have considered any of these broader consequences.

Scraping and syndication

The Court takes particular issue with the dissemination of ad content to third party sites. It holds that one of Publi24’s responsibilities as a controller is to “implement security measures such as to prevent advertisements published there and containing sensitive data… from being copied and unlawfully published on other websites.” (Par. 115) In other words, it should not actively seek to syndicate ads, and should take measures to prevent third parties from copying them.

Avoiding further distribution makes sense for the uniquely harmful content in this case. But making this the default rule for ordinary, lawful online content would create real problems. Disseminating information is a basic function of platforms. Many are built with tools such as APIs to actively spread posts to third party developers, sites, or apps. Content creators who hope to grow their audience — including advertisers, artists, and activists — often want and expect this. Syndicating ads to third party sites is a core part of Google’s business, for example. YouTube has long offered video syndication as a feature for content creators, who build business strategies around it. Platforms like Twitter and Reddit grew in scale and popularity in part because of their developer APIs, which allowed third parties to build new features or applications. When Reddit dismantled its API, users and volunteer moderators — the creative engine of the platform — protested, and many left the service entirely.

Regulators focused on competition or content issues often want platforms to share user content, too. EU Commission authorities enforcing the DSA fined X €40 million for denying API access and trying to prevent scraping by researchers, for example. They rejected X’s contention that sharing this already-public data violated the GDPR. Content-sharing through interoperability is also a key tool for competition regulators. US antitrust authorities want broader syndication of ads as a remedy in their antitrust suit against Google. As Cory Doctorow has explained, platforms that begin with more open models often later switch to locking users in. If the GDPR requires such lockdown, it provides a boon to incumbents and a major setback for both smaller rivals and distributed systems like Mastodon or BlueSky.

Implications for content moderation

The Russmedia ruling doesn’t examine other aspects of platform content moderation. But substituting GDPR rules for the detailed notice, action, and appeal rules of the DSA would have significant consequences. Significantly, the DSA requires platforms to notify users whose content has been removed, and allow them to appeal. But data protection authorities and courts have said that the GDPR generally prohibits such notifications, at least in the context of search engines and the Right to Be Forgotten. The DSA’s outside dispute resolution under Article 21 also seems imperiled. Indeed, Russmedia lends credence to platforms’ concerns that showing ODS bodies certain kinds of moderated content could violate the GDPR. Beyond these major concerns, a literal application of the GDPR could alter notice and takedown procedures in a host of odd and counterproductive ways, as I detail at page 327-341 of my article.

Conclusion

This post has enumerated an array of unintended consequences that appear to flow from the CJEU’s Russmedia decision. The next post will examine arguments for keeping those consequences in check for other platforms. In particular, it will review theories that the ruling

  • Only affects particularly harmful content or sensitive personal data
  • Only affects high-risk services
  • Only affects advertising platforms or marketplaces
  • Only affects platforms that exercise unusual degrees of control over data, or that claim excessive rights in their Terms of Service
  • Only displaces the platform immunity rules of the old eCommerce Directive, but not those of the DSA
  • Will not govern cases in which fundamental rights considerations are more clearly at issue.

As I will explain, I am somewhat skeptical of each of these theories. But the public policy reasons to limit Russmedia’s impact are compelling. I hope that Member State courts interpreting the case will adopt these or other approaches to do so.

Authors

Daphne Keller
Daphne Keller is the Director of Platform Regulation at Stanford Law School's Program in Law, Science & Technology. Her academic, policy, and popular press writing focuses on platform regulation and Internet users' rights in the US, EU, and around the world. Her recent work has focused on platform t...

Related

News
EU Set the Global Standard on Privacy and AI. Now It’s Pulling BackNovember 10, 2025
Analysis
How the Meaning of 'Publicly Accessible' Shapes Researcher Data Rights Under the DSADecember 12, 2025
Podcast
Unpacking the Politics of the EU's €120M Fine of Musk’s XDecember 7, 2025
Perspective
How Europe's Digital Omnibus Could Gut Privacy ProtectionsApril 1, 2026

Topics