Home

The Digital Services Act Is Fully In Effect, But Many Questions Remain

Gabby Miller / Feb 20, 2024

Flags of the European Union aside a European Commission building.

On Saturday, Feb. 17, 2024, the Digital Services Act (DSA), or the European Union’s “rulebook” for making the internet safer, fairer, and more transparent, entered into full effect, and is now applicable to all online intermediaries in the EU. But in the final days leading up to implementation, a vast network of researchers, civil society members, and even a number of European Union officials, among others, came together at the University of Amsterdam Law School for its first annual “DSA and Platform Regulation Conference.” Organized primarily by the University’s DSA Observatory and the Institute for Information Law (IViR), the two-day conference spanned issues ranging from the DSA’s impact on recommender systems to its global reach via the “Brussels effect.”

The European Commission Directorate-General for Communications Networks, Content and Technology, Rita Wezenbeek, delivered the opening keynote address, where she provided conference-goers with a DSA “state-of-play.” While Wezenbeek expressed some surprise to be in a room full of DSA skeptics, she repeatedly emphasized how the Commission welcomes critical feedback, finding it imperative for both the Act’s implementation and the Commission’s investigations.

So far, the Commission has launched two formal investigative proceedings against Very Large Online Platforms (VLOPs). In December, the Commission announced “formal infringement proceedings” against X over suspected breaches in its transparency obligations, failure to counter illegal content and disinformation, and deceptive design practices. And just two days after full DSA implementation, on Monday, it announced similar proceedings against TikTok for potential breaches in protecting minors, advertising transparency, and data access for researchers.

“In the area of enforcement, we [the EU Commission] are not doing investigations for the sake of it,” Wezenbeek told the audience on Friday. “We cannot do cases just because we find them interesting to us, or interesting from a political perspective, we need to look at cases that have a real risk of harm to society.” She also noted that children’s protections, hate speech, and election integrity will be prioritized in coming investigations. The Commission generally views these investigations as a series of confrontations where it can be seen as “tough” on DSA violators, according to Wezenbeek, but the regulatory body ultimately wants to build cooperative – not coercive – relationships with VLOPs.

Other DSA officials in the room included Digital Service Coordinators (DSCs) from Ireland, France, and the Netherlands, who spoke on a panel titled, “National regulators and the DSA.” While the EU Commission oversees DSA compliance for Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs), the DSCs are tasked with application and enforcement for all other digital services in their respective member states. And though the DSCs have been talking amongst each other since last June, the first formal meeting of the European Board for Digital Services convened on Monday. Digital Service Coordinators have not yet been appointed for all 27 EU Member States.

The conference, packed with nearly forty paper presentations, two keynote addresses, several featured panel discussions, and a round of lightning talks over just two days, left the audience with as many questions as it did answers. What follows are a series of takeaways from both categories:

1. It’s not yet entirely clear what the DSA actually is, and we shouldn’t be quick to judge.

“The DSA is not revolutionary,” according to Martin Husovec, associate professor of law at The London School of Economics and Political Science. In his keynote speech on “How (not) to enforce the DSA,” Husovec argued that the DSA is about incremental change, and effective implementation will be lengthy. Husovec, along with others like Access Now Senior Policy Analyst Eliška Pírková, hopes civil society will have enough of a voice throughout this entire process. In a panel on implementation and enforcement of the DSA, Pírková advocated for civil society’s deep involvement in developing guidelines with the Commission, rather than merely consulting on them. (The Commission announced a month-long public consultation period earlier this month for the first ever draft guidelines on mitigating election-specific systemic risks.) Nor can the DSA be looked at in isolation. "The DSA is only going to be useful if we correctly implement the Digital Markets Act, when the GDPR is considered, when the EU Charter on Fundamental Rights is factored in, and when all the other legislation is properly implemented and enforced," said Barbora Bukovská, senior director for law and policy at ARTICLE 19, during a panel on the DSA in times of crisis. "Only then the DSA will play a positive role and will have a positive effect on the information space we are creating."

2. The DSA’s vagueness isn’t necessarily a bad thing.

France’s Digital Services Coordinator, Benoît Loutrel, argued that being too precise is a recipe for merely addressing yesterday’s problems. “No one was discussing generative AI when the DSA was being drafted,” said Loutrel, who is also a board member at ARCOM, the French media regulator, in a panel on “National regulators and the DSA.” Martin Husovec, in a paper presentation on DSA’s “red lines,” lauded certain lawsuits against the Commission, like the one filed in June 2023 by Zalando, an e-commerce site and designated VLOP, over the Act’s online content rules. Challenges will help encourage participation and establish the contours of the DSA, according to Husovec.

3. The DSA will rely heavily on auditors for platform accountability. But the role (and reliability) of auditors remains unclear.

It’s widely agreed upon that auditors will play a large role in DSA compliance and so-called accountability for VLOPs. But how will the public get relevant information, rather than “gloss,” when the auditors are being paid to make do with what information platforms are presenting to them? This new market, and subsequent economic incentives, for auditing firms worries some. In a paper presentation on “Platform Power as Prediction Power,” researchers Hannah Ruschemeier of the Fern Universität in Hagen and Rainer Mühlhoff of the Universität Osnabrück argue that self-regulation through mandated transparency reports poses a risk for audit capture, where regulators become captured by the regulatee. The Commission, however, is leaning heavily on the reputational liability of auditors, according to EU Directorate-General Wezenbeek. In the event that something goes wrong, it might be traced back to auditors, which may draw further scrutiny and assessment in determining whether an auditor was “captured” by its clients. In the meantime, Wezenbeek urged patience, as the auditing sector is similarly figuring out what compliance looks like and this involves a steep learning curve.

4. Evaluating secrecy is just as important as transparency.

This was a key point made on a panel titled “EU-level implementation and enforcement of the DSA.” Deirdre Curtin, professor of European law at the European University Institute of Florence, says that “trade secrecy” around algorithms will be a significant barrier to effectively enforcing the DSA. While one audience member pointed out that it may not be necessary to see inside the algorithmic “black box” in order to effectively mitigate risks to society, Curtin said it will largely come down to the Commission’s configuration of this process.

Another panelist, Daphne Keller, director of the Program on Platform Regulation at Stanford's Cyber Policy Center, pointed out that any perception of “backroom deals between regulators and platforms quietly making content go away” will damage the Commission’s transparency efforts. Secretive systemic risk assessments, which won’t be made publicly available for 15 months after being submitted to the Commission, are also incompatible with the transparency promised within the DSA, Keller pointed out.

5. How the European Commission responds to moments of crises will define the DSA.

The Crisis Response Mechanism (CRM) of the DSA, or Article 36, allows the Commission to require VLOPs or VLOSEs to assess the extent of how their services contribute to a serious threat and apply measures to mitigate them. It was hastily added during the third trilogue in March 2022, perceivably in response to the Russian invasion of Ukraine, and has been the subject of much concern, including at the convening in Amsterdam last week. Discussion over how to define a crisis and whether the CRM can change the types of mitigation measures that platforms can implement were debated during a panel on disinformation. And Lorna Woods, professor of internet law at the University of Essex, questioned whether the CRM can effectively address the issues it was created to tackle, especially when crises are “multiple and distinct” and the mechanism is unilaterally triggered by an EU Board that has yet to be finalized. Some of Woods’ fellow panelists discussing the “DSA in times of crisis” also pointed out that, under the ambiguously defined CRM, the US elections could be considered a “crisis” – and probably should be irregardless of the Commission’s future interpretation.

6. The DSA will be co-opted for political means. The European Commission must be ready.

It’s only a matter of time before political actors seize on the successes and failures of the DSA. During a panel on “the DSA in times of crisis,” David Kaye, professor of law at the University of California, Irvine, worried that the DSA’s “built-in review process” will be weaponized by political actors. This could mean framing the DSA as a “failed experiment" in hopes of killing the project or ramping up enforcement against VLOPs for political means, especially with the aim of censorship. Others, like Daphne Keller, worried that “rule of law” constraints leave an opening for additional politicization. Clearly defining markers of “success” matters for when these moments arrive so that the Commission can say the internet is definitively better off (or not) because of the DSA.

Large questions looming over the DSA

The experts gathered in the room had as many questions as answers. Their concerns represent a starting point for future analysis.

  • The DSA is not about content moderation, it’s about systems. How do we balance rights with systems? And can platform “resilience” be an alternative way of thinking about “top-down moderation”? (Questions proposed by Lorna Woods and Martin Husovec, respectively.)
  • Who do we want to control users’ data? Data should not merely be transferred from platforms to the state. It’s better for a system to control users’ data, rather than one person, company, or regulatory agency, according to Kate Klonick, associate professor of law at St. John’s University School of Law.
  • How will Digital Service Coordinators oversee DSA enforcement for non-VLOPs? Ireland’s Digital Services Coordinator John Evans, who is also commissioner of Ireland’s new media regulator Coimisiún na Meán, says they are so far categorizing risk factors, like how many children use a platform and its reach, and then prioritizing each risk level with an “attention strategy” that will be refined over time.
  • How do we define ‘transparency’ under the DSA, and what is the Commission really after in compelling it? Transparency is not an end unto itself, and obligatory data disclosures may have little impact on user empowerment, according to Rainer Mühlhoff of the Universität Osnabrück. The Commission must think critically about why it wants the data it seeks and whether it aligns with the goals of the DSA, like effectively mitigating harms to society.
  • Hope (and skepticism) for Article 40 of the DSA. Under Article 40, paragraph 12, vetted researchers are able to request data from VLOPs and VLOSEs to conduct research on systemic risks in the EU. While the principles are clear, to what extent can researchers currently rely on Article 40.12 for research involving automated data collection like scraping? Some designated VLOPs, like TikTok, are not opening their application programming interfaces (APIs) to the public, or what the Mozilla Foundation argues is a “minimum viable method for permission access to public data.” How will the Commission resolve effective access to data for researchers in accordance with Article 40 now that the DSA is in full effect?

What is clear is that the DSA's implementation will surface an enormous amount of data and information for a burgeoning community of academics, regulators, policymakers, platform executives, and consultants to sift through going forward. Clarity on any of these issues can only come with time.

Update 2/21/24: A direct quote was added to more accurately reflect panel remarks made by Barbora Bukovská.

Authors

Gabby Miller
Gabby Miller is a staff writer at Tech Policy Press. She was previously a senior reporting fellow at the Tow Center for Digital Journalism, where she used investigative techniques to uncover the ways Big Tech companies invested in the news industry to advance their own policy interests. She’s an alu...

Topics