Home

A New Animal Joins the Circus: On the Nature and Value of Out-of-Court Dispute Settlement Under the DSA

Niklas Eder, Giovanni De Gregorio / Jun 1, 2024

This piece is part of a series of published to mark the first 100 days since the full implementation of Europe's Digital Services Act on February 17, 2024. You can read more items in the series here.

Brussels, Belgium - November 3, 2022; European Flags in front of the European Commission Headquarters building. Shutterstock

Article 21 of the EU’s Digital Services Act (DSA) allows national Digital Services Coordinators (DSCs) to endow organizations with the task of settling content moderation disputes. These organizations, out-of-court dispute settlement (ODS) bodies, can then issue decisions on whether content moderation actions, such as the removal of posts or the demotion of content, were justified or not. These decisions will be non-binding, but platforms have an obligation to cooperate with them in good faith. Since its creation, Article 21 of the DSA has been perceived as a strange provision. It inspires the legal imagination, prompting visions for centralized institutions such as a proposal for the development of a European Centre for the Settlement of Disputes on Social Media Content Moderation. It also gives rise to fundamental objections and concerns, such as for the rule of law. A new animal, it seems, has joined the content moderation circus.

This article describes the new and unique nature of ODS bodies and proposes answers to the following questions:

  • What will be the role of ODS bodies under the DSA?
  • How are they different from other players already operating in the content moderation space?
  • What contribution can they make to improve platform accountability and protect rights and democracy?

In order to answer these questions, we underline three key elements central to the value of ODS bodies: their independence, their ability to provide well-reasoned decisions which are rooted in fundamental rights, and reporting based on granular data analysis.

The New Animal

ODS bodies are different from anything we know so far in the world of content moderation because they are intended to provide independent review or content moderation decisions at-scale. They are supposed to cover a great variety of content moderation actions, including but not limited to removal, demotion, account bans, and monetization limitations. Their scope includes the review of decisions which were taken based on platform terms and conditions as well as national laws, and they are expected to account for fundamental rights.

The ODS regime under the DSA goes beyond existing remedy mechanisms in several ways. Mechanisms like the Oversight Board, which has structures in place aimed at ensuring its independence, only issue a handful of decisions each year and do not cover the broad spectrum of content moderation actions appealable under the DSA. Organizations such as the FSM, which operated under the German NetzDG law that is now replaced by the DSA, only reviewed potentially illegal content. Therefore, the creation of independent bodies with a legal mandate to review all sorts of content moderation actions is unprecedented.

Those who follow the content moderation debate closely may be surprised by the EU’s ambition to create a landscape of independent bodies tasked with reviewing content moderation decisions at-scale. At-scale content moderation, per the common wisdom, is an impossible task, making the reliance on AI systems to moderate indispensable. Some scholars go so far as to argue that individual remedy is not only practically impossible, but that relying on individual rights to reign in the tech giants is also conceptually flawed. Instead of focusing on individual remedies, we should, therefore, focus on systemic perspectives. According to this view, individual rights do not provide the appropriate starting point to protect the rights of users and citizens, and the functioning of democracy. Hence, the obvious question is: What can be the added value of creating a system of independent at-scale review of content moderation decisions?

Additional Remedy and Independent Review

We can start our search for value in out-of-court dispute settlement in a modest manner. While the task of at-scale review is no easier for ODS bodies than for platforms, at the very least ODS bodies provide an additional remedy for users and provide independent review. ODS bodies serve as an alternative or addition to platforms' internal complaint handling systems, in which decisions upon appeal are often made in the same manner as the original decision. Yes, ODS bodies will also make mistakes. But, because they are different animals than platforms and independent from them, they will likely make different mistakes than platforms.

There are many reasons why ODS bodies will operate differently from platforms when reviewing content. These become obvious when applying a perspective of law and political economy. The main reason is that their decision-making processes are not submitted to the same economic logic as platforms’ content moderation. ODS bodies are not driven by commercial interests. They will not have advertising revenues at the back (or front) of their minds when deciding how to design decision-making processes and distribute resources. They can charge fees for every case, which means they will not be tempted to focus their resources on the most lucrative markets. ODS bodies are different animals from platforms because they have different incentives and motivations. They can add value to the content moderation landscape by providing independent review which is not embedded in the logic of surveillance capitalism. Instead, their foundation is the regulatory environment created by the DSA, which is, in turn, rooted in European constitutional law and the rule of law.

Well-Reasoned Decisions Accounting for Fundamental Rights

Having discussed the value of ODS bodies based on the premise that they may not perform better in the task of at-scale review than platforms, we can now be more ambitious. What does it take for ODS bodies to save individual remedies in the content moderation space from the death so many predict? How can we ensure that ODS bodies do not only provide an additional, alternative layer of review but also provide a better review? To save individual remedy as an accountability mechanism, ODS bodies must not fall into the trap of providing only performative review and replicating sweatshop-style content moderation. To save individual remedy as an accountability mechanism, they need to issue high-quality decisions that account, where relevant, for fundamental rights. They need to provide detailed explanations of their decisions, helping users to understand what the rules are and why they may have been violated.

ODS bodies delivering well-reasoned decisions may not only ensure the meaningfulness of their work but may also have a broader impact on the content moderation landscape. It may increase pressure on platforms to improve the way in which they justify their own content moderation decisions and set a benchmark against which regulators can measure platforms' content moderation decisions. To date, platforms’ content moderation decisions are accompanied by notoriously poor statements of reasons. They often consist of generic messages stating that one of the platform's policies may have been violated, as underlined by the transparency database published by the European Commission. Platforms’ statements of reasons are in blatant violation of the DSA’s rules which require detailed explanations (Art. 17 (3) DSA) and accounting for fundamental rights (Art. 14 (4) DSA).

The poor quality of platforms’ justifications of content moderation decisions can barely be acceptable for a regulator which prides itself on effective enforcement, such as the Commission. Why, then, has the Commission not stepped in yet? We can only speculate that it may be hesitant to set a high bar for statements of reasons. It may well be that platforms make a seemingly compelling argument that providing well-reasoned decisions at-scale is impossible (this is an argument that may be challenged in a world where generative AI systems are employed robustly in content moderation). ODS bodies, hence their greater impact, may be able to prove the contrary, and thus that Art. 17 DSA and Art. 14 (4) DSA can and should be enforced.

ODS Bodies as Innovators

Admittedly, providing well-reasoned decisions at-scale is a daunting task. But ODS bodies may be best suited for tackling this task, and begin to make progress on what may be a long but not hopeless journey. They can charge platforms reasonable costs for their decisions, allowing them to allocate resources required to provide explanations that are meaningful and satisfy conditions established by the DSA. ODS bodies may be able and willing to invest in processes and systems to conduct fundamental rights assessments efficiently and generate well-reasoned decisions at-scale. Due to their unique funding scheme and independence, ODS bodies may have the resources and incentives to become innovators.

Concretely, ODS bodies can position themselves at the forefront of an exciting development, which consists in integrating LLMs in content moderation processes. Research on how to use LLMs is still in its early stages, but it is quickly developing, with scholars exploring its potential and challenges. ODS bodies need to develop strategies on how to leverage the potential of LLMs while assuring human control and decision-making. LLMs should not make final determinations, but they can help identify relevant provisions and translate human considerations into accessible explanations. ODS bodies should develop scalable concepts for fundamental rights assessments and transparent systems to carry out these assessments. They should cooperate with researchers and civil society organizations to ensure their systems satisfy algorithmic transparency requirements while also pushing the frontiers of content moderation.

Reporting About Policy Enforcement

Finally, ODS bodies can contribute to making other mechanisms of the DSA more meaningful through detailed reporting. Article 21 (4) DSA requires ODS bodies to report data to the national DSCs relating to their functioning, the disputes they decide on, the time they took to decide disputes and what difficulties they encountered. Based on this data, the DSCs will issue reports. For reporting by ODS bodies to make a meaningful contribution to the content moderation landscape, it should go beyond the data points required by Article 21 (4) DSA, and it should not have DSCs as the sole audience. The reporting of ODS bodies should include granular data about platforms’ enforcement of policies. It should respond to the needs of civil society organizations that aim at identifying systemic risks. ODS bodies, therefore, need to engage with these organizations to understand their needs and focus on useful data points.

Complementing data platforms are obliged to share, under Art. 40 DSA, aggregated data collected by ODS bodies that can provide an additional source of information to hold platforms accountable. Data from ODS bodies can contribute to assessing the risks platforms need to respond to under Art. 34 DSA, such as the dissemination of illegal or harmful content. It can inform the selection of appropriate mitigation measures under Art. 35 DSA, such as how platforms should adapt their terms and conditions and their enforcement systems. Through its reporting, ODS bodies can help bridge a gap in the content moderation accountability space: Individual remedy and systemic perspectives need not be conceived as two separate components of content moderation accountability. Insightful reporting by ODS bodies and building productive relationships with civil society organizations advocating for systemic accountability can contribute to both components going hand in hand.

What It Takes to Create a Meaningful ODS Ecosystem

As the vision outlined here demonstrates, ODS bodies can provide meaningful remedies for users and have a positive impact on the content moderation space more broadly. Despite their potential, we should also consider the challenges coming from externalizing decisions on rights and freedoms to entities outside the traditional realm of public law. This challenge is one of the core debates in digital constitutionalism. The expansion of private forms of justice provides alternative systems for users, but also requires further constitutional caution in order to ensure that private ordering does not take the lead and dilute the protection of fundamental rights and democratic values. For these reasons, it is critical to keep in mind the importance of oversight by public institutions, the centrality of the rule of law in defining the proceduralization of content moderation and the importance of embedding ODS bodies in the larger content moderation accountability system.

It should be obvious that the success of ODS bodies should not be made dependent on voluntary support by platforms, such as through a shadow accreditation system. The point of regulation is to not depend on self-regulation and voluntary commitments by those who are regulated. Rather, the entities enforcing regulation, the Commission and the Member States’ DSCs, have a mandate and obligation to ensure its effectiveness. For national DSCs, who are competent for certifying ODS bodies and oversee whether platforms comply with the obligation of good faith cooperation with those bodies, this provides an opportunity to lead on an important aspect of the DSA and position themselves on a level with the Commission.

Besides regulators, civil society organizations and academics should work closely with ODS bodies to help solve challenges and answer the hard questions ODS bodies will encounter in their work. ODS bodies can become organizations implementing solutions to some of the most difficult questions of at-scale review of content moderation decisions, but they cannot develop these solutions without the cooperation of relevant players in the content moderation space. They need to find their unique role in the circus that they join and make sure their part fits well into the greater show.

Authors

Niklas Eder
Niklas Eder (LLM and Maître en Droit) is the Digital Policy Postdoctoral Researcher at the Centre for Socio Legal Studies at Oxford Law School, a Visiting Lecturer at King’s College London, and the Co-Founder of User Rights.
Giovanni De Gregorio
Giovanni De Gregorio is the PLMJ Chair in Law and Technology at Católica Global School of Law and Católica Lisbon School of Law. His research interest deals with digital constitutionalism, freedom of expression, privacy and data protection law, digital policy. Giovanni is the author of the monograph...

Topics