What We Don't Know About DSA Enforcement
Ramsha Jahangir / Apr 8, 2025Audio of this conversation is available via your favorite podcast service.
At Tech Policy Press, we've been closely following the implementation of the Digital Services Act (DSA), the European Union law designed to regulate online platforms and services.
On April 4, The New York Times reported that the European Commission is considering finding X, formerly Twitter, as part of its ongoing DSA investigation, which began in 2023.
We've discussed at length the extent and quality of transparency from platforms under the DSA, but there is surprisingly limited insight into how the Commission is investigating very large online platforms or VLOPs, the largest online services covered by the DSA. The publicly available documents on cases are just press releases in most cases, while the enforcement strategies on methods are not spelled out.
To delve into the challenges this lack of transparency presents and how it impacts the public's understanding of the DSA, I spoke to two researchers:
- Jacob van de Kerkhof, a PhD researcher at Utrecht University. His research is focused on the DSA and freedom of expression.
- Matteo Fabbri, a PhD candidate at IMT School for Advanced Studies in Lucca, Italy. Fabbri is also a visiting scholar at the Institute for Information Law at the University of Amsterdam. He recently published a research article titled "The Role of Requests for Information in Governing Digital Platforms Under the Digital Services Act: The Case of X."

Brussels, Belgium – View of the statues decorating the Berlaymont, the headquarters building of the European Commission. Shutterstock
What follows is a lightly edited transcript of the discussion.
Ramsha Jahangir:
Our discussion today centers on transparency about the DSA's enforcement process. We'll unpack why there's a scarcity of publicly available legal documents detailing the methods, sources, and outcomes of these investigations. This lack of transparency raises critical questions about democratic accountability and the public's right to know. To delve into the challenges this lack of transparency presents and how it impacts the public's understanding of DSA, I'll be speaking to two fantastic DSA researchers.
For listeners who may not be familiar with your work, could each of you tell us about yourself and what aspect of DSA you're looking into?
Jacob van de Kerkhof:
My name is Jacob van de Kerkhof. I'm a PhD researcher at Utrecht University. I work specifically on the Digital Services Act and freedom of expression. I've done a bit of research on DSA enforcement at the level of the digital service coordinator.
Matteo Fabbri:
I am Matteo Fabbri. I am a PhD candidate at IMT School for Advanced Studies in Lucca, Italy. I am also a visiting scholar at the Institute for Information Law at the University of Amsterdam. In my research, I look at the influence of recommender systems on online users in the context of the DSA, and I have also been looking at the systemic risk framework and how it could be defined also within the DSA. My research bridges philosophy, law, and empirical social science.
Ramsha Jahangir:
I'm very happy to have you two here, and looking forward to this very important and timely conversation. So maybe to kick off, let's start with what are the key procedural steps in a typical DSA investigation? Matteo, you've done a great paper recently on the role of request for information under DSA and looked at this question in depth. So maybe you can walk us through what the first stage and what a typical investigation looks like.
Matteo Fabbri:
So usually, to start an investigation, the Commission would proceed first by sending requests for information to very large online platforms that they select as having potential infringement of the DSA. The request for information can be a simple request, which is just the Commission sending a request to the platform, or it can be a request by decision, in which the Commission actually issues an order for the platform to give information.
The second case, for example, was the one involving TikTok's role in the annulation of the Romanian presidential election that occurred recently. In that case, the Commission issued a retention order for TikTok's data. In that case, it was a request by decision.
Otherwise, the Commission also has further investigative powers that go from requesting interviews to people who might be involved with the platform or might be knowledgeable about what the platform is doing and even can go further as entering the platform's premises to access their files and the data that they might not otherwise disclose. The same can do with the independent auditors that the platforms have hired to carry the independent audit as required by Article 37 of the DSA.
So these are the most important steps. As far as I know, only requests for information have been issued until now. There might be some other procedures that have not been publicly announced by the Commission itself.
Following these investigative steps, the Commission might decide to start proceedings, so the initiation of proceedings of infringement, and for this, it has to issue a decision. These decisions are usually also announced through press releases, which is the same as the way in which they announce requests for information. But some of them are actually available in their integrity, and reviewing them can be a useful way for analyzing the Commission's strategy and approach in enforcing the DSA, which priorities they have, what they are focusing on.
However, the full legal documents on the decisions on initiating proceedings are available just in a minority of cases, and the first one was the one concerning X. That is the one that I used as a case study in my paper.
Ramsha Jahangir:
Do we have a number for the total number of RFIs that have been publicly made available?
Matteo Fabbri:
Actually, there isn't a number that has been published by the Commission itself. But in some public talks given by officers working for the DSA enforcement that I have seen, requests for information are more than 40 at this stage.
Ramsha Jahangir:
How many of them have prompted investigations or contributed to the opening of proceedings?
Matteo Fabbri:
I don't think there is a point-to-point correspondence between the requests for information and the initiation of proceedings. Usually requests for information are sent in a batch that correspond to a specific topic, maybe the role of recommender systems and platform design in YouTube, TikTok, Instagram, and other platforms. In other cases, the way in which platform deal with the presence of illegal product, especially in the case of marketplaces.
So I think the evidence that is coming from the requests for information is then used to basically fill the Commission's knowledge gap about specific areas of the DSA enforcement, and some of these gaps might lead to the initiation of proceedings. But there is not point-to-point correspondence between a request for information and the initiation of proceedings.
In the case of X that I have reviewed in my paper, there was actually a closer connection between the request for information and the decision, because it was the first one to be issued, and they didn't need to collect a lot of information before starting the decision of initiating the investigation and also the proceedings of infringement.
Ramsha Jahangir:
Can you walk us through that? What was that process like in the case of X that you've explored?
Matteo Fabbri:
In the case of X, the request for information that the Commission sent out focused on the assessment and mitigation of risks related to the dissemination of illegal content, disinformation, gender-based violence, and any negative effects on the exercise of fundamental rights of the child, public security, and mental well-being. The Commission aimed to scrutinize access policies and actions regarding notices on illegal content, complaint handling, risk assessment, and measures to mitigate the risks identified.
Following the submission of this request for information, the Commission identified the five areas of concern in which the platform is suspected to have infringed the DSA provision. In particular, in the first place, the Commission argued that X failed to diligently assess a certain systemic risk stemming from the design and functioning of X and its algorithmic system and the use made of its services. This concern is setting effective mitigation measures regarding addressing the negative effects on civic discourse and electoral processes, and as the Commission considered these measures inadequate.
Also, regarding the content moderation measures that X implemented through community nodes, so basically a collaborative user-based way of flagging potentially policy-violating content. The Commission thought that this feature is not sufficient to tackle content moderation in a comprehensive way.
Then the Commission also assessed that X wouldn't take decisions in a diligent and non-arbitrary and objective manner, and answer without undue delay to the notices of illegal content that user flagged on X. And so, this is all regarding the inadequacy of X's content moderation.
Then another main point of concern is the policy of the blue check mark that users may mistakenly identify as a sign of authoritativeness of the account, but instead, it can just be bought. And so, it is not really a way of identifying the authoritative content on the platform. Then another point was the absence of searchable and reliable tools that allows to query the other repository that is required by Article 39 of the DSA, so the repository that should allow users to search and identify which kind of target groups are used for online advertising.
Then the last point concerned basically the fact that X denied access to data as per Article 40(12), so the access to public data through the API. Then this request for information prompted the start of the proceedings of infringement that actually, in the case of X, followed these main topics. So there is, in this case, one-to-one correspondence between the topics highlighted in this press release and the topics that then the full Commission decision that is available as a legal document is considered.
So also the Commission decision goes through these five main topics and also the outcomes, the preliminary outcomes, of the investigation that have been released in the summer of last year focus on these areas. So the dark patterns, in particular the blue check mark, the advertising transparency, and the data access for researchers. While for the other two points, the investigation continues, so the points more related to content moderation and complaint handling.
In their findings, the Commission actually argued that they found the infringements of the DSA for these three areas. In particular, they argued that X designs and operates its blue check mark in a way that the deceives users, because anyone can subscribe to obtain the verified status.
Secondly, X does not comply with the required transparency on advertising as it does not provide a searchable and reliable advertisement repository. So it does not allow users to supervise the emerging risks of advertising. Then it fails to provide access to its public data to researchers in line with the Article 40(12) of the DSA.
To recap, there is a close connection between the request for information, the initiation of proceedings, and the preliminary findings, especially for the three areas mentioned: dark patterns, advertising transparency, and data access. I think this sums up my case study.
Ramsha Jahangir:
Thanks so much for that very comprehensive recap, Matteo. Moving up to the role of DSCs very closely, could you tell us a little bit about the role of different actors within DSA and how that contributes to transparency around the enforcement of this regulation?
Jacob van de Kerkhof:
Absolutely. I think while the Commission is taking this from this stage in DSA enforcement, as it should, regarding its competence towards VLOPs, there is a lot happening on a national level as well. The European Digital Rights Advocacy Group has a great database online in which you can check all the complaints that are already filed with national digital service coordinators, and that allows you to see what's going on.
Because of the competency structure in the DSA, much of those complaints land with the Irish immediate permission. So they'll have to do all the information requests and perhaps even issue some of the fines that could come out of noncompliance within DSA.
But there is a lot happening on that front. But aside from the digital service coordinators, which follow a roughly similar structure as the one that Matteo outlined for the Commission and its DSA enforcement with, of course, the added complexity of the competency structure there, there is also some initiatives of private individuals or NGOs already trying to enforce the DSA in front of national courts.
Here in the Netherlands, we have the case of Danny Mekić, who successfully sued Twitter or X for not providing sufficient information upon shadow banning him and not complying with GDPR requests. In Germany, you have a case of two German NGOs suing X for non-compliance with Article 40(12) on research and data access. Aside ...
Those are, of course, very transparent. Unfortunately, it means ... Because these are core decisions that you and I can just access and see what's going on. So if we talk about DSA enforcement for the rest, of course, there's ... For all the actors that are part of the DSA content moderation framework, meaning the out-of-court dispute settlement bodies, the trusted flaggers, they all have quite extensive transparency reporting obligations.
So we as users, but also the Commission and regulators can see on a pretty granular level, for example, what content-trusted flaggers are reporting on and what action those platforms take upon those notices. The question on that end is, what does that information lead us toward? If we know that all of this is happening, can we expect a very forward approach by platforms in tackling all of this, but then also a very forward approach in enforcement by digital service coordinators and the European Commission in cases of non-compliance?
I think a prime example in this could be, for example, the transparency database, which is this Commission-maintained database in which you can ... Platforms are required to upload all of their content moderation, statements or reasons that they should be users. If you cross-ref them to that database with the transparency reporting that platforms do, you already see that all those transparency reports are not ... They don't match the content moderation actions that these platforms propose that they do in their transparency reporting.
So the question then is will that lead a very proactive DSC or the Commission to take action and say, "Hey, look, that's a great transparency report you've got there"? One can be correct. You can either upload correctly into the database or have a correct transparency report, but at this point, you cannot have both because they don't seem to match.
I think the DSA adds some transparency to the process. I fully agree with Matteo's findings that seeing press releases on the Commission's request for information, and similar for digital service coordinators, doesn't really tell us much about what they're actually investigating and what they find in that process. But there are so many avenues for the Commission and DSCs to inform themselves throughout this whole DSA framework that, hopefully, this leads to a very proactive role in enforcement than ... Maybe that, at some point, leads to more transparency from the Commission and the DSC side as well.
Ramsha Jahangir:
Matteo, from your perspective, your study also looked at the role of the DSA enforcement team, which includes DG CONNECT at the Commission, the European Center for Algorithmic Transparency, and external contractors working with the Commission to support enforcement activity. So, what are your findings on the role of these actors?
Matteo Fabbri:
So, the role of these actors is actually not clearly outlined by the actors themselves in the sense that we know that the DSA enforcement team, which is especially one unit in DG CONNECT, the F2, actually carries the investigative work that supports the initiation of proceedings. This work is also partially supported by the European Center for Algorithmic Transparency, especially because ECAT, which is this center, is part of another directorate-general, which is the joint research center. Inside it, two teams, one that is mostly devoted to research and the other one that is devoted to inspections.
Based on my knowledge, the inspection team closely works with the DSA enforcement team at DG CONNECT. However, we don't know to which extent the work of ECAT on the side of inspection intertwines with the work again in the inspection that is done by the DSA enforcement team. We don't know also how this work intersects with the work of the Board for Digital Services, which includes also the digital services coordinators, as the DSA actually foresees a close collaboration with a wide range of subjects, including also other agencies of the union.
For example, we know that there are DSA officers in the national representation of the Commission in the member states and also in the United States, in San Francisco. All of these subjects work towards the enforcement process, but based on different strategies and from different perspectives. Within the DSA enforcement, they have legal officers, policy officers, and technical officers, also called the data scientist.
They also have working groups that are within the Board for Digital Services. These working groups can be actually seen in the minutes of the Board of Digital Services. However, there is no publicly available information on which kind of strategies and of developments each working group develops from the beginning, from February 2024. But going until this year, February 2025, the minutes have not been shared anymore, so in the last couple or three meetings of the board.
Also, there are some contributions, as you have mentioned, by external contractors that have been hired by the Commission through a call for tender for more than 12 million euros. Even the names of the winners of the call for tender have not been shared, which is a quite particular approach given that in most of the cases, the winner of public tenders should be announced. However, the work of these tenders has started, and also people in networks that collaborate with these tenders know who these tenders are.
The main picture that comes out of this is a lack of transparency regarding the strategy and the actual practice of the enforcement process, because on the one side, we don't know based on what the documents that they produce are shared. They share some decisions while not most of the decisions. On the other side, we don't know anything about how they decide to proceed through strategies.
So the DSA, for example, requires to produce standards on different issues, but the first set of guidelines that the Commission announced to be willing to produce was on the protection of minors, while on other topics such as recommender systems and choice interfaces and personalization, they did not announce any set of guidelines or standards, even if this point was also part of the DSA.
This lack of transparency actually does not allow us as researchers to inspect the enforcement process and to also know how to contribute to the knowledge that the enforcement process needs to actually inspect and investigate all the very large online platforms that are 25 up to now. It seems very much an approach in which the Commission relies on trusted stakeholders from which it seeks to gather information but does not share its information on the enforcement process.
On this side, even the European ombudsman has sent a letter to the president of the Commission regarding a case that was raised by Alexander Fanta, a journalist who followed the money, that actually in 2023 requested access to the ... To access systemic risk report, and the Commission denied such access request, even if this report would be later published.
Then, this action of the European ombudsman can be a channel for the Commission to clarify the way in which it applies this general presumption of confidentiality that they have stated as pertaining to their approach to sharing information on enforcement. So it appears that the Commission wants to keep as much information as possible confidential. However, they did not outline clearly based on what they consider this information confidential and to which extent, but I am sure we will come back on this.
Ramsha Jahangir:
I have a broader question that I thought of based on your response. There is obviously protecting sensitive information, but there's also ensuring meaningful transparency, which is the purpose of DSA. So how can regulators strike this balance? This can be a question to both of you. I'm curious to hear your perspective. What are the channels available? You've pointed to this case of the ombudsman and the administration already, but are there any mechanisms in the DSA broadly to challenge or appeal confidentiality decisions by regulators?
Matteo Fabbri:
I think, in general, the DSA does not outline any form of access to enforcement documents. However, it states the cases in which there should be professional secrets, which is, in general, the cases in which the information is considered confidential. But again we don't know which information or to what extent the information should be considered confidential.
Most of the reasons through which the Commission justifies this confidentiality come back to the regulation on public access to European Union documents or the documents of the parliament, council, and Commission. In most of the cases, they refer to the exceptions to the public access to documents that concern public security, defense, and military matters, international relations, or the financial, monetary, and economic policy of the community or a member state.
Also, the set of motivations that the Commission is focusing mostly on is protecting the commercial interest of the VLOPs and VLOSEs, and the fact that disclosing information would undermine the protection of inspection, investigations, and audits.
They refer to these motivations for exceptions that are outlined in Article 4 of the European regulation on public access to documents, but they do not specify any further motivations or do not specify how these general exceptions apply to the DSA itself. That's why also the ombudsman asked to which extent the information should be considered confidential.
This was asked explicitly, and we do not still have the answer of this question. But I think when this answer will be published, we would have more indication as to whether the Commission actually wants to take a precise stance on which documents can be disclosed.
Jacob van de Kerkhof:
I think it's very hard to add that because Matteo gave such a great and comprehensive overview of the consideration there. I suppose there are two small points that I could add to this, one being that I think in this very politicized DSA or tech policy landscape that we have currently, some level of transparency at the level of the Commission could perhaps help understand their enforcement choices and provide an additional layer of legitimacy there being that it might be easier to explain to our former allies in the United States why we've opted to regulate against platforms and, therefore, create a better transatlantic understanding about why we're enforcing a DSA in the way that we do.
I think that transparency on those choices, like why do we start with enforcing against experts, why is that legitimate under current EU law, can help overcome a lot of misunderstanding about what the DSA actually is. I think the Commission, and also the platforms themselves, are quite fearful of releasing too much information about, I suppose, the internal workings of the platform. They fear potential misuse by malicious actors that, once they know how a platform works and how it makes decisions, might use that information to the detriment of society.
I wonder to what extent that concern is legitimate. I think you'll have to weigh that into the framework that Matteo sketched before. But that could also be a legitimate concern to be a little bit withholding and saying, okay, releasing all the information that you have or not. So those would be my very small addendums to your question.
Ramsha Jahangir:
It was a pleasure having you here. This has been a very short but informative discussion. Thank you so much for your time.
Jacob van de Kerkhof:
Thank you.
Matteo Fabbri:
Thanks for inviting me.
Authors
