Time to Enforce: The EU’s Digital Rulebook is On the LineMark Dempsey / Nov 16, 2023
Mark Dempsey is Senior EU Advocacy Officer for ARTICLE 19, based in Brussels.
This article partially draws on the content of a panel discussion that took place at the annual Computers, Privacy, and Data Protection (CPDP) conference in Brussels in May of this year. Hosted by ARTICLE 19, the panel ‘Bridging the gap - enforcing the Digital Markets Act (DMA), learning from the data protection experience’ sought to address several challenges with regard to the enforcement of the DMA rules that interplay with the EU frameworks for data protection, as well as other end users’ rights.
The Digital Markets Act (DMA) is the EU’s new regulatory framework, which imposes obligations on large platforms in digital markets. It aims to ensure that markets for core platform services (of which there are 22 identified) are ‘contestable and fair,’ which, broadly speaking, means that gatekeepers must not ‘impose unfair conditions on businesses and end users.’ Introduced by the European Commission as a proposal in December 2020 after a thorough legislative process and inter-institutional negotiations between the European Parliament and the Council, the proposal became law on November 1, 2022, and most rules became applicable in May 2023.
Under the DMA, the European Commission can designate companies as gatekeepers if they fulfill three quantitative criteria: 1) the company has a strong economic position and a significant impact on the EU Single Market; 2) one of its core services operates as ‘an important gateway for business users to reach end users,’ and 3) the company ‘enjoys an entrenched and durable position in its operations, or it is foreseeable that it will enjoy such a position in the near future.’ Within six months of being designated, gatekeepers must comply with the DMA’s key obligations and prohibitions. On September 6, six gatekeepers were designated by the Commission: Alphabet, Amazon, Apple, ByteDance, Meta, and Microsoft.
The importance of choice and the interplay with the GDPR
The recitals of the DMA make it clear that two overarching goals of the legislation are contestability and fairness, with the latter appearing 55 times and the former 17 in text and recitals. But for these goals to be achieved, they must be viewed not as separate and distinct but as strongly interlinked. Through this interpretive lens, all provisions, commitments, and remedies must be considered and assessed. The other fundamental aim of the DMA is to guarantee choice, a function of contestability and fairness. Again, it is no coincidence that ‘choice’ appears 23 times in the text and recitals and more than ‘fairness.’ By assessing recitals together with relevant articles, it becomes apparent that the aim of the DMA is to address business models by imposing red lines in some cases and ‘nudges’ in others.
If we take a concrete example of a browser such as DuckDuckGo and examine Article 15 together with Recital 72, the outcomes being sought by the Commission become clearer. Recital 72 establishes that:
The data protection and privacy interests of end users are relevant to any assessment of the potential negative effects of the observed practice of gatekeepers to collect and accumulate large amounts of data from end users. Ensuring an adequate level of transparency of profiling practices employed by gatekeepers… facilitates contestability of core platform services.
It requires gatekeepers to ensure consistency of compliance with the GDPR in their transparency measures related to profiling and by “providing an independently audited description of the basis upon which profiling is performed.” Here, the Commission’s aim in the DMA is not just to impose transparency obligations to alert end users to the nefarious nature of profiling, but also to enable services that do not engage in profiling to better differentiate themselves in the marketplace.
DuckDuckGo offers stronger privacy protections, yet it is not widely adopted since users are generally unaware of the profiling enabled by other, more mainstream browsers. It becomes clear that - when alerted to the extent of profiling by browsers from companies designated as gatekeepers, users may be more likely to consider switching to a more privacy-oriented browser such as DuckDuckGo, for example. In turn, this could lead to a change in behaviour from the gatekeeper in an effort to regain users. Further pressure is imposed on the gatekeeper by the obligation in Article 15(3) to ‘…make publicly available an overview of the audited description...’ of its profiling of users, which will be assessed/evaluated by the European Data Protection Board for consistency with GDPR.
Calls for a holistic approach to the Commission’s enforcement efforts
At ARTICLE 19, we have long been concerned with the excessive market power that Big Tech profits from and the many aspects of its business models that are incompatible with users’ fundamental rights. The DMA can enable a marketplace that is more responsive to users’ rights, but only if the obligations on gatekeepers in the DMA are not viewed in silos or imposed with a fragmented approach. If compliance with these new rules only requires minor adjustments to the gatekeepers’ practices, they will have little impact on their business models and, consequently, their gatekeeping role.
To ensure the Commission’s enforcement efforts do not just result in a ‘tick the box’ exercise, we propose the following:
- Publish an ‘enforcement plan’ which identifies priorities and marks enforcement milestones.
- Consider a recent submission by the German government, ‘Proposals from Germany for a strong enforcement of the DMA’, which, amongst the seven proposals, calls for active roles for Member States and national competition authorities and calls for enforcement to be timely with full use of the compliance mechanism in the DMA.
- From the inception of both the Digital Services Act (DSA) and the DMA, ARTICLE 19 has called on the Commission to engage and work regularly with civil society organizations (CSOs). But we still need clarity, inter alia, on the precise manner and frequency with which third parties will be involved during proceedings under the DMA and the information that the Commission will provide to third parties in each case. On the latter, the Commission will need to establish procedures to ensure this is delivered in a timely manner.
- With the DMA’s foreseen resources meager compared to those of the gatekeepers (the Commission will have 0.3 - 0.7 employees per rule per gatekeeper while enforcement budgets would be equivalent to 0.005% - 0.015% of gatekeepers’ global turnover), the Commission must get internal organizational arrangements right by ensuring that its digital policy department (DG CONNECT) and its competition department (DG COMPETITION) work well together and benefit from each other’s expertise.
- Lastly, to avoid the enforcement failures of the GDPR, the Commission must adopt a collaborative, transparent, and structured approach with many actors, including regulators, experts, civil society organizations, gatekeepers, businesses, and end users in a system of participatory enforcement.
By adopting the recommendations above, the Commission has an opportunity to start leveling the playing field and diluting the power of the large platforms. But the ‘proof will be in the pudding’ as the saying goes, and as an unseasoned regulator, the challenges are daunting.
- - -
These issues and more will be discussed at a symposium, “Effective DMA Enforcement,” that ARTICLE 19, together with the Hertie School’s Centre for Digital Governance, Università di Trento, and the Amsterdam Centre for European Law and Governance, will host in Brussels on November 22 & 23, 2023.