Home

Donate
Podcast

Unpacking the Politics of the EU's €120M Fine of Musk’s X

Justin Hendrix / Dec 7, 2025

Audio of this conversation is available via your favorite podcast service.

On Friday, the European Commission fined Elon Musk’s X €120 million for breaching the Digital Services Act, delivering the first-ever non-compliance decision under the European Union’s flagship tech regulation. By Saturday, Elon Musk was calling for no less than the abolition of the EU.

To discuss the enforcement action, the politics surrounding it, and a variety of other issues related to digital regulation in Europe, I spoke to Joris van Hoboken, a professor at the Institute for Information Law (IViR) at the University of Amsterdam, and part of the core team of the Digital Services Act (DSA) Observatory.

What follows is a lightly edited transcript of the discussion.

Elon Musk at the Memorial for Charlie Kirk at State Farm Stadium in Glendale, Arizona. Photo by Gage SkidmoreCC BY-SA 4.0 via Wikimedia Commons

Justin Hendrix:

Back in 2022, after Elon Musk announced his bid to purchase the platform then known as Twitter, former European Commissioner, Thierry Breton, visited Musk at a Tesla plant in Austin, Texas. Part of their discussion focused on the EU's Digital Services Act, the DSA, which was politically agreed right around the time that Musk made his offer to purchase Twitter.

Thierry Breton:

So we're in Austin together with Elon Musk. Thank you very much, Elon, for welcoming me.

Elon Musk:

Thank you. Most welcome.

Thierry Breton:

And of course, we discussed many issues, and I was happy to be able to explain to you the DSA, a new regulation in Europe. And I think that now you understand very well. It fits pretty well with what you think we should do in a platform?

Elon Musk:

No, I think it's exactly aligned with my thinking. I very much agree with... It's been a great discussion. And I really think I agree with everything you said, really. I think we're very much of the same mind. And, you know, I think just anything that my companies can do that would be beneficial to Europe, we want to do that....

Justin Hendrix:

Much has changed since that exchange. For some time, it has been clear that Musk was on a collision course with the Commission, given his management of the platform now known as X. On Friday, the European Commission announced a 120 million euro fine on X for failure to comply with the DSA. Here's a Commission spokesperson announcing the action and the basis for it on Friday.

European Commission spokesperson:

Today, the Commission has issued a fine of 120 million euro to X for breaching the Digital Services Act. This is the first ever fine under the DSA.

X has indeed breached its transparency obligation under the DSA. This includes X's blue check mark. It deceives users. Anyone can pay to obtain the verified status, and X does not meaningfully verify who is behind it. It also includes X's advertising repository, which does not work properly, and X doesn't provide effective data access for researchers. Failure to comply with the non-compliance decision may lead to periodic penalty payments on top of today's fine.

At the same time, we have adopted today and accepted TikTok's commitments to make its own advertising repository work. What does this show? Our objective is not a fine. If you engage constructively with the Commission, we settle cases. If you do not, we take action.

Justin Hendrix:

But despite the measured tone of European regulators, Musk and his allies in the Trump administration did not hold back. On Saturday, Musk called for no less than the abolition of the European Union, which he called a quote, "tyrannical unelected bureaucracy, oppressing the people of Europe," unquote. US Secretary of State Marco Rubio said the fine was not just an attack on X, but quote, "An attack on American tech platforms and the American people by foreign governments," unquote.

What's more, the rhetoric by Trump administration officials and other Republicans is congruent with the White House's newly released National Security Strategy, which says it is the policy of the United States to help Europe restore its quote, "Civilizational self-confidence and Western identity." The document criticizes the European Union as among the transnational bodies that "undermine political liberty and sovereignty," including through quote, "censorship of free speech and suppression of political opposition."

To discuss the fine on X, the political context around it, and other matters at the intersection of tech and regulation, I spoke to an expert, Joris Van Hoboken.

Joris Van Hoboken:

I'm Joris Van Hoboken. I'm a professor of information law at the University of Amsterdam.

Justin Hendrix:

You also are part of a project that we track closely, the DSA Observatory. Can you tell my listeners a little bit about it, what it gets up to?

Joris Van Hoboken:

Yeah, so we launched this DSA observatory as an initiative running up to the legislative process for the Digital Services Act in 2020. So it's a little bit ago, and we really aim to be hub of independent expertise on this important new legislation. So we were involved in following the legislative process, but then pretty quickly also because it went so fast, the DSA was adopted in quite the record speed, we've moved to all sorts of questions of implementation and enforcement and analysis of the legal framework and how it's performing in practice.

So it's a project that's running at the University of Amsterdam at the Institute for Information Law, and there's quite a lot of colleagues also contributing to it, and we collaborate a bunch with other academics and also with civil society.

Justin Hendrix:

Well, we rely on it at Tech Policy Press. The expert's experts is how I think of the DSA Observatory. So you all have literally written the book on it, and we appreciate all the work that you do.

We're going to talk a little bit about the implications of the first big non-compliance decision and fine under the DSA. All the headlines are about X and 120 million euros that the commission has demanded for non-compliance. I want to ask you a question essentially trying to get at not so much what's happened, but what happens from here? Most folks are familiar with the idea that essentially Elon Musk and his platform have been found to have done something wrong. They've got to pay a fine, but what happens next? What are the next steps for X and the Commission legally in the wake of this ruling?

Joris Van Hoboken:

Yeah. So legally speaking, I mean, legally there is a process for X has a certain amount of time to get into compliance. There's obviously still also the chance to enter into some kind of discussions also with the European Commission as we saw in some of the other cases. There's a commitment of the European Commission also to get these binding commitments from companies that are under investigation that seems to be maybe even the preferred mode of operation for the commission to agree on, hey, how companies should proceed complying with the DSA.

In this case, there were some clear violations. X is still under investigation, actually, for some other issues that are a bit more sensitive, politically sensitive, but here it's basically a combination of no compliance or transparency obligations. So one of the areas is on the areas of the blue check mark and the fact that the way that that is running and implementing and what is actually happening is concluded to be misleading for users.

It's like there are a few things there that one can imagine X could start to comply, start to comply maybe in Europe. They would have to rebuild maybe some of their systems. It could be quite complicated though for some of the obligations to comply with. So in case they're choosing not to comply, then we'll have to go to court, and that's a possibility. These types of decisions from the European Commission, like the European Commission is acting as a regulator, implementing, enforcing the Digital Services Act, and anybody that gets subjected by a decision like that can go to court. And that's something that X could choose to do.

Justin Hendrix:

I want to ask you a question about the kind of proportionality of the fine. That's been a big focus in the media. Everyone always looks at the sum of money. Was this enough money? Does it matter? Everyone kind of puts it in the context of Elon Musk's personal wealth, which of course dwarfs. 120 million euro fine is a bar tab by comparison. What do you make of the proportionality of the fine in the context of this?

Joris Van Hoboken:

Yeah, I think, so first of all, I'm not sure that the European Commission is really testing the boundaries still of what is possible. They have signaled also under the other platform regulation package, the Digital Markets Act, which is more economic regulation, that they're not into getting to the maximum fine and the maximum deterrent. They're actually really trying to be relatively moderate still, could be also because of some of the political pressure, and they're really aiming for improvements and showing the law can lead to improvements. In this case, the fine and the way to calculate it's been going around for almost a year now. I think that apparently also the Commission has been looking at what is the proper basis for calculating the fine? What are they actually targeting? Is it just X as the surface? Is it the broader corporate structure around X, or is it even Elon Musk and his wealth because he's the single owner of the company?

And so what they ended up doing is picking a fine that in relation to the company itself, just the service X, I think it's about 4.5% of revenues, which are calculated to be a bit more than two million euros, I think. So that's what they went for, but it's clear that maybe more aggressive fines could be possible in the future. But I think maybe the European Commission is also considering that escalation is possible in the future, but at this point it's a sizable fine though. There's a number of clear violations, but overall and looking at the breadth of obligations under the DSA, there are some clear violations, but it's not that there's a conclusion of violation across the board.

So I think the fine comes across as relatively proportionate. Probably the European Commission also takes into account the unwillingness of X to move into compliance. There have been a bunch of conversations. This is not coming out of the blue. The investigations were opened quite a while ago. That probably fed into their thinking on the fine as well.

Justin Hendrix:

There was the preliminary results of the investigation, I think what, more than a year ago. Are there any differences from the preliminary results and what we saw yesterday with the announcement of the enforcement? Anything that appears to have changed in the way that the Commission has described what they're holding X responsible for?

Joris Van Hoboken:

That's a good question. I would actually have to look at that still. I haven't done that analysis. I mean, the thing that I would say stands out in particular is that in the earlier proceedings and the request for information, there was a number of issues that X was under investigation for and on issues related to content moderation, issues related to hate speech, this information on the platform, and X may be not being in compliance with the risk management obligations under Article 34 and 35. That is left open, and the decision is on the transparency-related provisions. So that's, I think, the thing that stands out in particular that some things are left open still.

Justin Hendrix:

I wanted to press you on that. It's something I definitely wanted to ask about. I think most listeners may not be aware that there are other investigations into X ongoing. Can you just say a word or two about that, about this concept of systemic risk? This almost seems like, in some ways, a more potentially volatile set of issues that the Commission's wading into. The idea of systemic risk is perhaps a lot less defined than consumer deception. I don't know. What do you make of what to expect on that front?

Joris Van Hoboken:

Yeah, I think what to expect on that is also that we really, and the Commission and everybody needs to learn still of how to work with this framework. And in the first instance, it's really been the platforms that have been designated as very large online services to do an assessment of these risks. And these risks are different buckets. You have illegal content, the fundamental rights risk, and then some specific risks, societal issues, right? Issues with civic discourse and also the protection of minors.

And so we've had now three years of these risk reports going to the Commission. And I think what we can see is that generally there's a bunch of issues that are being discussed about, but overall, there's a sense that maybe the risk-based approach is ultimately going to be difficult to really operationalize in practice. And we don't yet know how the European Commission is really going to give it more teeth because we haven't seen any kind of direct enforcement action.

In relation to X, what happened in 2023, soon after the Hamas attacks and the start of the Israeli war in Gaza, that is when also under the former European Commissioner, there was an exchange with X on some of the issues with regard and respect to illegal content. And so that is actually where some of that kind of investigatory work of the European Commission started. But we haven't seen a lot of updates since on what is actually under investigation.

But overall, like the Commission has stated, that it appears that X is not in compliance with the risk obligations with managing illegal content on the platform and managing some of these risks related to hate speech and also civic discourse on the platform, which is not exactly a surprise. No? I think we've seen, of course, also a bit under the ownership of Elon Musk that X has had a very particular, has positioned itself also in a very particular political way in Europe and also, of course, internationally. And so that is something that is also feeding, I'm sure, into that investigation.

Justin Hendrix:

I want to ask you a little bit about the response that you are seeing from abroad, how you think it's being heard in Europe right now. And of course, the Trump administration has locked arms alongside Musk and regarding this as a censorship action, there's intense political pressure generally on the Commission to produce results as it were. I don't know. What do you make of all the kind of political noise around this, particularly the noise that's coming from across the Atlantic?

Joris Van Hoboken:

Well, definitely the US administration is good at producing a lot of noise, but it's also, of course, taken seriously, and I think maybe the reactions to the X fine, they are dwarfed by, of course, what came out of the administration with the national security strategy. That's a much broader kind of thing, and that is really about this transatlantic relation.

And also it is about issues of democracy, but it's quite clear that the US government, they're planning to interfere basically in the politics in Europe and in the democratic politics of Europe and to be supportive of the far right and the kind of far right talking points on migration and speech. And so this is, I think, what is drawing people's attention even more than this particular kind of reaction to the X fine, which also is clearly not very convincing. There are possibilities to discuss about what is the appropriate kind of way in which government and government regulation can deal with some of the online content moderation issues.

And then there's been, of course, the broader accusation of censorship coming out of the administration that have not been necessarily very convincing. But then we have, in this case, clear violations of things that don't necessarily have much to do with speech. It's about misleading users. It's about lack of transparency about certain important mechanisms like transparency about advertising and also researchers being able to do research on the data of the platform.

I think that in that case, it's just showing a willingness of US administration and people at the highest level to keep somehow weaponizing and pushing and trying to have as much leverage as possible. But I think, yeah, in this case, on the X fine, it's not very convincing.

Justin Hendrix:

Let me ask you about just feeling in Europe as this ruling came down or as this enforcement was announced. What's the mood like in the bubble or around it? What do you think people were sort of feeling as this was happening?

Joris Van Hoboken:

No, I mean, I think there's been significant pressure also building up a little bit in the last months and weeks. So last week still, of course, Macron said something about it. There's been also in the European Parliament building pressure. Overall, civil society is making it very clear that they're at a loss of like, "Hey, when are you starting to really do work on this? When are you really going to enforce it?" You're seeing all these preliminary findings and we don't see any fines and there's clear violations.

So I think the mood within expert circles is, this is a very good sign that the European Commission is actually going to follow through. It's also, I think, very clear that more decisions will come. And that's generally seen, I think, as a very positive thing that Europe is standing by the rules that it adopted also through democratic processes.

Let's make it very clear. If the European Commission does make mistakes, we do have courts to go to and to offer corrections to any kind of overreach if that were the case. So I think there's a mood change in that regard, and I think people will be following closely what will be coming next. And also maybe seeing an appetite to continue working with some of these tools also that the DSA has offered, as we are at the moment in Europe going through this new push for competitiveness, also a simplification agenda. And discussions are ongoing of what is that going to mean for the DSA and the DMA. And even though evaluation and these kinds of things will only be happening to 2027, it's very clear that there needs to be something done to show that all this new regulation actually has value. So I think that's the path that the European Commission is on.

And that's also the whole broader ecosystem of entities that are also part of the broader ecosystem that can also help with the enforcement, including also through litigation, they're probably feeling supported by this new development.

Justin Hendrix:

So all the news headlines for the most part were about X, but other things came out of the Commission in the last few weeks and also alongside the X enforcement yesterday. Let's talk about TikTok first. It's more of essentially a kind of settlement with the Commission over the investigation into that platform. What do you make of it, and how does it fit alongside the news around X? What does it tell us about the Commission's enforcement of the Digital Services Act?

Joris Van Hoboken:

No. So what I mentioned earlier, I think very clearly that the European Commission is very interested in coming to these kind of binding commitments and not getting to the place where they have to issue fines. And I think that it maybe signals different things. I think it signals that the Commission does believe maybe also that ultimately the companies should be in the driving seat of how to organize their compliance. This is something also that the risk-based approach does. I don't think the commission thinks that they have all the expertise of how to do things the best way. And so with these kinds of binding commitments, they do also give a certain amount of space to companies to come up with the right measures to get into compliance.

The second thing I would say though is on TikTok, for also maybe people that are not so familiar with the specifics of this investigation and the ultimate decision and the commitments that were made, I even also personally feel that when I think about TikTok and the kind of issues maybe that TikTok has from a broader societal perspective, I'm more thinking about maybe addictive design, recommender system types of issues, disinformation kind of issues.

And these are not the issues that were decided on by the Commission by and large. For me, there's a sense of this is maybe not the end of the discussion also between the Commission and TikTok. And there, I think it is interesting to see also the broader developments around, especially maybe child safety, of course, because TikTok is also very popular with younger users. And you see that also the European Parliament putting significant pressure also in the area of these types of topics of addictive design and impact on minors of services like TikTok. So I do expect that we're seeing a bunch of commitments on some specific aspects, but it's not the end of the story for TikTok and DSA enforcement.

Justin Hendrix:

The Commission has a lot of other investigations going under the DSA. Any that you're particularly watching or any that you're aware that we might see enforcement action or judgment on in the near term?

Joris Van Hoboken:

No, I think in the social media space, the Meta investigations are important, I think. So here also the Commission is not necessarily targeting really the speech norms and things like that, but they're having issues with the way that Meta Services designed the procedures for notifying Meta Services of illegal content and issues with the content.

So we have laws in Europe that allow anybody to start some kind of legal proceedings about illegal content online. So it's not that people are dependent on the DSA for dealing with issues of illegality and harm. The thing is that a lot of the procedures that exist, they're costly, they're difficult, maybe they're cost long time to go to a court. So really what the DSA also tries to do is say there should be this kind of load threshold types of procedure available to people to notify services of illegal content, and these procedures should be time efficient. And they should get to a result that then also can be appealed by users.

So it's really on these types of procedures that the DSA is trying to add value, low thresholds for regular users. And then when you see the implementation of this by Meta, and that's what the European Commission has reacted to, is that they're really difficult for users. And users are also steered into the direction to basically notify just about terms of service violations instead of complaining that something is illegal or unlawful in European law. And then basically, things get triggered and channeled in particular directions that are not necessarily in the interest of users.

So it's really about the design of these kind of procedures and the usability of these procedures to users that Meta is also under investigation. And I do expect that the Commission takes that very seriously. We have seen that earlier. It's not the first time that this kind of happens. And we've seen it earlier under German legislation on the NetzDG, that the way that also at the time Facebook implemented those obligations under German law, they were misleading, they were kind of like designing things. They're designing the compliance in a way that's really not in the spirit of the law. And they were also fined at the time, and I expect that that would also happen again.

Justin Hendrix:

I think the picture that's sort of emerging, and tell me if you agree with this, but just seems like a kind of reasonable regulator enforcing the law. It's picked off, for the most part, things that are more straightforward to deal with upfront, and some of the more complicated things it's still yet to deal with. Is that right generally? I mean, I think there's so much noise from where I sit over in the United States and we're so unused on some level to seeing regulatory action against these tech firms that everyone has this kind of caricature in their mind of fussy European regulators. But I don't know. In many ways this just seems boring.

Joris Van Hoboken:

Yeah. But the overall politics is generally also not boring. I do think that you're right, and this makes complete sense that some of the easier issues like sometimes with rules, they're just very clear, you comply or you don't. It is directly visible and it can be documented, and that's going to make investigations and decision-making also by the regulator much easier. Yeah, and so some of those things are not. Some of those violations are much harder to document. They require also some decisions on precisely how to interpret things like these risk management obligations to make some proportionality assessments that like, hey, if we require or think that there should be some kind of mitigation, what could it mean for fundamental rights? And the more political issues around how certain types of decisions could be perceived by the public. But just alone documenting the violations and coming with solid legal theories around whether there is a violation, it takes a lot of time.

In some places of the DSA, we are dealing with new types of law, and so that just takes a long time. I think the Commission is quite settled, not making really. So they don't want to be caught with a whole bunch of procedural mistakes. They don't want to end up picking a target that then they get back into their faces and it fails.

So it makes sense, but I do think that there was maybe some delay with some of these also even easier types of decisions on more compliance. There maybe was some delay because of the preference of getting binding commitments. That obviously takes time to negotiate, but now the first real decision of fine is out. It's also one of the politically more sensitive ones. So I think we can expect more of them, some of them also being relatively straightforward and maybe boring, but I do expect that we will see some of the more boring stuff also being picked up at some point, just because that took more time.

Justin Hendrix:

While I have you, I also want to ask you about another thing that happened this week at the Court of Justice. The European Union delivered a judgment in a case, X versus Russmedia Digital and Inform Media Press. Can you tell my listener just a little bit about what this case is and why it matters?

Joris Van Hoboken:

Yeah. So there's been a discussion in Europe about what is precisely the responsibility and liability of intermediaries online that deal with user generated content, like social media companies and other types of online platforms. And they have these protections against liability. They're conditional, but as long as you don't really have knowledge of specific issues, you should generally not be held liable. That also carries over in the Digital Services Act. And the overall rule then also, like the big rules in this space is that governments and courts, they should not be allowed to put general monitoring obligations on these kinds of platforms that facilitate user generated content. So there should not be strict liability. There should not be an obligation on these platforms to check every piece of content that is posted on the platform for issues of illegality.

We've seen over the years some rulings that are kind of pushing on the boundary of what is this general monitoring obligation and maybe some specific monitoring can be allowed. We have seen these debates under the copyright directive and the need for some type of filtering of copyright protected content. And we've seen also in areas of defamation, we've seen cases.

There's this recent case again where the court was asked to determine whether hosting providers were in particular responsible also for complying with data protection rules and data protection rules under the GDPR. And the court concluded that this was indeed the case and that this hosting provider in the case, Russmedia, which allowed for third parties to post advertisements with personal data, whether they were responsible for complying with the GDPR, with the handling of this personal data. Some people have concluded that because the court said that is this a responsibility under the GDPR, the Russmedia site can be considered like a host, like a controller for the personal data jointly with the advertiser that somehow that means that there's a general monitoring obligation. They will have to just monitor everything that comes on the site.

But the court didn't say that. And in some of these other judgments, also the court was quite careful not to say, "This means that to comply, the company has to do this and that. They have to do filtering. They have to do basically scanning all the content, prevent any illegal content." They don't say that, but they make it clear that there's a compliance obligation, but the way that that can be organized, it doesn't have to entail general monitoring.

But this is an ongoing discussion. It hit quite a bit and it was also picked up in the US. And some people have been alarmed with the development. There are some issues with the ruling, but I think it's important to recognize that if the court doesn't say that there has to be general monitoring, we can still conclude also on the basis of general principles of European law, that hosting providers, social media, they don't have to take Draconian measures that would be clearly involving mass surveillance of the user, scanning all the contents, very restrictive towards the freedom of expression of users on the internet.

And so that's the position I have on that. And ultimately, I think what we see with some of these rulings that maybe there's some mistake in the way that the court has, maybe there's some lack of balance in the ruling, but there will be opportunities in the future to correct that.

Justin Hendrix:

So with all of this going on, I can't help but also ask about the Digital Omnibus and Europe's general effort to relax the red tape or cut red tape around its digital regulation. I don't know. What are you thinking on that at the moment? We have a final text to review. A lot of folks in civil society are upset, disappointed and concerned, I think. Fair to say. What do you make of it from your seat?

Joris Van Hoboken:

No, I mean, unfortunately the first Omnibus Act, it doesn't look like it's very well crafted. And one of the things it does, it's attacking some core principles also of the GDPR and so of data protection in Europe. And data protection has been under attack. It's been said to create too much cost and burdens and be bad for innovation. And it's possible to rethink some of the ways in which the DSA puts certain burdens. But what is happening is also is really some of the kind of big industry talking points on the GDPR and what could change, they're picked up, making some changes in which the concept of personal data operates and also creating more space for reuse of data, also for AI and restricting important rights of individuals like getting access to their data, which have really been at the core of data protection laws since the start.

I expect this to blow up. I don't think this is going to go politically well. I think even if some of these changes would be adopted as law, they're very vulnerable to litigation also because the right to data protection is in the charter. But most of all, also, I'm quite worried that the whole process of simplification, it's not going to work as maybe people hope that it will work.

As a legal expert, I would say I've been also struggling to keep up with the amount of digital regulation coming out of Europe. It's clear that also an effort to systematize some of the laws that have been adopted to prevent bad ways in overlap and maybe sometimes even conflicting rules, there's really something that could be done there. But typically this takes quite a lot of thinking and work. And now just kind of scrapping certain things with law, I think it's going to easily just blow up and make things even more complicated.

Now, what's not happening is that we have some kind of committee that gets to decide which parts of the law gets to be scrapped. That's not how it works. Actually, all these Omnibus proposals will be legal proposals themselves, and then the European Parliament can say something about it, and the council can say something about it. And often the compromise that comes out of it is more complicated than the original proposal. And easily, of course, new things will be added through the simplification process because that seems politically opportune to do. So before you know, the simplification process is actually making things more complicated. And so I think the whole mechanism, it sounds great maybe, but I think it's going to be difficult to really get to simpler rules in the way that this is happening.

Justin Hendrix:

A little more than a year ago, you had a piece on Tech Policy Press called "A Brussels Affect," with Petros Terzis. And I think the listener is well familiar with the idea of the Brussels effect, the phenomenon where Europe sets worldwide norms or seeks to set worldwide norms where its regulators are sort of seen as a gold standard. But your piece kind of asked the question, what are we doing here, right? And what's this all for on some level? I don't know. As you step back from all of this and even admitting that it's getting difficult now to follow every regulation, and I'm certain all the proceedings and all of the tail of all of this, where's your head at on the Brussels Affect at this point?

Joris Van Hoboken:

Well, thanks for asking that and about the piece which we really enjoyed working on, and we got good reactions. And I think one of the things that we're really questioning is this kind of pride around being the regulator of the world and really be like, "Do we really want to go there?" Also, just looking at it in a historical perspective for Europe setting the rules for the rest of the world is kind of by itself problematic. And also we question whether on some of these regulations actually that's actually something that we would want. Overall, we are saying, let's be much more cautious about this Brussels effect.

If I would add to it now, I mean, the Brussels effect thing is off the table now. This is a thing from the last Commission period, and it underpinned some of this push to adopt the DSA, to adopt the DMA in record time, to also adopt the AI Act, to still include also regulation of general purpose AI models like LLMs and things like ChatGPT to still pull it into the AI, to be the first that's regulating in this space and then to rely on this mechanism of the European market to have an outsized also standard setting kind of impact internationally.

And I think here we see really the geopolitics have really changed and just relying also on markets to do this kind of work, it's not enough. There was not so long ago, was a piece, an opinion piece called the 'Brussels Defect.' And it was about this kind of thinking in Europe about this open markets, regulation of markets, creating incentives for the players. That's kind of the way that you could set kind of standards and basically saying like, "Yeah, the geopolitics have really changed. And that's not really how it works anymore."

Maybe a second thing I would say about it, and that's something we discussed in the piece is also this way of saying, we have these values in Europe and these values should be put into these regulations. And that's why it's also a good thing that we get these standards that are having a more global impact because we have these beautiful rules, we have these beautiful values. And so that's also something that is not part of the kind of political climate anymore. Like we discussed earlier, the focus is really on competitiveness.

Instead of regulating for our values, it's deregulation that is going to make Europe more competitive and also make Europe more digitally sovereign is actually to be more competitive and to deregulate. And so a lot of stuff has shifted. So on the one hand, that is just more maybe Europe is showing partly also its real face and starting to show itself in this new way. But I do think there's also a serious concern about values becoming maybe even less part seriously of the regulatory mix.

Justin Hendrix:

What's next for the DSA Observatory? I mean, it's a big project to follow all of this. You all have produced so much detailed documentation and scrutinized this thing very, very closely with a magnifying glass. Will you be able to keep at it?

Joris Van Hoboken:

Yeah, we're definitely planning to, and we keep doing a bunch of work on the risk-based approach. We are really interested also in the data access obligations like for allowing researchers to document issues of risk and how they're mitigated. What we also do is try to build community around this law and the expertise around these laws. And so the major thing that's upcoming is a big conference we are organizing in February on the 16th and 17th where we're bringing the research community together and including all sorts of other stakeholders that are working on the DSA. So that's really the big thing at the moment to pull that all together.

Justin Hendrix:

Before we go, I have to ask you, you play in a band?

Joris Van Hoboken:

Yeah, that's correct. I play in a band, and I have a concert tonight.

Justin Hendrix:

And what kind of music and what do you play?

Joris Van Hoboken:

I play punk rock, and I play guitar.

Justin Hendrix:

Well, I would look forward to coming to see you play some punk rock sometime.

Joris Van Hoboken:

I don't know they'll be coming soon to the US anymore. We've played a fair amount in the US, but it's become more complicated.

Justin Hendrix:

I'm certain it has. I'm grateful for you taking the time today to speak to me about this. Grateful for all of the analysis that you do and the support you provide for Tech Policy Press and look forward to talking to you again about this down the line when we have more major action by the Commission.

Joris Van Hoboken:

Thanks so much. It was a pleasure.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President of Business Development & In...

Related

News
Brussels Fines Musk’s X €120M, Firing Shot in Transatlantic Tech ShowdownDecember 5, 2025
Perspective
White House 'Censorship' Grievance Fantasy Kicks Into High GearDecember 5, 2025

Topics