TikTok, Telegram, and Trust: Urgent Lessons from Romania’s Election
Roxana Radu / Jun 25, 2025
December 8, 2024 - Mogoșoaia, Romania: Far-right runoff candidate for the presidency of Romania, Călin Georgescu, speaks to the press at a closed polling station after the elections were legally annulled. Shutterstock
Romania’s 2025 presidential election was far more than a domestic political contest—it was a stress test for Europe’s digital defenses. Amidst rising political instability and economic downturn, the country has become a frontline example of how digital interference, algorithmic manipulation, and platform inaction can collide to undermine democratic processes.
What unfolded in Romania is both a warning and a lesson: in the digital age, electoral integrity can no longer be separated from platform accountability.
The digital threat to electoral integrity
The erosion of electoral integrity in Romania began not at the ballot box, but online. The first round of presidential elections, held in November 2024, was annulled by the Constitutional Court following credible reports of foreign interference and campaign irregularities. Cãlin Georgescu, a pro-Russian candidate with just 5% support in the polls, surged to first place in the initial tally, allegedly aided by opaque campaign financing, digital influencers, and TikTok’s recommender system, a platform with 9 million users in Romania. The National Audiovisual Council and ANCOM, Romania’s digital services coordinator—reported serious irregularities in TikTok’s handling of political content to the European Commission, which opened formal proceedings against the platform under the Digital Services Act on 17 December. Georgescu was later banned from running again.
By the time voters returned to the polls on May 4, 2025, a full-scale information and meme contest had ensued online. George Simion, the 38-year-old nationalist leader of the far-right AUR party, chose to communicate almost exclusively on social media and secured 40% of the vote. Nicușor Dan, the 55-year-old independent candidate and mayor of Bucharest became the front-runner with 20% of the votes, pulling ahead of coalition-backed Crin Antonescu. The result sent a clear message of public rejection toward the political establishment and triggered Prime Minister Marcel Ciolacu’s resignation.
The second round of elections unfolded under intense scrutiny. The online information space had become weaponized: personal attacks, fear-driven messaging, and fake narratives targeted Nicușor Dan. An investigation conducted by Global Witness revealed that TikTok pushed nearly three times more far-right content to new, politically balanced users than any other political content. Despite repeated warnings from civil society and watchdog groups, enforcement efforts by platforms appeared to be patchy and poorly scaled to the scope of the threat.
Meanwhile, the European Commission released preliminary findings confirming that TikTok had violated the Digital Services Act (DSA) in a separate investigation—for its commercial ad repository. The platform failed to disclose who sponsored ads, how audiences were targeted, and where funding originated. As the Romanian case illustrates, these failures also extended to the political content, despite TikTok’s commitments and actions following the annulled vote. The second investigation, prompted by the electoral amplification manipulation allegations in Romania, is now underway and is expected to shed more light on TikTok’s policies on political ads, paid political content, and the role of its recommender systems in amplifying such material.
During the final election day (May 18), another platform came under scrutiny when Telegram’s CEO, Pavel Durov, sent a message to all its users in Romania, alleging meddling by France. Although ultimately ineffective, it marked a new kind of digital interference: platform owners inserting themselves into national political moments with little to no accountability. Telegram has around 26% of the domestic market.
Under these extraordinary circumstances, pro-European Dan took the presidential seat with 53.6% of the vote. While his victory signals resilience in the face of digital manipulation, it also reflects a broader protest vote—a desire to move past the political paralysis and digital toxicity that defined the last six months. His stated priorities are pragmatic: closing the economic deficit, responding to regional security concerns, and forming a functional government—likely under reformist Ilie Bolojan. But digital ecosystem challenges barely register on the agenda. This is a risky oversight. The recent election cycle exposed just how deeply they can erode public trust and disrupt democratic stability.
Europe’s reckoning in the age of disinformation
While digital interference formed the storm clouds, the broader Romanian context made the overall climate uniquely vulnerable. Political instability and economic volatility had already primed the public for distrust. As the final election results show, Romania remained deeply fractured, and many of its fault lines only surfaced in online discussions.
In their newest forms, attacks on democratic processes no longer happen in isolation. They occur both within individual platforms and through coordinated, multi-platform targeting, where content and behaviours migrate, evolve, and amplify. The EU’s Code of Practice on Disinformation has proven insufficient—particularly as major players like Telegram have not signed up to it. The Digital Services Act (DSA) offers a stronger framework, but enforcement remains sluggish—even when needed the most.
The Romanian presidential elections showed that the current measures to protect electoral processes fall short in practice. The expectations laid out in the Commission’s 2024 guidelines for the mitigation of systemic risks online that impact electoral integrity have not been met. Major platforms failed to implement adequate mitigation measures or provide meaningful transparency. In Romania, TikTok failed to sanction accounts and election-related content promoted aggressively, despite having a policy in place banning political advertising. Even when national authorities repeatedly reported electoral irregularities, the platform failed to address the concerns.
The platforms’ risk assessments reporting under the DSA—self-designed and self-measured—reflected a minimalist approach to compliance. Instead of treating the DSA as a baseline for responsible action, platforms used it as a ceiling. For the May re-run of the elections, the Electoral Centre introduced by TikTok provided little transparency into the platform’s actions in real-time, despite an increase in the number of Romanian-speaking content moderators. Meanwhile, platforms like Facebook and Instagram maintained their standard approach, releasing transparency reports on a fixed quarterly schedule, regardless of unfolding political events or electoral timelines. At the same time, key players like Telegram remained outside the scope of the DSA’s enhanced transparency and risk mitigation obligations under Articles 34 and 35 for its instant messaging app, further complicating efforts to track and contain the spread of disinformation across platforms.
Beyond platform-level action, institutional complexities have slowed down interventions. Evidence gathering is an important part of the process in the ongoing probe against TikTok on election risks, yet the DSA does not set any legal deadline for concluding the formal proceedings. Under the DSA, only the Commission can assess compliance by very large online platforms, relying on cooperation with Ireland’s Coimisiún na Meán, given TikTok’s EU headquarters. This fragmented structure has led to slow, opaque responses at critical moments, revealing the limits of the DSA’s current implementation when it comes to real-time democratic accountability.
To prevent the next electoral crisis, adequate platform accountability needs digital safeguards with timely execution. Platforms must step up and provide real-time, proactive transparency that goes beyond baseline compliance. Under the DSA, transparency mechanisms must not only be available, but also accessible and meaningful to the users. That requires clear in-platform explanations and full disclosure of who funds content, how it’s targeted, and what amplification methods are in play.
Moreover, policymakers must reckon with the complex ways digital systems interact and reinforce each other. Disinformation moves fluidly between influencers, monetization tools, and algorithms, spanning multiple platforms. Narrow, platform-by-platform risk assessments fail to capture these systemic dynamics. The European Commission should therefore strengthen and refine its Guidelines on election-related risks, placing greater emphasis on covert political campaigning—particularly via influencer content—and on the interaction risks between platforms, ad intermediaries, and services like private messaging. These gaps in coverage actively enable voter deception and accelerate the trust gaps across the broader digital ecosystem.
Brussels must recognize Romania’s experience for what it is: a frontline preview of challenges facing democracies across Europe. Without urgent, robust, and agile oversight mechanisms, electoral integrity will continue to be a weak link in the democratic chain.
Authors
