Skip to content

Political Ads Under the Microscope: EU Regulations and Romania’s Legal Framework in the Era of Digital Elections

Introduction

The conclusion of Romania’s 2025 presidential elections provides a critical opportunity to examine how evolving regulatory frameworks have reshaped political advertising in the digital age. The elections, which saw a highly scrutinized digital campaign environment, provided a practical implementation context for the interplay between Regulation (EU) 2024/900 on transparency and targeting of political advertising, the Digital Services Act (DSA), and Romania’s domestic provisions, especially Government Emergency Ordinance No. 1/2025. Inspired by the EU Regulation, the Romanian legal framework introduced enhanced transparency, fairness, and accountability measures tailored to the national electoral context, including definitions of political advertising materials and the identification of political actors.

This analysis examines how these regulatory transformations influenced Romania’s 2025 electoral process, with particular attention to the boundaries governing permissible electoral propaganda, particularly content disseminated by non-political entities, and the enforcement of new obligations for digital platforms. Through examination of key decisions by Romania’s Central Electoral Bureau and coordination mechanisms among election supervisors, data protection regulators, and the Digital Services Coordinator, this study illuminates both the successes and limitations of cross-sectoral collaboration while mapping the intersection of supranational regulations, platform responsibilities, and national enforcement mechanisms. The paper highlights how Romania’s experience could serve as an early proving ground for the EU’s forthcoming “Democracy Shield,” where synchronized national and European oversight aims to ensure safeguards rather than stress‑tests for democratic integrity.

Regulatory Convergence and Divergence in Digital Political Advertising

Romania’s digital political advertising landscape operates within a sophisticated tripartite regulatory framework comprising the Digital Services Act (DSA), Regulation (EU) 2024/900 on political advertising transparency, and Government Emergency Ordinance No. 1/2025. These instruments converge around the fundamental principle that transparency constitutes the cornerstone of legitimate digital political advertising.

Each regulatory component mandates comprehensive labeling of political content,1 disclosure of sponsors2 and targeting parameters, and the maintenance of publicly accessible advertisement libraries.3 The DSA establishes requirements for very large online platforms and very large online search engines to ensure clear advertisement identification, including political advertisement, and creation of publicly accessible repositories, maintained for at least one year following final presentation on online interfaces.4 Uniquely, the DSA provides for platforms explicit provisions on algorithmic transparency, requiring meaningful explanations of recommender systems.5.Regulation 2024/900 introduces real-time advertisement spending dashboards during EU elections and establishes an EU-wide advertisement registry, further strengthening transparency mechanisms.6

A significant shared element involves prohibiting profiling using sensitive data categories, including political opinions or religious beliefs as defined under Article 9(1) GDPR. While Regulation (EU) 2024/900 prohibits profiling using special personal data categories, it doesn’t ban sentiment-based targeting utilizing emotions like fear or anger. The Regulation goes a step further by explicitly prohibiting microtargeting based on inferred political preferences, demonstrating commitment to protecting individuals from manipulative practices and ensuring compliance with the data collection necessity principle. Romania’s GEO1/2025 carves out an explicit exemption for issue‑advocacy ads, even if they evoke emotional responses, such as calls for support of an anti‑drone law and framing an opponent’s court challenge as endangering security, without telling people whom to vote for. This divergence creates interpretative uncertainty for platforms operating under both regimes.7

Platform accountability mechanisms receive reinforcement across all three frameworks, requiring risk assessments of democratic impacts and crisis protocols during electoral periods. While the DSA relies on decentralized Digital Services Coordinators (DSCs), Romanian legislation centralizes enforcement responsibility within its Central Electoral Bureau (BEC), illustrating divergent governance approaches.

Scope of regulated entities presents a clear point of divergence. The DSA targets Very Large Online Platforms (VLOPs) with over 45 million EU users, Regulation 2024/900 applies to all political advertisers including non-platform entities such as media agencies or influencers, and Romania regulates all digital platforms hosting political content regardless of size. 

The definition of what constitutes a political ad also demonstrates both alignment and divergence: the DSA adopts comprehensive approaches encompassing issue-based advertisements, Regulation (EU) 2024/900 expands coverage to include electoral content influencing voting behavior, and Romania’s GEO encompasses electoral propaganda materials, including indirect messaging. Beyond EU definitions, Romanian provisions require political advertisements to include sponsor identity, email, and postal addresses.

For this election, GEO No. 1/2025 broadened the definition of political advertising to include indirect promotion of contestants, expanded labelling requirements for written, audio, and video materials, including those published online.8 According to the OSCE findings, overall, labelling requirements were unclear, and many candidate and party representatives found them to be burdensome in the media and on social networks and ineffective in addressing unlawful campaigning. 9

Third-party liability frameworks also differ significantly. The DSA may exempt “mere conduit” services,10 Regulation 2024/900 holds advertisers liable for unverified claims, while Romania makes platforms liable for unverified user-generated content, even if passively hosted. Enforcement timelines reveal varying rigor levels. The DSA’s standard of acting “without undue delay” for illegal content removal is tightened under GEO 1/2025 to a 5-hour takedown mandate during election periods, while Regulation 2024/900 requires 48-hour updates to ad libraries. Coordinated inauthentic behavior, such as the use of bot networks, is not explicitly addressed by any of the frameworks, despite the BEC’s actions against such practices in Romania’s 2024 local elections. 

Systemic Risks and Electoral Precedents

The unprecedented annulment of Romania’s 2024 presidential elections due to AI-driven interference highlights the real-world stakes of these regulations. Romania’s Constitutional Court cited foreign-linked deepfake campaigns on TikTok that distorted voter perceptions, prompting the EU Commission to invoke the DSA’s crisis protocols and order TikTok to preserve related data. This intervention illustrates the layered enforcement model emerging in Romania: national authorities leverage GEO No. 1/2025 for rapid takedowns, while the EU enforces systemic risk mitigation under the DSA. 

The European Commission intensified digital political advertising oversight during the 2024-2025 electoral cycle, deploying a multi-pronged strategy to safeguard democratic processes across the EU. These efforts reflect a paradigm shift toward proactive platform accountability. In November 2024, the Commission convened major platforms to assess election readiness ahead of Romania’s presidential elections and to share risk mitigation plans addressing disinformation, foreign interference, and algorithmic biases, with particular scrutiny on TikTok’s recommender systems. 

The formal DSA investigation  opened by the European Commission against TikTok for systemic risks during Romania’s elections includes allegations of insufficient content moderation and opaque political ad policies.11 In May 2025, the Commission has informed TikTok of its preliminary view that the company does not fulfil the DSA obligation to publish an advertisement repository. The Commission has found that TikTok does not provide the necessary information about the advertisements, the targeted users, and the sponsors. The investigation is still ongoing.12

Throughout this period, the Commission emphasized both proactive and reactive enforcement. Nevertheless, while VLOPs implemented measures countering inauthentic behavior, these proved insufficient for addressing public concerns, and infrequent reporting obligations of the platforms further limited transparency.13

Election Oversight Authorities and Approach to Election Security in 2025

Romania’s electoral process operated under Permanent Electoral Authority (PEA) administration, with a temporary structure of electoral bureaus led by the Central Electoral Bureau (BEC). While the European Commission retained its authority over VLOPs, national authorities play a moderate role in content oversight and reporting. As Romania’s Digital Services Coordinator under the DSA, the National Authority for Communications Administration and Regulation (ANCOM) joined the BEC auxiliary technical staff alongside key state institutions to ensure a coordinated approach to handling complaints about illegal or misleading political advertisements.14

During the campaign, several state institutions flagged suspected inauthentic content and accounts to VLOPs, establishing a platform to prevent duplication in reporting activities.15 ANCOM acted as designated coordinator with the European Commission for flagging systemic risks, the BEC adjudicated online campaigning complaints and transmitted content moderation instructions to VLOPs through PEA channels. However, proactive online behavior monitoring was not mandated.16

As noted by the OSCE, authorities adopted fragmented approaches to online space oversight, with online environment and campaign supervision remaining split across institutions, resulting in fragmented responses and limited transparency. Transparency regarding the content moderation practices of VLOPs was limited, as details of enforcement actions and the criteria for content removal were not public. Authorities reported improved, but still not fully adequate, cooperation with VLOPs, noting that although posts were removed, this was not always done promptly and reposted or re-edited problematic content often remained accessible.

Analyzing Political Actors and Advertisement Practices in 2025 Digital Elections

The Central Electoral Bureau (BEC) received over 4000 complaints, primarily concerning inauthentic accounts and unlabeled posts by alleged political actors. These included attempts to influence the visibility of posts and the use of bots or troll farms and AI generated material to amplify or suppress candidates’ content.17 While the complainants were anonymized for data protection reasons by the BEC, many appeared to share identical initials. The high volume of complaints from a limited number of individuals may have targeted specific candidates and slowed down BEC activities.

A review of more than 4,000 Central Electoral Bureau (BEC) decisions shows that the institution applies the notion of “political actor” unevenly and, at times, expansively. Although a formal clarification stated that private individuals may acquire this status if they “predominantly and repetitively” disseminate electoral propaganda on personal accounts, that threshold stretches the concept well beyond its contours in EU and Romanian legislation.

This elasticity is starkly illustrated by contrasting outcomes and these case files illustrate just how elastic the BEC’s test for “political actor” has become. In the span of a the presidential campaign, the Bureau labelled as actors (i) a faceless Facebook page that merely reposted pro-candidate memes, (ii) a municipal councilor sharing campaign slogans, and (iii) an anonymous TikTok channel urging spoiled ballots—yet in a separate ruling it refused to apply the label to the sitting Prime-Minister who publicly demanded that one contender quit the race. Citizens unaffiliated with any political party were often declared a “political actor” with their voting intention post deemed unlawful for lacking proper labeling. Such an elastic and inconsistent interpretation undermines legal certainty and might deter legitimate political expression. 

The only constant seems to be the Bureau’s reliance on ambiguous fragments of Art. 3(4) of Regulation 2024/900: sometimes it invokes point (d), the “elected mandate” criterion; more often it falls back on point (g), the catch-all clause covering accounts that “act on behalf of a candidate”, even where no author can be identified. Because the Bureau never explains what evidence establishes that an account “acts on behalf” of anyone, identical behaviors are classified differently from one file to the next.

A lone voter who reposts a poll alongside the slogan “We can and we will” is transformed into an institutional campaigner, while a verified party leader encouraging strategic voting escapes scrutiny because the message is framed as “personal opinion”. Likewise, exhortations to void one’s ballot are alternately treated as lawful political speech or as unlabeled advertising, depending solely on which panel happened to examine the complaint.

The result is a chilling uncertainty: creators cannot predict when ordinary advocacy will be rebranded as paid propaganda, and platforms are left to guess which posts must carry transparency labels. Until the Bureau articulates a coherent, rule-based interpretation of the previsions of GEO 1/2025, enforcement will remain ad-hoc and susceptible to political pressure. For voters, the opacity further erodes confidence in the fairness of the online campaign.

Conclusion

The 2024–2025 cycle revealed both strengths and gaps in the EU’s approach. While the Political Advertising Regulation and DSA provided robust instruments, delayed implementation timelines and inconsistent platform compliance hindered effectiveness. The Commission’s focus on algorithmic accountability—exemplified by Germany’s stress tests and Romania’s deepfake bans—signals a future where transparency and user control over recommender systems become electoral safeguards. As the European Democracy Shield evolves, its success depends on harmonizing national enforcement priorities with EU-wide standards, ensuring that platforms serve rather than undermine democracies. Romania’s evolving laws position it as a testing ground for reconciling national electoral sovereignty with EU digital governance. 

As also concluded by the OSCE mission, while the legal framework provides an adequate basis for conducting democratic elections, recent changes did not sufficiently address the issues currently impacting public trust. While AI and deepfake regulations pioneer safeguards against emerging threats, they also expose critical gaps in the EU framework, particularly regarding synthetic media and algorithmic accountability. 

Future elections will likely intensify scrutiny of these measures, especially if conflicts arise between GDPR data rights, DSA harmonization goals, and Romania’s stringent national rules. For platforms, the challenge involves navigating this tripartite system —where EU transparency, Romanian speed, and AI ethics converge—without fragmenting the single market or stifling democratic discourse.

A pragmatic path for aligning national innovation with EU‑wide coherence would be to establish a concrete institutional mechanism, such as a “Digital Democracy Taskforce,” pooling the European Commission, ERGA, and national electoral bodies to issue binding thematic guidelines and dispatch joint audit teams during campaign periods. A procedural tool, such as an EU‑level regulatory sandbox would let member states pilot stricter measures, like real‑time API access to political‑ad libraries, under a common protocol before they are scaled across the Union, addressing the much needed harmonization challenge. 

  1. Recital 56 and art. 11 of Regulation (EU) 2024/900, art. 16 of GEO 1/2025, Recital 95 DSA
  2.   Recital 57 and art. 7 of Regulation (EU) 2024/900, art. 16 of GEO 1/2025, Art. 26 DSA
  3. Recital 64 and art. 13 of Regulation (EU) 2024/900
  4. Art. 39 DSA
  5. Recital 70, Art. 27, 38 DSA
  6. Art. 12 of Regulation (EU) 2024/900
  7. Romania’s legislative landscape continues to evolve with new proposals targeting artificial intelligence (AI) and electoral integrity, further complicating its interplay with existing domestic and European law. The draft law PLx No. 471/2023 (the “Deepfake Regulation”) and the parliamentary initiative PLx184/2025 on responsible use of AI exemplify Romania’s push to address AI-driven risks in elections, while revealing tensions with EU-wide standards. The Deepfake law aims to introduce stringent rules against synthetic media, requiring explicit warnings on AI-generated political content and criminal penalties for non-compliance. This aligns with GEO No. 1/2025’s prohibition of deepfakes but conflicts with the DSA’s narrower focus on transparency rather than content creation tools. Meanwhile, PLx 184/2025 seeks to implement the EU AI Act domestically, adopting its risk-based approach but adding national-specific obligations like algorithmic dataset audits and rigid deepfake regulation. This exceeds the EU AI Act’s transparency requirements and risks clashing with the DSA’s protections.
  8. Art. 16 (5) GEO 1/2025
  9. As also noted by the OSCE, the GEO was adopted without public consultation, only four months prior to the election, and has not effectively regulated other key concerns, such as the oversight of online political advertising and campaign finance.
  10. Recital 19 DSA
  11. Subsequently, TikTok reported that it had removed a total of 27,217 inauthentic accounts. The investigation specifically examined TikTok’s recommender systems, its handling of coordinated inauthentic behavior, and its policies around political advertising. VLOPs are only required to publish annual reports on their DSA-related activities. Transparency reports are published by the European Commission, with Meta and X’s most recent reporting going up to 30 September 2024, and TikTok’s to 31 December 2024. TikTok released data on 28 April, which showed that it continued to remove some coordinated inauthentic behavior.
  12. Parallel efforts included Germany’s election stress tests in January 2025, where VLOPs like Meta and Google agreed to limit microtargeting to basic demographics (age/gender) and prioritize fact-checked content in feeds.
  13. In January 2025, the European Parliament debated DSA enforcement in the context of ongoing investigations into TikTok and X. The Commission announced the launch of the European Democracy Shield, a cross-border initiative designed to combat disinformation through enhanced transparency, algorithmic accountability, and collaboration among Digital Services Coordinators. This initiative built on existing partnerships with organizations like the European Digital Media Observatory (EDMO) and civil society groups. In March 2025, it published the DSA Elections Toolkit, providing guidelines for Digital Services Coordinators to safeguard elections through real-time monitoring of political ad spend and microtargeting, coordination with fact-checkers and civil society, and stress tests for platforms. See also Guidelines by the European Commission on DSA responsibilities of the VLOPs during elections, which state that VLOPs should combat disinformation, ensure the integrity of accounts and clearly identify AI generated content. The terms and conditions of Meta and TikTok include prohibitions on inauthentic accounts, disinformation, unlabeled AI-generated images and fake engagement. Meta voluntarily adopted the Commission’s “Election Ads Transparency Protocol,” expanding its ad libraries to include targeting parameters and spending per demographic. By contrast, Google announced it would withdraw from political advertising in the EU from October 2025, citing regulatory uncertainty.
  14. ANCOM took a proactive stance by organizing a simulation exercise to counter manipulative online behavior and issued public guidance on reporting non-compliant political ads. During the campaign, ANCOM was actively involved in disinformation efforts, warned against cloned websites impersonating Romanian institutions, flagged micro-profiling of vulnerable groups, and worked to dismantle social media disinformation campaigns.
  15. In the first-round, the Ministry of Internal Affairs flagged some 450 posts. The National Authority for Management and Regulation in Communications (ANCOM) flagged 240 accounts in the first round, and more than 900 posts and accounts in the second round to VLOPs.
  16. According to state authorities, preparations to address online threats increased after the annulment of the November 2024 presidential election. A meeting designed to enhance institutional cooperation, referred to by authorities as a “stress test” was conducted on 27 March with the participation of the European Commission, the election administration, the CNA, and VLOPs. Limited information about its outcome was made available, missing an opportunity to inform the public about state efforts to tackle online threats. Additionally, on 26 March, the BEC established an inter-institutional “Working Group for Online Campaign” to process complaints submitted by citizens, political party representatives, and NGOs regarding online campaigning, and to support the BEC in taking decisions on possible violations, including instructing VLOPs through the PEA to remove unlawful content. Complaints could be submitted online, and during certain periods ahead of election day, the Working Group operated on a 24-hour basis.
  17. This resulted in 2,600 content-removal decisions, bringing the total number of removal decisions since the start of the first-round campaign to more than 9,000.
Tagged as
Law and Governance South East Europe © 2025 Law and Governance South East Europe
All rights reserved.