Policy Brief

The case for a European Union digital enforcement authority

The European Union's digital rulebook could be better enforced by delegating some of the European Commission's powers to an independent agency

Publishing date
05 March 2026
Issue number
05/2026
A

Reading time: 27 minutes

Reading time: 27 minutes

Download as PDF

Executive Summary

The European Commission plays the primary role in enforcing the European Union’s digital rules, but is under pressure to relax enforcement to avoid retaliation from the United States. In this context, it risks succumbing and undermining its own authority, or resisting so strongly that it over-uses its enforcement powers to unfairly penalise foreign competitors, driven by the false belief that this would help the EU become less dependent on foreign technology. The risk of upward or downward enforcement bias is problematic because it weakens the effectiveness of the EU’s regulatory framework and distorts market competition dynamics.

A country’s ability to set, preserve and enforce its own rules is perhaps the clearest expression of its autonomy. This Policy Brief asks whether a structural change in the institutional setup would improve the enforcement of the EU’s digital rulebook. We ask whether the European Commission should delegate to an independent agency the digital-enforcement powers given to it by the Digital Markets Act (DMA), Digital Services Act (DSA) and Artificial Intelligence (AI) Act. Independent EU agencies with significant enforcement powers, including the power to impose financial sanctions on private entities, already exist. An EU digital authority could follow this template.

We ask whether the challenge in outsourcing digital enforcement is outweighed by the expected increase in enforcement accuracy. Furthermore, we assess whether the establishment of an EU digital authority is technically, politically and legally feasible. We find that the case for an independent authority is not equally compelling for each of the three regulations examined: a structural separation of enforcement may be currently unsuitable for the DMA and the AI Act, while it is advisable for the DSA. We thus recommend the establishment of an independent EU agency to enforce the DSA, and provide an outline of its possible structure.

 

This Policy Brief benefited from helpful discussions within Bruegel. Many thanks in particular to Griffith Couser, Stephen Gardner, Bertin Martens, Paul Richter and Nicolas Véron for their helpful comments.

 

1 Addressing institutional inertia

European Union regulation of digital markets is in its relative infancy. The Digital Markets Act (DMA, Regulation (EU) 2022/1925) and the Digital Services Act (DSA, Regulation (EU) 2022/2065) only came into force in 2022. Other laws, such as the Artificial Intelligence Act (AI Act, Regulation (EU) 2024/1689), will not be fully enforceable until August 2026 at the earliest. The EU’s digital rulebook, however, may already have been superseded by markets and the political environment shifting in unanticipated ways since the legislation was passed, reducing its expected effectiveness.

This is not the first instance of a mismatch between evolving environments and public institutions failing to adapt. The 2008 global financial crisis was largely caused by the inability of the United States’ Securities and Exchange Commission and the US Federal Reserve to update their oversight powers to anticipate and mitigate risk in emerging derivatives markets (FCIC, 2011). Similarly, the slow public response to the COVID-19 pandemic in 2020 was rooted in the rigid protocols of public health institutions, which proved inadequate for addressing the global emergency (Schiff and Mallinson, 2023). 

In European digital markets, the risk of institutional inertia may be compounded by the European Commission’s role in enforcing the rules. The Commission is unlikely to be an effective market watchdog1. The EU treaties established it as a technocratic body focused primarily on economic integration within the single market. Insulation from political control was seen as a source of legitimacy (Moravcsik, 2002). But the Commission is no longer solely technocratic; it has developed into a political entity (Haroche, 2023) that pursues ambitious goals in areas including climate and technology, and deploys trade and defence policies to advance the EU’s geopolitical ambitions. 

In digital markets, the Commission is under pressure to relax enforcement against US companies because of fear of US retaliation2. The Commission has a history of yielding to trade threats. For example, in July 2021, the European Commission suspended work on a potential EU-wide digital levy in July 2021 following US pressure and US threats of retaliatory tariffs against EU countries that had implemented their own national digital taxes3. In 2024, the Commission removed sovereignty requirements from a proposed cloud service certification scheme after the US threatened to exclude European companies from US government procurement markets4. All these instances have undermined the Commission’s credibility as a digital-rules enforcer.

However, the Commission also risks over-enforcing EU digital regulations by yielding to domestic protectionist pressure instead. It could be tempted to use its enforcement powers to unfairly penalise foreign competitors, driven by the notion that this would help the EU become less dependent on foreign technology5. The DMA is intended to expand competition in digital markets and to directly ensure ‘fairness’ in market outcomes. Fairness, however, is a fuzzy concept, making it sometimes difficult to determine whether DMA measures are intended to protect businesses or consumers (Andriychuk, 2023). DMA enforcement may thus lend itself to being twisted for protectionist purposes. Meanwhile, the DSA and AI Act contain measures guarding against large platforms or AI providers potentially creating systemic risks’. These risks, however, are not yet well-defined in practice; this vagueness could potentially be used to pursue political objectives.

The risk of upward or downward enforcement bias could undermine the effectiveness of the EU regulatory framework, increase business uncertainty, reduce competitiveness and investment incentives and diminish strategic autonomy6. Based on this premise, this paper asks whether a structural change in the institutional setup could improve enforcement of the EU’s digital rules. Specifically, we ask whether the European Commission should delegate its enforcement powers under the DMA, DSA and AI Act to an independent EU agency. 

This paper focuses narrowly on specific, incremental improvements to the current regulatory framework. It does not address other specific but equally relevant questions, such as whether the current distribution of EU digital enforcement power between the supranational and national levels should be refined. Nor does it address the broader question regarding the adequacy of the current EU regulatory framework. 

Independent EU agencies with significant enforcement powers, including the power to impose pecuniary sanctions on private entities, already exist. The European Securities and Markets Authority (ESMA) has direct supervisory authority over market critical infrastructure, such as credit rating agencies and trade repositories (Véron, 2025). Established in 2024, the Anti-Money Laundering Authority (AMLA) will directly supervise 40 complex, cross-border financial groups. Both ESMA and AMLA can bypass national regulators to oversee companies directly, conduct their own investigations and, most importantly, impose heavy financial penalties. The two agencies are legally accountable only to the Court of Justice of the EU (CJEU).

This paper evaluates whether the establishment of an independent digital agency modelled on ESMA or AMLA is warranted. That would be the case if two premises are demonstrated to be true: first, that such an institutional change increases enforcement effectiveness; second, that it is legally and technically feasible. In the paper, we show that these two conditions, for now, are unlikely to be met simultaneously under the DMA or the AI Act, whereas they are likely to be satisfied under the DSA. 

The paper is organised as follows. Section 2 explains the background of the Commission’s enforcement powers. Section 3 evaluates the first premise: the desirability and effectiveness of an independent digital agency. Section 4 evaluates the second premise: the legal and technical feasibility of such an agency. Section 5 compares the merits of an independent authority across the DMA, DSA, and AI Act. Finally, section 6 concludes by proposing a blueprint for establishing a separate agency to assume DSA enforcement duties currently sitting with the Commission.

2 The European Commission as enforcer of the EU’s digital rulebook

The EU’s digital rulebook comprises numerous EU regulations directly applicable to digital markets7. In this paper, we focus on the specific subset of regulations that include a significant enforcement role for the European Commission: the DMA, DSA and AI Act (Table 1). Other regulations that could be considered part of the EU’s digital rulebook do not contain a significant enforcement role for the Commission, often relying entirely on national authorities to carry out key enforcement duties, including conducting investigations and imposing sanctions for infringements8.

In addition to the DMA, DSA and the AI Act, the Commission exerts significant enforcement powers in the digital economy through competition policy: antitrust enforcement and merger control9. This paper, however, does not address the enforcement of EU competition law, as any such institutional change would entail amending the Treaty on the Functioning of the EU (TFEU), posing an overwhelming political challenge (amending the EU Treaty requires complex procedures and the unanimous agreement of all EU countries).

Table 1: Enforcement of the DMA, DSA and AI Act

Instrument

What it regulates

Enforcement by the European Commission

Enforcement role by member states

Other enforcement bodies

Digital Markets Act

Core platform services by ‘gatekeepers’ (search, social networks, OS, app stores).

Commission is sole public enforcer; designates gatekeepers; investigates; fines up to 10–20% global turnover.

National authorities assist investigations; national courts handle private enforcement.

High-Level Group for the DMA (coordination).

Digital Services Act

Intermediary services and online platforms; illegal content; transparency; systemic risks.

Commission supervises VLOPs/VLOSEs; audits risk mitigation, data access. Fines up to 6% global turnover.

Member states designate Digital Services Coordinators (DSCs) as national enforcers.

European Board for Digital Services; national DSCs.

AI Act

All AI systems; bans; high-risk rules; transparency; general-purpose AI and systemic-risk models.

AI Office supervises general-purpose and systemic-risk models; fines up to 3% global turnover.

National AI competent authorities and market surveillance authorities oversee high‑risk systems.

European AI Board; Scientific Panel; Advisory Forum; notified bodies.

Source: Bruegel.

The DMA aims to ensure fairness and contestability in digital markets. It establishes a set of obligations for ‘gatekeepers’: large companies enjoying substantial, entrenched power in EU markets and providing critical ‘core’ digital services such as online search, social networks, operating systems and app stores. The Commission is the sole DMA enforcer, investigating potential infringements, requesting information and conducting inspections. It can impose sanctions of up to 10 percent of a violator’s total worldwide annual turnover (up to 20 percent for repeated violations). National authorities may support the Commission in investigating potential DMA infringements. However, only the Commission can impose sanctions and remedies.

The DSA establishes an EU regulatory framework for online intermediaries to enhance transparency, accountability and user protection and to reduce the spread of illegal and harmful content online. Contrary to the DMA, the DSA’s enforcement powers are shared between the Commission and member-state authorities. The Commission has exclusive competence over ‘very large online platforms’ (VLOPs) and ‘very large online search engines’ (VLOSEs) that have more than 45 million users in the EU.

Obligations for VLOPs and VLOSEs most notably include taking down illegal online content, identifying and analysing systemic risks from the spread of harmful content on their services and implementing adequate mitigating procedures; and transparency measures, such as maintaining a public database of the advertisements published on their services or enabling vetted researchers to access their internal data to monitor systemic risks. The Commission may impose penalties of up to six percent of the violator’s global annual turnover. National authorities, meanwhile, enforce obligations on smaller online intermediaries that do not meet the ‘very large’ threshold. The obligations enforced by national authorities primarily concern illegal online content. 

Finally, the AI Act lays out a comprehensive framework governing the development, market placement, deployment and use of AI systems within the EU. The AI Act is modelled as a ‘product safety’ regulation. It primarily targets providers (ie developers), deployers (ie users) and distributors of AI systems with a tiered obligation structure linked to the risks these systems pose. An AI system is considered high risk if it is part of a product’s safety component or used in areas where the risk of harm is deemed high, such as employment, education or critical infrastructure10. Similarly to the DSA, enforcement is shared between member states and the Commission. The former generally supervise providers, deployers and distributors of high-risk AI systems; the latter enforces measures specifically targeting general-purpose AI (GPAI)11. Providers of GPAI must, for example, ensure that their services comply with EU copyright laws or put in place measures to mitigate systemic risk when present. If they fail to do so, the Commission can impose fines of up to three percent of their global annual turnover.

There is as yet no evidence that the Commission has incorrectly enforced the three regulations. Under the AI Act, GPAI obligations are not yet enforceable, though some are expected to become enforceable as early as August 2026. Regarding the DMA and the DSA, the rules enforced by the Commission became applicable only as of May 2023 (for the DMA) and August 2023 (for the DSA); given the time required to investigate, there have been only a few infringement decisions so far. DMA infringement decisions have been taken against Apple (DMA.100109) and against Meta (DMA.100055) in March 202512. A DSA infringement decision was taken against X in December 202513. Even if these companies appeal to the EU General Court, which could provide an opportunity to assess the accuracy of the Commission’s decisions, it will take time before a judicial review of the three decisions is conducted.

Even when a regulation has been in place for a long time, it is hard to find evidence of improper enforcement because enforcement can also be distorted by inaction, which is difficult to quantify. It is difficult to prove that the Commission should have opened an investigation when it did not, or that it failed to sanction a company that it should have, given that information available only to the parties involved14. Thus, there is little factual evidence so far that the available enforcement tools need to be improved.

A case can, however, be made that the risk of distortion has increased. External and internal political pressure on the Commission is high. The US has openly demanded concessions on digital regulation in return for EU steel tariff relief15. Commission Vice President Teresa Ribera openly accused the US of blackmailing the Commission through trade talks to prevent the enforcement of the EU’s digital rulebook16. French President Emmanuel Macron blamed the Commission for being reluctant to tackle large US tech companies17

Increased political interference in enforcement is associated with a greater risk of distortion (Laffont and Tirole, 1991; Laffont and Tirole, 1993; Laffont, 1999). For instance, in the banking sector, Lambert (2019) found that regulators are 44.7 percent less likely to initiate enforcement actions against banks that lobby them. Similarly, concerning the enforcement of competition policy, Fidrmuc et al (2018) found that the probability of a merger being challenged or blocked is negatively correlated with the acquirers’ lobbying activity before the merger announcement. Mehta et al (2020) showed that mergers in the political districts of powerful US congressional members who serve on committees with antitrust oversight receive relatively favourable antitrust review outcomes.

The perception that there is a high risk of enforcement distortion in the current institutional setting is widespread among stakeholders. In response to a call for evidence during the first review of the DMA, the Commission reported that “several respondents expressed concerns about insufficiently robust enforcement” and that there were “calls for preserving the DMA’s political independence, […] by ensuring that its application remains insulated from broader political considerations18. Reports have documented enforcement gaps in the DSA, with vetted researchers accusing platforms of implementing delaying tactics to prevent them from accessing their data (Scott et al, 2025). In November 2025, the Socialists & Democrats group in the European Parliament reportedly requested the establishment of a formal inquiry committee into a possible failure by the Commission to enforce the DSA19.

3 Independent regulatory agencies in practice

If the risk of digital regulatory distortion is high, the question is whether delegating enforcement to an independent agency would mitigate it. Insulation from political influence is a primary reason why governments establish independent regulatory agencies. Gilardi (2002) provided empirical evidence that, for western European governments, mandating independence is often a strategic decision to shield enforcement from the political cycle and to enhance the credibility of regulatory actions. Sadeh and Rubinson (2024) listed additional benefits of independence: increased expertise because regulatory agencies are more likely than governments to employ staff with the necessary technical skills (Koop and Hanretty, 2018); and protection against political uncertainty because their working plans are not supposed to change when the government does (Ruffing et al, 2024). 

Independent agencies can also serve as ‘lightning rods’, protecting elected officials from public blame when unpopular choices are made (Heinkelmann-Wild et al, 2023). An independent authority is not accountable to the electorate and is therefore more likely to make decisions that are positive for social welfare but may negatively affect specific constituencies (Maskin and Tirole, 2004). These advantages help explain why international bodies such as the EU, the Basel Committee, and the OECD have promoted independence from political interference as an institutional regulatory model (Koop and Jordana, 2022).

The theoretical connection between enforcement quality and independence has been thoroughly examined in the literature, but empirical evidence remains limited. 

Empirical analyses struggle with imperfect cross-country comparisons due to differences between institutional solutions and regulatory contexts. Koop and Jordana (2022) surveyed the existing literature, finding that studies focusing on utility regulations (such as electricity or telecoms) or financial services tended to identify either a significant positive effect on regulatory outcome quality or no significant effect at all. Koop and Hanretty (2018) analysed competition authorities in 30 OECD countries, finding that formal independence has a positive and highly significant effect on agencies’ ranking in terms of the overall perceived quality of their enforcement and regulatory activity. They also found a positive and significant relationship between an agency’s size and the quality of its work, pointing to scale efficiency effects. Ma (2010) found that substantive independence has a positive effect on antitrust and merger enforcement. However, when independence is granted only on paper, and governments retain some political control over the agency, that effect vanishes – a finding confirmed by Guidi (2015).

Independence would be particularly welcome in the face of external pressure on trade policy, as in the enforcement of the EU’s digital rulebook (see Box 1 for a game-theory analysis). This is especially relevant when the preferences of the principal (in this case, the European Commission, potentially appeasing the US) cannot be aligned with those of the agent (the newly established regulatory authority, which is only concerned with proper enforcement)20

In conclusion, gaining substantial independence from political control reduces the risk of over- or under-enforcement; if the Commission were to delegate its enforcement powers to a digital authority, this would greatly improve the quality of enforcement of EU digital rules.

Box 1: A game-theory analysis of how regulatory independence can help with foreign interference

To help visualise how an independent EU digital authority would respond to US pressure, consider the following game in reduced form. There are two players: the EU and the US. The EU can either enforce the rules or refrain from doing so. The US can either impose a trade restriction, such as applying an import tariff, or refrain from restricting trade when the EU’s digital rulebook is applied to a US company. Formally, the action space for the EU is SEU {E,NE} and for the US is SUS {T,NT}, where stands for ‘enforcement’, NE for ‘non-enforcement’, for ‘trade restriction’ and NT for ‘no trade restriction’. The EU benefits from enforcing its digital rules (it gains B), the US loses when rules are enforced against US companies, and B>L>0 (this assumption implies that regulatory action generates value for the economy, for example, by enhancing competition, and that value is higher than the loss experienced by US companies hit by the EU enforcement). If the US retaliates, it triggers a trade war, leaving both players worse off than under the status quo: the EU and the US lose DEU, and DUS, respectively. DEU>B ensures that, for the EU, the loss from a trade war is not offset by the benefit of enforcement. Table 2 summarises the payoffs across all four possible scenarios.

Table 2: Reduced-form enforcement game payoff matrix

 

US Tariff (T) 

US No Tariff (NT)

EU enforce (E)

(𝐵−𝐷𝐸𝑈,−𝐿−𝐷𝑈𝑆)

(𝐵,−𝐿)

EU not enforce (NE) 

(−𝐷𝐸𝑈,−𝐷𝑈𝑆)

(0,0)

 

Source: Bruegel.

Under normal circumstances, with all players behaving rationally, the (E, NT) equilibrium emerges, as enforcing and no-trade restrictions are dominant strategies for the EU and the US, respectively. No matter what the other player does, it is always better for them to implement those strategies; (E, NT) is, therefore, a Nash equilibrium: each player’s best response to the action chosen by the other player is to take the action that leads to that equilibrium, and no player has an incentive to deviate from it. 

Let us then assume that the US credibly commits to retaliate whenever EU enforcement occurs, regardless of whether the short-term consequences are self-defeating21. If that were the case, only two equilibria would be possible, (E, T) and (NE, NT) the two players would then converge to the latter, where no one experiences a loss. Conversely, let us consider a third scenario in which the EU is committed to enforcing its digital regulations because it would no longer be within the Commission’s power to yield to US pressure (in Table 2, this effectively eliminates the bottom row). The US may still threaten to impose a tariff; however, since the EU is no longer able to refrain from enforcement, the US threats would be irrelevant. It would become increasingly untenable for the US to maintain retaliation, as the game would stabilise at an equilibrium where the US has the worst payoff: −L −DUS

This does not mean that the US would refrain from making threats. In January 2026, the US State Department threatened the United Kingdom with retaliation in response to the UK’s independent telecom and digital regulator (OFCOM) launching an investigation into X’s alleged lack of compliance with the UK Online Safety Act22. However, the UK government has no control over OFCOM’s investigative and fining powers. Therefore, the US threats are empty: enacting them in the long term would be counterproductive and dynamically unsustainable. Hence, the original Nash equilibrium (E, NT) would likely emerge.

 

The outsourcing of enforcement power by governments involves trade-offs. Ideally, the designated agency should be simultaneously free from political pressure when exercising its authority to investigate infringements and prosecute violations, and accountable for the exercise of its powers and expenditure of public resources. Institutional design choices influence the extent to which these two often-conflicting goals are met (Kovacic and Mariniello, 2016).

Establishing an independent EU digital agency would require defining its relationship with the European Parliament, the European Council and the European Commission. Various measures can strengthen the agency’s ability to resist pressure from political decisionmakers on which cases to pursue and how they are resolved. Granting agency leaders fixed-term appointments and prohibiting their removal from office – except for good cause – is a common approach. For example, ESMA’s chairperson and executive directors are appointed by the Board of Supervisors for five-year terms (renewable once), subject to European Parliament approval23. They may be removed only for serious misconduct or if they suddenly fail to meet the conditions necessary for their duties (eg becoming permanently incapacitated).

In AMLA, the primary decisionmaker sanctioning regulatory breaches and setting pecuniary sanctions is the Executive Board, which includes AMLA’s chairperson and five independent, full-time members. The chairperson is appointed through a procedure involving the Commission, European Parliament and the Council of the EU24; she can only be removed by the Council under the same conditions as those for removing ESMA’s leadership. 

Funding can also provide autonomy if the agency can secure resources without relying on institutional approval. Allowing the agency to collect and retain user fees protects the agency from political interference in the form of budgetary pressure. Both ESMA and AMLA, for example, collect fees from stakeholders that are proportional to the expected cost of regulatory supervision; AMLA’s fees are based on banks’ size and risk profile. 

Enforcement of the EU’s digital rulebook, however, is not uniformly funded. The DMA collects no fees from gatekeepers. Similarly, the AI Act does not require GPAIs to pay fees; it leaves it to member states whether fees can be imposed on larger AI providers, for example, to access regulatory sandboxes. Under the DSA, the Commission levies an annual supervisory fee on VLOPs and VLOSEs, which is proportional to the number of their users in the EU and capped at 0.05 percent of their global annual net income.  

However, an agency that is entirely autonomous can become detached from the policy decisions that influence the regulatory process. For instance, relevant enforcement units within the Commission are usually involved in the drafting of regulatory proposals. Bringing their expertise from concrete enforcement cases, they can significantly contribute to shape the legislative text. A separated autonomous agency may, instead, be sidelined and lose influence on legislative proposals that affect markets more significantly than any enforcement actions the agency might pursue (Kovacic and Mariniello, 2016). 

Accountability can be maintained by delegating to the court judicial review of agency decisions, ensuring that the agency operates within the bounds of its authority. ESMA and AMLA are both subject to the CJEU’s ultimate review, for example. The CJEU can annul decisions or adjust imposed fines upward or downward. Companies can similarly ultimately appeal to the CJEU against a Commission decision based on the DMA, DSA or AIA. 

4 Outsourcing digital enforcement as a feasible option

Is the establishment of a separate digital enforcement agency a realistic option? A new agency can be established through a ‘staff spinoff’ from the European Commission. For example, the European Chemicals Agency was set up this way, moving experts from the Commission’s Joint Research Centre to Finland, where the agency is based (JRC, 2007). AMLA is being set up by the AMLA Task Force, which is based within the Commission’s Directorate General for Financial Stability, Financial Services, and Capital Markets Union (DG FISMA). The Task Force is taking all the preparatory steps for setting up the agency, including selecting facilities and recruitment. AMLA is expected to be fully operational approximately four years from when it was established on 26 June 202425.

The transfer of enforcement powers from the Commission to a new EU digital agency would require legislative amendments. During the legislative process leading to the adoption of the DSA, the European Parliament adopted a resolution explicitly requesting that the Commission assess the feasibility of appointing an existing authority or establishing a new one to carry out enforcement tasks26. This did not happen for the DSA, but such a process was enshrined in the text of the AI Act, which mandates an assessment in 2029 “with regard to the structure of enforcement and the possible need for a Union agency to resolve any identified shortcomings” (AI Act, Article 112). Compared to the European Parliament, EU governments have been more reluctant to establish independent EU enforcement agencies. Yet member states have also been increasingly vocal about the need for stronger enforcement of the EU’s digital rulebook. For example, France has criticised the Commission for its alleged inaction27, and ministers from 13 EU countries, at a meeting in October 2025, discussed creating a single EU digital regulator28.

If sufficient political will were available for institutional restructuring, the final obstacle would be the EU Treaty. In 1958, the Court of Justice of the European Communities (the precursor to today’s CJEU) interpreted the Treaty establishing the European Coal and Steel Community as implying strict limits on the Commission’s ability to delegate regulatory enforcement powers to external agencies. This interpretation is referred to as the ‘Meroni doctrine’, from the case that originated it29. Accordingly, the Commission cannot delegate discretionary powers with a wide breadth of policy implications to external bodies unless the scope is strictly defined and closely supervised by the delegating authority. An independent agency must be a highly technical body that executes clearly prescribed tasks without independent political responsibility.

In principle, the Meroni doctrine seems to significantly undermine the prospects of establishing an independent EU digital agency. However, over the years, Meroni has been superseded. De facto, agencies such as the European Supervisory Authorities have exercised substantial powers in setting technical standards. De jure, in the 2014 ESMA short-selling case30, the ECJ confirmed Meroni but significantly relaxed its restrictions (Simoncini, 2025). The Court stated that delegated enforcement powers may include some discretion, provided there are objective criteria and specific conditions subject to judicial review. This is especially true when specific technical expertise is required to achieve a regulatory goal. The establishment of AMLA in 2024 confirmed that significant delegation of powers is considered compatible with EU law: AMLA selects high-risk institutions that require monitoring, supervises them and, when necessary, issues binding decisions and pecuniary fines.

Regarding the EU’s digital rulebook, the Commission exercises its discretionary powers most prominently by defining who is subject to its enforcement actions: it designates DMA gatekeepers, DSA VLOPs and VLSEs and the AI Act’s GPAI models31. Moreover, the Commission must assess whether companies are effectively implementing adequate measures to mitigate systemic risks, such as the spread of harmful online content or large-scale incidents arising from the use of large language models. Such an assessment is complex and may involve some subjectivity. Conversely, powers that are considered more technical include running investigations, requesting information, auditing, testing and inspecting, identifying non-compliance or infringements and monitoring ex-post compliance. The imposition of fines and remedies could also be considered technical, if discretion in sanctioning power is limited by well-defined criteria to ensure objectivity when an infringement is proved. 

Ultimately, the extent to which the enforcement of the EU digital rulebook can be transferred from the Commission to an external agency will depend on whether the new agency’s decisions can be taken within a framework of bounded discretion (permissible under Meroni), or could become precedent-setting to the extent that they resemble rulemaking (in breach of Meroni).

5 Comparing the DMA, DSA and AI Act

The case for outsourcing enforcement powers from the Commission is not equally compelling across the three regulations examined: currently, there is no strong case for an independent digital agency enforcing the DMA or the AI Act. Conversely, we argue that the enforcement of the DSA would improve significantly if delegated to such an agency.

For the DMA, establishing an independent agency is likely to lower the risk of over- or under-enforcement. However, this would come at a high price. The DMA overlaps strongly with the EU’s competition policy framework: the regulation itself was shaped based on the Commission’s past antitrust experience (Caffarra and Scott Morton, 2021). Some obligations for gatekeepers, such as informing the Commission about the intention to acquire another company, are intended to leverage complementarities with the competition framework – the EU merger regulation, for example (Mariniello, 2025).

Thus, DMA enforcement benefits greatly from synergies between Commission directorates-general (mainly DG-CONNECT and DG-COMP, in this case). Those synergies would be lost if the Commission were to stop enforcing the DMA. A structural separation between competition enforcement and DMA enforcement could, moreover, lessen the ability to coordinate respective investigations under different laws and increase the likelihood that the same company would be charged twice for the same infringement32. Finally, a DMA-enforcing agency would be most exposed to legal challenges under the Meroni doctrine, precisely because the DMA-enforcing powers are reminiscent of those conferred to the Commission by Article 105 of the TFEU. 

For the AI Act, the risk of running afoul of the Meroni doctrine is lower, because the AI Act is primarily conceived as a product regulation and the enforcer’s role is largely to ensure that AI systems are properly developed and conform to regulatory compliance standards (a function that has little discretion). This also applies to GPAIs that are supervised by the Commission, with providers of GPAIs having to keep information up to date on how the model is trained33. However, the regulatory framework for AI is not yet sufficiently stable: AI is evolving rapidly in ways that legislators struggle to anticipate. For example, the Commission’s AI Act proposal, released in April 2021, did not include the provisions on GPAIs; those were rushed into the text by the Council and Parliament after the release of ChatGPT in November 202234. The legislators had not foreseen the need for specific provisions targeting GPAIs. 

It is equally telling that the Commission proposed amending the AI Act with its Digital Omnibus proposal in November 2025, before the AI Act had even entered into full force (European Commission, 2025). In such a dynamic context, in which the legislative process is still in flux, the benefit of outsourcing enforcement to an independent agency seems outweighed by the drawback of removing critical expertise from within the Commission35. This, of course, does not preclude the possibility that delegating enforcement powers under the AI Act may become desirable in the future, as the regulatory framework is expected to become more stable. 

For the DSA, the case for an EU-independent authority is strongest. Between the DMA, AI Act and DSA, the DSA is subject to the greatest internal and external political pressure, particularly because of its links to freedom of speech. In July 2025, for example, the US House Judiciary Committee released a staff report focused on the DSA, titled ‘The Foreign Censorship Threat: How the European Union’s Digital Services Act Compels Global Censorship and Infringes on American Free Speech’36. Opposition to the DSA’s alleged potential to censor free speech also comes from within Europe. In January 2026, Polish President Karol Nawrocki vetoed the national law implementing the DSA, warning that its enforcement framework risked enabling “administrative censorship37. This suggests that the risk of enforcement distortion is highest with the DSA and, therefore, that creating an independent enforcement authority would yield significant benefits by minimising that risk.

The drawbacks of outsourcing enforcement, such as the risk of jeopardising the expertise needed to draft legislation, are limited: DSA obligations are not expected to change anytime soon. The DSA’s text is sufficiently broad to provide the necessary legal basis to tackle harmful online content, and significant discretion is given when assessing obligations, such as the measures VLOSEs and VLSEs should put in place to mitigate systemic risks. These obligations, however, are progressively specified through Commission guidance. For example, the Commission releases specific guidelines, such as those on election integrity38 or on protecting minors from addictive design39 that help qualify the risk that platforms must mitigate. It promotes non-binding codes of conduct, such as the one on disinformation40, that show how the risk of spreading disinformation can be alleviated (for example, by avoiding advertising next to disinformation). And, it uses delegated acts, such as the Delegated Regulation on Auditing (C(2023) 6807), which directs auditors on how to test claims by VLOSEs and VLSEs. In other words, the Commission is circumscribing the framework within which enforcement operates, limiting its discretion.

6 Policy recommendations

Based on the above discussion, we conclude that the establishment of an independent EU agency to enforce the DSA should be prioritised. The DSA agency should be established in line with the AMLA model. In practice, this means:

Similarly to AMLA’s direct supervision of selected high-risk financial institutions, the DSA agency should directly supervise DSA-designated VLOSEs and VSOEs. The Commission could retain the power of designation. However, the DSA agency would assume day-to-day supervision and inspection powers over these entities, ensuring a neutral, technical application of the law, free from political interference.

The Commission and the DSA agency would share the authority to define the criteria for assessing whether risk-mitigation measures have been implemented. The Commission could have the final word on broader principles, while the DSA agency would define the specific technical metrics and audit standards required by industry to show compliance. However, the DSA agency would have full, autonomous power to request information, inspect and audit platforms. It should be empowered to conduct investigations independently and to ultimately make decisions on DSA infringements. The DSA agency should be endowed with the authority to impose administrative fines, establish remedies and monitor ex-post compliance.

Mimicking AMLA, the DSA agency’s governance structure could comprise a General Board, an Executive Board, a chair, an executive director and an Administrative Board of Review with functions, composition and dismissal protections as described in section 3. The General Board would include the heads of national authorities responsible for enforcing the DSA at national level. The DSA agency’s decisions could ultimately be subject to appeal to the CJEU.

To supervise 40 European financial groups, AMLA is expected to employ slightly more than 400 staff members, half of whom will perform direct supervision tasks, organised into joint supervisory teams that include staff from national authorities. Taking that as a rough reference, a back-of-the-envelope calculation suggests that the DSA agency could need 200-250 staff members to supervise the currently designated 21 VLOPs and VLOSEs41. The DSA agency could initially be established through a spin-off from the Commission’s dedicated DSA unit within DG-CONNECT and the European Centre for Algorithmic Transparency (ECAT, currently within the Commission’s Joint Research Centre)42. Similarly to AMLA, the DSA agency’s staff could be supported by seconded national staff joining joint supervisory teams.

The need to shore up DSA enforcement against political interference is compelling and urgent. Transferring enforcement powers away from the Commission would require the DSA regulation to be amended. The agency’s practical setup would then require a multi-year transition. Using AMLA as a realistic benchmark, establishing the DSA agency would likely follow a three- to four-year trajectory. A quicker approach could also be considered: the DSA envisages an advisory body, the European Board for Digital Services (EBDS), composed of national DSA enforcers and chaired by the Commission. The new agency could evolve from the EBDS, provided that its institutional structure is transformed radically along the lines of what we propose for the DSA agency, to ensure its full independence from the Commission.

References

Andriychuk, O. (2023) ‘Do obligations for gatekeepers create entitlements for business users?’ Journal of Antitrust Enforcement 11(1): 123–132, available at https://doi.org/10.1093/jaenfo/jnac034

Bruegel Dataset (2023) ‘A dataset on EU legislation for the digital world’, version of 6 June 2024, available at https://doi.org/10.64153/HRCE1630

Caffarra, C. and F. Scott Morton (2021) ‘The European Commission Digital Markets Act: A translation’, VoxEU, 5 January, available at https://cepr.org/voxeu/columns/european-commission-digital-markets-act-translation

European Commission (2025) ‘Proposal for a Regulation as regards the simplification of the implementation of harmonised rules on artificial intelligence (Digital Omnibus on AI)’, COM/2025/836 final, available at https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52025PC0836

FCIC (2011) The Financial Crisis Inquiry Report: Final Report of the National Commission on the Causes of the Financial and Economic Crisis in the United States, Financial Crisis Inquiry Commission, available at https://www.govinfo.gov/app/details/GPO-FCIC

Fidrmuc, J.P., P. Roosenboom and E. Quxian Zhang (2018) ‘Antitrust merger review costs and acquirer lobbying’, Journal of Corporate Finance 51: 72-97, available at https://doi.org/10.1016/j.jcorpfin.2018.05.001

Gilardi, F. (2002) ‘Policy credibility and delegation to independent regulatory agencies: a comparative empirical analysis’, Journal of European Public Policy 9(6): 873-893, available at https://doi.org/10.1080/1350176022000046409

Guidi, M. (2015) ‘The impact of independence on regulatory outcomes: The case of EU competition policy’, Journal of Common Market Studies 53(6): 1195-1213, available at https://doi.org/10.1111/jcms.12280

Haroche, P. (2023) ‘A “geopolitical commission”: Supranationalism meets global power competition’, Journal of Common Market Studies 61(4): 970-987, available at https://doi.org/10.1111/jcms.13440

Heinkelmann-Wild, T., B. Zangl, B. Rittberger and L. Kriegmair (2023) ‘Blame shifting and blame obfuscation: The blame avoidance effects of delegation in the European Union’, European Journal of Political Research 62(1): 221-238, available at https://doi.org/10.1111/1475-6765.12503

JRC (2007) JRC Annual Report 2007, Joint Research Centre, European Commission, available at https://joint-research-centre.ec.europa.eu/system/files/2014-02/jrc_ar_2007.pdf

Koop, C. and C. Hanretty (2018) ‘Political independence, accountability, and the quality of regulatory decision-making’, Comparative Political Studies 51(1): 38-75, available at https://doi.org/10.1177/0010414017695329

Koop, C. and J. Jordana (2022) ‘Regulatory independence and the quality of regulation’, in M. Maggetti, F. Di Mascio and A. Natalini (eds) Handbook of regulatory authorities, Edward Elgar, available at https://doi.org/10.4337/9781839108990

Kovacic, W. and M. Mariniello (2016) ‘Competition Agency Design In Globalized Markets’, Policy Options Paper, E15 Expert Group on Competition Policy and the Trade System, available at https://www3.weforum.org/docs/E15/WEF_Competitition_Policy_Trade_Global_Economy_Towards_Integrated_Approach_report_2015_1401.pdf

Laffont, J.-J. (1999) ‘Political economy, information and incentives’, European Economic Review 43(4-6): 649-669, available at https://doi.org/10.1016/S0014-2921(98)00130-5

Laffont, J.-J. and J. Tirole (1991) ‘The politics of government decision-making: A theory of regulatory capture’, The Quarterly Journal of Economics 106(4): 1089-1127, available at https://doi.org/10.2307/2937958

Laffont, J.-J. and J. Tirole (1993) A theory of incentives in procurement and regulation, MIT Press

Lambert, T. (2019) ‘Lobbying on regulatory enforcement actions: Evidence from US commercial and savings banks’, Management Science 65(6): 2545-2572, available at https://dx.doi.org/10.1287/mnsc.2017.2895

Ma, T.-C. (2010) ‘Competition authority independence, antitrust effectiveness, and institutions’, International Review of Law and Economics 30(3): 226-235, available at https://doi.org/10.1016/j.irle.2010.04.001

Mariniello, M. (2025) ‘Reinforcing EU merger control against the risks of acquisitions by big tech’, Policy Brief 11/2025, Bruegel, available at https://www.bruegel.org/policy-brief/reinforcing-eu-merger-control-against-risks-acquisitions-big-tech

Maskin, E. and J. Tirole (2004) ‘The politician and the judge: Accountability in government’, American Economic Review 94(4): 1034-1054, available at https://www.aeaweb.org/articles?id=10.1257/0002828042002606

Mehta, M.N., S. Srinivasan and W. Zhao (2020) ‘The politics of M&A antitrust’, Journal of Accounting Research 58(1): 5-53, available at https://doi.org/10.1111/1475-679X.12291

Moravcsik, A. (2002) ‘In Defence of the “Democratic Deficit”: Reassessing Legitimacy in the European Union’, Journal of Common Market Studies 40(4): 603-624, available at https://doi.org/10.1111/1468-5965.00390

Motta, M. (2004) Competition Policy: Theory and Practice, Cambridge University Press

Ruffing, E., M. Weinrich, B. Rittberger and A. Wonka (2024) ‘The European administrative space over time: Mapping the formal independence of EU agencies’, Regulation & Governance 18(3): 740-760, available at https://doi.org/10.1111/rego.12556

Sadeh, T. and E. Rubinson (2024) ‘Agency independence and credibility in primary bond markets’, Regulation & Governance 18(4): 1332-1368, available at https://doi.org/10.1111/rego.12623

Schiff, E. and D.J. Mallinson (2023) ‘Trumping the centers for disease control: a case comparison of the CDC’s response to COVID-19, H1N1, and Ebola’, Administration & Society 55(1): 158-183, available at https://doi.org/10.1177/00953997221112308

Scott, M., D. Stockmann, T. Asher and A. Marchese (2025) Building Capacity for Data Access, Analysis + Accountability, Report of the Columbia-Hertie Working Group, available at https://worldprojects.columbia.edu/sites/default/files/2025-10/building-capacity-report.pdf

Simoncini, M. (2025) ‘Eroding Meroni: The Fate of the Delegation to EU Agencies’, STALS Research Paper 7/2025, Sant’Anna Legal Studies, available at https://iris.luiss.it/handle/11385/254398

Véron, N. (2025) Breaking the deadlock on a single supervisor to unshackle Europe’s capital markets union, Blueprint 35, Bruegel, available at https://www.bruegel.org/blueprint/breaking-deadlock-single-supervisor-unshackle-europes-capital-markets-union

Wils, W.P.J. (2003) ‘The principle of ne bis in idem in EC antitrust enforcement: a legal and economic analysis’, World Competition 26(2): 131–148, available at https://kluwerlawonline.com/journalarticle/World+Competition/26.2/WOCO2003001

Authors

Related content

Endnotes

  1. 1

    Mario Mariniello, ‘Is it time for an independent European digital authority?’ First Glance, 19 February 2024, Bruegel, https://www.bruegel.org/first-glance/it-time-independent-european-digital-authority.

  2. 2

    For example, in December 2025, the US imposed visa restrictions on former European Commissioner Thierry Breton, citing concerns about freedom of speech and the targeting of American social media platforms. The visa ban was preceded by US President Donald Trump’s threats against European companies such as Spotify, Siemens, SAP, Mistral and DHL if the EU were to [insist on] discriminatory and harassing lawsuits, taxes, fines” against US service providers. See Barbara Moens, ‘EU readies tougher tech enforcement in 2026 as Trump warns of retaliation’, Financial Times, 4 January 2026, https://www.ft.com/content/ca6f3062-f286-4a13-b81d-6e2a35c91fdc; and United States Trade Representative, ‘The U.S. continues to engage with the EU on the Digital Markets Act...’, post on X, 16 December 2025, https://x.com/ustraderep/status/2000990028835508258.

  3. 3

    Jorge Valero, ‘EU puts its digital tax on hold after US pressure’, Euractiv, 12 July 2021, https://www.euractiv.com/news/eu-puts-its-digital-tax-on-hold-after-us-pressure/.

  4. 4

    The EU wanted EU cloud services to be free from non-EU legal interference, implying that a US judge could not issue a warrant for European data held abroad by US companies. The text on requirements was revised to a lighter obligation. See Foo Yun Chee, ‘EU drops sovereignty requirements in cybersecurity certification scheme, document shows’, Reuters, 3 April 2024, https://www.reuters.com/technology/eu-drops-sovereignty-requirements-cybersecurity-certification-scheme-document-2024-04-03/.

  5. 5

    An analysis of the European Commission’s archives shows that the use of the words ‘protect European’ in official documents on digital economy-related policy areas increased from an average of four times per year from 2000 to 2019 to 48 times per year between 2020 and 2025. The analysis covers Commission factsheets, infringement decisions, news, press releases, questions and answers, read-outs, speeches and statements in the policy areas: antitrust, AI, competition, competitiveness, cybersecurity, digital economy and society, disinformation, research and innovation, single market, state aid and technology. See the European Commission’s Press Corner website: https://ec.europa.eu/commission/presscorner/home/en.

  6. 6

    For an overview of the literature on how distorted competition enforcement can affect competitiveness, see Motta (2004).

  7. 7

    For an EU digital legislation tracking tool see Bruegel Dataset (2023).

  8. 8

    Examples of these include: the Interoperable Europe Act (Regulation (EU) 2024/903), the General Data Protection Regulation (GDPR, Regulation (EU) 2016/679) and the Data Act (Regulation (EU) 2023/2854).

  9. 9

    As enshrined in Articles 101 and 102 of the Treaty on the Functioning of the European Union (OJ C 326, 26.10.2012) and in the EU Merger Regulation (Regulation (EU) 139/2004).

  10. 10

    Annex III of the AI Act lists the areas in which risk is presumed to be high.

  11. 11

    According to Article 3 of the AI Act: “‘general-purpose AI model’ [GPAIs] means an AI model […] that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications”.

  12. 12

    DMA cases can be found at https://digital-markets-act-cases.ec.europa.eu/search.

  13. 13

    The Commission also sent preliminary findings of infringement to Meta and TikTok in October 2025 and to TikTok in February 2026. The investigations are ongoing at time of writing. See European Commission press release of 24 October 2025 ‘Commission preliminarily finds TikTok and Meta in breach of their transparency obligations under the Digital Services Act’, https://ec.europa.eu/commission/presscorner/detail/en/ip_25_2503; and European Commission press release of 6 February 2026, ‘Commission preliminarily finds TikTok’s addictive design in breach of the Digital Services Act’, https://ec.europa.eu/commission/presscorner/detail/en/ip_26_312.

  14. 14

    Anecdotical evidence can be retrieved. For example, in early January 2026, sexualised photos flooded the social network X. See A.J. Vicens and Raphael Satter ‘Elon Musk’s Grok AI floods X with sexualised photos of women and minors’, Reuters, 3 January 2026, https://www.reuters.com/legal/litigation/grok-says-safeguard-lapses-led-images-minors-minimal-clothing-x-2026-01-02/. Pursuant to the DSA, X had already submitted its assessment of the risk of harmful content dissemination to the Commission in August 2025. That assessment contained no mention of the risk of potentially misusing its proprietary LLM, Grok, to produce sexualised deepfake images of minors.

  15. 15

    Antonia Zimmermann, Pieter Haeck, Koen Verhelst and Camille Gijs, ‘US demands digital concessions in return for EU steel tariff relief’, Politico, 24 November 2025, https://www.politico.eu/article/us-steel-tariff-relief-eu-digital-concessions-2025/.

  16. 16

    Daniel Viaña, ‘The Vice President of the European Commission states that the United States is “blackmailing” the EU’, El Mundo America, 27 November 2025, https://www.mundoamerica.com/news/2025/11/27/69282bf021efa0050f8b458e.html.

  17. 17

    Pieter Haeck and Océane Herrero, ‘Macron says Brussels is “afraid” of tackling US Big Tech’, Politico, 28 November 2025, https://www.politico.eu/article/brussels-afraid-of-tackling-us-big-tech-emmanuel-macron-says/.

  18. 18

    See European Commission, 'DMA Review - Summary of the contributions to the targeted consultation, call for evidence and AI consultation', undated, https://digital-markets-act.ec.europa.eu/document/download/244d8f93-e969-41af-bdcc-23e791863449_en.

  19. 19

    Max Griera, Eliza Gkritsi and Pieter Haeck, ‘Socialists push for formal inquiry into enforcement of EU digital rules’, Politico, 27 November 2025, https://www.politico.eu/article/socialists-formal-inquiry-enforcement-eu-digital-rules-big-tech/.

  20. 20

    Extensive literature surveyed by Ruffing et al (2024) suggests that the functioning of independent authorities can be understood through a principal-agent dynamic.

  21. 21

    This appears to have been the approach adopted by the second Trump administration. For example, in early 2025, the US administration introduced a 25 percent tariff on Canada and Mexico targeting ‘non-US content’ in vehicles. This resulted in a self-defeating move, with significant losses in the US manufacturing sector and significant increases in average car prices. Following the US Supreme Court ruling of 20 February 2026 (Learning Resources, Inc. v. Trump, 607 U.S. 24–1287 (2026)), the Trump administration would not be able to impose such tariffs without the authorization of the US Congress. However, it would still retain the power to retaliate to a foreign act that is considered discriminatory (this could allegedly be the case of an act of EU enforcement) under Section 301 of the US Trade Act (Pub. L. 93-618, 88 Stat. 1978, 19 U.S.C. ch. 12).

  22. 22

    Beatrice Nolan, ‘U.K. investigation into X over allegedly illegal deepfakes risks igniting a free speech battle with the U.S.’, Fortune, 12 January 2026, https://fortune.com/2026/01/12/uk-investigation-x-xai-grok-deepfakes-us-europe-free-speech-battle/.

  23. 23

    In ESMA, investigations are carried out by an Independent Investigating Officer, appointed on an ad-hoc basis by ESMA’s Executive Director. Infringement and fining decisions are made by the Board of Supervisors (comprising the heads of national regulators) under the guidance of ESMA’s chairperson.

  24. 24

    The Commission proposes a list of candidates for scrutiny by the European Parliament. Based on the Parliament’s feedback, the Chairperson is appointed formally appointed by the Council. The Council also formally adopts the decision to appoint the other five independent members of AMLA’s Executive Board.

  25. 25

    See AMLA’s FAQ website: https://www.amla.europa.eu/faqs_en#setting-up-process.

  26. 26

    European Parliament resolution of 20 October 2020 with recommendations to the Commission on a Digital Services Act: adapting commercial and civil law rules for commercial entities operating online (2020/2019(INL)), https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52020IP0273.

  27. 27

    See footnote 17.

  28. 28

    Luca Bertuzzi, ‘EU single digital regulator on agenda at meeting of 13 European ministers’, MLex, 25 September 2025, https://www.mlex.com/mlex/articles/2392242/eu-single-digital-regulator-on-agenda-at-meeting-of-13-european-ministers.

  29. 29

    Meroni & Co., Industrie Metallurgiche, SpA v High Authority of the European Coal and Steel Community (1958), Case 9-56, https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:61956CJ0009.

  30. 30

    United Kingdom of Great Britain and Northern Ireland v European Parliament and Council of the European Union (2012), Case C-270/12, https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:62012CJ0270.

  31. 31

    The regulation envisages numeric thresholds above which companies are presumed to be within the scope of the Commission’s enforcement. For example, a GPAI model is presumed to have high-impact capabilities – and thus systemic risk – if the total computational power used for its training exceeds 10²⁵ floating-point operations (FLOPs). However, the Commission can designate gatekeepers, VLOSEs, VLSEs and GPAIs on its own initiative based on additional information. The designation is subject to the judicial review of the ECJ.

  32. 32

    This would violate the ‘ne bis in idem’ principle, enshrined in Article 50 of the EU Charter of Fundamental Rights; see Wils (2003).

  33. 33

    Note that providers also need to perform and implement risk mitigation measures when systemic risk is present; thus, the AI Act also contains more general, behavioural obligations that are not strictly product related and that may entail some discretion.

  34. 34

    Innocenzo Genna, ‘The regulation of foundation models in the EU AI Act’, International Bar Association, 12 April 2024, https://www.ibanet.org/the-regulation-of-foundation-models-in-the-eu-ai-act.

  35. 35

    The Commission’s AI Office currently participates in shaping the Commission’s AI legislative proposals. It is reasonable to expect that such proposals could have a significant qualitative decline, should the Commission lose the technical expertise it has accumulated.

  36. 36

    See US House Judiciary Committee press release of 25 July 2025, ‘The Foreign Censorship Threat: How the European Union’s Digital Services Act Compels Global Censorship and Infringes on American Free Speech’, https://judiciary.house.gov/media/press-releases/foreign-censorship-threat-how-european-unions-digital-services-act-compels.

  37. 37

    Fernanda Seavon, ‘Poland’s DSA Veto Shows How National Politics Can Stall EU Tech Rules’, Tech Policy Press, 23 January 2026, https://www.techpolicy.press/polands-dsa-veto-shows-how-national-politics-can-stall-eu-tech-rules/.

  38. 38

    See European Commission publication of 26 April 2024, ‘Guidelines for providers of VLOPs and VLOSEs on the mitigation of systemic risks for electoral processes’, https://digital-strategy.ec.europa.eu/en/library/guidelines-providers-vlops-and-vloses-mitigation-systemic-risks-electoral-processes.

  39. 39

    See European Commission publication of 14 July 2025, ‘Commission publishes guidelines on the protection of minors’, https://digital-strategy.ec.europa.eu/en/library/commission-publishes-guidelines-protection-minors.

  40. 40

    See European Commission publication of 13 February 2025, ‘The Code of Conduct on Disinformation’, https://digital-strategy.ec.europa.eu/en/library/code-conduct-disinformation.

  41. 41

    This number may evolve over time. The full list of VLOPs, updated as of February 2026, is available at: https://digital-strategy.ec.europa.eu/en/policies/list-designated-vlops-and-vloses

  42. 42

    ECAT was launched in April 2023 to support DSA enforcement through scientific and technical expertise, providing an ideal knowledge base to establish an authority with strictly technical enforcement tasks. See the European Centre for Algorithmic Transparency website: https://algorithmic-transparency.ec.europa.eu/index_en.