AI Act: How to Ensure Compliance Through Well-Structured European and National Governance

The new European regulation on artificial intelligence, the AI Act, fundamentally redefines the obligations for companies operating in this field. It is more than just a list of rules, it establishes a clear and ambitious governance framework, organized between European institutions and member states. Understanding this structure is essential for any organization that designs, deploys, or uses AI systems.

This article explains the different levels of governance set by the AI Act, the respective roles of European and national authorities, and the key compliance deadlines to meet.

A Two-Tier Governance Structure for the AI Act

The EU’s Strategic Leadership Role

The AI Act is built around a dual-level governance mechanism: the European level sets the strategy, establishes technical safeguards, and coordinates information exchange.

The European AI Office, the European AI Board, and their advisory bodies form the backbone of this structure. These entities are responsible for drafting delegated acts, ensuring coherence of practices, and overseeing general-purpose AI models.

Member States at the Core of Local Implementation

Each member state is tasked with implementing these requirements through specialized national authorities. This includes notifying bodies, market surveillance authorities, and institutions responsible for protecting fundamental rights.

This continuous dialogue between the EU and national authorities ensures harmonized enforcement while allowing states the flexibility to tailor controls to their local realities.

European Authorities Responsible for AI Act Compliance

European AI Office (Article 64)

Operational since February 21, 2024, this central hub drafts delegated acts, manages the future European database, and oversees enforcement of the regulation, particularly for general-purpose AI models.

European Artificial Intelligence Board (Articles 65 & 66)

Starting August 2, 2025, this board will bring together one representative from each member state ; the Commission will convene its first meeting on that date.

Advisory Forum (Article 67)

This multistakeholder body will gather both commercial and non-commercial stakeholders to provide additional expertise to other entities.

Scientific Panel of Independent Experts (Article 68)

Composed of independent AI experts, this group will offer technical advice to the European AI Office and the European AI Board.

National Authorities: Pillars of AI Regulation Enforcement

National Competent Authorities and Single Points of Contact (Article 70)

By August 2, 2025, each member state must designate at least one notifying authority and one market surveillance authority. One of them, by default, the market surveillance authority, will serve as the single point of contact for the Commission and other member states.

Market Surveillance Authorities (Article 74)

These authorities, to be established or empowered by August 2, 2025, will conduct investigations, mandate the withdrawal or update of non-compliant AI systems, and impose penalties in accordance with EU market surveillance rules.

Notifying Authorities (Article 28)

A public body (such as a ministry or accreditation agency) responsible for evaluating, designating, and monitoring future notified bodies. At least one must be operational in each member state by August 2, 2025.

Notified Bodies (Articles 29 to 38)

Independent from public authorities, these entities, designated through the notification process,  will issue conformity certificates for high-risk AI systems. The first designations are expected by late 2025 or early 2026, to ensure certificates are available ahead of the August 2, 2026 technical deadline.

Fundamental Rights Protection Authorities (Article 77)

As AI-related incidents can infringe on privacy, equality, or labor rights, each country must identify and publish the list of bodies responsible for protecting these rights. The deadline was November 2, 2024, well ahead of the main 2025 governance deadline. As of now, Hungary and Italy have yet to designate theirs.

In France, three authorities responsible for the protection of fundamental rights have been designated: the Directorate General for Competition Policy, Consumer Affairs and Fraud Control (DGCCRF), the National Commission on Informatics and Liberty (CNIL), and the Defender of Rights.

In contrast, some Member States have designated only one authority, such as Latvia, while others have designated a significantly higher number, such as Ireland with 9 authorities and Portugal with 14.

Special Case: European Union Institutions

For AI systems deployed or used directly by EU institutions, bodies, or agencies, any references to national competent or market surveillance authorities should be understood as referring to the European Data Protection Supervisor (EDPS).

AI Act Compliance Timeline: Key Deadlines

These staggered deadlines mean that while the European AI Office is already in place, member states have just over a year (until August 2, 2025) to build their entire national enforcement framework: regulators, notifying authorities, and market surveillance bodies.

However, the deadline for appointing bodies responsible for protecting fundamental rights in AI contexts was slightly earlier, until November 2024. As of today, Hungary and Italy have still not complied.

The list of authorities reported by member states to the European Commission is available on the official website: Governance and enforcement of the AI Act | Shaping Europe’s digital future.

The AI Act introduces multi-layered governance that structures compliance both at the European and national levels. For businesses, it is critical to begin preparing for upcoming controls and certifications now, to ensure their systems meet the requirements by the initial deadlines.

Want to learn more? Discover our AI compliance solutions

Share the Post: