Topic Hub
EU AI Act Compliance
Everything technology companies need to know about the EU AI Act — risk classification, compliance deadlines, and practical steps to prepare before the August 2026 enforcement date.
Overview
The EU AI Act is the world's first comprehensive legal framework for artificial intelligence, establishing binding obligations for companies developing or deploying AI systems in the European market. With the high-risk system deadline of 2 August 2026 now less than four months away, technology companies face a closing window to assess their AI systems, implement compliance programmes, and establish governance frameworks.
This hub brings together our practical guides, regulatory analysis, and free resources to help your team navigate EU AI Act compliance — whether you are a provider building AI systems or a deployer integrating AI into your products and operations.
What You'll Find Here
Practical guides on EU AI Act risk classification and obligations
Provider vs deployer compliance requirements explained
Key deadlines and enforcement timeline
Sector-specific guidance for SaaS, fintech, and healthtech
Downloadable checklists and compliance playbooks
GPAI and general-purpose AI model obligations
Key Dates
Enforcement timeline and deadlines.
2 Feb 2025
Prohibited AI practices provisions became applicable.
2 Aug 2025
General-purpose AI (GPAI) obligations, governance structure, and penalties take effect.
2 Aug 2026
High-risk AI system obligations become enforceable — the primary compliance deadline for most companies.
2 Aug 2027
Obligations for high-risk AI systems that are safety components of products covered by existing EU legislation.
Guides & Articles
In-depth resources on this topic.
AI Governance Frameworks for Responsible Deployment: A Practical Guide
From the EU AI Act to India's governance guidelines, a global regulatory architecture for AI is taking shape. This article maps the landscape and provides a practical roadmap.
Protecting Intellectual Property in AI Development Agreements
As AI development increasingly involves third-party datasets, pre-trained models, and collaborative arrangements, the question of who owns what has never been more commercially significant.
EU AI Act Penalties: What Technology Companies Need to Know Before August 2026
The EU AI Act introduces fines of up to €35 million or 7% of global turnover. With the main enforcement deadline four months away, here is what technology companies need to understand.
Coming soon
EU AI Act: What Tech Companies Need to Know
EU AI Act High-Risk AI Classification Explained
EU AI Act: Provider vs Deployer Obligations
EU AI Act Compliance for SaaS Companies
EU AI Act General-Purpose AI (GPAI) Rules
EU AI Act vs U.S. State AI Laws: A Comparison
Frequently Asked Questions
What is the EU AI Act?
The EU AI Act is the world's first comprehensive legal framework for artificial intelligence. It establishes a risk-based classification system that imposes different obligations depending on how an AI system is used, ranging from outright prohibitions on certain practices to transparency requirements for lower-risk systems.
Who does the EU AI Act apply to?
The Act applies to providers (developers) and deployers (users) of AI systems that are placed on the EU market or whose outputs are used in the EU — regardless of where the company is based. If your AI system affects people in the EU, you likely have obligations under the Act.
What are the penalties for non-compliance?
Penalties are tiered based on the severity of the violation. The highest fines — up to €35 million or 7% of global annual turnover — apply to prohibited AI practices. High-risk system violations can attract fines of up to €15 million or 3% of turnover. Providing incorrect information to authorities can result in fines up to €7.5 million or 1% of turnover.
How do I know if my AI system is high-risk?
The EU AI Act defines high-risk AI systems in two ways: (1) AI used as a safety component of products already covered by EU product safety legislation, and (2) AI used in specific use cases listed in Annex III, including biometrics, critical infrastructure, education, employment, essential services, law enforcement, migration, and justice. A self-assessment against these categories is the starting point.
When do companies need to be compliant?
The Act is being phased in. Prohibited AI practices became enforceable in February 2025. GPAI rules apply from August 2025. The main deadline — obligations for high-risk AI systems — is 2 August 2026. Companies should already be working towards compliance given the complexity of the requirements.
Does the EU AI Act apply to companies outside the EU?
Yes. The Act has extraterritorial reach, similar to the GDPR. It applies to any provider placing an AI system on the EU market and any deployer using an AI system whose output is used within the EU, regardless of where the provider or deployer is established.
Need help with eu ai act compliance?
30 minutes. No preparation. No obligation.
Free 30-Min ConsultationExplore related