AI Compliance Guides for Europe

Operational guidance for the people responsible for it.

Practical reading for compliance teams, legal departments, data protection officers, cybersecurity owners, and AI system owners who need to turn regulatory requirements into governed execution — across EU AI Act, GDPR, and NIS2.

EU AI Act Compliance
EU AI Act · Art. 5–9

What Is AI Compliance Screening?

A screening session creates the structured basis for determining what the EU AI Act requires of a specific AI system. Here is what that process actually involves — and why informal approaches fail at audit time.

8 min read · May 2025
EU AI Act · Art. 6 & Annex III

EU AI Act Risk Classification: How High-Risk AI Systems Are Identified

Risk classification is not self-evident. It depends on intended use, Annex III context, actor role, and system-specific review. This guide explains how high-risk status is determined — and why it is a governance step, not just a label.

12 min read · May 2025
EU AI Act · Art. 5

EU AI Act Prohibited Practices: What Article 5 Means for Organizations

Article 5 defines the absolute boundary. Before any risk classification begins, every AI system must pass a prohibited-practice gate. What the eight prohibition categories mean in practice — and what a documented check requires.

12 min read · May 2025
EU AI Act · Art. 3

Provider, Deployer, Importer or Distributor? Actor Roles under the EU AI Act

The EU AI Act assigns obligations based on the actor role an organization holds — not just by the AI system it uses. Understanding whether you are a provider, deployer, importer, or distributor determines what you must actually do.

12 min read · May 2025
EU AI Act · Art. 9–17

EU AI Act Obligations: Why Risk Classification Is Only the Beginning

Risk classification tells you the regulatory direction. It does not tell you what to do next. Understanding which obligations apply — and what each requires as evidence — is a separate governance step that classification alone cannot answer.

12 min read · May 2025
Evidence & Auditability
Governance Workflow
GDPR and AI
NIS2 Governance
Enterprise AI Governance
Enterprise

Why Spreadsheets Fail for AI Governance

Spreadsheets create documentation — they do not create governance. AI governance requires responsibility, workflow, evidence, approval, audit trail, and reconstructable decisions. What the difference means when an auditor asks for evidence.

14 min read · May 2025
Executive · EU AI Act

How Boards Should Read AI Compliance Status

Boards are increasingly accountable for AI governance outcomes. What meaningful AI compliance reporting looks like at executive level — and what a board should be able to ask and verify.

14 min read · May 2025
Enterprise · EU AI Act

What Enterprise Auditors Need from AI Governance Records

Internal and external auditors reviewing AI governance are not assessing intent. They are assessing evidence. What enterprise-grade AI governance records look like and what gaps they expose.

15 min read · May 2025
EU AI Act

Technical Documentation under the EU AI Act

Technical documentation is not a file archive — it is the evidence backbone of AI governance. What Article 11 and Annex IV require and how documentation connects to risk classification, obligations, and audit readiness.

13 min read · May 2025
EU AI Act

Human Oversight under the EU AI Act

Article 14 requires more than a policy that says humans are in the loop. Deployers must assign real oversight to named people with competence, authority, and documented intervention rights.

12 min read · May 2025
EU AI Act

AI Literacy Obligations under the EU AI Act

Article 4 requires that providers and deployers ensure sufficient AI literacy in their staff. What this means in practice, who needs literacy, how it differs by role, and what evidence governance requires.

11 min read · May 2025
EU AI Act

General-Purpose AI Models (GPAI): What Organizations Need to Know

GPAI models sit behind many everyday AI tools. Most organizations are not GPAI providers — but they still need governance over how GPAI-based systems are used, governed, and evidenced.

13 min read · May 2025
EU AI Act

Conformity Assessment: When Self-Assessment Is Not Enough

Conformity assessment determines how a provider demonstrates that a high-risk AI system meets EU AI Act requirements. Self-assessment is available for many systems — but it is not informal, and it is not permanent.

13 min read · May 2025
EU AI Act

Post-Market Monitoring under the EU AI Act

Article 72 makes clear that compliance does not end at approval. High-risk AI systems must be monitored throughout their lifetime — with structured plans, evidence, corrective action, and re-screening when issues arise.

12 min read · May 2025
EU AI Act · Deployers

What Deployers Actually Have to Do

Many organizations assume EU AI Act compliance is mainly a provider problem. It is not. Article 26 gives deployers their own obligations — from following instructions and assigning oversight to managing evidence and approving use.

12 min read · May 2025
EU AI Act · HR

High-Risk AI in HR and Employment

AI in recruitment, evaluation, monitoring, and worker management is one of the most consequential high-risk areas under the EU AI Act. What Annex III means for HR tools, why governance must be use-case-specific, and what evidence is required.

13 min read · May 2025
EAB Compliance Platform

Reading about it is one thing. Governing it is another.

EAB turns EU AI Act, GDPR, and NIS2 compliance into a structured, attributed, audit-ready governance process — not a document folder.

EU-hosted · Anchored to CELEX 32024R1689

Get in Touch
Request More Information

Tell us about your organization and what you’re looking to address. We’ll follow up with the relevant information.