Conformity assessment is the process by which a provider demonstrates that a high-risk AI system meets the applicable EU AI Act requirements before it is placed on the market or put into service. For many high-risk AI systems, the conformity assessment may be based on internal control. For some systems, a notified body may be involved. For systems covered by existing Union harmonisation legislation, AI Act requirements may become part of the relevant sectoral conformity assessment procedure. The governance question is not simply: “Do we need a conformity assessment?” The stronger question is: “Which path applies, which evidence supports it, who owns it, and what happens when self-assessment is not enough?”
Conformity assessment is mainly a provider obligation
Conformity assessment is primarily relevant to providers of high-risk AI systems. A provider that places a high-risk AI system on the market or puts it into service must ensure that the system complies with the applicable requirements and that the correct assessment path has been followed. Deployers usually do not perform provider-side conformity assessment — but deployers still need to know whether the AI system they use has the required provider documentation, instructions, declarations, CE marking where applicable, and evidence needed for responsible use. This makes conformity assessment relevant across the governance chain.
Self-assessment does not mean informal assessment
Internal control or self-assessment should not be misunderstood. It does not mean the provider can simply write that the system is compliant. It means the provider follows a structured conformity assessment process without the involvement of a notified body where the AI Act allows that path. A self-assessment still needs evidence, technical documentation, requirements coverage, documented reasoning, traceability, and formal steps such as a declaration of conformity where required. Self-assessment is not a shortcut — it is a responsibility-heavy internal procedure.
When notified body involvement can become relevant
Notified body involvement may be required or relevant depending on the type of high-risk AI system and the applicable conformity assessment route. For certain systems — especially some biometric systems and systems connected to product-safety legislation or sector-specific rules — the assessment path can involve third-party conformity assessment. Where harmonised standards or common specifications are not fully applied in relevant cases, notified body involvement may also become relevant. The details depend on the system category, Annex I or Annex III context, applicable standards, common specifications, and actor role. This is exactly why organizations need structured governance — a simple risk label does not answer the conformity assessment question.
Conformity assessment depends on the system category
Some high-risk systems are safety components of products covered by existing Union harmonisation legislation listed in Annex I. Others are stand-alone high-risk AI systems referred to in Annex III. The conformity assessment procedure can differ. For product-related high-risk systems, AI Act requirements may be integrated into the existing conformity assessment procedure under the relevant sectoral legislation. For many stand-alone Annex III systems, internal control may apply. For certain biometric or special categories, notified body involvement may be relevant. A governance record should show why a specific route was selected.
Standards and common specifications matter
Harmonised standards and common specifications can affect the conformity assessment path and evidence basis. If the provider applies relevant harmonised standards or common specifications, this may support the conformity assessment. If they do not exist, are only partly applied, are restricted, or are not followed, the required route and evidence expectations may change. The organization should document which standards or specifications were considered, whether they were applied (fully or partially), whether restrictions existed, which evidence supports the claim, and whether notified body involvement is required. This information is part of conformity governance.
Conformity assessment is tied to technical documentation
A conformity assessment cannot be stronger than the documentation behind it. Technical documentation is central — showing how the system meets the applicable requirements across risk management, data governance, transparency, human oversight, accuracy, robustness, cybersecurity, and other relevant areas. If technical documentation is incomplete, the conformity assessment record is weak. If it is outdated, the assessment may no longer reflect the system. Conformity assessment must connect directly to technical documentation evidence readiness.
Conformity assessment and substantial modification
A high-risk AI system that has already undergone conformity assessment may require a new assessment if it is substantially modified. AI systems can change after initial approval: models are updated, use cases expand, performance changes, new functionality is added, systems are integrated into other products, human oversight changes, or intended purpose changes. The governance record should preserve what was assessed originally and identify changes that may trigger renewed assessment. A conformity assessment is not permanent if the system changes materially.
Deployer relevance: do not assume provider compliance blindly
Deployers of high-risk AI systems often rely on providers, but reliance should be documented. A deployer should know whether it has received instructions for use, relevant documentation, conformity information, declarations, and other provider evidence needed to use the system responsibly. If provider evidence is missing, incomplete, outdated, or unclear, this should appear as a governance gap. A deployer does not need to duplicate provider conformity assessment, but it should not approve deployment blindly. Vendor evidence matters.
Conformity assessment is not the same as AI screening
AI screening helps determine risk class, actor role, prohibited-practice relevance, obligations, and review needs. Conformity assessment is a provider-side compliance procedure for high-risk AI systems under the applicable AI Act route. These are connected, but not identical. Screening may identify that a system is high-risk and provider obligations are relevant — the conformity assessment then becomes part of the obligation and evidence path. Organizations should not confuse a screening result with conformity assessment completion. A risk class is the beginning. Conformity assessment is one possible downstream requirement.
Spreadsheets can mark whether conformity assessment is required. They cannot reliably govern the assessment path — which route applies, which standards were considered, which technical documentation supports the result, whether notified body involvement was required, whether a declaration exists, whether the assessment is current, or whether a substantial modification triggered renewed review. Conformity assessment requires structured evidence, actor role, system classification, documentation, approvals, and change history. A spreadsheet status field is not enough.
How EAB structures conformity assessment governance
In EAB, conformity assessment is handled as part of obligation and evidence governance. The AI System Registry identifies the system and responsible legal entity. AI Screening and Risk Classification identify whether high-risk logic may apply. Actor Role Assessment determines whether provider-side duties may be relevant to the organization. The Obligation Matrix surfaces conformity assessment as an obligation area where applicable. Evidence Readiness shows whether technical documentation, provider evidence, standards evidence, conformity documentation, declarations, or notified body information are complete, partial, missing, external, outdated, or unclear. Supervisor Approval makes gaps visible before deployment approval. The Re-Screening Queue supports renewed review when substantial modification or documentation changes occur. Audit-Ready Traceability preserves the decision path.