EU AI Act Compliance

Provider, Deployer, Importer or Distributor? Understanding Actor Roles under the EU AI Act

EU AI Act obligations do not depend on risk classification alone. They depend on the organization's role in relation to the AI system. Before an organization can determine what it must do, it must first understand who it is — provider, deployer, importer, or distributor — in the specific context of each system it governs.

A company may develop an AI system, place it on the market, import it, distribute it, deploy it internally, operate it for customers, modify it, rebrand it, integrate it into another product, or use it in a regulated business process. These are not minor distinctions. They determine which obligations apply, which evidence must be prepared, which responsibilities remain with the organization, and which governance path the system must follow.

This is why Actor Role Assessment must happen before obligations are assigned. Before an organization asks what it must do under the EU AI Act, it must first understand who it is in relation to the AI system.

Actor role is a governance trigger

Many organizations begin AI compliance with a risk question: is the system high-risk? Does Annex III apply? These questions are important but incomplete without actor-role context. The same AI system can create different obligations depending on whether the organization is the provider, deployer, importer, distributor, or another relevant operator in the AI value chain.

A company that develops and markets an AI system under its own name may carry provider obligations. A company that uses a third-party system under its authority carries deployer obligations. A company that places a non-EU system on the EU market may carry importer obligations. A company that makes a system available in the EU supply chain may carry distributor obligations. The role is therefore not just a legal label — it is a governance trigger that determines which obligations are relevant.

Governance principle

A risk class without actor-role context is not operational governance. The obligation view is only complete when the system, the risk classification, and the organization's actor role are all identified — in that sequence.

Provider: when the organization creates or places the system on the market

A provider is generally the actor that develops an AI system — or has it developed — and places it on the market or puts it into service under its own name or trademark. The provider role is one of the most obligation-intensive positions. A provider may be responsible for system design, conformity-related duties, technical documentation, risk management, data governance, logging, transparency, human oversight design, accuracy, robustness, cybersecurity, quality management, and post-market monitoring.

Critically, the provider role is not limited to traditional software vendors. An organization may become provider-like if it builds an internal AI system, significantly modifies a third-party system, rebrands a system, places it on the market under its own name, or integrates AI capability into a product in a way that shifts responsibility. Many companies underestimate this exposure by assuming they are "only users" because they rely on an external model, platform, or API — but if they package, configure, adapt, or control the system in certain ways, the role analysis may become more complex.

Deployer: when the organization uses the system under its authority

A deployer is generally the actor that uses an AI system under its authority in a business context. For many enterprises, this will be the most common role. Examples include AI systems used in HR, customer service, marketing, fraud detection, credit support, analytics, healthcare administration, legal operations, or internal workflow automation.

The deployer role does not mean "no responsibility." Deployers may need to ensure appropriate use, human oversight, input data relevance, monitoring, record-keeping, transparency toward affected persons, and cooperation with providers or authorities — depending on the system and applicable obligations. The question is not only whether the vendor is compliant. The question is whether the organization's own use is governed.

Importer: when the organization brings a non-EU system to the EU market

An importer is generally an actor established in the EU that places on the market an AI system bearing the name or trademark of a person established outside the EU. For organizations working with non-EU AI vendors, this role can become relevant. The importer position matters because EU market access creates responsibility within the Union. The importer may have duties to verify that certain provider obligations have been addressed before the system can lawfully enter the EU market.

Importer assessment is a supply-chain governance question. The organization must understand whether it is merely using a system, making it available, placing it on the market, or acting as the EU-facing responsible link for a non-EU system — and if the latter, that status must be documented.

Distributor: when the organization makes the system available in the supply chain

A distributor is an actor in the supply chain, other than provider or importer, that makes an AI system available on the EU market. Distribution is not passive in regulatory terms. A distributor may not design the AI system or own the model, but it can still have obligations connected to making the system available — for example, verifying markings, documentation, or instructions before making a high-risk AI system available.

From a governance perspective, distributor assessment prevents a common mistake: assuming that only the developer has obligations. The EU AI Act regulates a chain of responsibility, and each actor in that chain must understand its position.

One organization can have more than one role

Actor Role Assessment must account for overlapping roles. A company may be deployer for one AI system, provider for another, importer for a third, and distributor in a separate product line. It may even hold more than one role in relation to the same system depending on what it does.

An organization may start as deployer of a third-party AI tool. Later, it modifies the system substantially, integrates it into its own product, or markets it under its own brand. The role may change. A company-wide assumption such as "we are only deployers" is too weak. The correct governance question is: which role does this organization have for this AI system, in this use case, at this point in time?

Actor role can change when the system changes

AI systems do not remain fixed. They are configured, adapted, extended, integrated, rebranded, repurposed, and connected to new workflows. A system originally used internally may become part of a customer-facing service. A tool originally for drafting may later be used for decision support. A third-party API may become embedded into a proprietary platform. These changes may affect the actor role — which is why Actor Role Assessment must be connected to re-screening and change governance.

Vendor responsibility does not remove deployer responsibility

One of the most common misunderstandings is the belief that AI compliance belongs entirely to the vendor. A vendor may be the provider — but an organization using the system may still be the deployer with its own responsibilities. The provider may need to supply information, documentation, instructions, and system-level compliance measures. The deployer may need to ensure appropriate human oversight, monitor operation, provide information to affected persons where required, and maintain internal governance records. Vendor responsibility does not eliminate organizational responsibility.

Certainty levels matter in role assessment

Actor-role assessment is not always straightforward. A company that develops and sells an AI system under its own brand is clearly a provider. A company that uses a commercial AI tool internally is likely acting as deployer. But many cases sit between these poles — a SaaS provider embedding external AI into its own product, a European entity introducing a non-EU system, a consulting firm configuring AI tools for clients, a group company centralizing AI procurement across subsidiaries.

In these cases, a simple yes/no role selection is insufficient. The governance record should allow uncertainty to be visible and distinguish between clear determinations, likely determinations, uncertain roles requiring review, and multiple possible roles. Hiding uncertainty creates false assurance. A governed process makes uncertainty visible, assigns responsibility, and preserves the review path.

How EAB structures actor role assessment

In EAB, Actor Role Assessment is part of the AI governance chain — not a footnote after risk classification. The AI System Registry establishes the system context. Business Intake captures how the system is used, whether it supports internal or external processes, and whether affected persons may be involved. Guided Technical Completion adds information about provider, deployment, integration, modification, and operational control. Actor Role Assessment then determines the organization's role, allows certainty levels and rationale to be documented, and connects the result to the Obligation Matrix. Audit-Ready Traceability preserves the role determination, changes, reviewer attribution, and later re-screening history.


Once actor role is determined, the EU AI Act obligations that follow from actor role and risk classification — including why obligation mapping is the step most organizations reach too late to create system-specific governance work — and why obligation mapping is the step most organizations reach too late.

EAB Compliance Platform

Know your role. Govern your obligations.

EAB connects actor-role assessment to the Obligation Matrix — so your governance record shows not only what was classified, but which responsibilities apply to your specific position in the AI value chain.

EU-hosted · Anchored to CELEX 32024R1689

Get in Touch
Request More Information

Tell us about your organization and what you’re looking to address. We’ll follow up with the relevant information.