GDPR and AI

TOMs, Vendors, and AI Systems

Technical and organisational measures and vendor governance are not separate administrative files when AI systems process personal data. An AI system is rarely isolated — it may rely on a SaaS provider, call an external model API, process personal data through a cloud environment, or depend on processors and subprocessors. This means TOMs and vendor governance are part of the AI system's evidence and accountability structure.

Many organizations document TOMs too generically. Access control exists. Encryption is used. Backups are in place. Policies exist. These statements may be useful for baseline governance, but they are not sufficient when AI systems process personal data. The question is not only "does the organization have TOMs?" — the better question is "which TOMs support this AI system, this processing activity, this vendor relationship, and this governance decision?"

GDPR TOMs and AI governance overlap

Article 32 GDPR requires appropriate technical and organisational measures to ensure a level of security appropriate to risk. In AI contexts, this connects directly to AI governance because AI systems often introduce additional processing, access, inference, monitoring, vendor, and evidence questions. The overlap can include: access control; role management; pseudonymisation; encryption; logging; availability; resilience; backup and recovery; incident handling; data minimization; retention control; processor governance; subprocessor oversight; human review; training and AI literacy; vendor documentation; security of interfaces and integrations; and auditability of system use. These measures can support both GDPR accountability and AI governance evidence. A TOM profile should therefore not remain disconnected from the AI System Registry.

AI systems need system-specific TOM context

Not every AI system requires the same TOM profile. An internal drafting assistant using no personal data may require different measures than an HR AI system processing applicant data. A healthcare support system may require different controls than a marketing personalization tool. A biometric system may require more sensitive access, logging, and data governance controls than a document classification tool. A high-risk AI system may require stronger evidence around security, robustness, oversight, and technical documentation. The TOM profile should reflect the system context — a single generic security statement cannot capture these differences.

Vendor governance is part of AI accountability

AI systems often depend on vendors. The vendor may provide the model, operate the SaaS platform, process personal data, host infrastructure, or supply documentation, instructions for use, and processor terms. This creates accountability questions. Who processes personal data? Who is the processor? Who is the AI provider? Who controls model updates? Who stores logs? Who can access data? Which subprocessors are involved? Which TOMs are contractually documented? Which AI Act evidence is vendor-dependent? Vendor governance is therefore not only procurement — it is part of the AI governance record.

Processor governance and AI provider governance are different

A vendor can be relevant under multiple governance lenses. Under GDPR, the key question may be whether the vendor acts as processor, controller, joint controller, or subprocessor, and what contractual, technical, and organizational controls apply. Under the EU AI Act, the key question may be whether the vendor acts as provider, deployer, importer, distributor, or another actor in the AI value chain. These are not the same categories. A processor under GDPR is not automatically the provider under the AI Act. The same vendor may need to be documented in both GDPR Vendor Governance and AI System Governance — but with different responsibility logic and distinct assessment dimensions.

What an AI-related TOM profile should show

A TOM profile connected to AI systems should be structured enough to support review. It should show: which AI system or processing activity the measures support; which legal entity is responsible; which vendor or processor is involved; which data categories are processed; which access controls apply; which logging or monitoring controls exist; which encryption or pseudonymisation measures apply; which retention controls exist; which incident handling process applies; which subprocessor or hosting dependencies exist; which documentation supports the measures; which evidence is complete, missing, partial, or external; which owner is responsible for maintenance; and when the profile was last reviewed. This turns TOMs into evidence rather than abstract policy language.

Vendor records must support evidence readiness

A vendor record should not only store contract information. For AI systems, the vendor record may need to support evidence readiness — including data processing agreement status, subprocessor information, TOM documentation, security documentation, hosting location, data transfer information, processor role, AI provider role, instructions for use, technical documentation, risk information, model update information, logging or auditability commitments, incident notification obligations, and support for human oversight. This evidence should be linked to the AI system where relevant. Otherwise, the organization may know the vendor exists but not whether the vendor evidence supports the AI governance decision.

External evidence must remain visible

Many organizations rely on vendors for security documentation, technical documentation, model information, processor terms, subprocessor lists, or audit reports. That is normal. But external evidence must be visible as external evidence — it should not be treated as complete internal control, hidden as not applicable, or assumed without documentation. The evidence state should distinguish: internally documented; externally covered; externally requested; missing; partial; outdated; not applicable; requires review. A vendor promise is not the same as verified evidence. A missing vendor document is not the same as non-applicability.

TOMs must connect to the processing activity

Where an AI system processes personal data, TOMs should connect to the processing activity. The RoPA should show the processing purpose, data categories, data subjects, recipients, retention, and security measures. The AI system record should show how the system uses personal data and whether AI-specific governance applies. The TOM profile should show the measures supporting that processing. The vendor record should show who operates or processes data. If these objects are disconnected, the organization may struggle to show a coherent privacy governance picture when asked.

TOMs must connect to the AI system record

TOMs should also connect to AI governance. An AI system may require access control evidence because only trained users should operate it. It may require logging because outputs influence decisions. It may require incident handling because system misuse could affect individuals. It may require vendor review because the model provider processes prompts or outputs. It may require encryption because personal data is transferred to a processor. It may require human oversight because automated outputs influence case handling. These are not only GDPR controls — they become evidence for AI governance.

Vendor governance must connect to actor-role assessment

Vendor governance also supports Actor Role Assessment. If an organization uses a third-party AI system, it must understand whether it is deployer, provider, importer, distributor, or another relevant actor. The vendor relationship can affect that role. Does the organization substantially modify the system? Does it offer it to customers? Does it rebrand it? Does it place it on the EU market? Does the vendor provide instructions and documentation? Does the organization control intended purpose? The vendor record can provide evidence for these determinations.

Vendor governance must connect to approval

An AI system should not be approved blindly where vendor evidence is missing. The approving role should know whether vendor-related evidence is complete, partial, missing, external, or outdated. If a vendor processes personal data, the DPA status should be visible. If a vendor supplies an AI system, instructions for use and relevant documentation should be visible where required. If vendor evidence is missing, approval may require completion, escalation, conditional approval, or documented risk acceptance. Vendor governance is therefore part of approval readiness.

TOMs and vendor governance support audit readiness

Auditors may ask how the organization ensured appropriate security and organizational controls for an AI system. They may ask whether a processor was reviewed. They may ask whether vendor TOMs were assessed. They may ask whether data protection measures support the AI processing activity. They may ask whether the AI system record connects to GDPR documentation. They may ask whether missing vendor evidence was known at approval. A connected TOM and vendor record helps answer these questions — showing that controls, processors, AI systems, evidence, and approval decisions were governed together.

TOMs and AI systems must be reviewed over time

AI systems change. Vendors update services. Subprocessors change. Hosting arrangements change. Model functionality changes. Data categories change. A TOM profile that was sufficient at first approval may require review later. A vendor that was low-risk at onboarding may become critical after the AI system expands. A processor relationship may change when new features are activated. This is why TOM and vendor governance should be connected to re-screening and periodic review. Approved once does not mean controlled forever.

How EAB connects TOMs, vendors, and AI systems

In EAB, TOMs, vendors, and AI systems are connected through governance records. The AI System Registry creates the system-level record. The GDPR module supports processing activities, TOM profiles, vendor governance, and DPIA or DSFA logic. Where an AI system processes personal data, the system record can connect to the relevant processing activity. TOM profiles support privacy, security, and AI governance evidence. Vendor Governance documents processors, AI providers, subprocessors, security documentation, DPA status, external evidence, and review state. Evidence Readiness shows whether vendor and TOM evidence is complete, partial, missing, unclear, external, outdated, or not applicable. Supervisor Approval can take vendor gaps and TOM evidence into account before approval. Audit-Ready Traceability preserves review, changes, approvals, evidence status, and later re-screening.


For a broader view of how AI governance and data protection governance connect across frameworks, see GDPR and the EU AI Act: Where AI Governance and Data Protection Overlap.

EAB Compliance Platform

TOMs and vendor governance are part of your AI evidence.

EAB connects TOM profiles, vendor records, processing activities, and AI system governance into one linked record — so privacy controls and AI accountability are visible together, not scattered across separate files.

EU-hosted · EU AI Act & GDPR

Get in Touch
Request More Information

Tell us about your organization and what you’re looking to address. We’ll follow up with the relevant information.