Who is a Provider?
A provider is a natural or legal person, public authority, agency, or other body that develops an AI system or a general-purpose AI model, or has an AI system or general-purpose AI model developed, with a view to placing it on the market or putting it into service under its own name or trademark, whether for payment or free of charge.
Examples: An AI software company building a CV-screening tool. A medical device manufacturer embedding diagnostic AI in their product. A tech company releasing a foundation model API.
Who is a Deployer?
A deployer is a natural or legal person, public authority, agency, or other body using an AI system under its authority, except where the AI system is used in the course of a personal non-professional activity.
Examples: A bank using a third-party credit scoring AI. A hospital deploying a diagnostic AI system built by a medical device company. An employer using an HR platform's AI recruitment feature.
Provider Obligations for High-Risk AI (Article 16)
| Obligation | Article | What It Requires |
|---|---|---|
| Quality management system | Art. 17 | Written policies and procedures for AI Act compliance, updated throughout the system's lifecycle |
| Technical documentation | Art. 11, Annex IV | Comprehensive documentation of the system before market placement |
| Record-keeping | Art. 12 | Automatic logging of events; logs retained for specified periods |
| Transparency to deployers | Art. 13 | Instructions for use enabling compliant deployment |
| Human oversight design | Art. 14 | Technical measures enabling effective human oversight |
| Accuracy and robustness | Art. 15 | Appropriate accuracy levels; resilience against errors and adversarial inputs |
| Conformity assessment | Arts. 43–44 | Self-assessment or notified body assessment before market placement |
| EU declaration of conformity | Art. 47 | Formal declaration that the system meets AI Act requirements |
| CE marking | Art. 48 | Affix CE marking to high-risk AI systems |
| EU database registration | Art. 49 | Register the system in the EU AI database before placing on market |
| Post-market monitoring | Art. 72 | Proactive monitoring plan; serious incident reporting |
| Serious incident reporting | Art. 73 | Report serious incidents or malfunctions to national authorities without undue delay |
Deployer Obligations for High-Risk AI (Article 26)
| Obligation | Article | What It Requires |
|---|---|---|
| Use per instructions | Art. 26(1) | Use the AI system in accordance with the provider's instructions for use |
| Human oversight | Art. 26(2) | Implement the human oversight measures specified by the provider; designate qualified persons for oversight |
| Monitor performance | Art. 26(5) | Monitor for risks arising from use in specific deployment context |
| Report incidents | Art. 26(5) | Report serious incidents or malfunctions to the provider and (where required) national authorities |
| Log retention | Art. 26(6) | Keep logs generated by the AI system that are under deployer control; minimum 6 months where no specific law applies |
| GDPR compliance | Art. 26(8) | Where personal data is processed, comply with GDPR in addition to AI Act obligations |
| Fundamental Rights Impact Assessment | Art. 27 | Required for public bodies and certain private deployers using AI in Annex III areas 5–8; register assessment in EU database |
| Notify affected workers | Art. 26(7) | Where AI systems affect employees, inform workers' representatives and the workers themselves |
When Deployer Becomes Provider
A deployer that modifies an existing high-risk AI system beyond the scope of the provider's intended use — or that makes substantial modifications to the system — takes on provider obligations for the modified system.
Similarly, where a provider is established outside the EU, their EU-based importer or authorised representative takes on specified provider-equivalent obligations.
Obligations for Other Actors
Importers (Article 23)
EU-based importers of high-risk AI systems from outside the EU must verify that the provider has conducted a conformity assessment, that technical documentation exists, and that the system bears CE marking. They must not place non-compliant systems on the EU market.
Distributors (Article 24)
Distributors must verify CE marking and technical documentation are in place before making systems available. They must inform providers and importers of any identified non-compliance.
Authorised Representatives (Article 22)
Non-EU providers of high-risk AI must designate an EU-based authorised representative before placing their systems on the EU market. The representative acts as the contact point for national authorities.