Viktoria Compliance

This document is a template provided as a starting point for your compliance documentation. It does not constitute legal advice and should be reviewed by a qualified legal professional before use. Viktoria Compliance accepts no liability for the use of this template.

AI Act Conformity Assessment Procedure — Template

Customize Template

Fill in your organisation details below. The preview updates in real time.

Version 1.0.0 — Last updated 2026-04-25

1. Purpose and Scope

This procedure governs the conformity assessment of [aiSystemName] (version [aiSystemVersion]), classified as high-risk under Article 6 of Regulation (EU) 2024/1689 (the 'AI Act'). The conformity assessment is the gate before [companyName] places the system on the market or puts it into service in the European Union, and is reissued on each substantial modification. The procedure is owned by the AI Officer [aiOfficer] in coordination with the CISO [cisoName]. It enters into force on 2026-04-26 and is reviewed on or before [reviewDate]. The supervisory authority [supervisoryAuthority] is the addressee of any notification, registration and post-market reporting required by the procedure.

2. Assessment Route

Two assessment routes are possible under Article 43: (a) for high-risk systems classified under Annex III not covered by harmonised standards, and where harmonised standards or common specifications adopted by the Commission are available, internal control conformity assessment per Annex VI is the default, conducted by [companyName] itself; (b) for high-risk systems falling within Annex I (safety components of products under existing Union harmonisation legislation, or biometric-identification systems under Annex III(1) where the provider has not applied harmonised standards or common specifications), assessment by a notified body per Annex VII is required. Where the route is third-party, [notifiedBody] is the notified body engaged. The Risk Classification Decision file determines the applicable route.

3. Requirements Checklist (Articles 8-15)

The system must demonstrate compliance with the seven mandatory requirements for high-risk AI systems: Article 9 (risk management system — iterative process throughout the life cycle); Article 10 (data and data governance — relevance, representativeness, freedom from errors, completeness, statistical properties, bias detection and mitigation); Article 11 + Annex IV (technical documentation); Article 12 (record-keeping — automatic event logging over the lifetime of the system); Article 13 (transparency and provision of information to deployers); Article 14 (human oversight measures); Article 15 (accuracy, robustness and cybersecurity). For each requirement, the procedure identifies the applicable harmonised standards or common specifications and documents how the system meets them.

4. Evidence Collection

Evidence supporting conformity is collected from across [companyName]'s engineering, security and operations teams: model cards and datasheets; data-set provenance and bias audit reports; functional and adversarial testing results; security testing including penetration tests and adversarial-input testing per Article 15(5); accuracy benchmarks against representative datasets; logging architecture and retention policy; oversight design documentation; transparency notice and instructions for use; risk register and risk-treatment records; incident-history. Evidence is mapped to the requirements of Sections 8-15 of the AI Act and is retained for ten years after the system is placed on the market or put into service.

5. EU Declaration of Conformity (Article 47)

On successful completion of the conformity assessment, [companyName] signs a written EU declaration of conformity for [aiSystemName] in accordance with Article 47 and Annex V. The declaration states the system identifier, the name and address of the provider, the legal basis of compliance (relevant harmonised standards or common specifications applied), the name of the notified body where applicable, the EU declaration of conformity reference number, the place and date of issue, and the name and signature of the person empowered to sign on behalf of [companyName]. The declaration is held available for the supervisory authority [supervisoryAuthority] for ten years and is referenced from the AI System Inventory. The declaration of conformity, together with the technical documentation, shall be kept at the disposal of the competent national authorities for ten (10) years from the date the AI system has been placed on the market or put into service (Article 18 RIA).

6. CE Marking (Article 48)

The CE marking is affixed visibly, legibly and indelibly to the high-risk AI system or, where that is not possible, to its packaging or accompanying documents. The CE marking is followed by the identification number of the notified body where third-party conformity assessment was conducted (per Annex VII). The CE marking signifies that [companyName] takes responsibility for the conformity of the system with all applicable requirements of the AI Act. For high-risk AI systems that are also subject to other Union harmonisation legislation requiring CE marking, the CE marking indicates conformity with the requirements of those acts as well.

7. Post-Market Monitoring (Articles 72-73)

[companyName] establishes a post-market monitoring system proportionate to the nature of the AI technology and the risks of the high-risk AI system, in accordance with Article 72. The system actively and systematically collects, documents and analyses relevant data on the performance of the AI system throughout its lifetime, and allows the provider to evaluate the continuous compliance of AI systems with the requirements of the AI Act. Serious incidents (Article 73) — defined to include the death or serious harm to the health of a person, serious and irreversible disruption of management and operation of critical infrastructure, infringement of Union law fundamental rights, or serious harm to property or the environment — are reported to the supervisory authority [supervisoryAuthority] within fifteen (15) days of awareness, with shorter deadlines for the most serious cases.

8. Reassessment Triggers

A new conformity assessment is performed when: (a) [companyName] makes a substantial modification to the system within the meaning of Article 3(23) and Article 43(4) — modification not foreseen in the initial conformity assessment that affects compliance with Article 8-15 requirements or the intended purpose; (b) post-market monitoring identifies non-conformity that cannot be corrected by ordinary maintenance; (c) a notified body detects non-conformity in surveillance under Annex VII; (d) the supervisory authority requests reassessment under Article 79. Reassessment results in an updated EU declaration of conformity and, where applicable, an updated entry in the EU database for high-risk AI systems (Article 71).

This document is a template provided as a starting point for your compliance documentation. It does not constitute legal advice and should be reviewed by a qualified legal professional before use. Viktoria Compliance accepts no liability for the use of this template.