Preparing for Your First AI Act Audit
At some point, your organisation’s AI Act compliance will be tested. It might be a market surveillance authority conducting a routine inspection. It might be an investigation triggered by a complaint. It might be a notified body conducting a conformity assessment. Whatever the trigger, you need to be ready to demonstrate compliance, not just assert it.
Preparing for an audit isn’t about assembling documents the week before an inspector arrives. It’s about building a compliance file that accurately reflects your practices and can withstand scrutiny. Here’s how to prepare.
What triggers an audit
Understanding what might bring an auditor to your door helps you prioritise your preparation:
Complaints. An individual affected by your AI system (a rejected job applicant, a denied loan applicant, a customer who wasn’t told they were talking to AI) files a complaint with the national competent authority. This is likely to be the most common trigger, mirroring GDPR experience.
Market surveillance. National authorities have the power to conduct proactive market surveillance of AI systems. This includes testing AI systems, requesting documentation, and inspecting compliance.
Serious incidents. If you’ve reported a serious incident under Article 73, expect follow-up. The authority will want to understand what happened, why, and what you’ve done about it.
Sector-specific audits. In regulated sectors (financial services, healthcare, employment), existing sectoral regulators may incorporate AI Act compliance into their inspection programmes.
Conformity assessment. If your high-risk AI system requires third-party conformity assessment, the notified body will audit your compliance as part of the certification process.
Your compliance file
The compliance file is the collection of documentation that demonstrates your compliance with the Act. For high-risk AI systems, this should include:
Technical documentation (Annex IV)
This is the most extensive documentation requirement. Annex IV specifies that technical documentation must include:
- General description of the AI system (intended purpose, developer, version, how it interacts with hardware or software)
- Detailed description of the system’s elements and development process (methods and design choices, system architecture, computational resources, training data, testing methodology)
- Monitoring, functioning, and control of the system (capabilities and limitations, accuracy levels, foreseeable unintended outcomes, human oversight measures, input specifications, output interpretation)
- Risk management system documentation
- Description of changes through the system’s lifecycle
- Harmonised standards or common specifications applied
- EU Declaration of Conformity
- Post-market monitoring plan
Produce the technical documentation before or during development, not retroactively. Retroactive documentation tends to have gaps, inconsistencies, and a lack of the reasoning that contemporaneous documentation captures naturally.
Risk management system (Article 9)
Your risk management documentation should include:
- The risk identification and analysis methodology
- Identified risks, their evaluation, and the reasoning
- Risk management measures and their justification
- Residual risk assessment
- Testing conducted to validate risk measures
- Update history showing the iterative nature of the process
Data governance documentation (Article 10)
If you’re a provider, document your data governance practices:
- Training, validation, and testing datasets: origin, scope, characteristics
- Data preparation and processing steps
- Assumptions about information the data represents
- Bias assessment and mitigation measures
- Data quality metrics
Quality management system (Article 17)
Your quality management system documentation should cover:
- Compliance strategy and procedures
- Design, development, and testing techniques
- Examination, test, and validation procedures
- Technical standards applied
- Systems and procedures for data management
- Risk management process
- Post-market monitoring system
- Incident and malfunction reporting procedures
- Communication with authorities, notified bodies, and deployers
- Record-keeping procedures
Instructions for use (Article 13, Annex IV section 9)
Your instructions for use — the documentation you provide to deployers — must meet the requirements discussed in detail in our article on Annex IV compliance.
Logs and records
Maintain the logs generated by your AI system, along with evidence that deployers have been informed of their log retention obligations.
Incident records
Keep records of all incidents, including those assessed as not meeting the serious incident threshold. This demonstrates that you have a functioning incident assessment process.
Structuring for auditability
Create an index
An auditor who receives a USB drive full of PDFs in no particular order will not form a positive impression. Create a clear index that maps each compliance requirement to the specific document that addresses it.
| Requirement | Article | Document | Location | Last Updated |
|---|---|---|---|---|
| Risk management system | Art. 9 | Risk_Management_v3.pdf | /compliance/risk/ | 2026-03-01 |
| Technical documentation | Annex IV | TechDoc_SystemX_v2.pdf | /compliance/techdoc/ | 2026-02-15 |
| Data governance | Art. 10 | DataGov_Policy.pdf | /compliance/data/ | 2026-01-20 |
| Instructions for use | Art. 13 | IFU_SystemX_v2.pdf | /compliance/ifu/ | 2026-02-15 |
Version control everything
Every document should be version-controlled with dates, change descriptions, and the responsible person. Auditors will check that documents have been updated as required — a risk management system that hasn’t been updated since initial development is a red flag.
Maintain traceability
An auditor should be able to trace from any compliance claim to the evidence supporting it:
- Risk identification → Risk assessment → Mitigation measure → Validation test → Residual risk evaluation
- Performance claim → Test methodology → Test results → Conditions and limitations
- Incident report → Investigation → Root cause → Corrective action → Verification
If an auditor can’t follow these chains, they’ll flag gaps.
Keep evidence of process, not just outcomes
Documents that show the final risk assessment are useful. Documents that show how you got there — meeting minutes, decision logs, version comparisons, review comments — are more useful. They demonstrate that your compliance is the result of a genuine process, not an after-the-fact creation.
Common gaps that trigger findings
Based on product safety audit experience and early AI Act enforcement signals, these are the areas most likely to cause problems:
Incomplete AI system inventory
The auditor asks what AI systems you have. You provide a list. They then discover AI systems that aren’t on the list — an AI feature in your CRM, a chatbot on a subsidiary’s website, an AI tool a department adopted without approval. An incomplete inventory undermines everything else.
Risk management that isn’t iterative
The auditor reviews your risk management system and finds a single document dated two years ago with no updates. Article 9 requires continuous, iterative risk management. A static document fails the requirement regardless of its content quality.
Performance claims without evidence
Your documentation states that the system achieves 95% accuracy. The auditor asks for the test report. You don’t have one, or the test was conducted on a different dataset, in different conditions, or against different metrics than what’s documented. Unsubstantiated performance claims are a common and avoidable finding.
Human oversight that exists on paper only
You’ve designated someone as the human oversight person. The auditor asks what tools they use, how often they review outputs, what their override rate is, and what training they received. If the answers are vague or the designated person can’t explain their oversight activities in concrete terms, the oversight is nominal.
Missing transparency disclosures
The auditor interacts with your chatbot. No AI disclosure appears. They check your marketing materials for AI-generated content without attribution. They look for synthetic content labelling. Transparency violations are easy to detect and easy to prevent.
No incident reporting process
The auditor asks about your serious incident reporting process. You don’t have one, or you have a policy document that nobody has operationalised. They ask if any incidents have occurred. You’re not sure, because nobody has been monitoring for them.
Documentation that doesn’t match reality
The most damaging finding is documentation that describes practices the organisation doesn’t actually follow. If your risk management document says you review the system monthly but there’s no evidence of monthly reviews, that’s worse than not having the document at all — it demonstrates both non-compliance and misrepresentation.
Preparing practically
Conduct a mock audit
Walk through your compliance file as if you were the auditor. For each requirement, ask:
- Is there a document that addresses this?
- Does the document accurately reflect what we actually do?
- Is the document current and version-controlled?
- Can I trace from the requirement to specific evidence?
- Would this withstand questioning?
Identify gaps and address them before a real audit.
Designate a compliance contact
Identify the person who will interact with auditors and ensure they understand the full compliance picture. This person should be able to navigate the compliance file, explain the organisation’s approach, and direct auditors to specific evidence. They should also know when to bring in technical experts for detailed questions.
Keep records accessible
Your compliance documentation needs to be retrievable quickly. If an authority requests your risk management system, instructions for use, or technical documentation, you should be able to provide them within days, not weeks. Store documentation centrally with clear organisation and access controls.
Review regularly
Don’t wait for an audit trigger to review your compliance file. Quarterly reviews catch gaps, outdated documents, and inconsistencies before an auditor does. Annual comprehensive reviews ensure the full picture remains coherent.
The organisations that will perform best in AI Act audits are those that treat compliance documentation as a living operational artefact rather than a regulatory response to be assembled under pressure. Build the file as you build the system, maintain it as you maintain the system, and it will be ready whenever it’s needed.