Why I Spent Three Months Reading the EU AI Act
I’ve been building websites and applications since 2005. Government agencies, large non-profits, SMEs to mum and dad businesses. I’ve got clients across Australia, the UK, the US, and Europe.
Part of the work over those 20 years involved helping organisations navigate GDPR, WCAG accessibility, privacy legislation and the various other regulatory frameworks that arrive periodically and cause a collective headache across the industry.
I thought I understood the pattern. Regulation arrives, creates anxiety, turns out to be manageable with the right guidance, move on. When the EU AI Act came into force in August 2024, my assumption was the same. Then I started reading it.
What shocked me
The compliance burden was the first thing. Not the penalties, though €35 million or 7% of global annual turnover gets your attention, but the volume of work the regulation requires. Technical documentation. A continuous risk management system. Human oversight built into the system design. Staff training. A conformity assessment before you can place the product on the EU market. Post-market monitoring that doesn’t stop after launch. All requiring documentation.
For high-risk systems, the EU AI Act is asking for a genuine engineering and governance programme, documented in detail.
The second thing was the provider question.
Most founders don’t know they’re the provider
Every conversation with developers and founders started the same way: “That’s an EU problem” or “we’re just using OpenAI’s API , it is their responsibility.”
That’s not how the Act works.
Article 3(3) defines a provider as anyone who develops an AI system and places it on the market under their own name or trademark. The key word is system. Recital 97 is explicit that a raw AI model “does not constitute an AI system on its own” and requires “the addition of further components, such as for example a user interface, to become an AI system.”
If you call the OpenAI or Claude API, wrap it in your product, and ship it to users — you have created an AI system. You are its provider. Not OpenAI. You.
The extraterritorial scope makes this hit harder than people expect. The act applies to providers who place AI systems on the EU market regardless of where those providers are incorporated. If you have EU users and your product uses AI, the Act applies to you.
Most people’s first reaction is: “Is the EU really going to enforce this against a small startup in Austin or Melbourne?” The same question was asked about GDPR. EU authorities have since issued €4.68 billion in fines to US companies alone — 83% of all GDPR fines issued since 2018. The enforcement model targets non-EU companies. The AI Act uses the same model.
Three months inside 144 pages
I decided to read the full regulation — not a summary, not a consultant’s deck, the text itself. I worked through the articles, the recitals, the annexes. I mapped provider obligations against deployer obligations, and separated what applies to all AI systems from what’s specific to high-risk.
It took months. The regulation is 144 pages of dense legislative text where single provisions only make sense when read alongside three others. I took notes throughout and prepared a checklists of what a provider actually needs to do, at each phase, in plain English, tied to the specific article that requires it.
My original plan was to build an online SaaS service. But, the amount of documentation and internal systems and plans that this Act requires from an organisation can’t be done through a SaaS app.
It became obvious that my notes were the product.
What I built
The notes became ComplyDrive — a 47-point compliance checklist across five phases, with every item tied to a specific article in the regulation. Alongside the checklist, nine sample compliance documents written for a fictional organisation: a Declaration of Conformity, FRIA, Risk Management File, Technical Documentation, Data Governance Policy, Instructions for Use, Post-Market Monitoring Plan, Serious Incident Report, and AI System Register.
The checklist is designed for product managers, CTOs, and founders — not lawyers. People who need to understand what the regulation actually requires without wading through 144 pages themselves.
August 2, 2026 is when the full enforcement regime comes into force. The documentation for a high-risk system takes months to prepare. If you’re a provider of a high-risk AI system and you haven’t started, you’re already behind.
I built ComplyDrive because something practical needed to exist. Compliance consultants charge thousands for this work. A practical, accurate, affordable checklist with a one-off pricing model seemed like the right thing to make.
John Pitchers is the founder of ComplyDrive and has been building web applications for government and enterprise clients since 2005. He also runs Viperfish Media and Joomstore.