DAYS UNTIL EU AI ACT ENFORCEMENT — August 2, 2026. Also covering: NYC Local Law 144 (live now) · Colorado AI Act (June 30, 2026) · SR 11-7 model validation · GDPR Article 22.
5 regulators · Sector-based enforcement

The UK regulates AI through existing regulators — and each one has enforcement power.

The UK takes a sector-based approach to AI regulation: the FCA governs AI in financial services, the ICO governs AI and data protection, the MHRA governs AI in medical devices, Ofcom governs AI in communications, and the CMA addresses AI and competition. There is no single comprehensive AI law — but each regulator has real enforcement authority. Companies operating in the UK need to navigate multiple regulatory expectations simultaneously. AI Asset Assurance provides the independent evaluation that satisfies multiple UK regulators in a single engagement.

Request evaluation
REGULATION STATUS
ApproachSector-based regulation
StatusActive enforcement
Key regulatorsFCA, ICO, MHRA, Ofcom, CMA
UK GDPRAutomated decisions — live
AI Asset Assurance coverageMulti-regulator evaluation
UK regulatory landscape

Five regulators, one evaluation

Each UK regulator brings its own AI expectations. AI Asset Assurance evaluation addresses the common requirements across all five.

FCA — Financial Conduct Authority

Financial services AI

The FCA requires firms to demonstrate that AI-driven decisions in lending, insurance, and investment are fair, explainable, and non-discriminatory. Model risk management expectations align with global standards (SR 11-7 equivalent).

AI Asset Assurance: Independent model validation with Shapley attribution satisfies FCA expectations for explainability and fairness. Structurally independent evaluation provides the "effective challenge" the FCA expects in model risk management.

ICO — Information Commissioner's Office

UK GDPR + Data Protection Act 2018

The ICO enforces UK GDPR provisions on automated decision-making (Article 22 equivalent), data protection impact assessments, and the right to explanation for AI-driven decisions affecting individuals.

AI Asset Assurance: Per-decision explanation capability directly supports UK GDPR automated decision-making obligations. Data protection impact assessment documentation integrates with AI Asset Assurance evaluation reports.

MHRA — Medicines and Healthcare products

AI as a medical device

AI-based clinical decision support and diagnostic tools may be classified as medical devices under the MHRA framework, requiring conformity assessment, post-market surveillance, and clinical evidence of safety and performance.

AI Asset Assurance: Demographic performance evaluation assesses whether AI medical devices perform equitably across patient populations — a requirement that MHRA conformity assessment increasingly expects.

CMA + Ofcom — Competition and communications

AI and market fairness

The CMA examines AI's impact on competition and consumer fairness. Ofcom addresses AI in content moderation and communications. Both regulators have signaled increased scrutiny of algorithmic decision-making.

AI Asset Assurance: Algorithmic fairness evaluation addresses both competition concerns (algorithmic pricing, market manipulation) and content fairness concerns (differential treatment across user demographics).

Five UK regulators. One independent evaluation.

The UK regulates AI through existing regulators — each with enforcement power. AI Asset Assurance evaluation satisfies the common requirements across all five.

Request evaluation