← Back to Insights
EU AI Act

The EU AI Act Is Enforced. Here Is What That Actually Means for Your Stack.

AuditPulse Intelligence • March 20266 min read

The Geography Mistake

The most common misunderstanding: US founders believe the EU AI Act does not apply to them. It does. The regulation applies based on where your users are located - not where you are incorporated. Any EU customers means you are in scope.

A Series B company headquartered in San Francisco with 200 EU customers is fully subject to EU AI Act requirements for any AI system that affects those customers.

What High-Risk Actually Means

The EU AI Act categorises AI systems by risk level. High-risk systems face the most stringent requirements.

High-risk AI includes systems used in:

  • Credit scoring and lending decisions
  • Recruitment and hiring processes
  • Medical diagnosis and treatment
  • Insurance risk assessment
  • Educational assessment
  • Critical infrastructure management

If your AI makes or materially influences decisions in any of these categories, you are operating a high-risk AI system. The penalties for non-compliance: EUR 35 million or 7% of global annual turnover.

The Three Gaps We See Most Often

After running diagnostics on hundreds of AI stacks, three compliance gaps appear consistently.

1. No explainability documentation

Article 13 requires high-risk AI systems to provide meaningful information about how decisions are made. Most companies have this in engineers heads - none of it documented in a form that satisfies Article 13.

2. Human oversight exists on paper but not in practice

Article 14 mandates meaningful human oversight capability. Most companies have a policy that says human review is possible. What they do not have is a documented workflow proving the review actually happened.

3. Bias evaluation is a one-time event

Article 9(7) requires regular evaluation of AI system performance. The typical pattern: one fairness test at launch, never repeated. Models drift. This creates silent regulatory exposure.

A Note on Timing

Requirements for high-risk AI systems are in effect now. Companies still in wait and see mode are no longer waiting for regulation to arrive. They are waiting to be asked about compliance they do not have.

The founders who navigate this most effectively are those who move from uncertainty to documented posture fastest when it matters. That window is now.

Regulatory Exposure Is Hidden In Your Stack.

Identify critical compliance gaps in your AI architecture before enterprise procurement does.

Run Your Free Diagnostic