← Back to Insights
HIPAA

HIPAA and AI: What Healthcare Teams Are Getting Wrong in 2026.

AuditPulse Intelligence • March 20266 min read

The HIPAA Misconception

Most healthcare AI teams believe that if their data infrastructure is HIPAA compliant, their AI systems are too. This is one of the most dangerous assumptions in healthcare technology.

HIPAA was written before modern AI systems existed. The regulation addresses protected health information - how it is stored, transmitted, and accessed. It does not address what happens when that information is used to train or run an AI model.

This gap creates significant and largely unrecognised exposure for healthcare AI teams.

What HIPAA Actually Covers in an AI Context

The HIPAA Privacy Rule and Security Rule apply to protected health information regardless of how it is used. If your AI model was trained on patient data, or if it processes patient data at inference time, HIPAA applies to that data.

The specific obligations that most healthcare AI teams miss:

Minimum necessary standard. The Privacy Rule requires that only the minimum necessary information is used for a given purpose. Training a model on a full patient record when only demographic and diagnostic data is needed may violate this standard.

Business Associate Agreements. If you share patient data with a third-party AI provider - including foundation model providers via API - that provider is a Business Associate under HIPAA and requires a signed BAA. Most teams have not obtained BAAs from their AI infrastructure vendors.

Audit controls. The Security Rule requires technical security measures that record and examine activity in information systems containing PHI. For AI systems this means logging model inputs and outputs when they contain or are derived from patient data.

De-identification standards. HIPAA provides two methods for de-identifying patient data - the Expert Determination method and the Safe Harbor method. Neither was designed with AI training in mind. Data that meets HIPAA de-identification standards may still allow re-identification when combined with other data sources or used in certain model architectures.

Where the EU AI Act Intersects

Healthcare AI teams with any EU patients face a compounding compliance challenge. The EU AI Act classifies AI systems used in medical diagnosis and treatment as high-risk under Annex III.

High-risk classification triggers the full suite of EU AI Act requirements including conformity assessment, technical documentation, human oversight, and registration in the EU database of high-risk AI systems.

A healthcare AI company serving both US and EU patients faces HIPAA obligations, EU AI Act obligations, and GDPR obligations simultaneously. The overlap is significant and the gaps between frameworks create specific exposure points.

The Three Most Common HIPAA AI Gaps We See

Missing Business Associate Agreements with AI vendors. If you are sending any patient data to an AI API without a BAA you have a material HIPAA violation regardless of how secure your own systems are.

Inadequate audit logging for AI inference. Most teams log application events but not model-level inputs and outputs. When those inputs contain or are derived from PHI the absence of logging is a Security Rule violation.

Training data consent gaps. Patient consent for data use in treatment does not extend to AI model training. Using patient records to train a model without specific consent or a valid HIPAA exception creates both regulatory and reputational risk.

The Practical Priority List

If you are building AI on healthcare data the immediate priorities are:

Audit your vendor relationships for missing BAAs. This is the highest-risk gap and the easiest to remediate - most major AI providers have BAA processes.

Document your de-identification methodology explicitly. Do not assume that removing names and dates of birth is sufficient.

Implement model-level audit logging for any inference that touches PHI.

Conduct a formal risk assessment that specifically addresses AI systems - not just your general HIPAA risk assessment.

Regulatory Exposure Is Hidden In Your Stack.

Identify critical compliance gaps in your AI architecture before enterprise procurement does.

Run Your Free Diagnostic