Healthcare

Healthcare Developers Need AI Tools. HIPAA Requires PHI Protection.

Your engineers use Copilot and Claude Code to build patient-facing systems. Those tools transmit code context, including variable names and comments referencing PHI schemas, to third-party servers. Pretense intercepts and protects that traffic before HIPAA liability attaches.

30+PHI and PII detection patterns
100%Requests logged for audit
On-premAvailable for air-gapped environments
PDF / JSONCompliance report export formats

The Problem

Why existing controls do not address AI coding tool risk for healthcare teams.

PHI appears in code, not just databases

HIPAA applies to protected health information in any form, including code. Function names like fetchPatientMedicalHistory, comments like "returns SSN and DOB from patient record", and variable names like patientInsuranceId are potential PHI indicators. When these appear in AI tool requests, they cross a covered entity boundary.

BAA coverage does not extend to AI coding tools

Your EHR vendor has a BAA. Your cloud provider has a BAA. GitHub Copilot, Cursor, Anthropic, and OpenAI are not your business associates for the purpose of developer tooling. AI coding assistant requests are not covered by standard BAA arrangements.

Traditional DLP cannot distinguish code from PHI

Existing DLP tools scan for PHI patterns in files and communications. They do not intercept AI API traffic and cannot distinguish between a production patient record and a test fixture containing a placeholder SSN. The result is either missed PHI or excessive false positives that block developer productivity.

How Pretense Solves It

PHI pattern detection before LLM transmission

Pretense scans every AI API request for 30+ PHI and PII patterns: SSNs, dates of birth, insurance IDs, patient names, MRNs, and diagnosis codes. When a match is found, the request is blocked with a clear error before any data leaves the developer environment.

Code identifier mutation for HIPAA schema protection

Beyond pattern detection, Pretense mutates all code identifiers. Function names like fetchPatientMedicalHistory become _fn4a2b. PHI-adjacent schema names are never transmitted in recognizable form. The LLM sees synthetic code that preserves structure without revealing PHI-related architecture.

HIPAA audit trail for every AI request

Every AI tool request is logged: timestamp, provider, mutation count, blocked secrets, and request hash. Logs are tamper-evident and exportable as PDF or JSON. This documentation satisfies HIPAA audit requirements for AI tool usage in covered entity environments.

On-prem deployment for air-gapped compliance environments

Healthcare organizations with strict network isolation can deploy Pretense entirely on-premises via Docker Compose or Kubernetes. No data flows to Pretense infrastructure. The proxy, dashboard, and audit store run in your data center.

Compliance Coverage

Pretense generates audit evidence and compliance documentation for the frameworks that matter to healthcare teams.

HIPAA

PHI detection and blocking at API layer

HITECH

Audit controls for electronic PHI

SOC2 Ready

Audit log exports for Type II

Local-First

PHI never leaves your environment

Zero Data Retention

No Pretense server storage

What the LLM Actually Sees

Pretense transforms proprietary identifiers into synthetic tokens before transmission. Structure and logic are preserved. Your IP is not.

Without Pretense: identifiers exposed
// Sent to LLM provider verbatim
async function fetchPatientMedicalHistory(
  patientId: string,
  includeSSN: boolean
) {
  return await ehrClient.getRecord(
    patientId, ENCRYPTION_KEY
  );
}
With Pretense: synthetic identifiers only
// Pretense-mutated before transmission
async function _fn4a2b(
  _v8c3d: string,
  _v2f1a: boolean
) {
  return await _v9e4b._fn7d2c(
    _v8c3d, _v6b1a
  );
}

After the LLM responds, Pretense reverses every mutation. You receive real, working code with your original identifiers restored byte-for-byte.

Frequently Asked Questions

Does Pretense serve as a BAA-covered service for HIPAA?

Pretense does not process PHI on its servers. The proxy runs locally on developer machines or in your infrastructure. Because no PHI flows through Pretense infrastructure, BAA requirements for Pretense itself are typically not triggered. Consult your compliance counsel for your specific situation.

What PHI patterns does Pretense detect?

Pretense detects SSNs, dates of birth, phone numbers in standard formats, email addresses, credit card numbers, insurance IDs, and custom patterns you configure. Detection runs on request content before any data is forwarded to an LLM provider.

Can Pretense work with de-identified test data environments?

Yes. Pretense can be configured to apply different rule sets per environment. Development environments with de-identified data can have relaxed rules while production-adjacent environments have full PHI blocking.

How does Pretense handle audit log retention for HIPAA?

Pretense stores audit logs in a local SQLite database with configurable retention periods. Logs include all required fields for HIPAA audit controls. They can be exported to your existing compliance storage systems via the SIEM integration.

Protect your healthcare team in 30 seconds

One environment variable. No code changes. No workflow disruption. Pretense intercepts every AI API request from day one.

No credit card required. Free tier available. Local-first, nothing leaves your machine.

Ask me anything