HIPAA Compliant AI for Therapists: 9 Critical Truths

Posted in   Compliance, Data Privacy   on  March 23, 2026 by  Editorial Team0
Editorial Team
Compliance & Data Privacy

HIPAA Compliant AI for Therapists: 9 Things to Verify Before You Sign Anything

“HIPAA compliant” is a phrase any vendor can put on their website. There’s no certification. No audit. No stamp. Here’s what actually separates compliant from marketing.

📅 Updated 2026 10 min read 📋 9-question vendor checklist included
HIPAA compliant AI for therapists secure data processing system on laptop
Record
healthcare data breaches in 2024 — mental health records hold the highest sensitivity classification
$9.8M
average cost of a healthcare breach in 2024 — highest of any industry for 14 consecutive years (IBM)
0
official bodies that certify HIPAA compliance — any vendor can claim it
42 CFR
Part 2 SUD protections are stricter than HIPAA — and most AI vendors have never assessed them

“HIPAA Compliant” Doesn’t Mean What You Think It Means

There is no HIPAA certification.

No government body reviews an AI tool and issues a compliance stamp. No third-party auditor grants the designation. “HIPAA compliant” is a self-reported claim — which means when you ask a vendor if they’re HIPAA compliant, you’re asking them to grade their own homework. Most will say yes. That answer tells you almost nothing.

What the phrase actually means: a vendor has a signed Business Associate Agreement, documented technical safeguards, and a compliance architecture they can describe specifically and show you. Without those three things, “HIPAA compliant” is marketing. With them, it’s the start of a real evaluation — not the end of one.

“Signing a BAA doesn’t reduce the probability of a breach. It determines who answers for it — and the answer, under HIPAA, is still you.”

Here’s the part most compliance articles skip.

A Business Associate Agreement does not transfer your liability to the vendor. When a vendor you’ve signed a BAA with causes a breach, HHS holds your practice accountable for the violation. The BAA gives you the contractual right to pursue the vendor afterward — after you’ve already been fined, after the breach has been reported, after the damage is done. It’s not protection. It’s recourse.

According to HHS OCR enforcement data, the most common driver of HIPAA actions against covered entities isn’t a sophisticated breach. It’s inadequate business associate oversight. Not hackers. Vendors you didn’t vet. The risk most therapists should be thinking about isn’t “what if someone breaks into our system?” It’s “what if a patient discovers their data touched a non-compliant tool and files a complaint?”

This guide gives you the five requirements that actually mean something, nine vendor questions that go further than “are you compliant?” — and a few things the compliance conversation consistently leaves out.

5 Requirements That Actually Mean Something

Four of five is not mostly compliant. It’s non-compliant. All five, documented, before any data moves.

REQ 01

A Signed BAA — Before Any Data Moves

Required. Not optional. Not retroactive.

Every AI tool that touches patient data requires a signed BAA. The scheduling tool. The note-taking app. The billing platform. Every AI feature inside each of them. Without a signed BAA, using that tool is a HIPAA violation — regardless of how the vendor describes their security, regardless of how long you’ve been using it, regardless of whether any breach has occurred.

A BAA must be executed before any PHI is transmitted.

Not after onboarding. Not “available on request after activation.” Before. A vendor who sends the BAA only once you’ve connected your practice data has already put you in violation — even if they don’t know it and even if you didn’t.

⚠️ Common scenario: Vendor offers a 14-day free trial. BAA is “available upon request after activation.” You connect your EHR on day one. You’ve violated HIPAA before the trial ends. The feature set doesn’t matter at this point.

REQ 02

PHI Redaction — Not Just Encryption

You don’t need to trust the AI model. You need the platform to keep PHI away from it.

Most therapists ask the wrong question. “Is the AI HIPAA compliant?” is not the question. The question is: does my platform prevent patient identifiers from ever reaching the AI in the first place?

If a genuine PHI redaction pipeline exists, the external model’s own compliance posture becomes largely irrelevant — because no PHI reaches it. The platform is the PHI boundary, not the AI vendor. That’s a structurally stronger compliance position, and it completely reframes how you should evaluate AI tools.

What redaction means in practice: before any clinical content leaves the secure environment, the system strips name, DOB, insurance IDs, every HIPAA-defined identifier. The AI processes de-identified text. The output returns to the platform where identifiers are re-attached. The external model sees a patient’s anxiety description. It never sees whose description it is.

Ask for the architecture diagram. A vendor with real PHI redaction can draw it. “We use encryption and security best practices” is not an architecture diagram — it’s a non-answer.

REQ 03

Audit Logging — Per Action, Seven Years

Your malpractice defense two years from now depends on what gets logged today.

A patient challenges their clinical record. Their attorney requests documentation of every AI action that touched their file. It’s two years from now. Your platform produces a general activity log. No per-record detail. No action-level timestamps. You are now defending your clinical decisions without the evidence HIPAA required you to keep.

That’s not a hypothetical. It’s a failure mode with a timeline.

HIPAA requires comprehensive access logging: action taken, record affected, clinician who triggered it, timestamp. Minimum seven years for behavioral health practices under most state laws. Per HHS Security Rule requirements, missing audit logs is itself a violation — entirely separate from any underlying breach.

⚠️ Red flag: “Activity logs available in your dashboard” is not the same as per-action AI audit logging with patient record-level detail. Ask specifically. Audit logging is not an enterprise upgrade.

REQ 04

42 CFR Part 2 — The Regulation Most AI Vendors Haven’t Heard Of

If your client has a substance use history, different rules apply.

A client comes to you for depression and anxiety. They also drink heavily. Both are in the record. Under standard HIPAA, those records are handled like any other mental health records. Under 42 CFR Part 2, the substance use portions carry stricter consent requirements and different disclosure rules — more protective than HIPAA, not less.

Most behavioral health AI vendors have never assessed their compliance with Part 2. Not because it’s obscure — it was significantly updated by SAMHSA in 2024 and applies to most practices accepting Medicare or Medicaid for SUD-related services. Because nobody asked them to.

When you ask a vendor directly about Part 2, the answer tells you more about the depth of their compliance review than any marketing page will.

Quick test: Ask your vendor: “Are you compliant with 42 CFR Part 2 for substance use disorder records?” If they’ve never heard of it, or ask you to repeat the question, you’re done with that vendor evaluation.

REQ 05

Clinician Approval Gate — AI Drafts, You Sign

The liability line. The AI writes it; you own it.

When AI generates a progress note that enters the official record without clinician review, who carries responsibility for its content? The answer isn’t settled law. But it’s consistently resolved unfavorably for the practice. AI-generated content in the clinical record without a licensed professional in the loop creates liability ambiguity — and ambiguity, in a malpractice or regulatory context, is not your friend.

The fix is structural, not procedural.

Every AI output must require clinician review before entering the record. Not as a preference setting. Not as a best practice recommendation. As a non-configurable architectural requirement — the boundary between AI that supports clinical work and AI that replaces the clinician without their knowledge.

✅ Therasoft: Every AI Clinical action requires clinician sign-off before record entry. Non-configurable architectural requirement, not a workflow preference.

⚠️ Something Most Articles Don’t Cover

The Training Data Clause Most Therapists Never Read

Many AI BAAs contain a provision permitting the vendor to use de-identified session data to train or improve their models. This is, technically, HIPAA-compliant — because de-identified data isn’t PHI under HIPAA’s Safe Harbor standard. Your patient’s name is gone. Their diagnosis description, therapeutic patterns, and clinical language are not.

The clinical content of your sessions — stripped of identifiers but otherwise intact — may be used to build commercial AI products. Legally. Without any specific notification that it’s happening.

Most therapists would find this uncomfortable if they knew it was occurring. Most don’t know to look for the clause. Now you do.

What to look for in any BAA: A clause prohibiting the use of your clinical data for model training, product improvement, or any purpose beyond providing the services you contracted for. If the BAA is silent on this, ask directly. If the answer is vague, assume the clause exists in their favor.

Behavioral health practice owner reviewing HIPAA compliant AI vendor BAA documentation at desk

9 Questions That Go Beyond “Are You HIPAA Compliant?”

Each maps to one of the five requirements. Blue = data privacy & BAA. Red = highest-risk enforcement gaps.

A vendor who gets vague on any of these is answering the question whether they mean to or not. Vagueness is the answer.

1
Can I have a signed BAA before I connect any data?
Not “available upon request.” Before any connection. Signed. In hand.
Requirement 1
2
Does your BAA prohibit using my session data for model training?
The training data clause. If the BAA is silent, the answer is probably no.
Requirement 1 + Training Data
3
What exactly gets sent to your external AI model on each request?
Show me the data payload. Patient identifiers included? A diagram, not a description.
Requirement 2
4
How long are AI action logs retained, and can I pull per-record reports?
Per-action detail, record-level, minimum seven years. “Dashboard activity” is not sufficient.
Requirement 3
5
Are you compliant with 42 CFR Part 2 for SUD records?
If they’ve never heard of it, or hesitate, end the call. This regulation is foundational for behavioral health.
Requirement 4
6
What happens to my data when I cancel?
Timeline, export format, destruction certificate. The BAA must address this. Get it in writing before signing.
Requirement 1
7
Are you SOC 2 Type II certified, or pursuing it?
Third-party audited security controls. Complements HIPAA. Signals maturity beyond self-attestation.
Overall Security Posture
8
Which of your staff have access to my practice’s patient data?
Role-based access controls, background check policies, internal access logs. Required under HIPAA minimum necessary standards.
Requirements 1 & 3
9
Can AI output enter the clinical record without clinician review?
One acceptable answer: no. Any yes here is a malpractice and compliance risk regardless of how it’s framed.
Requirement 5

What Compliant Architecture Actually Looks Like

The five requirements aren’t a checklist — they’re layers. Understanding the sequence shows you exactly where the PHI boundary sits and what breaks down in a non-compliant tool’s architecture.

Therasoft HIPAA Compliant AI — 4-Layer Compliance Architecture

LAYER 1 Secure Practice Environment — EHR, Scheduling, Billing, Clinical Records BAA ✓ LAYER 2 — PHI BOUNDARY PHI Redaction Pipeline — All Patient Identifiers Stripped Before External Processing 42 CFR ✓ LAYER 3 External AI Processing — De-Identified Clinical Content Only. No PHI Ever Transmitted. No PHI LAYER 4 Clinician Approval Gate — PHI Re-Attached, Draft Reviewed & Signed Before Record Entry Audit ✓

PHI never leaves Layer 1. External AI only processes de-identified content at Layer 3. Clinician approval at Layer 4 is mandatory. Every action is logged for seven-year retention.

Therapist confidently using HIPAA compliant AI platform after completing vendor evaluation

Frequently Asked Questions

Direct answers to the questions we hear most.

?

Can I use ChatGPT for therapy notes?

Risk +

No. OpenAI does not offer a signed BAA for standard or Plus ChatGPT accounts. Entering any PHI — patient name, DOB, diagnosis, session content — is a HIPAA violation. This applies to the browser, mobile app, API without a specific enterprise healthcare agreement, and every plugin.

You can technically use ChatGPT with all PHI manually stripped, but this eliminates most clinical utility for patient-specific documentation. Purpose-built behavioral health platforms handle the de-identification automatically as part of a compliant architecture — so you don’t have to manage the PHI boundary manually.

?

If a vendor has a BAA, are they HIPAA compliant?

Clarification +

A BAA is necessary but not sufficient. It establishes the legal relationship and creates obligations — but the BAA alone doesn’t mean the vendor has PHI redaction in place, audit logging, 42 CFR Part 2 compliance, or clinician approval gates. A vendor can sign a BAA and still transmit PHI to unprotected external models.

The BAA is the starting point of the evaluation, not the end of it. You need the BAA and documented confirmation of the four technical requirements. Anything less means you’re relying on the vendor’s word rather than their architecture.

?

Is Therasoft AI HIPAA compliant?

Therasoft +

Yes, across all five requirements. Therasoft provides a signed BAA covering all AI-powered features. PHI is redacted before any external AI model processes clinical content — patient identifiers never leave the platform. Every AI action is logged per-record with clinician identity and timestamp for seven-year retention. Therasoft AI Clinical addresses 42 CFR Part 2 for SUD-related documentation. And every AI output requires clinician review before entering the official record — non-configurable.

Therasoft’s BAA explicitly prohibits use of clinical session data for model training purposes. You can ask for confirmation of any of these in writing before connecting any practice data.

?

What should I do if an AI tool I’m using has no BAA?

Action +

Stop using it for any purpose involving patient data immediately and document the date. If PHI was transmitted, conduct a breach risk assessment under the HIPAA Breach Notification Rule’s four-factor test. Consult your HIPAA Privacy Officer or healthcare legal counsel if the exposure was significant in volume or duration.

Going forward: no BAA before connection means no connection. Period. Treat it as a non-negotiable precondition before demos, trials, or any data handshake of any kind.

?

Do payers treat AI-generated notes differently?

Documentation +

No. Payers evaluate clinical content, medical necessity documentation, and the licensed clinician’s signature. They have no visibility into whether AI was involved in drafting the note. AI-assisted notes generated through a compliant platform and reviewed and signed by the responsible clinician are indistinguishable from manually written notes in the billing and audit context.

The compliance question affects the data processing layer — how session content is handled when the note is being generated. It has no bearing on how the final signed document is treated by payers or regulators, provided the clinician approval gate is in place.

What to Do Before the Next Vendor Demo

The behavioral health AI market is full of companies using compliance language fluently. Most of them haven’t done the architecture work that language implies. The difference between a genuinely compliant platform and a non-compliant one is not detectable from a website, a sales call, or a demo. It’s only detectable by asking the right questions and requiring documentation in response.

You don’t need to become a compliance expert to protect your practice.

You need to ask for the BAA before any data moves. Request the PHI redaction architecture diagram. Check the BAA for a training data clause. Verify audit logging retention is per-action, per-record, seven years minimum. Those four moves will tell you more about a vendor’s actual compliance posture than hours on their marketing pages — and they take about ten minutes on a sales call.

🔒 Your pre-activation checklist — all five, documented:

  1. Signed BAA on file, executed before any PHI was transmitted
  2. Written confirmation that PHI is redacted before external AI processing
  3. BAA reviewed for training data clause — explicitly prohibited
  4. Audit logging documented: per-action, per-record, seven-year minimum
  5. Clinician approval gate in place: non-configurable, required before record entry

Therasoft was built to answer these questions clearly — because we think the industry’s habit of making compliance vague is something behavioral health practice owners deserve better than. The Therasoft AI Clinical suite, AI Billing, and Smart Calendar operate within one HIPAA-compliant environment, one BAA, and one audit trail. If you want to verify any of it in writing before connecting a single record, that’s exactly the right thing to ask.

Compliant by Architecture. Not by Promise.

PHI redaction. Signed BAA. No training data use. Seven-year audit logging. 42 CFR Part 2 compliance. Mandatory clinician approval gates. Documented, in writing, before you connect a single record.

Sources & Research References

  1. HHS Office for Civil Rights. (2024). HIPAA Security Rule: Technical Safeguards and Documentation Requirements. hhs.gov
  2. HHS Office for Civil Rights. (2024). Guidance on Business Associates Under HIPAA. hhs.gov
  3. HHS Office for Civil Rights. (2024). HIPAA Resolution Agreements and Enforcement Data. hhs.gov
  4. SAMHSA. (2024). 42 CFR Part 2 Final Rule: Confidentiality of Substance Use Disorder Patient Records. samhsa.gov
  5. IBM Security. (2024). Cost of a Data Breach Report 2024. ibm.com
  6. American Psychological Association. (2024). Ethical Principles for AI Use in Psychological Practice. apa.org
  7. Therasoft. (2025). HIPAA Compliant AI for Behavioral Health Practices. therasoft.com
TS
Therasoft Editorial Team
Compliance & Data Privacy | Behavioral Health Technology | therasoft.com
The Therasoft Editorial Team includes behavioral health technology specialists, licensed practice management consultants, and healthcare content strategists. All regulatory content is reviewed against current HIPAA guidance and peer-reviewed research before publication.
About the Author

The Therasoft Editorial Team is composed of behavioral health technology specialists, licensed practice management consultants, and healthcare content strategists with direct experience in mental health billing, clinical documentation, and EHR implementation. All clinical and regulatory content is reviewed against current HIPAA guidance, payer policy, and peer-reviewed research before publication.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Related Posts

Subscribe now to get the latest updates!