Quick Answer
AI therapy notes can be safe if the tool meets three conditions: it has a Business Associate Agreement (BAA), it does not retain client data after processing, and you can verify both of those things. The risk is not AI itself. The risk is using a tool that stores session content without proper protections.
Why Trust This Guide
Built around the safety questions therapists actually ask before adopting AI notes
This guide is a practical evaluation framework. It covers the three criteria that determine whether an AI notes tool is safe to use, how to assess recording-based vs. typed-input tools, and the specific mistakes that create exposure. Treat it as a checklist, not a sales pitch.
Safety Floor
BAA + zero retention
A signed BAA is the legal minimum. Zero data retention after processing is the architectural minimum.
Risk Variable
Recording vs. typed
Tools that record sessions transmit full audio. Tools that take typed summaries transmit only what you choose to include.
Verification
Architecture over policy
A vendor confident in its safety lets you inspect what data leaves your browser. Policy pages are not enough.
Sources And Method
HHS explains that covered entities need satisfactory assurances in the form of a BAA when a vendor handles PHI on their behalf.
The Security Rule establishes standards for protecting electronic PHI, including transmission security and access controls.
Google Cloud documents HIPAA and BAA coverage for supported AI infrastructure, including Vertex AI.
This is not legal advice. Verify vendor terms directly and consult your licensing body for jurisdiction-specific guidance.
Related Guides
Start here for safety, then go deeper on workflow and compliance
This page answers the safety question directly. The guides below cover how AI notes work in practice and what HIPAA compliance looks like across different tools.
The Three Criteria
What makes an AI notes tool safe
Safety is not a feeling. It is a set of verifiable conditions. Before you use any AI tool with session content, check these three things. If any one of them is missing, the tool is not safe for clinical use.
1. A signed Business Associate Agreement (BAA)
A BAA is the legal minimum for HIPAA compliance. It is a contract that requires the vendor to protect PHI, report breaches, and not use your data for their own purposes. Without a signed BAA, the tool is not compliant. Full stop. "We take security seriously" is not a BAA.
2. No data retention after processing
A BAA protects you legally, but it does not eliminate risk if the vendor stores session content on their servers. Every stored record is a potential breach target. The safest architecture processes your input in memory and discards it after returning the result. "We encrypt it" is not the same as "we do not store it."
3. Verifiable architecture
Policy pages are easy to write. Architecture is harder to fake. A tool that is truly safe should let you verify what data leaves your browser and what comes back. Open your browser's Network Inspector and watch. If the vendor is confident in their architecture, they will encourage you to check.
If a tool passes all three, you can evaluate the rest: note quality, format options, pricing, workflow fit. If it fails any one of them, move on.
Risk Profiles
Recording vs. typed notes: different risk profiles
AI therapy note tools fall into two categories. Each has a different data exposure profile, and that difference matters for safety.
| Ambient Scribing (Recording) | Generation-Based (Typed Input) | |
|---|---|---|
| What gets transmitted | Full session audio, often 50+ minutes | A 3-5 sentence summary you write after the session |
| PHI exposure | Everything the client said, verbatim | Only what you choose to include |
| Client consent | Required for recording in most jurisdictions | No recording occurs, consent varies by jurisdiction |
| Data minimization | Difficult. Audio captures everything. | Built in. You control the input. |
| Breach impact | Full session transcripts exposed | Brief clinical summaries exposed (if stored at all) |
Neither approach is inherently wrong. Both can be HIPAA compliant with the right vendor agreements and architecture. But the risk profile is different, and that difference should inform your decision.
Recording-based tools transmit everything your client says. If something goes wrong with the vendor, that is what is exposed. Typed-input tools transmit only the summary you wrote. You are the filter. For many private practice therapists, that difference is the deciding factor.
Evaluation Checklist
What to check before you use any AI notes tool
Before you type a single word of session content into any tool, run through this list. It takes ten minutes and protects you from the most common compliance gaps.
Ask for the BAA
Not whether they "support HIPAA." Ask for the actual Business Associate Agreement. If the vendor cannot produce one, stop here.
Ask what happens to your data after the request
Does the tool retain your session summaries or transcripts? For how long? On whose servers? "We use encryption" is not an answer to the retention question.
Ask who the underlying AI provider is
Many therapy note tools are built on top of OpenAI, Anthropic, or Google models. The BAA chain matters. Your tool needs a BAA with its AI provider, and you need a BAA with your tool.
Ask whether your data trains AI models
Consumer AI products (ChatGPT Free, Google Gemini consumer) may use your inputs for training. Healthcare-grade deployments explicitly prohibit this. Confirm in writing.
Check whether you can verify the data flow
Open your browser's developer tools. Watch the Network tab while generating a note. Can you see exactly what gets sent? A tool with nothing to hide will let you look.
Check the consent requirements for your jurisdiction
Recording-based tools require informed consent in most jurisdictions. Post-session input tools typically do not require consent for the note generation itself, but check your licensing body's guidance.
A useful rule of thumb: If a vendor cannot answer these questions clearly in plain language, that tells you something about their architecture. Vendors who have built safe systems are typically eager to explain how they work.
What to Avoid
Common safety mistakes therapists make
These are not hypothetical. They come up regularly in conversations with clinicians who are trying to do the right thing but are working with incomplete information.
Using ChatGPT or Gemini consumer plans for session content
ChatGPT Free and Plus do not offer a BAA. Neither does Google Gemini's consumer product. Typing session content into either one is a HIPAA violation regardless of whether you remove the client's name. This is the single most common mistake, and it happens because the tools are convenient and the compliance gap is not obvious.
Assuming "HIPAA compliant" on a website means you are covered
"HIPAA compliant" is a marketing claim until you see the BAA. Some tools claim compliance because they use encryption or because their cloud provider is HIPAA-eligible. Neither of those things is the same as having a BAA with you as the covered entity. Ask for the agreement, not the badge.
Confusing encryption with safety
Encryption protects data in transit and at rest. It does not address whether the vendor stores your data, for how long, or who has access. A tool can encrypt everything perfectly and still retain your session notes on its servers indefinitely. Encryption is necessary but not sufficient.
Not asking about the AI provider behind the tool
Many therapy-specific tools are wrappers around OpenAI, Anthropic, or Google models. The tool vendor may have a BAA with you, but if they do not have a BAA with their AI provider, the chain is broken. Your data flows through an unprotected link. Ask who powers the AI and whether a BAA covers that relationship.
Our Approach
How Reframe handles this
Reframe Practice was built specifically to address the safety concerns above. Here is how each criterion is met.
BAA in place
All processing happens through Google Vertex AI under a signed Business Associate Agreement. The BAA covers the AI provider relationship. Reframe also provides a BAA to you as the covered entity.
Zero data retention
Session content is processed in memory and discarded after the note is returned. No session summaries, no transcripts, no client data stored on Reframe servers. There is nothing to breach because nothing is kept.
No recording
Reframe is a generation-based tool. You type a post-session summary. No audio is recorded, transmitted, or stored. You control exactly what information goes in.
Verifiable architecture
Open your browser's Network Inspector while generating a note. Watch what gets sent and what comes back. The architecture is designed to be inspectable, not just described on a policy page.
HIPAA-compliant by physics, not promises.
We built Reframe this way because a Registered Psychotherapist needed a note tool that met their own safety standard. The approach is simple: if the data is never stored, it cannot be breached. The BAA provides the legal framework. The architecture provides the actual protection.
Frequently asked questions
Are AI therapy notes HIPAA compliant?
It depends on the tool, not the technology. HIPAA compliance requires a signed BAA, a clear data retention policy, and ideally zero retention. A tool that signs a BAA but stores your session data on its servers carries more risk than one that processes in memory and discards immediately. Check all three criteria before using any tool with session content.
Is it safe to use AI for therapy notes if I remove client names?
Removing names does not make it safe. HIPAA defines PHI as any data that could identify a patient combined with health information. A presenting problem, a distinctive life situation, or session-specific clinical details can all qualify as PHI without a name. The safety question is about whether the tool has the proper legal agreements and architecture, not about what you leave out of the input.
What is the difference between zero-retention and encrypted storage?
Encrypted storage means your data is protected while it sits on the vendor's servers. Zero retention means your data is never stored at all. Both can be HIPAA compliant, but the risk profiles are different. Encrypted data that is stored can still be breached if the vendor's systems are compromised. Data that is never stored cannot be breached because it does not exist after processing.
Do I need to tell clients I use AI for notes?
For tools that record sessions, most licensing bodies require informed consent for audio capture. For tools where you type a post-session summary with no recording involved, consent requirements vary by jurisdiction. Check your specific licensing body's guidance. When in doubt, adding a line to your intake consent form about using AI-assisted documentation tools is a straightforward way to address it.
See the safety claims in practice
The best way to evaluate any tool is to use it. Generate a note, open your Network Inspector, and see for yourself what data is transmitted and retained. No account required for your first 10 notes.