Quick Answer
No. ChatGPT is not HIPAA compliant. OpenAI does not offer a Business Associate Agreement (BAA) for ChatGPT Free or Plus, which means using either plan with protected health information violates HIPAA. OpenAI does offer a BAA for ChatGPT Enterprise and the API, but not for the standard plans most therapists use.
Why Trust This Guide
Grounded in HHS guidance and OpenAI's own BAA documentation
This page is only useful if it stays anchored to the actual vendor and regulator language. The key issue is not whether ChatGPT feels secure. It is whether the product you are using can legally handle PHI in your workflow.
Free / Plus / Business
No BAA
Standard ChatGPT plans do not offer the contractual protection therapists need for PHI.
Sales-managed plans
Conditional BAA
OpenAI says some sales-managed ChatGPT offerings can explore a BAA through sales.
API
BAA available
OpenAI allows BAAs for qualifying API use cases, which is why architecture matters as much as the model.
Sources And Method
HHS explains that vendors handling PHI on behalf of covered entities need satisfactory assurances in the form of a BAA.
OpenAI documents which products can support a BAA and explicitly excludes standard ChatGPT plans from that path.
OpenAI outlines which business and healthcare offerings support privacy and compliance workflows, including healthcare-specific paths.
Product terms can change. Verify the exact plan, contract, and retention terms in force before you enter client information.
AI Compliance Cluster
Use this page as the compliance answer inside the broader AI tools cluster
Start here when the question is specifically about ChatGPT and HIPAA. Then use the supporting guides below for the bigger tool-selection framework and documentation workflow decisions.
Hub Guide
AI for Therapists
The full framework for privacy, liability, workflow fit, and rollout decisions.
Documentation
AI Therapy Notes
The practical workflow guide for clinicians evaluating note-generation tools.
Tool Comparison
ChatGPT Alternatives
Compare therapist-safe alternatives when the real job is notes, worksheets, or clinical prep.
What therapists are saying
"I was using ChatGPT for note drafts until I realized there was no BAA. Switched to a tool with zero-retention architecture and sleep better now."
Therapist on r/therapists
"The fact that removing client names does not make ChatGPT HIPAA compliant was a wake-up call. Clinical details alone can identify someone."
Therapist on r/psychotherapy
Is ChatGPT HIPAA compliant? The short answer, by plan
The compliance picture depends entirely on which ChatGPT product you are using. These are different products with different terms, and most therapists are on the ones that offer no HIPAA protection at all.
ChatGPT Free
Not HIPAA compliantNo BAA available. OpenAI may use conversations to improve models. Entering client PHI here is a HIPAA violation.
ChatGPT Plus ($20/mo)
Not HIPAA compliantNo BAA available. Conversations are still subject to OpenAI's standard privacy terms. Same HIPAA exposure as the free plan.
ChatGPT Business
Not HIPAA compliantNo BAA available. OpenAI's current BAA documentation explicitly excludes ChatGPT Business from the BAA path.
ChatGPT Enterprise
BAA available (with conditions)BAA may be available through sales-managed procurement, but it requires organizational setup, legal review, and admin controls. Most solo practitioners cannot access this path.
OpenAI API
BAA available for developersBAA is available for developers building on the API. If you are using a third-party therapy tool built on the OpenAI API, compliance depends on whether that tool has its own BAA in place with both OpenAI and with you.
The practical reality: The overwhelming majority of therapists who use ChatGPT are on the Free or Plus plans. Neither plan offers a BAA. This means the tool most therapists think of when they say "ChatGPT" is not an option for any client-related work.
What HIPAA actually requires for AI tools like ChatGPT
HIPAA (the Health Insurance Portability and Accountability Act) requires that covered entities, which includes therapists in private practice, only share protected health information with vendors who have signed a Business Associate Agreement. If you searched for "HIPAA compliant ChatGPT," "ChatGPT HIPAA compliance," or "HIPAA compliant AI for therapy notes," the short answer is the same: standard ChatGPT plans do not meet the threshold.
A BAA is a legal contract. It requires the vendor to protect the PHI you share with them, report any breaches, not use the PHI for their own purposes (like training AI models), and delete data on your request. Without a signed BAA, you have no legal framework protecting the client information you share.
The BAA requirement applies to any tool that handles PHI on your behalf, regardless of how peripheral the use feels. "Just drafting notes" still counts. "Just generating worksheet prompts" still counts. If you type in session content that relates to a client and could identify them, you are handling PHI.
What counts as PHI
PHI is any information that relates to a patient's health and could be used to identify them. You do not need to include a name. Any of the following, in combination with health information, can constitute PHI:
Therapists sometimes ask: "What if I just remove the name?" This is a reasonable instinct, but it does not solve the HIPAA problem. A note about "a 34-year-old trauma survivor in the middle of a custody battle who is a nurse at Toronto General Hospital" identifies the client without using their name. HIPAA regulators look at the totality of the information, not just whether a proper name appears.
The safe assumption: any content drawn from a real client encounter is PHI until proven otherwise.
What OpenAI actually says
OpenAI is not vague about this. Their documentation is clear, if you read it carefully.
OpenAI's standard Terms of Service state that ChatGPT is not designed for use cases requiring HIPAA compliance. Their usage policies explicitly prohibit using ChatGPT for certain sensitive healthcare applications without appropriate enterprise arrangements in place.
OpenAI's default privacy settings for Free and Plus accounts allow them to use conversations to improve their models. This means client information you enter may be used as training data. Even if OpenAI anonymizes that data before using it in training, the act of transmitting PHI to a system without a BAA is already a HIPAA violation at the point of transmission, before any downstream use occurs.
OpenAI's own language on Enterprise vs. consumer plans
Consumer plans (Free, Plus)
No BAA. Conversations may be used for model training unless you opt out (and opting out is not a substitute for a BAA). Not designed for regulated healthcare data.
Enterprise plan
BAA available. OpenAI states that Enterprise conversations are not used for model training by default. Includes additional administrative controls. Requires volume commitment and IT configuration.
API (developers)
BAA available for API customers. API inputs are not used for training by default. Relevant for developers building healthcare apps on OpenAI infrastructure, not for therapists using ChatGPT directly.
OpenAI's current business privacy and BAA documentation is the authoritative source for current terms. These policies do change over time. Verify directly before making any compliance decisions, and have legal counsel review before relying on any sales-managed BAA.
The BAA problem, explained plainly
A Business Associate Agreement is not a technicality. It is a signed legal contract between you (the covered entity) and any vendor who handles PHI on your behalf (the business associate). Without it, sharing PHI with a vendor is a HIPAA violation regardless of:
The BAA requirement exists to create accountability. When a vendor signs a BAA, they agree to specific obligations: they will protect the PHI, they will not use it for their own purposes, they will report breaches to you within specific timeframes, and they will delete the data when the relationship ends. These are legally enforceable commitments, not marketing claims.
The "nobody has been caught" logic does not hold. HIPAA compliance is not about whether you will be audited. It is about what liability you are carrying. If a client's data is later part of a breach at OpenAI or any vendor, and you had entered that data without a BAA, you are personally liable under HIPAA. The breach does not need to be your fault for the liability to attach.
A useful analogy: driving without car insurance is technically possible and most drivers will never get into a serious accident. But the risk structure is fundamentally different from driving insured. The compliance question for HIPAA is not "what is the probability I get caught?" It is "what am I liable for if something goes wrong?"
For most therapists, the answer to that second question, combined with the simplicity of choosing HIPAA-compliant alternatives, makes continued ChatGPT use with client data a liability they do not need to carry.
What about ChatGPT Enterprise? Is it actually HIPAA compliant?
ChatGPT Enterprise does offer BAA capability. This is a meaningful distinction from the consumer plans, and it is worth understanding clearly.
With Enterprise, OpenAI commits to: not using your conversations to train models, providing data processing agreements, supporting HIPAA compliance frameworks, and offering additional administrative controls like SSO, audit logs, and custom retention settings. A signed BAA with OpenAI Enterprise is a legally enforceable HIPAA compliance instrument.
The practical barriers for solo practitioners
Cost
ChatGPT Enterprise is priced for organizations, not individual practitioners. Pricing is sales-managed and typically tied to organizational procurement rather than a simple solo checkout flow.
Setup requirements
Enterprise requires IT configuration: SSO setup, admin console management, domain verification, and policy configuration. Most solo practitioners do not have IT staff and are not equipped to manage enterprise software procurement and configuration on their own.
Legal review required
A BAA is only as protective as the terms within it. Enterprise BAAs are complex documents. Having legal counsel review the agreement before signing is prudent and adds to the cost and timeline of adoption.
Data still leaves your device
Even with a BAA, ChatGPT Enterprise sends your prompts to OpenAI's servers for processing. This is fundamentally different from zero-retention architecture, where data never persists. "HIPAA compliant by contract" and "HIPAA compliant by architecture" are two different compliance postures with different risk profiles.
Compliance by contract vs. compliance by architecture
There are two fundamentally different models for HIPAA-compliant AI tools. Understanding which category a tool falls into changes the risk profile significantly.
Compliant by contract (BAA model)
Your data goes to the vendor's servers. The vendor has promised (in writing) to protect it. You are protected by the legal agreement if something goes wrong. Examples: ChatGPT Enterprise, Quill, Mentalyc.
Compliant by architecture (zero-retention model)
Your data is processed for the request and not retained in the main database afterward. The protection is built into the system design, not just a legal contract. Example: Reframe Practice.
For most therapists, especially those in solo or small group practice, purpose-built HIPAA-compliant therapy tools offer a better risk profile at a lower cost and complexity than configuring ChatGPT Enterprise for healthcare use.
The audit risk: what enforcement actually looks like
HIPAA enforcement is handled by the HHS Office for Civil Rights (OCR). OCR investigates potential violations through two primary channels: complaints filed by individuals (including clients) and breach reports submitted by covered entities when an incident occurs.
Random audits do exist, but most enforcement actions begin with a breach or a complaint. This is the key insight: you are not safe just because no audit is scheduled. You are exposed at the moment something goes wrong, and at that point, the question is whether you had the required safeguards in place.
HIPAA civil penalty tiers (2024 OCR schedule)
Using a tool without a BAA because you did not know it was required would likely fall under "reasonable cause" or "willful neglect," depending on how long the practice continued. Ignorance of the BAA requirement is not a defense for a licensed mental health provider, who is a covered entity subject to HIPAA.
The APA data point: A 2025 APA survey found that 67% of psychologists cite data breaches as their number one concern about AI tools in clinical practice. The concern is well-founded. The gap between "my AI tool seems fine" and "my AI tool has a BAA and I understand what that means" is where most HIPAA exposure lives.
Criminal penalties are a separate category and reserved for intentional violations. For the typical therapist who used ChatGPT in good faith without understanding the BAA requirement, civil penalties are the relevant risk, and they are substantial enough to take seriously.
What therapists use instead
The good news: HIPAA-compliant AI options exist for every major clinical workflow. The question is which compliance model you prefer.
For progress notes, the strongest options are tools purpose-built for therapy documentation. For worksheet and material generation, the full comparison of ChatGPT alternatives for therapists covers the major players. The summary below focuses on the compliance picture.
| Tool | BAA available | Data storage | Solo-friendly | Notes / Worksheets |
|---|---|---|---|---|
| ChatGPT Free/Plus | No | Retained (may train models) | Yes (but not compliant) | General AI |
| ChatGPT Enterprise | Yes | Retained (not for training) | Not practical | General AI |
| Quill | Yes | Retained with BAA | Yes | Notes only |
| Mentalyc | Yes | Retained with BAA | Yes | Notes only |
| Reframe Practice | Yes (via Google Vertex AI) | Zero retention (in-memory) | Yes | Notes + Worksheets |
The distinction between "retained with BAA" and "zero retention" matters in practice. When a vendor retains your session data, even under a BAA, that data exists on their servers. It is protected by contract, but a breach of their systems could still expose it. Zero-retention architecture reduces retained clinical text and lowers breach exposure.
Reframe Practice: HIPAA compliant by architecture
Reframe Practice generates worksheets, progress notes, session prep materials, and treatment plan drafts. All processing happens in-memory using Google Vertex AI (with a signed BAA). Nothing is stored on Reframe's servers after your request completes. If you want to verify this yourself, open your browser's network inspector while generating a worksheet. Watch what gets sent and what comes back.
Try it freeNot an endorsement that any particular tool is right for your practice. HIPAA compliance is a floor, not a ceiling. Review BAA terms directly, consult your licensing body's guidance on AI tools, and make decisions appropriate to your jurisdiction and client population.
Free: Documentation Time Audit
Track exactly where your documentation hours go each week. A simple self-audit worksheet used by over 400 therapists to find recoverable time in their practice admin.
Free download. No spam. Unsubscribe anytime.
Frequently asked questions
Is ChatGPT HIPAA compliant in 2026?
No. Standard ChatGPT plans are not HIPAA compliant. OpenAI does not offer a Business Associate Agreement (BAA) for Free, Plus, or Business. Using those plans with protected health information violates HIPAA. OpenAI says some sales-managed ChatGPT offerings and qualifying API use cases can support a BAA, but that is not the standard therapist setup.
Can I use ChatGPT for therapy notes without violating HIPAA?
Not on the standard Free or Plus plans. Any session content you type into ChatGPT that could identify a patient, including presenting problems, symptoms, or clinical details, counts as protected health information (PHI). Entering PHI into a system without a BAA is a HIPAA violation regardless of intent or whether a breach occurs.
Does OpenAI offer a BAA for ChatGPT?
OpenAI says BAAs can be explored for certain sales-managed ChatGPT offerings and are available for qualifying API use cases. Standard ChatGPT plans, including Free, Plus, and Business, do not include BAA capability. Most therapists use those standard plans, which means most therapists using ChatGPT with client data are out of compliance.
Is ChatGPT Enterprise HIPAA compliant?
Certain sales-managed ChatGPT offerings can support a Business Associate Agreement, which is the foundational requirement for HIPAA compliance. However, HIPAA compliance also depends on how your organization configures data controls, access permissions, and breach response. These plans require procurement, legal review, and admin controls that most solo practitioners do not have.
What happens if I use ChatGPT with client data?
Using ChatGPT (Free or Plus) with protected health information creates personal HIPAA liability. HIPAA penalties range from $100 to $50,000 per violation, with annual caps up to $1.9 million per violation category. Most violations surface during breach investigations rather than random audits. If a data incident later involved OpenAI's systems and you had entered client data without a BAA, you would have no legal protection.
Can I use ChatGPT if I don't include the client's name?
No, this is a common misconception. HIPAA defines protected health information as any data that could identify a patient in combination with health information. You do not need to include a name for information to qualify as PHI. Unique clinical details, a specific combination of presenting problems, a distinctive life situation, or session-specific content can all be PHI even without a name attached.
Is the ChatGPT API HIPAA compliant?
OpenAI does offer a Business Associate Agreement for API customers. Developers building HIPAA-compliant applications on top of the OpenAI API can enter into a BAA. However, the BAA is with the developer, not the end user. If you are using a third-party tool built on the ChatGPT API, HIPAA compliance depends on whether that third-party tool has its own BAA with OpenAI and whether it offers a BAA to you as the covered entity.
What AI tools for therapists are actually HIPAA compliant?
Tools with HIPAA compliance fall into two categories. BAA-based compliance: the vendor signs a Business Associate Agreement with you and promises to protect your data. This includes tools like Quill, Mentalyc, and ChatGPT Enterprise. Architecture-based compliance: data is processed for the request without retained clinical text in the main database afterward. Reframe Practice uses zero-retention architecture with a Google Vertex AI Business Associate Agreement. Data is processed for the request and not retained in our main database afterward.
Is Google Gemini HIPAA compliant for therapists?
Standard Google Gemini (the consumer product at gemini.google.com) is not HIPAA compliant for therapist use. Google Workspace with Healthcare add-on does support BAA capability, and Google Cloud services (Vertex AI) can be used under a BAA. The same logic applies to Gemini as to ChatGPT: the consumer-facing product lacks the BAA infrastructure required for HIPAA compliance.
Does deleting a ChatGPT conversation make it HIPAA compliant?
No. Deleting a conversation removes your access to it, not OpenAI's data processing logs. Without a Business Associate Agreement, you have no contractual guarantee about what happens to data after deletion. The compliance gap is the missing BAA, not the retention UI.
What is the difference between HIPAA compliance and HIPAA security?
HIPAA compliance is the full framework: signed BAAs, documented policies, breach procedures, staff training. HIPAA security is specifically the Technical Safeguards: encryption, access controls, audit logs, transmission protection. A tool can have strong security features but still not be HIPAA compliant without a signed BAA.
How do I know if an AI tool is HIPAA compliant?
Ask three questions before using any AI tool with client data. First: does the vendor offer a signed Business Associate Agreement? If they say 'we take security seriously' without offering a BAA, that is not HIPAA compliance. Second: does the tool retain your data after processing? A BAA helps, but a vendor who retains all your session notes creates ongoing risk even with legal protections in place. Third: who is the underlying AI provider and do they have a BAA?
Does ChatGPT offer a BAA (Business Associate Agreement)?
Not on the standard plans. ChatGPT Free, Plus, and Business do not include a BAA. OpenAI does offer a BAA for ChatGPT Enterprise through a sales-managed procurement process, and for qualifying API use cases. Most therapists use the standard plans, which means a ChatGPT BAA is not available to them without an enterprise contract.
Is ChatGPT Enterprise HIPAA compliant in 2026?
ChatGPT Enterprise can support HIPAA compliance in 2026 through a Business Associate Agreement available via sales-managed procurement. However, compliance also depends on organizational configuration: data controls, access permissions, SSO, audit logs, and breach response. Solo practitioners typically cannot access this path due to procurement requirements and cost. Having a BAA also does not mean data never leaves your device. It means the vendor has agreed contractually to protect it.
Is there a HIPAA compliant AI for therapy notes?
Yes. Several HIPAA compliant AI tools exist specifically for therapy notes. Tools like Quill and Mentalyc offer BAA-based compliance. Reframe Practice uses zero-retention architecture, where data is processed in-memory and not retained in the main database afterward, backed by a Google Vertex AI BAA. The key question is whether the tool has a signed BAA with both the AI provider and with you as the covered entity.
Related guides
AI for therapists: what is actually safe
The broader framework for evaluating privacy, liability, and workflow fit before rollout.
AI SOAP notes for therapists
How note-generation tools work, where review still matters, and what the workflow looks like in practice.
Best therapy aid tools
A clinician-facing comparison of documentation, worksheet, and session-prep tools.
ChatGPT alternatives for therapists
Which HIPAA-safer alternatives map best to the jobs therapists are trying to get done.
SEO for therapists
The full visibility playbook for therapy practices: Google Business Profile, local search, service pages, and trust signals.