HIPAA Compliant Voice AI: What Healthcare Practices Need to Know
Every patient call your AI handles involves protected health information. This guide covers the real HIPAA requirements for voice AI—not marketing language.
Loading audio...

Medically reviewed by Dr. Charles Sweet, MD, MPH, board-certified psychiatrist and medical advisor to Linear Health.
Every patient call your AI handles involves protected health information. The patient's name. Their appointment history. Their insurance details. The reason they're calling. The scheduling actions the AI takes on their behalf. All of it is PHI under HIPAA, and all of it creates obligations for your practice and your vendor.
That should be straightforward. In practice, it's a mess. "We're HIPAA compliant" has become a checkbox that vendors slap on their website like a certification badge, whether they've earned it or not. Saying it is easy. Having the documentation, the architecture, and the legal agreements to back it up is something else entirely.
This guide is for the person at your practice who needs to sign off on a voice AI deployment and wants to know, specifically, what HIPAA actually requires here. Not marketing language. The real requirements.
What HIPAA says about voice AI
Under HIPAA, any entity that processes PHI on behalf of a covered entity (that's your practice) is a business associate. An AI voice agent that answers patient calls, looks up scheduling data, or touches insurance information absolutely qualifies. There's no gray area, no workaround, no "we only process voice data so it doesn't count."
Any voice AI vendor handling protected health information must sign a Business Associate Agreement. This is federal law, not a best practice. The BAA defines what the vendor can do with patient data, how they protect it, how they report breaches, and what happens to your data when the contract ends. If a vendor tells you their standard terms of service "already cover HIPAA requirements," that's your cue to end the conversation. They either don't understand the regulation or they're hoping you don't.
Five compliance checkpoints for healthcare voice AI
HIPAA compliance for voice AI requires encryption in transit and at rest, role-based access controls, comprehensive audit trails, and documented breach notification procedures. Here's what each of those actually means when the data in question is a recorded phone call with a patient.
1. Encryption, everywhere
Every voice call, transcript, and patient record that moves through the system needs to be encrypted. TLS 1.2 or higher for data in transit. AES-256 for data at rest. That covers the audio stream during the conversation, the transcript afterward, any patient data pulled from or written to your EHR, and any backups or logs that contain PHI. Ask vendors where data exists in plaintext. If they can't give you a clear answer, the architecture probably has gaps.
2. Access controls that mean something
Not everyone at the vendor's company should have access to your patients' call recordings. HIPAA requires role-based access controls that limit PHI access to people who genuinely need it for their specific function. Your practice admin should be able to configure who on your team can see what. And the vendor should be able to tell you exactly who on their side can access your data, and under what circumstances. If the answer is vague, that's a problem.
3. Audit trails that hold up
Every interaction the AI has with PHI needs a log. Every call handled. Every EHR lookup. Every appointment booked or changed. Every time a human at the vendor accesses your call data. These logs need to be tamper-proof and kept for at least six years. This isn't just a compliance checkbox. It's how you demonstrate, if you're ever audited, exactly what happened with patient data and when.
4. Data lifecycle policies
How long are call recordings stored? Where do transcripts live? What happens to your data if you stop using the vendor? HIPAA requires documented retention and deletion policies. The vendor needs to tell you their data lifecycle from creation to purge, and they need to certify complete deletion when the contract ends. "We'll delete it" isn't sufficient. You need a timeline and confirmation.
5. Breach notification that's fast enough
HIPAA gives business associates a 60-day window to report breaches. That's the legal ceiling, not the target. The vendors who take this seriously commit to 24 to 72 hour notification. Ask to see their actual incident response plan. Not a summary on their website. The plan itself. It should cover how breaches get detected, who gets notified, what information you receive, and what remediation steps they take.
Need HIPAA-compliant voice AI for your practice?
Linear Health provides a signed BAA, SOC 2 Type II certification, end-to-end encryption, and comprehensive audit trails. Go live in 4 weeks.
Mistakes practices make that nobody warns them about
Even organizations that take HIPAA seriously can trip on voice AI because the technology is new and the compliance implications aren't always intuitive.
Recording calls without handling consent correctly. State laws vary. Some require one-party consent for recording, others require all-party consent. Your voice AI needs to handle this based on the patient's location, not just yours. The AI should inform callers that the conversation may be recorded. This sounds basic, but it's frequently missed.
Leaving the voice AI out of the risk assessment. HIPAA requires regular risk assessments that cover every system handling PHI. Deploy voice AI without updating your risk assessment and you've created a compliance gap, even if the vendor is technically compliant on their end. Your auditor will find it.
Assuming the vendor handles everything. A BAA doesn't mean you can stop thinking about compliance. Your practice is still the covered entity. You're responsible for ensuring the vendor meets their obligations, that your staff uses the system correctly, and that your policies align with how the AI processes patient data. Compliance is shared responsibility, not delegation.
No audit trail for what the AI did in your EHR. Some voice AI products log that a call happened but don't log the specific actions the AI took: which records it accessed, which appointments it modified, which patient data it pulled. That's insufficient. Every PHI access and every action on PHI needs to be recorded.
Ten questions for your next vendor call
Bring this list to your next evaluation. If the vendor can't answer these clearly, you've learned something important.
- Do you provide a signed Business Associate Agreement? Can we review a draft before proceeding?
- What encryption standards do you use for data in transit and at rest?
- What SOC certifications do you hold? (SOC 2 Type II certification verifies that security controls have been tested over time, not just at a single point. This matters significantly more than Type I.)
- Who at your company can access our patient call data, and under what circumstances?
- How long are call recordings and transcripts retained?
- What is your breach notification timeline? Can we see your incident response plan?
- Do you have a BAA with your cloud infrastructure provider?
- How do you handle state-specific call recording consent requirements?
- Can you provide audit logs showing all PHI access and modifications?
- What happens to our data if we terminate the contract?
For a broader evaluation framework covering compliance alongside scheduling, EHR integration, and ROI, see the full Buyer's Guide.
The bottom line
HIPAA compliance for AI voice agents in healthcare isn't rocket science, but it's rigorous. Enterprise-grade encryption, access controls, audit trails, retention policies, breach procedures, and a signed BAA. That's the list. Vendors who built their platform for healthcare from day one will have all of it. Vendors who bolted healthcare onto a generic voice product may have gaps they haven't discovered yet.
Your patients trust your practice with their health information. That trust extends to every system that touches their data, including the AI that picks up the phone when they call. Verify everything. Assume nothing.
Frequently Asked Questions
Is voice AI HIPAA compliant?
Not automatically. HIPAA compliance requires a signed Business Associate Agreement, enterprise-grade encryption in transit and at rest, role-based access controls, comprehensive audit trails, documented retention and deletion policies, and breach notification procedures. Always verify compliance with specific documentation.
Do I need a BAA with my voice AI vendor?
Yes. Any voice AI vendor processing protected health information on behalf of your practice is a business associate under HIPAA. A signed BAA is a federal requirement, not optional.
What's the difference between SOC 2 Type I and SOC 2 Type II?
SOC 2 Type I verifies that security controls exist at a single point in time. SOC 2 Type II verifies those controls have operated effectively over a sustained period (typically 6 to 12 months). Type II provides significantly stronger assurance that security practices are consistent and reliable.
Can voice AI record patient calls under HIPAA?
Call recording is permissible under HIPAA with appropriate safeguards, but state laws on consent vary. Some states require all-party consent. Your voice AI must inform patients that the call may be recorded and handle consent based on applicable state law, not just your practice's location.
What happens to patient data if I stop using a voice AI vendor?
Your BAA should specify data handling upon termination, including timelines for PHI deletion and certification that data has been purged. Negotiate this provision before signing, not when you've already decided to leave.

Sami scaled Simple Online Healthcare to $150M and built a multi-specialty telehealth clinic across 20 specialties and all 50 states. Connect on LinkedIn.






