Healthcare organizations face strict rules when using AI tools that handle patient data. The Health Insurance Portability and Accountability Act HIPAA requires safeguards for electronic protected health information ePHI. Microsoft Copilot can process clinical data only when deployed under a Business Associate Agreement BAA. This article explains which Copilot configurations meet HIPAA requirements and how healthcare teams can use Copilot for clinical documentation, data summarization, and administrative tasks without violating compliance rules.
Key Takeaways: Copilot HIPAA Compliance in Healthcare
- Business Associate Agreement BAA with Microsoft: Required before any Copilot service can access ePHI. Without a BAA, Copilot must not receive patient data.
- Microsoft 365 E5 or G5 license with Copilot add-on: The only licensing path that includes the full compliance controls needed for HIPAA workloads.
- Customer-managed encryption keys and audit logging: Must be enabled to meet HIPAA access control and audit control standards.
How Copilot Handles ePHI Under HIPAA
Microsoft Copilot is not a single product. It includes Copilot in Microsoft 365, Copilot in Azure, and Copilot in Dynamics 365. Each service has different data handling policies. For HIPAA compliance, the key requirement is that Microsoft signs a BAA with your organization. The BAA contractually binds Microsoft to safeguard ePHI and limits how Microsoft can use the data.
When a BAA is in place, Copilot can process patient data that resides in Microsoft 365 services such as Exchange Online, SharePoint Online, and Teams. The Copilot system grounds its responses on your tenant data. It does not train its base models on your ePHI. Microsoft states that Copilot for Microsoft 365 uses your data only to generate responses and does not store that data for model improvement.
However, Copilot features that connect to public web search or third-party plugins are not HIPAA eligible. You must disable these features in the Copilot admin settings. The Copilot pane in Microsoft 365 apps like Word and Teams uses the Microsoft Graph to surface data. If your tenant contains ePHI, the Graph search must be scoped to specific sites and libraries that contain only de-identified data or that are covered under the BAA.
Data Residency and Processing Location
Microsoft offers data residency commitments for HIPAA customers. You can choose the region where your Copilot data is stored and processed. The Microsoft 365 admin center shows the default data location for your tenant. For healthcare organizations in the United States, Microsoft typically stores data in US data centers. You can verify this in the admin center under Settings > Org Settings > Organization Profile > Data Location.
Audit Logging Requirements
HIPAA requires audit controls that record who accessed ePHI and when. Copilot for Microsoft 365 generates audit events in the Microsoft 365 Purview compliance portal. You must enable audit logging for Exchange, SharePoint, and Teams. The audit log captures Copilot interactions that involve ePHI, including the prompt text and the documents Copilot referenced. These logs must be retained for at least six years to meet HIPAA documentation requirements.
Steps to Configure Copilot for HIPAA Compliance
Before deploying Copilot in a healthcare environment, you must complete several configuration steps. Skip any step and your Copilot deployment may violate HIPAA rules.
- Sign a Business Associate Agreement with Microsoft
Contact your Microsoft account representative or licensing partner. Request a BAA amendment for your Enterprise Agreement or Microsoft Customer Agreement. The BAA must explicitly cover Copilot for Microsoft 365. Do not proceed until the signed BAA is in place. - Assign the correct licenses
Each user who will access Copilot needs a Microsoft 365 E5 or G5 license plus the Copilot for Microsoft 365 add-on. Lower-tier licenses such as E3 or Business Premium do not include the compliance features required for HIPAA. Verify license assignments in the Microsoft 365 admin center under Billing > Licenses. - Disable web search and third-party plugins
In the Microsoft 365 admin center, go to Settings > Org Settings > Copilot. Turn off the option Allow Copilot to use web search. Also disable all third-party plugins. These features send data to external services that are not covered by your BAA. - Enable customer-managed encryption keys
Use Microsoft Purview Customer Key to provide your own encryption keys for Copilot data. This meets the HIPAA requirement for unique user identification and access control. Set this up in the Purview compliance portal under Data Lifecycle Management > Microsoft 365 Customer Key. - Configure audit logging and retention
In the Purview compliance portal, go to Audit > Audit log. Enable auditing for Exchange, SharePoint, and Teams. Set the audit log retention period to at least six years. Create a retention policy for the audit logs under Data Lifecycle Management > Retention. - Restrict data access with sensitivity labels
Apply Microsoft Purview sensitivity labels to documents that contain ePHI. Configure Copilot to respect these labels. In the Microsoft 365 admin center under Settings > Org Settings > Copilot, enable the option Respect sensitivity labels when grounding responses. This prevents Copilot from surfacing ePHI from labeled documents to unauthorized users.
Common Compliance Mistakes with Copilot in Healthcare
Copilot Accesses Patient Data Without a Signed BAA
The most frequent violation occurs when an organization deploys Copilot before the BAA is signed. Copilot can read any data in the user’s mailbox and documents. If that data includes ePHI and no BAA exists, the organization is in violation of HIPAA. Always verify the BAA status in the Microsoft 365 admin center under Settings > Org Settings > Services & add-ins > Microsoft Copilot. The page shows whether a BAA is active.
Users Enable Web Search in Copilot
When a user asks Copilot a clinical question, Copilot may send the prompt to Bing web search if the feature is enabled. This sends ePHI to Microsoft’s public search service, which is not covered by the BAA. The fix is to disable web search globally in the admin center and also block it at the user level using Conditional Access policies.
Copilot Returns ePHI to Unauthorized Users
If sensitivity labels are not applied, Copilot may surface ePHI from a SharePoint document to a user who does not have the correct permissions. For example, a nurse could ask Copilot to summarize a patient’s chart and receive data from a document they should not see. Apply sensitivity labels to all ePHI documents and enable label-aware grounding in Copilot settings.
Copilot in Microsoft 365 vs Copilot in Azure for Healthcare
| Item | Copilot in Microsoft 365 | Copilot in Azure |
|---|---|---|
| Primary use case | Clinical documentation, email summaries, meeting notes | Custom AI models for clinical decision support, medical imaging |
| Data grounding | Microsoft Graph data from Exchange, SharePoint, Teams | Azure data sources including Azure SQL, Cosmos DB, custom APIs |
| BAA availability | Included with Microsoft 365 E5/G5 and Copilot add-on | Included with Azure Enterprise Agreement and HIPAA-enrolled subscription |
| Model training | Microsoft does not train models on tenant data | Customer can choose to train models on their data using Azure AI Studio |
| Plugin support | Third-party plugins must be disabled for HIPAA | Custom plugins can be built within Azure environment and controlled |
| Audit logging | Microsoft 365 Purview audit log | Azure Monitor and Azure Activity Log |
| Encryption control | Customer Key via Purview | Azure Key Vault with customer-managed keys |
For most healthcare administrative workflows, Copilot in Microsoft 365 is the correct choice. For custom clinical AI applications that require fine-tuned models on patient data, Copilot in Azure offers more flexibility but requires more technical setup to maintain HIPAA compliance.
You can now assess whether your current Microsoft tenant meets the HIPAA requirements for Copilot. Start by confirming your BAA status and disabling web search. Next, configure sensitivity labels and audit logging. For advanced protection, enable Customer Key encryption. If you manage a large healthcare system, consider using Copilot in Azure for custom clinical models while keeping Copilot in Microsoft 365 for administrative tasks. The most important action is to run a compliance assessment in the Microsoft Purview compliance portal before allowing any user to access Copilot with patient data.