Microsoft Copilot Privacy Impact Assessment Template Guidance
🔍 WiseChecker

Microsoft Copilot Privacy Impact Assessment Template Guidance

Organizations adopting Copilot for Microsoft 365 must complete a Privacy Impact Assessment or PIA to meet compliance requirements and understand data flows. A PIA identifies how Copilot processes prompts, retrieves data, and generates responses using the Microsoft Graph and your tenant content. Without a structured assessment, organizations risk exposing sensitive information or violating data protection regulations like GDPR or HIPAA. This article provides a PIA template structure and explains each section so you can document your Copilot deployment accurately.

Key Takeaways: Copilot PIA Template Sections

  • Microsoft 365 admin center > Copilot > Data sources: Controls which Microsoft Graph data Copilot can read for grounded responses.
  • Microsoft Purview compliance portal > Data Classification > Sensitive info types: Defines what content Copilot must not access or process.
  • Azure Active Directory > Conditional Access > Copilot app: Restricts which users and devices can interact with Copilot in your tenant.

ADVERTISEMENT

What a Privacy Impact Assessment Must Cover for Copilot

A PIA for Copilot documents how personal data flows through the system and what controls exist to protect it. Copilot does not use your data to train foundation models. It retrieves data only from your Microsoft 365 tenant through the Microsoft Graph. This means the data Copilot accesses is the same data users already have permission to see. The PIA must confirm that access permissions are correctly set and that no sensitive data is exposed to unauthorized users.

The assessment also covers data residency, encryption in transit and at rest, and logging of Copilot interactions. Microsoft provides audit logs in the Microsoft 365 admin center and the Purview compliance portal. These logs record which user sent a prompt, which data sources were queried, and what response was generated. The PIA must state how long logs are retained and who can review them.

Data Flow Diagram for Copilot

Include a diagram showing the data path from the user prompt to the response. The prompt travels from the Copilot interface to the Copilot service, which sends a request to the Microsoft Graph. The Graph retrieves relevant data from Exchange, SharePoint, OneDrive, Teams, and other workloads. The Copilot service then generates a response and sends it back to the user. No data leaves the Microsoft 365 boundary. The PIA must show that data never passes through third-party servers or is stored outside your tenant.

Structure of the Copilot PIA Template

Use the following sections to build your Copilot PIA. Each section addresses a specific compliance requirement. Modify the template to match your organization’s data protection policies.

Section 1: System Description

Describe Copilot and its purpose in your organization. State which Microsoft 365 workloads Copilot accesses. List the user roles that have Copilot licenses. Mention the Copilot version and whether you use Copilot for Microsoft 365, Copilot Pro, or both.

Section 2: Data Collection and Processing

List all data types that Copilot processes. This includes user prompts, email content, documents, meeting transcripts, calendar items, and chat messages. Explain that Copilot does not store prompts or responses outside the tenant. State that data is processed in real time and not used for model training.

Section 3: Data Access Controls

Document how access permissions are managed. Copilot uses existing Microsoft 365 permissions. If a user cannot view a document in SharePoint, Copilot cannot access it either. Describe your Conditional Access policies for the Copilot app. Mention any sensitivity labels that block Copilot from processing certain content.

Section 4: Data Residency and Storage

State where your tenant data is stored geographically. Confirm that Copilot does not move data to a different region. Mention the encryption standards used: TLS 1.2 for data in transit and AES-256 for data at rest. Include the data retention policy for Copilot audit logs.

Section 5: Third-Party Data Sharing

Confirm that Copilot does not share data with any third-party services unless you have enabled a plugin. If plugins are used, list them and describe what data each plugin receives. State that no data is shared with OpenAI or any external AI model provider.

Section 6: Incident Response and Breach Notification

Describe how your organization handles a data breach involving Copilot. Reference your existing incident response plan. State that Microsoft 365 audit logs will be the primary source for investigating Copilot-related incidents. Include contact information for your data protection officer.

ADVERTISEMENT

Completing the PIA Template: Step-by-Step Instructions

  1. Gather prerequisite documentation
    Collect your Microsoft 365 tenant architecture diagram, user license report, and existing data protection impact assessment framework. You will need these to fill in the template sections accurately.
  2. Open the Microsoft 365 admin center
    Go to Admin centers > Microsoft 365 admin center. Navigate to Settings > Org settings > Copilot. Review the data source settings and note which workloads Copilot is allowed to access.
  3. Review Copilot audit logs
    In the Microsoft Purview compliance portal, go to Audit > Search audit log. Filter by Workload: Copilot. Export a sample log to understand what data is recorded. Include the log retention period in the PIA.
  4. Document Conditional Access policies
    In the Azure AD admin center, go to Security > Conditional Access > Policies. Locate any policy that targets the Copilot app. Record the policy name, conditions, and grant controls in the PIA.
  5. List all active Copilot plugins
    In the Copilot interface, open the Plugins menu. Write down each plugin name and its publisher. For each plugin, note what data it receives. If a plugin is third-party, include a data processing agreement reference.
  6. Complete the risk assessment table
    Create a table with columns: Risk Description, Likelihood, Impact, Mitigation. Example risk: Unauthorized access to sensitive data via Copilot. Mitigation: Sensitivity labels block Copilot from processing labeled content. Assign a risk score and document the mitigation.
  7. Obtain sign-off
    Send the completed PIA to your data protection officer, legal team, and security team for review. Store the signed document in your compliance repository. Update the PIA whenever you change Copilot settings or add new workloads.

Common Mistakes When Completing the Copilot PIA

Assuming Copilot Has Its Own Data Storage

Some assessors write that Copilot stores prompts in a separate database. This is incorrect. Copilot processes data in memory and does not persist prompts or responses outside the tenant. The PIA must state that data is ephemeral and only logged for audit purposes.

Omitting Plugin Data Flows

If your organization uses third-party plugins like Jira or ServiceNow, the PIA must describe the data flow to those services. Each plugin can send prompt data to its own server. Document the plugin, the data sent, and the data processing agreement in place.

Not Updating the PIA After Configuration Changes

When you enable a new workload or change data source settings, the PIA becomes outdated. Set a recurring review cycle every six months or after any major Copilot update. Assign a responsible person to track changes and update the document.

Copilot PIA Data Flow Comparison

Item Without Plugins With Third-Party Plugins
Data source Microsoft Graph only Microsoft Graph plus plugin APIs
Data storage No persistent storage Plugin may store data on its own servers
Data sharing None Data shared with plugin publisher
Audit logging Microsoft 365 audit log Microsoft 365 audit log plus plugin logs
Compliance scope Microsoft 365 compliance controls Must include plugin vendor compliance

Use this table in your PIA to show the difference between a default Copilot deployment and one with plugins. It helps reviewers understand where additional data protection measures are needed.

You can now complete a Privacy Impact Assessment for Copilot using the template structure provided. Start by gathering your tenant documentation and reviewing the data source settings in the Microsoft 365 admin center. For a thorough assessment, include the data flow diagram and the risk mitigation table. Review the PIA every six months and after any configuration change to maintain compliance with data protection regulations.

ADVERTISEMENT