How to Detect Sensitive Data Leaks Through Copilot Prompts
🔍 WiseChecker

How to Detect Sensitive Data Leaks Through Copilot Prompts

When business users paste confidential information into Copilot prompts, that data may be processed outside your organization’s security boundaries. This risk grows as employees use Copilot to summarize contracts, draft emails with customer PII, or analyze internal financial reports. The cause is often a lack of awareness about what Copilot sends to Microsoft’s AI services and how it is logged. This article explains how to audit prompt logs, identify exposed sensitive data, and configure controls to prevent leaks before they happen.

Key Takeaways: Detecting and Preventing Data Leaks in Copilot Prompts

  • Microsoft 365 admin center > Audit > Search audit log: Review Copilot interaction logs to identify prompts containing sensitive terms like credit card numbers or social security numbers.
  • Microsoft Purview > Data Loss Prevention > Policies: Create DLP rules that block or warn when users paste sensitive data into Copilot prompts.
  • Microsoft 365 admin center > Settings > Copilot > Data security: Enable tenant-level controls to restrict which data Copilot can access and how prompts are stored.

How Copilot Processes Prompts and Where Data Can Leak

When a user types a prompt in Copilot, the text is sent to Microsoft’s AI infrastructure for processing. The prompt may include data from the current document, email, or chat context. If the user manually pastes sensitive information such as a customer’s home address, a trade secret, or a financial account number, that content becomes part of the AI request. Microsoft logs these interactions for service improvement and security auditing unless your tenant is configured otherwise.

The data leak risk is not about Copilot intentionally exposing secrets to other users. Instead, it is about three specific vectors:

Prompt Content Stored in Audit Logs

Every Copilot interaction is recorded in the Microsoft 365 audit log. The log contains the full prompt text and the AI’s response. If an administrator or attacker gains read access to these logs, they can see every piece of sensitive data users typed. This is the most common leak path because audit logs are often broadly accessible.

Data Sent to Microsoft’s AI Service

Prompt data is transmitted to Microsoft’s Azure OpenAI endpoints. While Microsoft states that data is not retained for training in commercial tenants, the data is processed in Microsoft’s data centers. If your organization has strict data residency requirements, prompts containing sensitive data may violate compliance policies.

Contextual Data Automatically Included

Copilot can pull context from the current file, email thread, or Teams conversation. A user may not realize that their prompt includes hidden sensitive data from the surrounding content. For example, a prompt like “Summarize this contract” may include the contract’s pricing terms and client names even if the user did not type them explicitly.

Steps to Audit Copilot Prompts for Sensitive Data

To detect leaks, you must review the actual prompt text logged in your tenant. The following steps guide you through searching the audit log, filtering for Copilot events, and identifying sensitive content.

  1. Open the Microsoft 365 audit log
    Sign in to the Microsoft 365 admin center at admin.microsoft.com. Go to Compliance > Audit. You need the Audit Log Viewer role or Global Administrator rights. If audit logging is not enabled, turn it on from the Audit page.
  2. Filter for Copilot interactions
    In the audit log search, set the Activities filter to Copilot interaction. This returns all prompts and responses generated by Copilot across Word, Excel, PowerPoint, Teams, and Outlook. Set a date range that covers the period you want to review, such as the last 30 days.
  3. Export the audit records
    Click Search to run the query. When results appear, click Export all results to download a CSV file. This file contains columns for CreationTime, UserId, Operation, and AuditData. The AuditData column holds the full JSON record including the prompt text.
  4. Parse the JSON to extract prompt text
    Open the CSV in a tool like Excel or PowerShell. Extract the AuditData JSON field. Look for the Prompt property inside the JSON. This contains the exact text the user typed. If you see Context property, it includes any data Copilot automatically pulled from the file or conversation.
  5. Search for sensitive patterns
    Use a text search or script to find patterns like credit card numbers, social security numbers, email addresses, or internal project codes. For example, search for regex patterns such as \d{4}-\d{4}-\d{4}-\d{4} for credit cards or \d{3}-\d{2}-\d{4} for US SSNs. Flag any matching records for investigation.

Common Detection Failures and How to Avoid Them

Audit Logs Are Not Enabled

If audit logging is off, no Copilot prompt data is recorded. Go to the Microsoft 365 admin center, navigate to Compliance > Audit, and click Start recording. This enables logging for all Microsoft 365 services including Copilot. Without this step, you cannot detect leaks through logs.

Prompt Text Is Truncated in the CSV Export

The AuditData JSON can be long. Excel may truncate the cell content. Use PowerShell or a JSON parser to extract the full prompt. Run the following PowerShell command to read the CSV and expand the AuditData property: Import-Csv audit.csv | ForEach-Object { $_ | Add-Member -NotePropertyName Prompt -NotePropertyValue (ConvertFrom-Json $_.AuditData).Prompt -PassThru }. This ensures you see the complete prompt text.

Users Paste Data from External Sources

A user may copy sensitive data from a non-Microsoft app like a CRM or a PDF reader and paste it into Copilot. The audit log captures this pasted text. However, the log does not show the source application. To confirm the leak source, interview the user or cross-reference the prompt timestamp with other app logs.

Copilot Data Security Controls vs Standard Microsoft 365 DLP

Item Copilot Data Security Controls Standard Microsoft 365 DLP
Scope Controls what data Copilot can access and how prompts are processed Monitors and blocks sensitive data across all Microsoft 365 apps and services
Configuration location Microsoft 365 admin center > Settings > Copilot > Data security Microsoft Purview > Data Loss Prevention > Policies
Key feature Restrict Copilot from reading specific SharePoint sites or OneDrive folders Block sharing of credit card numbers, SSNs, or custom sensitive info types via email or file sharing
Prompt blocking Cannot block specific prompt content; only restricts data sources Can block or warn when users paste sensitive data into any app including Copilot
Audit capability Logs all interactions but does not classify sensitive data Classifies sensitive data in audit logs and provides alerts

Use Copilot data security controls to limit which data the AI can read. Use standard DLP policies to detect and block sensitive content in prompts. The two tools work together. For example, you can set a DLP rule that triggers when a user pastes a credit card number into Copilot, and also restrict Copilot from reading your finance SharePoint site.

What to Do When You Find a Sensitive Data Leak

If your audit log review reveals a prompt containing sensitive data, take these actions immediately. First, determine whether the data was exposed only in the audit log or also in Copilot’s response. If the response was shared with other users, the data may have been exposed to unintended recipients. Second, revoke access to the audit log for non-essential users to limit further exposure. Third, notify your security team and follow your organization’s incident response plan. Fourth, configure a DLP policy that blocks similar patterns in the future. For example, create a DLP rule that detects social security numbers in Copilot prompts and warns the user before the prompt is sent.

You can now audit Copilot prompts for sensitive data by searching the Microsoft 365 audit log and parsing the JSON records. Next, configure DLP policies in Microsoft Purview to automatically detect and block sensitive content in future prompts. For advanced protection, enable Copilot data security controls to restrict which files the AI can read. Use the audit log export script regularly to catch leaks early.