Microsoft Copilot Insider Risk Management Signals Explained
🔍 WiseChecker

Microsoft Copilot Insider Risk Management Signals Explained

Insider risk management in Microsoft 365 helps security teams detect, investigate, and respond to potential data leaks, malicious actions, or policy violations by employees. Copilot adds a new signal layer by analyzing user interactions with AI-powered tools such as Copilot in Word, Excel, and Teams. This article explains what Insider Risk Management signals are, how Copilot generates them, and how administrators can configure and interpret them. By the end, you will understand the data flow from Copilot queries to risk alerts and how to tune policies for your organization.

Key Takeaways: Insider Risk Management Signals from Copilot

  • Copilot interaction logs: Every query sent to Copilot across Microsoft 365 apps is recorded in the Microsoft Purview compliance portal as an audit event.
  • Signal-based risk scoring: The insider risk management engine uses Copilot activity signals to calculate a risk score based on policy rules and user baseline behavior.
  • Policy templates for Copilot: Prebuilt templates in Microsoft Purview target Copilot misuse scenarios such as data exfiltration, prompt injection, or access to sensitive content.

What Are Insider Risk Management Signals from Copilot?

Insider risk management signals are specific user actions that the Microsoft Purview compliance platform ingests and analyzes for risky behavior. With Copilot, these signals originate from the interaction between a user and the Copilot service across Microsoft 365 applications. Each time a user sends a prompt to Copilot, the system records metadata such as the application used, the content of the prompt, the data sources accessed, and the output returned.

The signal is not the raw prompt text itself. Instead, Copilot generates an audit event in Microsoft 365 that contains structured attributes: user identifier, timestamp, application ID, data sensitivity label of the accessed content, and a hash of the prompt for privacy. Microsoft Purview then applies its insider risk management policies to these signals. If a policy condition is met, for example a user sends a prompt that references a document labeled “Highly Confidential” and then pastes the output into an external messaging app, the system creates an alert for the security team.

Understanding these signals requires knowing the three layers of data collection:

Layer 1: Audit Log Generation

Every Copilot interaction generates an audit record in the Microsoft 365 audit log. This record includes the Workload field set to “Copilot,” the Operation field set to “CopilotInteraction,” and details such as the app name (for example, Word, Excel, Teams) and the type of interaction (prompt, response, or data source access). These audit logs are retained for 90 days by default and up to 10 years with an appropriate add-on license.

Layer 2: Insider Risk Policy Ingestion

The insider risk management engine in Microsoft Purview continuously scans the audit log for events that match active policies. For Copilot-specific policies, the engine looks for patterns such as repeated queries about sensitive financial data, queries made outside normal working hours, or queries that return content from a restricted SharePoint site. The engine does not store the full prompt text; it stores a hashed version and the sensitivity label of the data Copilot accessed.

Layer 3: Risk Score Calculation

Each Copilot signal contributes to a user’s cumulative risk score. The score is calculated by comparing the user’s current activity to their historical baseline. For example, if a user who normally sends 5 Copilot prompts per day suddenly sends 50 prompts in one hour, the risk score increases. The score also factors in the sensitivity of the data involved. A prompt that accesses a document labeled “Top Secret” carries more weight than one accessing a public document.

Configuring Insider Risk Policies for Copilot Signals

To start using Copilot signals in insider risk management, you must enable the Copilot audit feed and create or modify a policy in the Microsoft Purview compliance portal. The following steps assume you have the required roles: Insider Risk Management Admin or Global Admin.

Enable Copilot Audit Logging

  1. Sign in to Microsoft Purview
    Go to https://compliance.microsoft.com and sign in with your admin credentials.
  2. Navigate to Audit
    In the left navigation, select Audit under Solutions.
  3. Verify Copilot logging is on
    Under the Search tab, confirm that the audit log is enabled. If not, click Start recording user and admin activity.
  4. Search for Copilot events
    Use the Activity filter and type “CopilotInteraction” to confirm events are appearing. This confirms the feed is working.

Create an Insider Risk Policy for Copilot

  1. Open Insider Risk Management
    In Microsoft Purview, select Insider Risk Management from the left navigation.
  2. Choose a policy template
    Click Policies then Create policy. Select the template named Data leaks by risky users or Security policy violations by risky users. Both support Copilot signals.
  3. Select Copilot as a trigger
    In the “Choose activities to score” section, check the box for Copilot interaction. You can also add other activities like Download from SharePoint or Send email with sensitive content.
  4. Define thresholds
    Set the risk score threshold for generating an alert. For initial testing, use the default “Medium” threshold. Adjust later based on false positive rates.
  5. Assign users and start the policy
    Select the user group to monitor. You can create a group of high-risk users or apply the policy to all users. Click Create to activate the policy.

Common Misconceptions and Limitations of Copilot Signals

Copilot Signals Are Not Real-Time

Insider risk management processes signals in near real-time, but there is a delay of 5 to 15 minutes between a Copilot interaction and the appearance of an alert. This is because the audit log ingestion and risk scoring pipeline run on a batch cycle. Do not rely on these signals for immediate blocking of user actions. Use them for post-event investigation and trend analysis.

Signal Content Is Hashed, Not Plain Text

Microsoft Purview does not store the exact text of a Copilot prompt or response. Instead, it stores a cryptographic hash of the prompt. This means you cannot search alerts by exact keyword. You can, however, see the sensitivity label of the data Copilot accessed and the application used. For full content review, you must export the audit log and use a third-party tool or Microsoft’s eDiscovery features.

Copilot Signals Do Not Cover All Apps

Currently, Copilot interaction signals are generated only for Microsoft 365 apps that have the Copilot integration enabled. These include Word, Excel, PowerPoint, Outlook, Teams, and the Microsoft 365 web interface. Copilot in Windows or Copilot in Edge does not generate insider risk signals. If your organization uses Copilot across non-Microsoft platforms, you need separate monitoring solutions.

False Positives Are Common in Early Deployment

When you first enable Copilot signals, users who are heavy early adopters may trigger many alerts simply because they use Copilot frequently. To reduce noise, configure the “User baseline” setting in the policy to require a deviation of at least 3 standard deviations from the user’s normal activity before scoring. Also, exclude known test accounts and IT admin accounts from monitoring.

Item Copilot Interaction Signal Standard Audit Event
Description Records a user’s prompt to Copilot and the data sources accessed Records any user action in Microsoft 365, such as file download or email send
Data stored Hashed prompt, sensitivity label, app name, timestamp Action type, object name, user ID, timestamp
Risk scoring Yes, by insider risk policies Yes, by insider risk policies
Privacy protection Prompt text is hashed, not stored in plain text Object names and file paths are stored in plain text
Supported apps Word, Excel, PowerPoint, Outlook, Teams, M365 web All Microsoft 365 workloads
Alert delay 5 to 15 minutes 5 to 15 minutes

Insider risk management signals from Copilot give security teams a new way to detect risky behavior tied to AI usage. The key takeaway is that these signals are metadata-based and privacy-preserving by design. They are best used as part of a layered detection strategy alongside traditional data loss prevention and user behavior analytics. To get started, enable Copilot audit logging in your tenant, create a pilot policy with a small group of users, and tune the thresholds based on the alert volume you observe. Regularly review the “Copilot interaction” activity in the insider risk dashboard to identify patterns that may indicate a data exfiltration attempt or policy violation.