How to Stop Copilot From Using Your Prompts for Model Training
🔍 WiseChecker

How to Stop Copilot From Using Your Prompts for Model Training

When you use Microsoft Copilot, your prompts and the responses generated may be used by Microsoft to improve its AI models. This data processing happens by default for certain Copilot experiences, especially those connected to the consumer service rather than your organization’s tenant. If you want to prevent your prompts from being used for model training, you need to adjust specific settings in your Microsoft 365 environment. This article explains how Copilot handles your data, the exact steps to opt out, and the limitations of each method.

Key Takeaways: Opting Out of Copilot Model Training

  • Microsoft 365 admin center > Settings > Org settings > Copilot > Data controls: Toggle off “Allow Microsoft to use your data to improve AI models” to stop prompt processing for training.
  • Copilot for Microsoft 365 tenant boundary: Prompts processed within your tenant are not used for model training by default; only consumer Copilot and Bing Chat data may be used.
  • Microsoft Privacy Dashboard > Activity history > Copilot: View and delete past prompts that may have been used for training before you opted out.

Why Microsoft Uses Prompts for Model Training

Microsoft collects prompts and responses from Copilot to refine its large language models. This data helps improve accuracy, reduce harmful outputs, and adapt the model to real-world usage patterns. The data collection applies to Copilot experiences that are not fully isolated within a Microsoft 365 tenant boundary. For example, when you use Copilot through the consumer service at copilot.microsoft.com or through the Copilot app on Windows, your prompts may be sent to Microsoft’s central AI infrastructure for training. In contrast, Copilot for Microsoft 365 processes prompts within your organization’s tenant boundary, and those prompts are not used for model training by default. The key difference is whether the data leaves your tenant’s secure environment. Microsoft’s Data Protection Addendum for commercial customers explicitly states that prompts and responses from Copilot for Microsoft 365 are not used to train foundation models. However, the consumer Copilot service operates under a different data handling policy.

Data Flow for Consumer Copilot

When you use the free Copilot service, your prompts travel to Microsoft’s cloud service outside your tenant. Microsoft may store these interactions for up to 30 days for safety monitoring and model improvement. During this period, your data can be used to retrain the underlying AI models. This practice is disclosed in Microsoft’s Privacy Statement under the section on AI and machine learning. If you are signed in with a personal Microsoft account, this policy applies. If you are signed in with a work or school account, the tenant-level controls take precedence.

Data Flow for Copilot for Microsoft 365

Copilot for Microsoft 365 processes your prompts entirely within your organization’s Microsoft 365 tenant. The data does not leave the tenant boundary for model training. Microsoft’s contractual commitments under the Data Protection Addendum ensure that prompts and responses are not used to train AI models. This means that for enterprise users, the default behavior already prevents model training on your prompts. However, if you also use the consumer Copilot service with the same account, you need to manage the opt-out settings separately.

Steps to Stop Copilot From Using Your Prompts for Model Training

The method you use depends on whether you are an end user with a personal Microsoft account, an IT administrator managing a tenant, or a user signed in with a work or school account. Follow the section that matches your scenario.

For IT Administrators: Disable Model Training at the Tenant Level

  1. Sign in to the Microsoft 365 admin center
    Go to admin.microsoft.com and sign in with an account that has Global Administrator or at least the AI Administrator role assigned.
  2. Navigate to Org settings
    In the left navigation pane, select Settings and then Org settings.
  3. Open Copilot settings
    On the Services tab, find and select Copilot. If you do not see this entry, your tenant may not have Copilot for Microsoft 365 enabled.
  4. Go to Data controls
    In the Copilot settings page, select the Data controls tab.
  5. Toggle off model training
    Find the setting labeled Allow Microsoft to use your data to improve AI models. Set the toggle to Off. This prevents Microsoft from using any prompts and responses from your tenant for model training.
  6. Save changes
    Click Save at the bottom of the page. The setting takes effect immediately for all users in the tenant.

For End Users: Disable Model Training on Consumer Copilot

If you use Copilot with a personal Microsoft account, you can opt out through the Microsoft Privacy Dashboard.

  1. Open the Microsoft Privacy Dashboard
    Go to account.microsoft.com/privacy and sign in with your personal Microsoft account.
  2. Select Activity history
    On the Privacy Dashboard, click Activity history from the left menu.
  3. Find Copilot activity
    Scroll down to the Copilot section. You will see a list of your recent prompts and interactions.
  4. Clear your Copilot activity
    Click Clear all Copilot activity. This deletes past prompts and signals to Microsoft that you do not consent to future data use for model training.
  5. Adjust data sharing settings
    On the same page, click Manage your data sharing preferences. Under the Copilot section, set the toggle for Improve AI models to Off.

For Users Signed In with a Work or School Account

If you are signed into Copilot with a work or school account, the tenant-level setting from the admin center controls data usage. You do not need to take individual action. However, if you also use the consumer Copilot service with the same account, you must manage the consumer settings separately using the Privacy Dashboard steps above.

If Copilot Still Uses Prompts After Opting Out

After changing the settings, you may still see references to your prompts being used for training. This can happen for several reasons.

Copilot Prompts Still Appear in Activity History

Clearing your activity history removes past data but does not prevent future collection. You must also toggle off the Improve AI models setting in the Privacy Dashboard. If the toggle is on, new prompts will continue to be used for training. Verify that the toggle is set to Off after clearing history.

Tenant-Level Toggle Does Not Affect Consumer Copilot

The admin center setting only applies to Copilot for Microsoft 365 within the tenant. If users in your organization also use the free Copilot service at copilot.microsoft.com while signed in with a personal account, those prompts are not covered by the tenant toggle. Instruct users to manage their personal Microsoft account settings separately.

Copilot in Windows Shows Different Behavior

The Copilot button in Windows 11 may use the consumer service or the work account depending on the sign-in state. If you are signed in with a personal account, the consumer data policy applies. To avoid this, sign out of your personal account in Windows and only use your work account for Copilot. You can also disable the Copilot button entirely through Group Policy: Computer Configuration > Administrative Templates > Windows Components > Windows Copilot > Turn off Windows Copilot.

Copilot for Microsoft 365 vs Consumer Copilot: Data Handling Differences

Item Copilot for Microsoft 365 Consumer Copilot
Data processing location Within tenant boundary Microsoft cloud service outside tenant
Default model training usage Not used for training Used for training by default
Opt-out method Admin center > Org settings > Copilot > Data controls Privacy Dashboard > Activity history > Manage data sharing preferences
Data retention for training None Up to 30 days
User control Only tenant admin can change Individual user can change

Understanding these differences helps you apply the correct opt-out method. For enterprise users, the tenant-level toggle is the only control needed. For personal users, the Privacy Dashboard is the correct place to manage settings. For users who switch between work and personal accounts, both controls must be configured independently.