Business users in India who deploy Microsoft Copilot must understand how the Digital Personal Data Protection Act, 2023 applies to their data processing activities. The DPDP Act governs how personal data is collected, stored, and processed within India. Microsoft has published compliance documentation and contractual commitments to help organizations meet these requirements. This article explains the key DPDP Act clauses that affect Copilot usage, the technical measures Microsoft provides, and the steps administrators must take to align their Copilot deployment with Indian data protection law.
Key Takeaways: Copilot and DPDP Act Compliance for Indian Organizations
- Microsoft Product Terms (DPDP Addendum): Microsoft commits to process personal data only on documented instructions from the customer, aligning with DPDP Act consent requirements.
- Microsoft 365 admin center > Compliance > Data Lifecycle Management: Controls data retention, deletion, and export policies that satisfy DPDP Act data minimization and storage limitation principles.
- Copilot pane > Settings > Data Sources: Administrators must restrict Copilot to tenant-specific Microsoft Graph data to avoid processing third-party personal data without valid consent.
How the DPDP Act Applies to Copilot Data Processing
The Digital Personal Data Protection Act, 2023 applies to any entity that processes personal data within India or processes data of Indian residents. Copilot, when integrated with Microsoft 365 services, processes personal data through its grounding in Microsoft Graph, files, emails, and user prompts. The key obligations under the DPDP Act that directly affect Copilot usage include:
Consent and Notice Requirements
Under Section 6 of the DPDP Act, data fiduciaries must obtain explicit consent from data principals before processing their personal data. For Copilot, this means that administrators must ensure that users have been informed about how their data is used by Copilot. Microsoft provides a Data Processing Agreement that includes the customer as the data fiduciary and Microsoft as the data processor. The consent notice must be presented to users before they interact with Copilot.
Data Minimization and Purpose Limitation
Section 7 of the DPDP Act requires that personal data be collected only for a specific, clear, and lawful purpose. Copilot should not process more personal data than necessary to respond to a user prompt. Microsoft achieves this by grounding Copilot responses in the user’s Microsoft Graph data and not in external or third-party data sources unless explicitly configured. Administrators must audit which data sources Copilot can access to ensure they do not include personal data unrelated to the business purpose.
Storage Limitation and Deletion
Section 8 of the DPDP Act mandates that personal data be retained only as long as necessary for the purpose for which it was processed. Microsoft stores Copilot interaction logs and generated content in the user’s Exchange Online mailbox and SharePoint sites. Administrators must configure data lifecycle policies in the Microsoft 365 compliance center to automatically delete or archive Copilot-related content after the required retention period. The default retention for Copilot chat history is 30 days, but this can be extended or shortened through retention labels.
Steps to Configure Copilot for DPDP Act Compliance
To align your Copilot deployment with the DPDP Act, follow these steps in the Microsoft 365 admin center and compliance center. These steps assume you have global administrator or compliance administrator privileges.
- Review and accept the Microsoft DPDP Addendum
Go to the Microsoft 365 admin center. Select Settings > Org settings > Services > Microsoft Copilot. Under the Data Protection section, review the DPDP Addendum. This document contains Microsoft’s contractual commitments to process personal data only on your instructions. Accept the addendum to activate compliance terms for your tenant. - Restrict Copilot data sources to tenant Microsoft Graph
In the Microsoft 365 admin center, select Settings > Org settings > Microsoft Copilot. In the Data Sources section, ensure that only Microsoft Graph data is enabled. Disable any third-party connectors or public web data sources. This prevents Copilot from processing personal data from external sources without your explicit consent. - Configure data retention policies for Copilot content
In the Microsoft 365 compliance center, select Data Lifecycle Management > Retention policies. Create a new retention policy for Copilot-generated content. Set the retention period to match your organization’s data retention schedule under the DPDP Act. For example, set a retention period of 90 days for general business correspondence and 7 years for legally mandated records. Apply the policy to Exchange Online mailboxes and SharePoint sites where Copilot content is stored. - Enable audit logging for Copilot interactions
In the compliance center, select Audit > Audit log search. Ensure that audit logging is turned on for your tenant. This logs all Copilot prompts and responses. Under the DPDP Act, you must be able to demonstrate compliance with data processing obligations. Audit logs provide the evidence needed for regulatory inspections. - Create a consent notice for Copilot users
Draft a notice that explains to users how Copilot processes their personal data. Include the purpose of processing, the types of data processed, and the retention period. Distribute this notice through your organization’s internal communication channels. Under the DPDP Act, consent must be freely given, specific, informed, and unambiguous. Update the notice whenever you change Copilot data sources or retention policies.
Common Compliance Gaps and How to Address Them
Copilot accesses personal data of non-consenting individuals
If Copilot processes personal data of individuals who have not consented, you violate Section 6 of the DPDP Act. This can happen when Copilot indexes shared mailboxes, public calendars, or distribution groups that contain personal data of external parties. To fix this, use Microsoft Purview data classification to identify and label personal data. Then configure Copilot to exclude data from shared mailboxes or public folders through the Data Sources settings.
Data retention exceeds the purpose for which it was collected
Copilot chat history and generated documents may be retained longer than necessary. Under Section 8 of the DPDP Act, you must delete personal data when the purpose is fulfilled. Configure a retention label in the compliance center with a deletion action after 30 days for general Copilot interactions. For Copilot-generated documents that contain personal data, set a shorter retention period unless legal holds apply.
No mechanism for data principal to request deletion
Under Section 12 of the DPDP Act, data principals have the right to request deletion of their personal data. If a user asks you to delete Copilot-generated content that contains their personal data, you must be able to locate and delete it. Use the Microsoft Purview eDiscovery tool to search for content containing the data principal’s name or email address. Export the results and delete the content from Exchange Online and SharePoint. Document the deletion for compliance records.
Copilot Data Processing Compared to DPDP Act Obligations
| DPDP Act Obligation | Microsoft Copilot Default Behavior | Administrator Action Required |
|---|---|---|
| Consent and notice | No consent notice is shown to users | Create and distribute a consent notice |
| Data minimization | Grounds in user Microsoft Graph data | Restrict data sources to tenant Graph only |
| Purpose limitation | Processes data only for user prompts | Audit data source permissions regularly |
| Storage limitation | 30-day retention for chat history | Configure retention policies with deletion |
| Right to deletion | No automated deletion mechanism | Use eDiscovery to locate and delete content |
| Data security | TLS encryption in transit and at rest | Enable customer-managed keys if required |
Administrators should review each row and confirm that their tenant settings match the required action column. The default Copilot configuration does not fully satisfy DPDP Act obligations without additional configuration.
You can now configure Copilot to comply with the DPDP Act by restricting data sources, setting retention policies, and creating a consent notice. Next, run a data classification scan using Microsoft Purview to identify any personal data that Copilot might process without proper consent. As an advanced step, consider enabling customer-managed keys for Copilot data storage to meet the DPDP Act’s data security obligations under Section 9.