Australian businesses using Microsoft Copilot must understand how the Privacy Act 1988 and the Notifiable Data Breaches scheme apply to their Copilot deployments. Copilot processes vast amounts of Microsoft 365 data, including emails, documents, and calendar entries, to generate responses. This creates new risks for personal information exposure that fall under the Office of the Australian Information Commissioner OAIC jurisdiction. This article explains the legal obligations, the specific Copilot features that trigger data breach notification requirements, and the steps your organization must take to stay compliant.
Key Takeaways: Copilot and Australian Privacy Law Compliance
- Microsoft 365 admin center > Compliance > Data lifecycle management > Retention policies: Configure retention labels to prevent Copilot from accessing expired or sensitive personal information.
- Microsoft Purview > Data Loss Prevention DLP policies > Copilot for Microsoft 365: Detect and block Copilot from generating responses that contain credit card numbers, tax file numbers, or health identifiers.
- Microsoft 365 admin center > Audit > Audit log search: Review Copilot interaction logs to identify potential data breaches within 30 days of the incident.
How Copilot Interacts with Personal Information Under the Privacy Act
The Privacy Act 1988 defines personal information as any data that can identify an individual. Copilot can access this information from Microsoft 365 sources: emails, SharePoint sites, OneDrive files, Teams chats, and calendar items. When a user asks Copilot a question, the AI model retrieves relevant data from these sources to generate a response. This process is called grounding. The key compliance issue is that Copilot does not distinguish between general business data and personal information. It treats all accessible content equally. If an employee has permission to view a file containing someone’s tax file number, Copilot can include that number in a response to another user who has access to the same file. This is not a breach by itself. But if Copilot exposes that personal information to an unauthorized party, it becomes a notifiable data breach under the Notifiable Data Breaches scheme.
What Constitutes a Notifiable Data Breach with Copilot
A notifiable data breach occurs when three conditions are met. First, there is unauthorized access to or disclosure of personal information. Second, the breach is likely to result in serious harm to the affected individuals. Third, your organization cannot take remedial action to prevent the harm. With Copilot, a data breach can happen in two ways. The first is direct exposure: a user asks Copilot a question, and the response contains personal information about a third party that the user should not have seen. The second is indirect exposure: a user shares a Copilot response containing personal information with someone outside the organization. In both cases, your organization must assess whether the breach is likely to cause serious harm. Serious harm includes identity theft, financial loss, or damage to reputation.
Steps to Prevent and Respond to Copilot-Related Data Breaches
Your organization must implement technical controls and an incident response process specific to Copilot. The following steps address both prevention and notification obligations under the Privacy Act.
- Configure Microsoft Purview Data Loss Prevention DLP policies for Copilot
Go to the Microsoft Purview compliance portal at compliance.microsoft.com. Under Solutions, select Data Loss Prevention. Create a new DLP policy. Under Locations, select Copilot for Microsoft 365. Choose the sensitive information types that apply to your organization, such as Australian Tax File Number, Medicare Card Number, or Credit Card Number. Set the action to Block. This prevents Copilot from generating responses that contain these data types. - Set up sensitivity labels to restrict Copilot access
In the Microsoft Purview portal, go to Information Protection > Sensitivity labels. Create or edit a label with the setting Mark the content as sensitive. Under Auto-labeling, configure rules that detect personal information patterns. Then go to the Microsoft 365 admin center > Settings > Org settings > Copilot for Microsoft 365. Under Data sources, exclude SharePoint sites and OneDrive folders that contain high-risk personal information. This stops Copilot from grounding its responses on that content. - Enable audit logging for Copilot interactions
In the Microsoft 365 admin center, go to Security & Compliance > Audit. Turn on audit log recording. Under Search, filter by Activity: Copilot interaction. Set the date range to the last 30 days. Review logs weekly for any Copilot responses that contained personal information. Export logs to a CSV file for your records. - Create a data breach response plan for Copilot incidents
Document a step-by-step procedure for when a Copilot data breach is detected. Include these actions: 1. Isolate the affected user account by resetting the password and revoking Copilot access. 2. Identify the exact Copilot response that caused the breach by reviewing the audit log. 3. Determine whether the response was shared externally. 4. Assess whether the breach is likely to result in serious harm. 5. If yes, prepare a notification to the OAIC and affected individuals within 30 days of becoming aware of the breach.
Common Compliance Gaps and How to Fix Them
Organizations often overlook specific Copilot behaviors that lead to privacy breaches. The following issues are the most frequent causes of notifiable data breaches with Copilot.
Copilot accesses personal information in shared mailboxes
Shared mailboxes often contain personal information such as customer inquiries or employee records. Copilot can read these mailboxes if the user has Full Access permission. To fix this, go to Exchange admin center > Recipients > Shared mailboxes. Select the shared mailbox. Under Mailbox delegation, remove the Full Access permission for users who do not need it. Then, in the Microsoft 365 admin center > Settings > Org settings > Copilot for Microsoft 365, under Data sources, uncheck Exchange Online mailboxes if the shared mailbox contains high-risk data.
Copilot returns personal information from Teams chat history
Teams chat history includes conversations that may contain personal information such as home addresses or medical details. Copilot can retrieve this data when a user asks a question about past chats. To prevent this, go to Microsoft Teams admin center > Messaging policies. Select the policy applied to your users. Under Chat, set Delete chat history to a short retention period such as 30 days. Then in Microsoft Purview > Data lifecycle management > Retention policies, create a policy for Teams chat messages with a retention period of 30 days and then delete. This removes old chat data from Copilot’s reach.
Copilot generates responses with personal information from external users
External users such as guests or vendors can share files that contain personal information. If an internal user has access to those files, Copilot can include that data in responses. To mitigate this, go to Microsoft 365 admin center > Settings > Org settings > Copilot for Microsoft 365. Under Data sources, uncheck OneDrive for Business and SharePoint. This restricts Copilot to only Exchange Online data. Alternatively, create a DLP policy that blocks Copilot from generating responses containing personal information from external sources.
Copilot Risk Levels Under the Notifiable Data Breaches Scheme
| Risk Scenario | Likelihood of Serious Harm | Notification Required |
|---|---|---|
| Copilot exposes a tax file number to an unauthorized internal user | High | Yes, notify OAIC within 30 days |
| Copilot exposes a customer email address in a response shared externally | Medium | Assess case-by-case; notify if harm is likely |
| Copilot exposes a general business document with no personal information | Low | No notification required |
| Copilot exposes health information from a Teams chat to an unauthorized user | High | Yes, notify OAIC and affected individuals |
Your organization can now configure Copilot to comply with the Privacy Act and Notifiable Data Breaches scheme. Start by implementing the DLP policies and sensitivity labels described above. Then run a weekly audit log review to catch any Copilot-generated personal information exposures early. For advanced protection, enable Microsoft Purview Communication Compliance to monitor Copilot responses for sensitive data patterns automatically.