Singapore – Microsoft has acknowledged a fault in its Microsoft 365 Copilot system that led to confidential emails being summarised by the AI assistant, despite existing data protection safeguards. According to BleepingComputer, the issue emerged in late January and was first identified on 21 January under tracking reference CW1226324.
The problem affected the Copilot ‘work tab’ chat function, which was found to access and condense email content stored in users’ Sent Items and Drafts folders. In some cases, the emails carried sensitivity labels intended to prevent automated processing under organisations’ data loss prevention policies.
Microsoft 365 Copilot Chat, an AI-driven assistant integrated across business applications including Word, Excel, PowerPoint, Outlook and OneNote, has been available to paying Microsoft 365 enterprise customers since September 2025. The company stated that the malfunction stemmed from a coding error that enabled the tool to process labelled emails contrary to its intended configuration.
Additionally, the company began deploying a corrective update in early February and has been monitoring its rollout. It has also contacted a portion of affected customers to confirm the effectiveness of the fix. While the incident is classified as an advisory, typically used for service matters with limited impact, Microsoft has not disclosed how many organisations were involved or provided a definitive timeline for full resolution.
In an update issued after initial reports, BleepingComputer mentioned that Microsoft indicated how the issue had been addressed and that the configuration change had been implemented globally for enterprise clients.
A corrective code had reached most impacted environments, with deployment continuing in a small number of more complex systems. No new emails would be affected for customers who have received the update.

