Microsoft has confirmed that a bug allowed its Copilot AI to aggregate customers' sensitive emails for weeks without permission.
The bug, first reported by Beeping computerhas allowed Copilot Chat to read and sketch the contents of emails since January, though customers had data loss prevention policies in place to stop their sensitive information from being included in Microsoft's large language model.
Copilot Chat enables paying Microsoft 365 customers to make use of AI-powered chat functionality in its Office software products, including Word, Excel and PowerPoint.
Microsoft said the error was comprehensible for administrators CW1226324implies that drafted and sent email messages “labeled “Confidential” are incorrectly processed by Microsoft 365 Copilot Chat.”
The tech giant said it began rolling out a fix for the bug in early February. A Microsoft spokesman didn’t reply to a request for comment, including when asked how many purchasers were affected by the bug.
Earlier this week, the European Parliament's IT department told lawmakers it had blocked built-in AI features on its work-issued devices, citing concerns that the AI ​​tools could potentially upload sensitive correspondence to the cloud.

