Description: A vulnerability in Microsoft 365 Copilot reportedly allowed users to access and summarize files without generating audit log entries, allegedly undermining traceability and compliance. Security researcher Zack Korman disclosed the issue to Microsoft, which reportedly classified it as "important" and fixed it on August 17, 2025, but reportedly chose not to notify customers or assign a CVE.
Entities
View all entitiesAlleged: Microsoft and Microsoft 365 Copilot developed and deployed an AI system, which harmed Microsoft 365 Copilot enterprise customers and Organizations relying on audit logs for compliance and security.
Alleged implicated AI system: Microsoft 365 Copilot
Incident Stats
Incident ID
1218
Report Count
1
Incident Date
2025-07-04
Editors
Daniel Atherton
Incident Reports
Reports Timeline
Loading...
Like most tech companies, Microsoft is going all-in on AI. Their flagship AI product, Copilot (in all its various forms), allows people to utilize AI in their daily work to interact with Microsoft services and generally perform tasks. Unfor…
Variants
A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Loading...

Game AI System Produces Imbalanced Game
· 11 reports
Loading...

Biased Sentiment Analysis
· 7 reports
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Loading...

Game AI System Produces Imbalanced Game
· 11 reports
Loading...

Biased Sentiment Analysis
· 7 reports