Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1218: Microsoft 365 Copilot Vulnerability Allegedly Allowed File Access Without Audit Log Entry

Description: A vulnerability in Microsoft 365 Copilot reportedly allowed users to access and summarize files without generating audit log entries, allegedly undermining traceability and compliance. Security researcher Zack Korman disclosed the issue to Microsoft, which reportedly classified it as "important" and fixed it on August 17, 2025, but reportedly chose not to notify customers or assign a CVE.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Microsoft and Microsoft 365 Copilot developed and deployed an AI system, which harmed Microsoft 365 Copilot enterprise customers and Organizations relying on audit logs for compliance and security.
Alleged implicated AI system: Microsoft 365 Copilot

Incident Stats

Incident ID
1218
Report Count
1
Incident Date
2025-07-04
Editors
Daniel Atherton

Incident Reports

Reports Timeline

Incident OccurrenceCopilot Broke Your Audit Log, but Microsoft Won’t Tell You
Loading...
Copilot Broke Your Audit Log, but Microsoft Won’t Tell You

Copilot Broke Your Audit Log, but Microsoft Won’t Tell You

pistachioapp.com

Loading...
Copilot Broke Your Audit Log, but Microsoft Won’t Tell You
pistachioapp.com · 2025

Like most tech companies, Microsoft is going all-in on AI. Their flagship AI product, Copilot (in all its various forms), allows people to utilize AI in their daily work to interact with Microsoft services and generally perform tasks. Unfor…

Variants

A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Nest Smoke Alarm Erroneously Stops Alarming

Nest Smoke Alarm Erroneously Stops Alarming

Jan 2014 · 6 reports
Loading...
Game AI System Produces Imbalanced Game

Game AI System Produces Imbalanced Game

Jun 2016 · 11 reports
Loading...
Biased Sentiment Analysis

Biased Sentiment Analysis

Oct 2017 · 7 reports
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Nest Smoke Alarm Erroneously Stops Alarming

Nest Smoke Alarm Erroneously Stops Alarming

Jan 2014 · 6 reports
Loading...
Game AI System Produces Imbalanced Game

Game AI System Produces Imbalanced Game

Jun 2016 · 11 reports
Loading...
Biased Sentiment Analysis

Biased Sentiment Analysis

Oct 2017 · 7 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 6f6c5a5