Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Entities

Bing

Incidents involved as Deployer

Incident 6213 Report
Microsoft AI Is Alleged to Have Generated Violent Imagery of Minorities and Public Figures

2023-11-10

Microsoft’s AI Image Creator, integrated with Bing and Windows Paint, produced disturbingly violent and graphic images featuring members of minority groups and public figures like Joe Biden and Pope Francis.

More

Incidents implicated systems

Incident 11742 Report
Microsoft Copilot Reportedly Able to Access Cached Data from Since-Private GitHub Repositories

2025-02-26

Lasso Security reported that Microsoft Copilot could return content from GitHub repositories that had been public briefly but later set to private or deleted. Lasso attributed this to Bing's caching system, which stored "zombie data" from over 20,000 repositories. The cached content allegedly included sensitive information such as access keys, tokens, and internal packages. Microsoft reportedly classified the issue as low severity and applied only partial mitigations.

More

Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
 

Entity

Windows Paint

Incidents involved as Deployer
  • Incident 621
    3 Reports

    Microsoft AI Is Alleged to Have Generated Violent Imagery of Minorities and Public Figures

More
Entity

Microsoft

Incidents involved as both Developer and Deployer
  • Incident 621
    3 Reports

    Microsoft AI Is Alleged to Have Generated Violent Imagery of Minorities and Public Figures

  • Incident 1174
    2 Reports

    Microsoft Copilot Reportedly Able to Access Cached Data from Since-Private GitHub Repositories

More
Entity

Bing users

Incidents involved as Deployer
  • Incident 621
    3 Reports

    Microsoft AI Is Alleged to Have Generated Violent Imagery of Minorities and Public Figures

More
Entity

AI Image Creator

Incidents involved as Deployer
  • Incident 621
    3 Reports

    Microsoft AI Is Alleged to Have Generated Violent Imagery of Minorities and Public Figures

More
Entity

Sikh people

Incidents Harmed By
  • Incident 621
    3 Reports

    Microsoft AI Is Alleged to Have Generated Violent Imagery of Minorities and Public Figures

More
Entity

President Joe Biden

Incidents Harmed By
  • Incident 621
    3 Reports

    Microsoft AI Is Alleged to Have Generated Violent Imagery of Minorities and Public Figures

More
Entity

Pope Francis

Incidents Harmed By
  • Incident 621
    3 Reports

    Microsoft AI Is Alleged to Have Generated Violent Imagery of Minorities and Public Figures

More
Entity

Navajo people

Incidents Harmed By
  • Incident 621
    3 Reports

    Microsoft AI Is Alleged to Have Generated Violent Imagery of Minorities and Public Figures

More
Entity

Minorities

Incidents Harmed By
  • Incident 621
    3 Reports

    Microsoft AI Is Alleged to Have Generated Violent Imagery of Minorities and Public Figures

More
Entity

Hillary Clinton

Incidents Harmed By
  • Incident 621
    3 Reports

    Microsoft AI Is Alleged to Have Generated Violent Imagery of Minorities and Public Figures

More
Entity

General public

Incidents Harmed By
  • Incident 621
    3 Reports

    Microsoft AI Is Alleged to Have Generated Violent Imagery of Minorities and Public Figures

More
Entity

Donald Trump

Incidents Harmed By
  • Incident 621
    3 Reports

    Microsoft AI Is Alleged to Have Generated Violent Imagery of Minorities and Public Figures

More
Entity

GitHub users

Incidents Harmed By
  • Incident 1174
    2 Reports

    Microsoft Copilot Reportedly Able to Access Cached Data from Since-Private GitHub Repositories

More
Entity

GitHub repositories

Incidents Harmed By
  • Incident 1174
    2 Reports

    Microsoft Copilot Reportedly Able to Access Cached Data from Since-Private GitHub Repositories

More
Entity

GitHub

Incidents Harmed By
  • Incident 1174
    2 Reports

    Microsoft Copilot Reportedly Able to Access Cached Data from Since-Private GitHub Repositories

Incidents implicated systems
  • Incident 1174
    2 Reports

    Microsoft Copilot Reportedly Able to Access Cached Data from Since-Private GitHub Repositories

More
Entity

Microsoft Copilot

Incidents implicated systems
  • Incident 1174
    2 Reports

    Microsoft Copilot Reportedly Able to Access Cached Data from Since-Private GitHub Repositories

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • b9764d4