Entities

Microsoft

Incidents involved as both Developer and Deployer

Incident 628 Reports
TayBot

2016-03-24

Microsoft's Tay, an artificially intelligent chatbot, was released on March 23, 2016 and removed within 24 hours due to multiple racist, sexist, and anit-semitic tweets generated by the bot.

More

Incident 61217 Reports
Microsoft AI Poll Allegedly Causes Reputational Harm of The Guardian Newspaper

2023-10-31

An AI-generated poll by Microsoft, displayed alongside a Guardian article, inappropriately speculated on the cause of Lilie James's death, leading to public backlash and alleged reputational damage for The Guardian. Microsoft acknowledged the issue, subsequently deactivating such polls and revising its AI content policies.

More

Incident 12712 Reports
Microsoft’s Algorithm Allegedly Selected Photo of the Wrong Mixed-Race Person Featured in a News Story

2020-06-06

A news story published on MSN.com featured a photo of the wrong mixed-race person that was allegedly selected by an algorithm, following Microsoft’s layoff and replacement of journalists and editorial workers at its organizations with AI systems.

More

Incident 5037 Reports
Bing AI Search Tool Reportedly Declared Threats against Users

2023-02-14

Users such as the person who revealed its built-in initial prompts reported Bing AI-powered search tool for making death threats or declaring them as threats, sometimes as an unintended persona.

More

Incidents Harmed By

Incident 6616 Reports
Chinese Chatbots Question Communist Party

2017-08-02

Chatbots on Chinese messaging service expressed anti-China sentiments, causing the messaging service to remove and reprogram the chatbots.

More

Incident 5037 Reports
Bing AI Search Tool Reportedly Declared Threats against Users

2023-02-14

Users such as the person who revealed its built-in initial prompts reported Bing AI-powered search tool for making death threats or declaring them as threats, sometimes as an unintended persona.

More

Incident 4776 Reports
Bing Chat Tentatively Hallucinated in Extended Conversations with Users

2023-02-14

Early testers reported Bing Chat, in extended conversations with users, having tendencies to make up facts and emulate emotions through an unintended persona.

More

Incident 4702 Reports
Bing Chat Response Cited ChatGPT Disinformation Example

2023-02-08

Reporters from TechCrunch issued a query to Microsoft Bing's ChatGPT feature, which cited an earlier example of ChatGPT disinformation discussed in a news article to substantiate the disinformation.

More

Incidents involved as Developer

Incident 6616 Reports
Chinese Chatbots Question Communist Party

2017-08-02

Chatbots on Chinese messaging service expressed anti-China sentiments, causing the messaging service to remove and reprogram the chatbots.

More

Incident 1884 Reports
Argentinian City Government Deployed Teenage-Pregnancy Predictive Algorithm Using Invasive Demographic Data

2018-04-11

In 2018, during the abortion-decriminalization debate in Argentina, the Salta city government deployed a teenage-pregnancy predictive algorithm built by Microsoft that allegedly lacked a defined purpose, explicitly considered sensitive information such as disability and whether their home had access to hot water.

More

Incident 4693 Reports
Automated Adult Content Detection Tools Showed Bias against Women Bodies

2006-02-25

Automated content moderation tools to detect sexual explicitness or "raciness" reportedly exhibited bias against women bodies, resulting in suppression of reach despite not breaking platform policies.

More

Incidents involved as Deployer

Incident 5711 Report
Accidental Exposure of 38TB of Data by Microsoft's AI Research Team

2023-06-22

Microsoft's AI research team accidentally exposed 38TB of sensitive data while publishing open-source training material on GitHub. The exposure included secrets, private keys, passwords, and internal Microsoft Teams messages. The team utilized Azure's Shared Access Signature (SAS) tokens for sharing, which were misconfigured, leading to the wide exposure of data.

More

Related Entities