Entities
Microsoft
Incidents involved as both Developer and Deployer
Incident 627 Reports
TayBot
2016-03-24
Microsoft's Tay, an artificially intelligent chatbot, was released on March 23, 2016 and removed within 24 hours due to multiple racist, sexist, and anit-semitic tweets generated by the bot.
MoreIncident 12712 Reports
Microsoft’s Algorithm Allegedly Selected Photo of the Wrong Mixed-Race Person Featured in a News Story
2020-06-06
A news story published on MSN.com featured a photo of the wrong mixed-race person that was allegedly selected by an algorithm, following Microsoft’s layoff and replacement of journalists and editorial workers at its organizations with AI systems.
MoreIncident 1022 Reports
Personal voice assistants struggle with black voices, new study shows
2020-03-23
A study found that voice recognition tools from Apple, Amazon, Google, IBM, and Microsoft disproportionately made errors when transcribing black speakers.
MoreIncident 4542 Reports
Emotion Detection Models Showed Disparate Performance along Racial Lines
2018-11-09
Emotion detection tools by Face++ and Microsoft's Face API allegedly scored smiling or defaulted ambiguous facial photos for Black faces as negative emotion more often than for white faces.
MoreIncindents Harmed By
Incident 6616 Reports
Chinese Chatbots Question Communist Party
2017-08-02
Chatbots on Chinese messaging service expressed anti-China sentiments, causing the messaging service to remove and reprogram the chatbots.
MoreIncidents involved as Developer
Incident 6616 Reports
Chinese Chatbots Question Communist Party
2017-08-02
Chatbots on Chinese messaging service expressed anti-China sentiments, causing the messaging service to remove and reprogram the chatbots.
MoreIncident 1884 Reports
Argentinian City Government Deployed Teenage-Pregnancy Predictive Algorithm Using Invasive Demographic Data
2018-04-11
In 2018, during the abortion-decriminalization debate in Argentina, the Salta city government deployed a teenage-pregnancy predictive algorithm built by Microsoft that allegedly lacked a defined purpose, explicitly considered sensitive information such as disability and whether their home had access to hot water.
MoreRelated Entities
Tencent Holdings
Incindents Harmed By
- Incident 6616 Reports
Chinese Chatbots Question Communist Party
- Incident 6616 Reports
Chinese Chatbots Question Communist Party
Incidents involved as Deployer
Turing Robot
Incindents Harmed By
- Incident 6616 Reports
Chinese Chatbots Question Communist Party
- Incident 6616 Reports
Chinese Chatbots Question Communist Party