Amazon
開発者と提供者の両方の立場で関わったインシデント
インシデント 3435 レポート
Amazon Alexa Responding to Environmental Inputs
2015-12-05
There are multiple reports of Amazon Alexa products (Echo, Echo Dot) reacting and acting upon unintended stimulus, usually from television commercials or news reporter's voices.
もっとインシデント 3733 レポート
Female Applicants Down-Ranked by Amazon Recruiting Tool
2016-08-10
Amazon shuts down internal AI recruiting tool that would down-rank female applicants.
もっとインシデント 1524 レポート
Amazon Censors Gay Books
2008-05-23
Amazon's book store "cataloging error" led to books containing gay and lesbian themes to lose their sales ranking, therefore losing visibility on the sales platform.
もっとインシデント 217 レポート
Warehouse robot ruptures can of bear spray and injures workers
2018-12-05
Twenty-four Amazon workers in New Jersey were hospitalized after a robot punctured a can of bear repellent spray in a warehouse.
もっと影響を受けたインシデント
インシデント 6255 レポート
Proliferation of Products on Amazon Titled with ChatGPT Error Messages
2024-01-12
Products named after ChatGPT error messages are proliferating on Amazon, such as lawn chairs and religious texts. These names, often resembling AI-generated errors, indicate a lack of editing and undermine the sense of authenticity and reliability of product listings.
もっとインシデント 5282 レポート
Amazon Algorithmic Pricing Allegedly Hiked up Price of Reference Book to Millions
2023-04-08
Amazon's pricing algorithm was implicated in a reference book about flies' unusual high price of millions of dollars, allegedly due to two sellers using the paid service which based their product's pricing on one another's as competitors.
もっとインシデント 5751 レポート
Amazon Rife with Many Allegedly AI-Generated Books of Suspect Quality
2023-06-28
Amazon’s Kindle Unlimited young adult romance bestseller list was flooded with allegedly AI-generated books that made little to no sense, disrupting the rankings. These books were reported to be "clearly there to click farm." Despite being removed from the bestseller list, many remained available for purchase. The incident raised concerns about the integrity of the platform, and the potential financial impact on legitimate authors.
もっとIncidents involved as Developer
インシデント 1115 レポート
Amazon Flex Drivers Allegedly Fired via Automated Employee Evaluations
2015-09-25
Amazon Flex's contract delivery drivers were dismissed using a minimally human-interfered automated employee performance evaluation based on indicators impacted by out-of-driver's-control factors and without having a chance to defend against or appeal the decision.
もっとインシデント 4693 レポート
Automated Adult Content Detection Tools Showed Bias against Women Bodies
2006-02-25
Automated content moderation tools to detect sexual explicitness or "raciness" reportedly exhibited bias against women bodies, resulting in suppression of reach despite not breaking platform policies.
もっとインシデント 7782 レポート
Amazon's Alexa Reportedly Shows Political Preference Error in Trump-Harris Presidential Race Queries
2024-09-04
Amazon's Alexa was found to provide politically biased responses when asked about the 2024 presidential candidates. It refused to give reasons to vote for Donald Trump, citing neutrality, while offering detailed endorsements for Kamala Harris. Amazon labeled the discrepancy an "error" and reportedly corrected it.
もっとIncidents involved as Deployer
インシデント 3954 レポート
Amazon Forced Deployment of AI-Powered Cameras on Delivery Drivers
2021-03-02
Amazon delivery drivers were forced to consent to algorithmic collection and processing of their location, movement, and biometric data through AI-powered cameras, or be dismissed.
もっとインシデント 1162 レポート
Amazon's AI Cameras Incorrectly Penalized Delivery Drivers for Mistakes They Did Not Make
2021-09-20
Amazon's automated performance evaluation system involving AI-powered cameras incorrectly punished delivery drivers for non-existent mistakes, impacting their chances for bonuses and rewards.
もっと関連する組織
Alexa Device Owners
影響を受けたインシデント
- インシデント 3435 レポート
Amazon Alexa Responding to Environmental Inputs
- インシデント 7782 レポート
Amazon's Alexa Reportedly Shows Political Preference Error in Trump-Harris Presidential Race Queries
Incidents involved as Deployer
Microsoft
開発者と提供者の両方の立場で関わったインシデント
- インシデント 1022 レポート
Personal voice assistants struggle with black voices, new study shows
- インシデント 5871 レポート
Apparent Failure to Accurately Label Primates in Image Recognition Software Due to Alleged Fear of Racial Bias
Incidents involved as Developer
開発者と提供者の両方の立場で関わったインシデント
- インシデント 1022 レポート
Personal voice assistants struggle with black voices, new study shows
- インシデント 5871 レポート
Apparent Failure to Accurately Label Primates in Image Recognition Software Due to Alleged Fear of Racial Bias