Grok
Incidents involved as Deployer
Incident 114634 Report
Grok Chatbot Reportedly Posts Antisemitic Statements Praising Hitler on X
2025-07-08
xAI's Grok chatbot reportedly generated multiple antisemitic posts praising Adolf Hitler and endorsing Holocaust-like violence in response to posts about the Texas floods. X deleted some posts; xAI later announced new content filters.
MoreIncidents implicated systems
Incident 114634 Report
Grok Chatbot Reportedly Posts Antisemitic Statements Praising Hitler on X
2025-07-08
xAI's Grok chatbot reportedly generated multiple antisemitic posts praising Adolf Hitler and endorsing Holocaust-like violence in response to posts about the Texas floods. X deleted some posts; xAI later announced new content filters.
MoreIncident 107222 Report
Grok Chatbot Reportedly Inserted Content About South Africa and 'White Genocide' in Unrelated User Queries
2025-05-14
xAI's Grok chatbot reportedly inserted unsolicited references to "white genocide" in South Africa into a wide array of unrelated conversations on X. These reported interjections introduced inflammatory, racially charged content into otherwise neutral threads.
MoreIncident 11865 Report
Reported Public Exposure of Over 100,000 LLM Conversations via Share Links Indexed by Search Engines and Archived
2025-07-31
Across 2024 and 2025, the share features in multiple LLM platforms, including ChatGPT, Claude, Copilot, Qwen, Mistral, and Grok, allegedly exposed user conversations marked "discoverable" to search engines and archiving services. Over 100,000 chats were reportedly indexed and later scraped, purportedly revealing API keys, access tokens, personal identifiers, and sensitive business data.
MoreIncident 13073 Report
Grok AI Reportedly Generated Fabricated Civilian Hero Identity During Bondi Beach Shooting
2025-12-15
Grok reportedly generated and repeated a fabricated civilian hero identity, "Edward Crabtree," following the Bondi Beach shooting in Sydney, Australia. The system reportedly cited a fake news article and misattributed heroic actions to this fictional individual during the unfolding emergency. Contemporaneous reporting identified Ahmed Al Ahmed as the real bystander who intervened and was injured during the attack.
MoreRelated Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
Related Entities
xAI
Incidents involved as both Developer and Deployer
- Incident 114634 Reports
Grok Chatbot Reportedly Posts Antisemitic Statements Praising Hitler on X
- Incident 107222 Reports
Grok Chatbot Reportedly Inserted Content About South Africa and 'White Genocide' in Unrelated User Queries
Incidents involved as Developer
X (Twitter) users
Incidents Harmed By
- Incident 114634 Reports
Grok Chatbot Reportedly Posts Antisemitic Statements Praising Hitler on X
- Incident 114634 Reports
Grok Chatbot Reportedly Posts Antisemitic Statements Praising Hitler on X
Incidents involved as Deployer
Incidents involved as both Developer and Deployer
- Incident 11882 Reports
Multiple LLMs Reportedly Generated Responses Aligning with Purported CCP Censorship and Propaganda
- Incident 12051 Report
Multiple Generative AI Systems Reportedly Amplify False Information During Charlie Kirk Assassination Coverage
Incidents implicated systems
Grok users
Incidents Harmed By
- Incident 11882 Reports
Multiple LLMs Reportedly Generated Responses Aligning with Purported CCP Censorship and Propaganda
- Incident 11982 Reports
Grok 3 Reportedly Generated Graphic Threats and Hate Speech Targeting Minnesota Attorney Will Stancil