Incident 672: Lavender AI System Reportedly Directs Gaza Strikes with High Civilian Casualty Rate

Description: The AI system "Lavender" has reportedly been used by the Israel Defense Forces (IDF) to identify targets in Gaza with minimal human oversight, resulting in allegedly high civilian casualty rates. The system, designed to speed up target identification, seems to have led to significant errors and mass casualties.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Unit 8200 and Israel Defense Forces developed and deployed an AI system, which harmed Palestinians and Gazans.

Incident Stats

Incident ID
672
Report Count
2
Incident Date
2024-04-03
Editors
Daniel Atherton
‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
972mag.com · 2024

In 2021, a book titled "The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World" was released in English under the pen name "Brigadier General Y.S." In it, the author --- a m…

Israeli Military Using AI to Select Targets in Gaza With 'Rubber Stamp' From Human Operator: Report
yahoo.com · 2024

Israel has been using an artificial intelligence system called Lavender to create a “kill list” of at least 37,000 people in Gaza, according to a new report from Israel’s +972 magazine, confirmed by the Guardian. Lavender is the second AI s…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.