Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Découvrir
Envoyer
  • Bienvenue sur AIID
  • Découvrir les incidents
  • Vue spatiale
  • Vue de tableau
  • Vue de liste
  • Entités
  • Taxonomies
  • Soumettre des rapports d'incident
  • Classement des reporters
  • Blog
  • Résumé de l’Actualité sur l’IA
  • Contrôle des risques
  • Incident au hasard
  • S'inscrire
Fermer
Découvrir
Envoyer
  • Bienvenue sur AIID
  • Découvrir les incidents
  • Vue spatiale
  • Vue de tableau
  • Vue de liste
  • Entités
  • Taxonomies
  • Soumettre des rapports d'incident
  • Classement des reporters
  • Blog
  • Résumé de l’Actualité sur l’IA
  • Contrôle des risques
  • Incident au hasard
  • S'inscrire
Fermer

Incident 672: Lavender AI System Reportedly Directs Gaza Strikes with High Civilian Casualty Rate

Répondu
Description: The AI system "Lavender" has reportedly been used by the Israel Defense Forces (IDF) to identify targets in Gaza with minimal human oversight, resulting in allegedly high civilian casualty rates. The system, designed to speed up target identification, seems to have led to significant errors and mass casualties.
Editor Notes: "The Gospel" is an AI-based decision support system used by the Israel Defense Forces (IDF) to recommend buildings and structures as bombing targets in Gaza. It works alongside another AI system called "Lavender," which generates a database of individuals linked to Hamas or PIJ militants. While "Lavender" identifies human targets, "The Gospel" focuses on selecting physical targets, significantly increasing the number of potential bombing sites and accelerating the targeting process.

Outils

Nouveau rapportNouveau rapportNouvelle RéponseNouvelle RéponseDécouvrirDécouvrirVoir l'historiqueVoir l'historique

Entités

Voir toutes les entités
Présumé : Un système d'IA développé et mis en œuvre par Unit 8200 et Israel Defense Forces, a endommagé Palestinians et Gazans.

Statistiques d'incidents

ID
672
Nombre de rapports
7
Date de l'incident
2024-04-03
Editeurs
Daniel Atherton
Applied Taxonomies
MIT

Classifications de taxonomie MIT

Machine-Classified
Détails de la taxonomie

Risk Subdomain

A further 23 subdomains create an accessible and understandable classification of hazards and harms associated with AI
 

7.3. Lack of capability or robustness

Risk Domain

The Domain Taxonomy of AI Risks classifies risks into seven AI risk domains: (1) Discrimination & toxicity, (2) Privacy & security, (3) Misinformation, (4) Malicious actors & misuse, (5) Human-computer interaction, (6) Socioeconomic & environmental harms, and (7) AI system safety, failures & limitations.
 
  1. AI system safety, failures, and limitations

Entity

Which, if any, entity is presented as the main cause of the risk
 

Human

Timing

The stage in the AI lifecycle at which the risk is presented as occurring
 

Post-deployment

Intent

Whether the risk is presented as occurring as an expected or unexpected outcome from pursuing a goal
 

Intentional

Rapports d'incidents

Chronologie du rapport

+4
‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
Israel offers a glimpse into the terrifying world of military AIWhat War by A.I. Actually Looks LikeInside Israel’s Bombing Campaign in Gaza
‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

972mag.com

Israeli Military Using AI to Select Targets in Gaza With 'Rubber Stamp' From Human Operator: Report

Israeli Military Using AI to Select Targets in Gaza With 'Rubber Stamp' From Human Operator: Report

yahoo.com

‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets

‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets

theguardian.com

Israel Defence Forces’ response to claims about use of ‘Lavender’ AI database in Gaza

Israel Defence Forces’ response to claims about use of ‘Lavender’ AI database in Gaza

theguardian.com

Israel offers a glimpse into the terrifying world of military AI

Israel offers a glimpse into the terrifying world of military AI

washingtonpost.com

What War by A.I. Actually Looks Like

What War by A.I. Actually Looks Like

nytimes.com

Inside Israel’s Bombing Campaign in Gaza

Inside Israel’s Bombing Campaign in Gaza

newyorker.com

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
972mag.com · 2024

In 2021, a book titled "The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World" was released in English under the pen name "Brigadier General Y.S." In it, the author --- a m…

Israeli Military Using AI to Select Targets in Gaza With 'Rubber Stamp' From Human Operator: Report
yahoo.com · 2024

Israel has been using an artificial intelligence system called Lavender to create a “kill list” of at least 37,000 people in Gaza, according to a new report from Israel’s +972 magazine, confirmed by the Guardian. Lavender is the second AI s…

‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
theguardian.com · 2024

The Israeli military's bombing campaign in Gaza used a previously undisclosed AI-powered database that at one stage identified 37,000 potential targets based on their apparent links to Hamas, according to intelligence sources involved in th…

Israel Defence Forces’ response to claims about use of ‘Lavender’ AI database in Gaza
theguardian.com · 2024
Réponse post-incident de The Guardian

IDF statement in response to an article about the use of the AI-powered database named Lavender in the bombardment of Gaza:

Some of the claims portrayed in your questions are baseless in fact, while others reflect a flawed understanding of …

Israel offers a glimpse into the terrifying world of military AI
washingtonpost.com · 2024

It's hard to concoct a more airy sobriquet than this one. A new report published by +972 magazine and Local Call indicates that Israel has allegedly used an AI-powered database to select suspected Hamas and other militant targets in the bes…

What War by A.I. Actually Looks Like
nytimes.com · 2024

In November the left-wing Israeli outlets +972 magazine and Local Call published a disturbing investigation by the journalist Yuval Abraham into the Israel Defense Forces' use of an artificial intelligence system for identifying targets in …

Inside Israel’s Bombing Campaign in Gaza
newyorker.com · 2024

Since the war began in Gaza, more than six months ago, the Israeli magazine +972 has published some of the most penetrating reporting on the Israel Defense Forces' conduct. In November, +972, along with the Hebrew publication Local Call, fo…

Variantes

Une "Variante" est un incident qui partage les mêmes facteurs de causalité, produit des dommages similaires et implique les mêmes systèmes intelligents qu'un incident d'IA connu. Plutôt que d'indexer les variantes comme des incidents entièrement distincts, nous listons les variations d'incidents sous le premier incident similaire soumis à la base de données. Contrairement aux autres types de soumission à la base de données des incidents, les variantes ne sont pas tenues d'avoir des rapports en preuve externes à la base de données des incidents. En savoir plus sur le document de recherche.
Incident précédentProchain incident

Recherche

  • Définition d'un « incident d'IA »
  • Définir une « réponse aux incidents d'IA »
  • Feuille de route de la base de données
  • Travaux connexes
  • Télécharger la base de données complète

Projet et communauté

  • À propos de
  • Contacter et suivre
  • Applications et résumés
  • Guide de l'éditeur

Incidents

  • Tous les incidents sous forme de liste
  • Incidents signalés
  • File d'attente de soumission
  • Affichage des classifications
  • Taxonomies

2024 - AI Incident Database

  • Conditions d'utilisation
  • Politique de confidentialité
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • ecd56df