Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 420

Associated Incidents

Incident 293 Report
Image Classification of Battle Tanks

Loading...
Tales from the Trenches: AI Disaster Stories (GDC talk)
lobste.rs · 2016

His team was working on running simulations of long-distance manned spaceflight. In particular, the goal of their simulations was to determine an algorithm that would optimally allocate food, water, and electricity to 3 crew members. The decided they would try running a genetic algorithm with the success criteria being that one or more crew members would survive for as many days as possible before resources ran out.

It started off fairly predictably– 300 days, 350 days, 375 days of survival. Then fairly abruptly, the algorithm shot up to around 900 days of survival. The team couldn’t believe it! They were fairly pleased at the 375 day survival results as it was.

As they started digging into how this new algorithm worked, they discovered a small problem. The algorithm had arrived at a solution wherein it would immediately withhold food and water from two of the crew mates, causing them to die from starvation and dehydration. From there, it would simply provide the surplus remaining resources to the surviving crew member.

The team realised that the success criterion of “one or more crew members would survive for as long as possible” was not actually the criteria that they really wanted, and the algorithm settled in at 350 days worth of resources once again once they adjusted the algorithm to keep all of the crew alive.

It’s often the simple underlying assumptions that distinguish murderous spaceships from spaceships that keep their crew alive a little longer in extreme conditions.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd