Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3093

Associated Incidents

Incident 5405 Report
Tesla Failed to Yield to Detected Pedestrian on Crosswalk, Reportedly Violated Traffic Law

Loading...
No, It's Not Amazing That A Tesla Using FSD Blew Through A Crosswalk
jalopnik.com · 2023

Recently, a Tesla fan account shared a video of a Tesla using so-called Full Self-Driving on a two-lane road. As the video plays, you can see the car approach a marked crosswalk. On the center screen, the car’s sensors detect a pedestrian crossing, but unlike the cars traveling in the opposite direction, the Tesla doesn’t stop and yield to the pedestrian. Instead, it continues driving like normal.

“One of the most bullish / exciting things I’ve seen on Tesla Full Self-Driving Beta 11.4.1. It detected the pedestrian, but rather than slamming on the brakes it just proceeded through like a human would knowing there was enough time to do so,” they wrote.

Sorry, but that’s not remotely exciting or amazing in any way. And while it may be what some humans do, it’s also against the law. A pedestrian crossing at a marked crosswalk has the right of way, and you’re supposed to yield to them. The fact that the Tesla didn’t stop is a major problem. Why doesn’t the software know to yield to pedestrians?

Is that just too complex of a problem to solve yet? Is it because Tesla thinks it knows better than the people who make the laws? Either way, it’s clear that FSD shouldn’t be allowed to operate in areas where pedestrians may be present.

The fact that Tesla fans look at that video and see amazing technology and not a major flaw with FSD is also terrifying even if it’s not exactly surprising. They seem to think that Teslas should be allowed to break the law if the car thinks it can do so safely. In this one instance, no one got hurt, but that’s not the point. The fact that you’re a special fancy person in a special fancy car doesn’t give you the right to ignore the rules of the road. And it’s not like FSD has a perfect track record for safety.

At a time when pedestrian deaths continue to rise, we need to be doing more to keep people safe, not cheering on irresponsible uses of half-baked technology.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd