Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3094

Associated Incidents

Incident 5405 Report
Tesla Failed to Yield to Detected Pedestrian on Crosswalk, Reportedly Violated Traffic Law

Loading...
Watch Tesla's FSD Beta 11.4.1. Ignore Pedestrian On Marked Crosswalk
insideevs.com · 2023

A short video of a Tesla with Full Self-Driving activated is doing the rounds online, adding more controversy to the ADAS that has received more than its fair share of criticism ever since its first deployment in October 2020.

The footage shared by Whole Mars Catalog on Twitter shows a Tesla with the latest FSD Beta 11.4.1. update driving in sunny weather with perfect visibility in a suburban area in California.

As the Tesla approached a pedestrian crossing at 28 mph, we can see that a pedestrian had already started to cross the road from the left when the car was roughly 50 yards away. It's worth noting that the crossing was well marked with signs on both sides of the road, in addition to having the traditional lines painted on the asphalt.

So what did the Tesla do? It "saw" the pedestrian, then hesitated a bit – notice that the speed started to decrease at one point – but instead of applying the brakes it continued driving, passing the pedestrian by as they had almost reached the middle of the crossing.

YouTuber Whole Mars Catalog appeared to be thrilled with how FSD Beta proceeded, though.

"One of the most bullish / exciting things I've seen on Tesla Full Self-Driving Beta 11.4.1. It detected the pedestrian, but rather than slamming on the brakes it just proceeded through like a human would knowing there was enough time to do so," he tweeted.

As Jalopnik pointed out, that's not at all exciting, it's clearly dangerous behavior – and an obvious violation of traffic laws. A driver approaching a marked crosswalk is required to yield to pedestrians, and the fact the Tesla didn't even attempt to stop is a big problem – especially since the car's display shows the system detected the pedestrian.

In a series of replies to his original tweet, Whole Mars Catalog argued the Tesla "started slowing a little bit in case it decided it needed to brake but saw it was safe to proceed."

That's clearly not a get out of jail free card, and the fact no one was hurt in this instance does not mean it was the right thing to do. Also, the fact some drivers do that sometimes does not mean FSD Beta should imitate dangerous human behavior. 

It's pretty clear that the video exposes a major flaw with the FSD Beta system, and some of the replies in the Twitter thread also reveal that some Tesla fans believe it's not that big of a deal to ignore the rules of the road. Nothing further from the truth. 

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd