Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2953

関連インシデント

インシデント 5332 Report
Tesla FSD Misidentified Truck Hauling Traffic Lights as Trail of Traffic Lights

Loading...
Watch Tesla Autopilot Get Bamboozled by a Truck Hauling Traffic Lights
futurism.com · 2021

A Tesla Model 3 owner encountered an unusual glitch while using the Autopilot assisted driving system on the highway: The car seemed to detect an endless trail of traffic lights all the way down the road as it traveled upwards of 80 MPH.

In video footage of the car's display that the driver uploaded to Reddit, it looks like traffic lights are being blasted out of the truck in front of them, making the drive look like a car-themed "bullet hell" style video game.

After much speculation among other redditors, the author posted a followup video revealing that they had been driving behind a truck hauling deactivated traffic lights. It's a funny-looking glitch, to be sure, but the system's inability to figure out what's going on shows how astoundingly difficult it is to prepare autonomous driving systems for the incredible range of edge cases they might encounter in the real world.

Safety First

On one hand, it's good that the assisted driving system was able to repeatedly recognize that it was, in fact, staring at traffic lights. And the car never seemed to try and screech to a halt as if it had encountered a red light, since a maneuver like that could have proven disastrous.

However, much like a similar glitch where a Tesla mistook a stop sign printed on a billboard for the real thing, the fact that the system couldn't piece together the context of the situation is still an issue. The Tesla's failure to realize that the lights were cargo rather than signals installed in the middle of the highway is a clear sign that Tesla isn't ready for full autonomy, no matter how many times CEO Elon Musk says so.

"I guess this scenario was probably not part of the system's training data," University of Birmingham and MIT mathematician Max Little said on Twitter. "A good illustration of how it will likely be impossible to reach full driving autonomy just by recording 'more data.'"

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd