Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2418

関連インシデント

インシデント 4348 Report
Sudden Braking by Tesla Allegedly on Self-Driving Mode Caused Multi-Car Pileup in Tunnel

Loading...
Tesla behind eight-vehicle crash was in ‘full self-driving’ mode, says driver
theguardian.com · 2022

The driver of a 2021 Tesla Model S told California authorities the vehicle was in “full self-driving mode” when the technology malfunctioned, causing an eight-vehicle crash on the San Francisco Bay bridge last month.

The crash on Thanksgiving Day resulted in two juveniles being transported to hospital and led to lengthy delays on the bridge. The incident was made public in a police report on Wednesday.

It is the latest in a series of accidents blamed on Tesla technology. The electric automaker’s chief executive, Elon Musk, has heavily promoted “Full Self-Driving” (FSD) software, sold as $15,000 add-on to Tesla vehicles, but it faces legal, regulatory and public scrutiny.

After the San Francisco accident, the driver told police the FSD software malfunctioned.

The police report said the vehicle was traveling at 55mph when it shifted lane but braked abruptly, slowing the car to about 20mph. That led to another vehicle hitting the Tesla and a chain reaction of crashes, according to Reuters.

However, police were unable to determine if the software was in operation or that the driver’s account was accurate. The report was made public after a records request.

The crash occurred hours after Musk said Tesla would make FSD software available to anyone in North America who requested it. It previously offered the system only to drivers with high safety scores.

The police report said that if FSD malfunctioned, the driver should have manually taken control. Tesla has repeatedly said its advanced self-driving technology requires “active driver supervision” and its vehicles “are not autonomous”.

Drivers are also warned when they install FSD that it “may do the wrong thing at the worst time”.

The National Highway Traffic Safety Administration (NHTSA), which is investigating Tesla after reports of braking “without warning, at random, and often repeatedly in a single drive”, did not immediately comment on the San Francisco crash.

Last summer, NHTSA upgraded the investigation to what it calls an engineering analysis. The chair of the National Transportation Safety Board, Jennifer Homendy, has questioned if “full self-driving” is an accurate description of the technology – and said Tesla must do more to prevent misuse.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd