Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2417

関連インシデント

インシデント 4348 Report
Sudden Braking by Tesla Allegedly on Self-Driving Mode Caused Multi-Car Pileup in Tunnel

Loading...
Tesla ‘full self-driving’ triggered an eight-car crash, a driver tells police
cnn.com · 2022

A driver told authorities that their Tesla’s “full-self-driving” software braked unexpectedly and triggered an eight-car pileup in the San Francisco Bay Area last month that led to nine people being treated for minor injuries including one juvenile who was hospitalized, according to a California Highway Patrol traffic crash report.

CNN Business obtained the report detailing the crash through a public records request Wednesday. California Highway Patrol reviewed videos that show the Tesla vehicle changing lanes and slowing to a stop.

California Highway Patrol said in the Dec. 7 report that it could not confirm if “full self-driving” was active at the time of the crash. A highway patrol spokesperson told CNN Business on Wednesday that it would not determine if “full self-driving” was active, and Tesla would have that information.

The crash occurred about lunchtime on Thanksgiving, snarling traffic on Interstate 80 east of the Bay Bridge as two lanes of traffic were closed for about 90 minutes as many people traveled to holiday events. Four ambulances were called to the scene.

The pileup took place just hours after Tesla CEO Elon Musk had announced that Tesla’s driver-assist software “full self-driving” was available to anyone in North America who requested it. Tesla had previously restricted access to drivers with high safety scores on its rating system.
“Full self-driving” is designed to keep up with traffic, steer in the lane and abide by traffic signals. It requires an attentive human driver prepared to take full control of the car at any moment. It’s delighted some drivers but also alarmed others with its limitations. Drivers are warned by Tesla when they install “full self-driving” that it “may do the wrong thing at the worst time.”

The report states that the Tesla Model S was traveling at about 55 mph and shifted into the far left-hand lane, but then braked abruptly, slowing the car to about 20 mph. That led to a chain reaction that ultimately involved eight vehicles to crash, all of which had been traveling at typical highway speeds.

Tesla’s driver-assist technologies, Autopilot and “full self-driving” are already being investigated by the National Highway Traffic Safety Administration following reports of unexpected braking that occurs “without warning, at random, and often repeatedly in a single drive.”

The agency has received hundreds of complaints from Tesla drivers. Some have described near crashes and concerns about their safety. This summer NHTSA upgraded the investigation to what it calls an engineering analysis, an indication that it’s seriously considering a recall.

NHTSA told CNN Business a few days after the Thanksgiving Day crash that it was gathering addition information from Tesla and law enforcement about the crash.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd