Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 200

関連インシデント

インシデント 32114 Report
Tesla Model X on Autopilot Crashed into California Highway Barrier, Killing Driver

Loading...
Tesla car that crashed and killed driver was running on Autopilot, firm says
theguardian.com · 2018

Tesla has said a car that crashed in California last week, killing its driver, was operating on Autopilot.

The 23 March crash on highway 101 in Mountain View is the latest accident to involve self-driving technology. Earlier this month, a self-driving Volvo SUV that was being tested by the ride-hailing service Uber struck and killed a pedestrian in Arizona.

Federal investigators are looking into the California crash, as well a crash in January of a Tesla Model S that may have been operating under the Autopilot system.

In a blogpost, Tesla said the driver of the sport-utility Model X that crashed in Mountain View, 38-year-old Apple software engineer Wei Huang, “had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.

“The driver had about five seconds and 150 meters of unobstructed view of the concrete divider … but the vehicle logs show that no action was taken.”

Tesla also said the concrete highway divider had previously been damaged, increasing its impact on the car. The vehicle also caught fire, though Tesla said no one was in the vehicle when that happened.

The company said its Autopilot feature can keep speed, change lanes and self-park but requires drivers to keep their eyes on the road and hands on the wheel, in order to be able to take control and avoid accidents.

Autopilot does not prevent all accidents, Tesla said, but it does make them less likely.

“No one knows about the accidents that didn’t happen,” Tesla said, “only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe.

“There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year.”

The company added that it “care[s] deeply for and feel[s] indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety.

“None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.”

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd