Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 212

関連インシデント

インシデント 32114 Report
Tesla Model X on Autopilot Crashed into California Highway Barrier, Killing Driver

Loading...
Tesla in fatal California crash was on Autopilot
bbc.com · 2018

Electric carmaker Tesla says a vehicle involved in a fatal crash in California was in Autopilot mode, raising further questions about the safety of self-driving technology.

One of the company's Model X cars crashed into a roadside barrier and caught fire on 23 March.

Tesla says Autopilot was engaged at the time of the accident involving the driver, 38, who died soon afterwards.

But they did not say whether the system had detected the concrete barrier.

"The driver had received several visual and one audible hands-on warning earlier in the drive," a statement on the company's website said.

"The driver's hands were not detected on the wheel for six seconds prior to the collision."

"The driver had about five seconds and 150m (490ft) of unobstructed view of the concrete divider... but the vehicle logs show that no action was taken," the statement added.

Tesla's Autopilot system does some of the things a fully autonomous machine can do. It can brake, accelerate and steer by itself under certain conditions, but it is classified as a driver assistance system, is not intended to operate independently and as such the driver is meant to have their hands on the wheel at all times.

In 2016, a Tesla driver was killed in Florida when his car failed to spot a lorry crossing its path.

It led the company to introduce new safety measures, including turning off Autopilot and bringing the car to a halt if the driver lets go of the wheel for too long.

The accident in California comes at a difficult time for self-driving technology.

Earlier this month, Uber was forbidden from resuming self-driving tests in the US state of Arizona.

It followed a fatal crash in the state in which an autonomous vehicle hit a woman who was walking her bike across the road.

It was thought to be the first time an autonomous car had been involved in a fatal collision with a pedestrian.

The company suspended all self-driving tests in North America after the accident.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd