Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 1205

Loading...
Police chased the 'unresponsive' driver of a Tesla S that was on Autopilot for 7 miles in California. How can that happen?
abcnews.go.com · 2018

In the early morning hours, California Highway Patrol chased a grey Tesla S for an unfathomable seven miles down Highway 101 as the driver slept, police said.

Redwood City Area CHP officers said they observed Alexander Joseph Samek, a local Los Altos politician, driving at around 3:30 a.m. PST on Nov. 30. Police followed Samek with lights and siren on, but he remained “unresponsive,” and “appeared to be asleep at the wheel,” according to the arrest report.

Assuming that the car was on Autopilot, police drove in front of Samek and "began slowing directly in front of the Tesla in hopes that the ‘driver assist’ feature had been activated and the Tesla would slow to a stop as the patrol vehicle came to a stop," the arrest report said. Samek was charged on suspicion of driving under the influence.

But what is befuddling transportation analysts and Tesla watchers is that the chase could even go on for that long. Tesla's "Autopilot" feature requires a driver to touch the steering wheel every minute, or the system alerts the driver and gradually brings the car to a stop. It seems that in this case, Autopilot may not have worked, or the driver somehow subverted the process, experts say.

(Artur Widak/NurPhoto via Getty Images) The wheel of a Tesla Model S P100D, May 8, 2018.

Tesla declined to comment on the accident or confirm the car was in Autopilot mode. But on Sunday night, Musk tweeted: "Default Autopilot behavior, if there’s no driver input, is to slow gradually to a stop & turn on hazard lights. Tesla service then contacts the owner. Looking into what happened here."

Exactly. Default Autopilot behavior, if there’s no driver input, is to slow gradually to a stop & turn on hazard lights. Tesla service then contacts the owner. Looking into what happened here. — Elon Musk (@elonmusk) December 3, 2018

In a follow-up tweet, Musk said that Autopilot could not distinguish between different types of emergency vehicles, but that it would be able to in the near future. "We’re adding police car, fire truck & ambulance to the Tesla neural net in coming months," he wrote.

We’re adding police car, fire truck & ambulance to the Tesla neural net in coming months — Elon Musk (@elonmusk) December 3, 2018

Redwood City CHP is familiar with the Tesla Autopilot feature in part because of a fatal crash the agency investigated in March. A 38-year old engineer at Apple died after he did not place his hands on the wheel in time when the car was in Autopilot mode, Tesla said.

The March crash is being investigated by the National Transportation and Safety Board.

Dan Edmunds, director of vehicle testing at Edmunds, an automotive research firm, has been reviewing partially automated vehicles, and called Tesla’s Autopilot a misleading term for an "overhyped automated cruise control system." He said it was difficult to come up with an explanation for such a long car chase, and it underscored shortcomings with Tesla's safety features.

"Certainly somebody could defeat the one minute timeout that allows you to put your hands on the wheel and the car could go longer,” Edmunds told ABC News. “Cadillac's Super Cruise system would not have allowed you to behave this way because Super Cruise does something that Tesla doesn't do and should do. It has sensors look at your head to see which way it's pointed to make sure your chin's up and not down against your shirt, and also looks at your eyeballs to see where they're looking. So even if you're head's up, and you look off to the side, it will warn you and eventually disengage."

"The fact that it doesn't monitor the driver's head position and line of sight is really a major shortcoming," Edmunds said. "Just because somebody has their hands on the wheel, maybe the guy's leaning on it, passed out, with just enough force to make it think that he's got his hands on the wheel. The car isn't really sure what the driver's looking at. It doesn't matter if you have your hands on the wheel or not, it matters if you're looking out the windshield at the cars ahead."

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd