Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 1209

関連インシデント

インシデント 6723 Report
Sleeping Driver on Tesla AutoPilot

Loading...
Drunk Tesla Driver Tells Cops Autopilot Was in Charge
fortune.com · 2018

The California Highway Patrol (CHP) says a driver was found passed out in his Tesla with a very high blood alcohol content on San Francisco’s Bay Bridge on Friday. The driver, according to CHP, claimed the car had been “set on autopilot” in an apparent attempt to defend himself.

The highway patrol, seemingly unimpressed, arrested the unnamed driver, charged him with suspicion of driving under the influence, and towed his car, noting on Twitter that “no it didn’t drive itself to the tow yard.”

When u pass out behind the wheel on the Bay Bridge with more than 2x legal alcohol BAC limit and are found by a CHP Motor. Driver explained Tesla had been set on autopilot. He was arrested and charged with suspicion of DUI. Car towed (no it didn’t drive itself to the tow yard). pic.twitter.com/4NSRlOBRBL — CHP San Francisco (@CHPSanFrancisco) January 19, 2018

Tesla did not confirm that the driver had actually engaged the autopilot system, though it has in the past used driver data in accident investigations. The autopilot system is designed to get a driver’s attention if it detects a challenging situation and brings the car to a stop if a driver does not respond.

Get Data Sheet, Fortune’s technology newsletter.

At the most abstract level, the incident invites us to ask questions about driver responsibility in the age of autonomous vehicles. In the future, will it be okay for us to get in our cars while inebriated, and let them take us home?

Maybe—but for now, that hardly matters. Tesla’s “autopilot” is not fully autonomous driving, though it can look like it for short stretches and under specific conditions. Tesla is clear that drivers using autopilot should remain alert and retain responsibility for their vehicle.

But Tesla drivers don’t always seem to get that message. Over-reliance on autopilot might have contributed to a 2016 fatality involving a distracted driver. The investigation following the crash concluded, in part, that Tesla didn’t have sufficient safeguards to ensure driver attention while using autopilot.

There’s still no confirmation that autopilot was in fact involved in the Friday incident, and with no apparent accident or injuries resulting, it’s unlikely it will lead to further official investigations into driver responsibility. However, it should be concerning in light of the ongoing rollout of Tesla’s Model 3, which features optional autopilot features. It appears Tesla may still have some work to do in educating its customers about the limitations of autopilot, or implementing further controls to prevent drivers from misusing it.

This article has been updated to reflect communication with Tesla.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd