Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2681

関連インシデント

インシデント 47813 Report
Tesla FSD Reportedly Increased Crash Risk, Prompting Recall

Loading...
Tesla’s Full Self-Driving Recall Targets a 'Fundamental' Flaw
wired.com · 2023

After years selling its controversial Full-Self Driving software upgrade for thousands of dollars, Tesla today issued a recall for every one of the nearly 363,000 vehicles using the feature. The move was prompted by a US government agency saying the software had in “rare circumstances” put drivers in danger and could increase the risk of a crash in everyday situations.

Recalls are common in the auto industry and mostly target particular parts or road situations. Tesla’s latest recall is sweeping, with the National Highway Traffic Safety Administration saying the Full Self-Driving software can break local traffic laws and act in a way the driver doesn’t expect in a grab bag of road situations.

According to the agency’s filing, those include driving through a yellow light on the verge of turning red; not properly stopping at a stop sign; speeding, due to failing to detect a road sign or because the driver has set their car to default to a faster speed; and making unexpected lane changes to move out of turn-only lanes when going straight through an intersection. Drivers will be able to continue to use the feature as Tesla builds a software patch for the defects.

The situations highlighted by the recall appear to be united by a design flaw that some safety experts argue has long been at the heart of Tesla’s driver assistance technology: the notion that drivers can let the software handle the driving—but are also expected to intervene at a moment’s notice when the software needs help.

Humans do not work that way, says Philip Koopman, who studies self-driving car safety as an associate professor at Carnegie Mellon University. “That’s a fundamental issue with this technology: You have a short reaction time to avoid these situations, and people aren’t good at that if they’re trained to think that the car does the right thing,” he says. The car is designed to buzz and beep when it determines that the human driver needs to take over.

Today’s recall shows that the US government is “dipping its toe in the water” when it comes to setting firmer limits on not only Tesla’s ambitious technology, but all automakers’ advanced driver assistance features, Koopman says. These features are meant to make driving more fun, less tedious, and safer, but they also require carmakers to make tricky decisions around the limits of human attention and how to market and explain their technology’s capabilities.

Tesla’s approach has been unique. Led by CEO Elon Musk, it has bucked government scrutiny, criticized lawmakers,  and in some cases built technology faster than regulators could regulate. “This is an interesting exercise in NHTSA figuring out how to use its authority with Tesla,” Koopman says.

A statement provided by NHTSA spokesperson Lucia Sanchez said that the agency detected the issues cited in the new recall through analyses related to an investigation opened in 2022. The probe looked into why vehicles using Tesla’s Autopilot feature have a history of colliding with stationary first responder vehicles.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd