Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2929

関連インシデント

インシデント 5253 Report
Tesla Vehicle Running on Self-Driving Mode Crashes on City Streets

Loading...
Tesla's Autopilot mode is on trial in California
qz.com · 2023

The first trial of robotic technology’s threat to human life is underway in a California court. The case relates to Tesla’s Autopilot software, which caused an accident on a city road in 2019.

According to Reuters, the plaintiff in the case is Los Angeles resident Justine Hsu, who first sued Tesla in 2020, when, while driving on the semi-autonomous mode, her Model S swerved into a restraint. She says in court filings that her airbag was activated with so much force that it “knocked out teeth, and caused nerve damage to her face” and broke her jaw. Hsu claims the airbag system—and the entire design of the Autopilot system, which Tesla launched in 2015—had flaws. She’s seeking more than $3 million in damages.

The electric car maker has denied any wrongdoing. It defends itself in part by pointing out that Hsu activated Autopilot on a city street, despite the car’s user manual warning against that. Tesla maintains that its cars are not fully autonomous, and that drivers should be ready “to take over at any moment.”

Tesla attorney Michael Carey claims that Hsu had time to brake the vehicle, yet still drove straight into the barrier. “The evidence proving distraction is pretty straightforward,” he said.

Autopilot’s safety record

Tesla vehicles running on the Autopilot software were involved in 273 crashes in 2021, according to data from the National Highway Traffic Safety Administration. That means Tesla vehicles made up nearly 70 percent of the 392 crashes involving advanced driver-assistance systems during that year.

Tesla CEO Elon Musk has always promoted Tesla’s “Full Self-Driving” (FSD) software, selling it as a $15,000 add-on to the company’s vehicles. Automation is a major part of the company’s future plans for revenue growth. As such, investors and shareholders are likely to monitor the outcome of the trial closely. The company’s shares dropped by 8% when the incident was reported in 2019.

While previous Tesla Autopilot flaws have led to deaths around the world, none has ever been prosecuted, making the outcome of the San Francisco case a critical point in how robotic car software will be designed in the future—and a major precedent in similar trials.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd