Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 1278

関連インシデント

インシデント 7127 Report
Google admits its self driving car got it wrong: Bus crash was caused by software

Loading...
Google autonomous SUV involved in serious crash after a van runs red light in Mountain View
bizjournals.com · 2016

A Lexus SUV with Google’s self-driving technology was involved in a serious crash on Friday after a human-operated vehicle ran a red light in Mountain View.

Video obtained by 9to5Google shows the Google-owned vehicle crossing the intersection at El Camino Real and Phyllis Avenue after the light had turned green for a full six seconds. A van marked "Interstate Batteries" then ran the red light. The Lexus was in self-driving mode.

All airbags deployed in the Lexus and there were no reported injuries. Google employees monitoring the autonomous car were visibly shaken, according to witnesses at the incident. The Google car suffered severe body damage and broken windows on its right side. It was towed away on a flatbed truck.

“Thousands of crashes happen everyday on U.S. roads and red-light running is the leading cause of urban crashes in the U.S.,” Google said in a statement, per 9to5Google. “Human error plays a role in 94 percent of these crashes, which is why we’re developing fully self-driving technology to make our roads safer.”

Reports of autonomous vehicle-involved crashes are becoming more frequent as self-driving cars hit public roads. Most of the accidents, however, are the result of human error. In July, Google’s self-driving car project experienced its first injury accident in Mountain View. Three Google employees suffered whiplash when a Google self-driving SUV was rear-ended when it came to a stop at the intersection of Phyllis Avenue and Martens Avenue.

Google isn’t the only Silicon Valley company with self-driving safety concerns. Earlier this year, Tesla saw the first fatality involving one of its vehicles that had the Autopilot feature engaged. Shortly after that accident, there was a non-fatal incident in China where a Tesla engaged in Autopilot sideswiped a Volkswagen parked on a Beijing highway. The incident did not result in any injuries and Tesla said that the driver was not holding the steering wheel.

Last week, the Obama administration released its Federal Automated Vehicles policy, which detailed a 15-point Safety Assessment for self-driving car manufacturers. Under the new regulations, companies like Google and Tesla will be required to share vast amounts of data with federal regulators regarding the building and testing of self-driving cars.

The companies will have to provide details on how the cars operate, how they record data, crash information and how they guard against hacking. They will also need to provide answers on how a vehicle’s software will manage ethical situations. The government will publish the responses in an annual report.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd