Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 805

Associated Incidents

Incident 4528 Report
Defamation via AutoComplete

Loading...
Defamation Case Attacks Google Autocomplete Results
dmlp.org · 2012

Google searches employ two features: autocomplete and Google instant. These work together to complete your search terms and to automatically load search results while you're typing. While you're probably thankful for the few seconds this saves, or the way it triggers a connection you couldn't recall, Bettina Wulff (wife of former German President Christian Wulff) would be unlikely to agree with you these days. Type Wulff's name into Google, and the first autocomplete suggestions you'll see are "Bettina Wulff escort," and "Bettina Wulff prostituierte." Wulff is now suing Google for defamation, along with German TV host Günther Jauch and over 30 bloggers and media outlets. Wulff's suit against Google focuses on the results of this autocomplete feature.

Years ago, rumors began that Bettina Wulff had a former career as a prostitute named "Lady Victoria," possibly intended to damage Christian Wulff's political career. Wulff denies these rumors in her new biography, and asserts that they have harmed her reputation and family life. Wulff has filed a defamation suit in the Hamburg District Court to force Google to remove these false, damaging terms from the results of its autocomplete function. So far, Google has not removed the result terms; according to Spiegel Online, the company denies responsibility and claims that the products of the autocomplete function are driven by an algorithm relying on, among other things, popular search terms selected by users.

Without going into too much detail about the search algorithms, suggested autocomplete searches are all real searches done by Google users. The algorithm considers popularity foremost, but also considers geography, relevance, and your prior search history, among other objective factors, when providing these results. In this manner, then, Google's autocomplete feature and Google instant consider more than popularity, and there are times in which Google's algorithms limit search results or alter their ranking to reflect policy considerations.

Thus, if Google chose to alter the results of a search for Bettina Wulff, it would not be the first time Google has censored or altered Autocomplete results for policy reasons. For example, pressure from the entertainment industry and government officials has impacted Google's searches. In an attempt to fight piracy, Google's search function will not complete words such as "bittorrent", "torrent" and "rapidshare." Similarly, up until recently, Google had blocked the term "bisexual" from autocomplete search results, only removing "bisexual" from its banned words after a campaign by BiNet and other advocacy groups. Should false speech, particularly that injurious to its subject, be its next consideration?

Bettina Wulff is not the first individual to sue Google over false autocomplete results, and there is some precedent for finding defamation arising out of search results. Earlier this year in Japan, a man sued Google after its autocomplete results linked him to a crime he had never committed, allegedly resulting in irretrievable damage. The Japanese court ordered Google to delete the identified "false" autocomplete terms. Google was also fined last year when autocomplete suggested "crook" after the name of a French insurance company, and an Italian court ordered Google to filter out libelous search results that falsely suggested fraud. It would not be surprising, then, if a case tried in Germany, where defamation is codified in the criminal code, follows the lead of these other nations.

As in many other situations involving liability for content, this case highlights American exceptionalism in the realm of free speech. The United States takes a very speech-protective approach to libel, particularly with respect to public figures like Wulff. While the specific elements of defamation vary slightly from state to state, to establish liability for defamation in the case of public figures, the injured party must prove that the speaker acted with actual malice -- i.e., knowledge of falsity or a high degree of awareness of probable falsity.

Under this rigorous standard, it seems unlikely that this case or a similar defamation case would succeed in a U.S. court. Google, in devising its algorithm, did not knowingly intend to publish false statements of fact. At most, a court might determine that Google failed to take adequate precautions against defamatory combinations of words popping up, but "actual malice" depends on knowledge of falsehood and not objective perceptions of negligence. Moreover, even under a negligence standard it might be unreasonable to expect Google to police for truthfulness every potential autocomplete suggestion resulting from an inquiry on its search engine.

In addition, popularity is the primary factor considered in producing search results, and with regards to this element, Google is not itself speaking in a traditional manner but rather collecting and presenting the speech of others. As such, Google would li

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd