Citation record for Incident 45

Suggested citation format

Yampolskiy, Roman. (2011-04-05) Incident Number 45. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
45
29
2011-04-05

CSET Taxonomy Classifications

Taxonomy Details

Full Description

From 2011 to 2018, Google has been sued in multiple countries on charges of defamation, as its autocomplete feature for its search engine would imply defamatory statements for businesses and people in China, Ireland, and Germany, and its image search associated an Australian man with the Melbourne criminal underworld.

Short Description

Google's autocomplete feature alongside its image search results resulted in the defamation of people and businesses.

Severity

Negligible

Harm Type

Harm to civil liberties, Other:Reputational harm/social harm (libel and defamation)

AI System Description

Google's autocomplete function in its search engine suggests keywords based on data collection of similar searches; Google Image's image classification model

System Developer

Google

Sector of Deployment

Information and communication

Relevant AI functions

Perception, Cognition, Action

AI Techniques

Machine learning, natural language processing model

AI Applications

recommendation engine, decision support, image recognition, forecasting

Location

Global

Named Entities

Google

Technology Purveyor

Google

Beginning Date

06/2011

Ending Date

06/2018

Near Miss

Unclear/unknown

Intent

Accident

Lives Lost

No

Laws Implicated

Law of defamation in Australia, EU Electronic Commerce Directive, judgment of Ribeiro PJ in Oriental Press Group Ltd & anor v Fevaworks Solutions Ltd [2013] 5 HKC 253 in Hong Kong

Data Inputs

User Google search input, photos

Incidents Reports

Google has lost a case in Italy over the defamatory nature of autocomplete suggestions, according to a lawyer for the complainant.

Google has lost a case in Italy over the defamatory nature of autocomplete suggestions. Credit: Google

On Tuesday, lead counsel Carlo Piana wrote on his blog that the Court of Milan has upheld its earlier decision to order Google to filter out libellous search suggestions. These are the suggestions that pop up in Google's search input bar, proposing what the user might be wanting to search for.

People searching via Google for Piana's client, who remains publicly unnamed, were apparently presented with autocomplete suggestions including truffatore ("con man") and truffa ("fraud").

The order (PDF, in Italian) is dated 31 March, although Piana only made its contents public on Tuesday. Google lost its bid to claim the protection of the E-Commerce Directive's safe harbour provisions, which partly shields hosting and ISPs from liability for content held on or transmitted over their systems. However, the court viewed the autocomplete suggestions as being produced by Google itself.

Content filter

"Google argued that it could not be held liable because it is a hosting provider, but we showed that this is content produced by them (and by the way, they do filter out certain content, including terms that are known to be used to distribute copyright-infringing material), although through automated means," Piana wrote.

The lawyer said the suit is "by no means an endorsement to censorship", as the allegations had been fully discussed with Google before the court action was even considered and only two phrases were put forward to be filtered out of autocomplete.

"All cases are different, therefore there is no assurance that similar cases would see the same outcome," Piana said. He added that this case had "caused a lot of trouble to the client, who has a public image both as an entrepreneur and provider of educational services in the field of personal finance".

In a statement on Tuesday, Google said it was "disappointed" by the Court of Milan's decision.

"We believe that Google should not be held liable for terms that appear in autocomplete as these are predicted by computer algorithms based on searches from previous users, not by Google itself," the company said. "We are currently reviewing our options."

This is not the first time Google has fallen foul of Italy's authorities. In February 2010, three Google executives were convicted in absentia over a video uploaded to the site, in which an autistic child was shown being bullied. In January this year, Italian authorities also forced Google to make concessions regarding Google News and AdSense, in order to close an antitrust investigation in the country....

Google loses autocomplete defamation case in Italy

Google Found Liable For Autocomplete Suggestions In Italy

from the oh-come-on dept

Here's yet another ridiculously bad ruling for search engines in Italy. Glyn Moody points us to the news of a blog post by a lawyer involved in the case (against Google) who is happy that his side prevailed and that Google is liable for search autocomplete suggestions . The case involved someone who was upset that doing a Google search on his name popped up "con man" ("truffatore") and "fraud" ("truffa") as autocomplete Google search suggestions. We've seen similar cases elsewhere, and France has (most of the time) also ruled against Google.Of course, this is ridiculous for a variety of reasons. Google is not "creating" this content. It'ssuggesting results based on what users are searching. Clearly, people are searching on this particular individual along with the two terms. That's not Google's fault. Yet Google is liable for it?One interesting footnote: a part of the reason why the court ruled the way it did was because the court noted that Google already edited autocomplete suggestions for issues related to copyright infringement. Funny. That'sthe issue we warned about when Google made the silly decision (following pressure from the US government) to start blocking certain keywords from autocomplete. The court seems to see this as proof that Google can and should be responsible for the content in that autocomplete box... Once again, it looks like the company would have been better off not meddling.

Filed Under: autocomplete, defamation, italy, liability, search

Companies: google...

Google Found Liable For Autocomplete Suggestions In Italy

A Milan judge has found Google Italy guilty of defamation because of the way its search engine linked the name of an Italian businessman to the word "fraud" and has ordered the company to modify the operation of its Autocomplete service.

The ruling by Judge Roberto Bichi was published March 24 and rejects a Google appeal against an earlier Milan court ruling that upheld the complaint of a businessman, identified in press reports Tuesday only as "AB." Bichi also ordered the company to pay a total of €3,800 ($5,550) in costs and damages.

AB, an entrepreneur in the financial services sector who uses the Internet to promote his business, complained that Google's Suggest search/Autocomplete function linked his name to the words "fraud" and "fraudster" (truffa and truffatore).

The connection was particularly unfortunate since he was a trader and clients might naturally search for his name in connection with the word "trading." AB complained that the word "fraud" came up as a suggestion after his name even without the user beginning to type a further term and that Google failed to take corrective action when the problem was drawn to its attention by his lawyer.

Bichi and a panel of two other judges ruled that the association of the plaintiff's name with the word "fraud" was liable to lead users "to doubt the moral integrity of the individual" and "to suspect him of illicit conduct."

The fact that the links did not actually lead to defamatory information about AB was not a valid excuse for Google's conduct, the judges said. Many users would not bother to click on the links and would come away with a negative impression of AB, while there was no evidence to support Google's contention that Internet users were capable of discriminating intelligently about the information they found online, the judges said.

In its appeal against the earlier ruling, Google argued that it provided a neutral hosting service and that the selection of information was carried out automatically by its proprietary software, with no active human intervention.

The company also argued that if it intervened to prevent its users from having access to information posted by third parties it could open itself to complaints and requests for compensation.

Writing in his blog, Carlo Piana, the lead counsel for AB, said there was no question of the court ruling opening the door to censorship. His client had discussed the case with Google before going to court, and was seeking the elimination of only two search terms, Piana wrote.

There was no question of the ruling creating a precedent, Piana said. "All cases are different."

"Google argued that it could not be held liable because it is a hosting provider, but we showed that this is content produced by them, although through automated means. Therefore in this case the search engine cannot avail itself of the safe harbor provision of the [European Union's] e-commerce directive," Piana wrote.

Technology expert Guido Scorza took a contrary view, arguing in his blog that there was nothing defamatory about Google's conduct.

The search engine had simply registered the fact that a number of users had combined AB's name with the words "fraud" and "fraudster," Scorza wrote. "The suggestions merely recounted the history of other people's searches and made them available to new users," he said.

The Milan ruling follows similar controversies in France, Sweden and Brazil, and comes just over a year after three Google executives were handed suspended six-month prison sentences in Milan for allowing a video showing the bullying of a handicapped boy to be posted on Google Video.

It also has elements in common with a ruling by a Rome court last month that ordered Yahoo to remove links from its search engine that led to pirated copies of an Iranian film.

Google was disappointed by the court's decision because it failed to take account of the fact that Autocomplete was based on the search behaviors of prior users, the company said in a written statement. "For the moment we are considering all our options," it said....

Google guilty of defamation, Italian court rules

Note: By submitting this form, you agree to Third Door Media's terms . We respect your privacy .

Sign up for our NEW daily brief, your #1 source for need-to-know search marketing news.

A popular Irish hotel has sued Google for defamation because Google’s autocomplete feature suggests to searchers that the hotel is in receivership.

Searchers looking for the Ballymascanlon Hotel — a four-star property that’s reportedly one of the most popular wedding venues in northeast Ireland and is not in financial trouble — see “ballymascanlon hotel receivership” as an autocomplete suggestion as soon as they’ve typed only eight letters of the hotel name. According to a recent Sunday Times article (quoted here by TJ McIntyre), some brides have contacted the hotel “in tears” after seeing the autocomplete suggestion, no doubt fearing that their wedding plans would have to be scrapped.

As Mark Collier writes, the hotel isn’t seeking punitive damages from Google; the suit only asks for an injunction to stop Google from showing the autocomplete suggestion about receivership, and for Google to pay the hotel’s legal fees.

Collier also details how the hotel made multiple attempts to contact Google about the issue and resolve it away from court – beginning with online channels and eventually escalating to attorney’s letters and even including the autocomplete problem in a DMCA complaint filed in March.

Previous Autocomplete Cases

Google has already faced similar complaints in other countries, and hasn’t fared well in the courts. The company lost two cases last year in France; see our articles Google Loses French Lawsuit Over Google Suggest and Google Convicted Again In France Over Google Suggest.

Earlier this year, Google also lost cases in Italy and Argentina.

How Autocomplete Works

Google has explained many time that autocomplete suggestions come from actual search activity. In Danny Sullivan’s article, How Google Instant’s Autocomplete Suggestions Work, the company commented on the Italian case I mentioned above:

We believe that Google should not be held liable for terms that appear in Autocomplete as these are predicted by computer algorithms based on searches from previous users, not by Google itself.

But Google’s argument that autocomplete suggestions are algorithmic doesn’t seem to stand up to legal scrutiny, perhaps because the company has manually removed piracy-related terms in the past, and its help pages list other cases — pornography, violence, hate speech, etc. — where suggestions will be removed.

I’m certainly not a lawyer, nor do I play one on Search Engine Land. So, whether that happens again in Ireland is anyone’s guess at this point.

(Thanks to Mark Collier for tipping us to this story. If you have news tips to share, please contact us.)...

Irish Hotel Sues Google For Defamation Over Autocomplete Suggestion

A new lawsuit alleges that Google's search engine has an anti-Semitism problem.

French anti-discrimination organization SOS Racisme, in association with the Union of Jewish Students of France, the Movement Against Racism and for Friendship Among Peoples and other organizations, is suing Google because its autocomplete feature suggests the word "Jewish" in searches involving certain public figures, including News Corporation chairman Rupert Murdoch and actor Jon Hamm, reports The Times of Israel.

Indeed, querying the search engine for "Jon Hamm," for example, yields "Jon Hamm Jewish" as one of the top results.

According to Google's website, its algorithm for the Google Instant autocomplete feature "predicts and displays search queries based on other users' search activities and the contents of web pages indexed by Google." In addition, the search engine says it strives to "reflect the diversity of content on the web (some good, some objectionable)" and so has a narrow set of removal policies for pornography, violence, hate speech, etc. -- though not narrow enough for SOS Racisme, it seems.

A lawyer for SOS Racisme, Patrick Kulgman, told Agence France Presse (AFP) that Google's autocomplete algorithms have resulted in "the creation of what is probably the biggest Jewish file in history," according to The Times of Israel. As an "ethnic file," this compilation is outlawed in the country.

Local reports pointed out by The Hollywood Reporter explain that the plaintiffs contend users of Google in France and across the world are systematically confronted with the unsolicited association of the term "Jew" with prominent names in the world of politics, media, and business. A hearing for the lawsuit is scheduled for Wednesday.

The Hollywood Reporter also writes that the last lawsuit Google saw in France due to its autocomplete feature occurred in 2009, when two French companies sued the search engine because its autocomplete feature suggested the French word for "scam" in searches for said companies' names.

Just over a month ago, a man in Japan won an injunction against Google to have the autocomplete feature turned off when someone searched the man's name. Apparently, the search engine was connecting the man's name with crimes he had not committed and, according to Japan Times, "likely played a role in the sudden loss of his job several years ago and caused several companies to subsequently reject him when he applied for new jobs."...

Google Instant's Allegedly 'Anti-Semitic' Results Lead To Lawsuit In France

Google has been ordered to disable part of its autocomplete function in Japan after complaints it violates privacy.

An unidentified man took the search giant to court over concerns that typing in his name linked him with crimes he was not involved with.

Lawyer Hiroyuki Tomita said the effect on the man's reputation has meant he has found it hard to find work.

Google has so far not carried out the court's request - but said it was "reviewing the order".

"A Japanese court issued a provisional order requesting Google to delete specific terms from autocomplete," the company said in a statement on Monday.

"The judge did not require Google to completely suspend the autocomplete function."

Autocomplete is a function on many of Google's search service which uses a mixture of algorithms and stored user data to predict what a person is searching for.

For example, a search for BBC will automatically suggest "news", "iPlayer" and "weather".

'Irretrievable damage'

Mr Tomita said a search for his client's name would wrongly associate him with crimes committed by a man with the same name.

He argued that the autocomplete feature as a whole was problematic as it directed users to potentially false or misleading information.

"It could lead to irretrievable damage such as a loss of job or bankruptcy just by showing search results that constitute defamation or a violation of the privacy of an individual person or small and medium-sized companies," Mr Tomita told Japanese news agency Kyodo.

Google defended the system, arguing that as results were generated mechanically - rather than by an individual - it was not an invasion of privacy.

"These searches are produced by a number of factors including the popularity of search terms," the company said.

"Google does not determine these terms manually - all of the queries shown in autocomplete have been typed previously by other Google users."

It is not the first time the feature has come under scrutiny. In December 2010, Google tweaked autocomplete so that terms relating to piracy did not automatically appear.

However, the company attracted further criticism after it refused to remove sites hosting illegal copyright material from its search results....

Google ordered to change autocomplete function in Japan

The problem has its roots in the American service mentality. One could presumably imagine Google as a somewhat overzealous assistant. "Rest your fingers," says the friendly search engine provider. Right from the very first letter that we type in the search box, it rushes to guess what we might be looking for. "S." Is it SPIEGEL? Samsung? Savings and loan? Skype?

It's pure service-mindedness, but for Bettina Wulff it's a nightmare. The wife of former German President Christian Wulff wants the search engine to cease suggesting terms that she finds defamatory. This has nothing to do with the search results, but rather with the recommendations made by Google's "Autocomplete" function, a service that is also offered by competitors like Bing and Yahoo. All one has to do is type her first name and the first letter of her last name to get search suggestions such as "Bettina Wulff prostitute," "Bettina Wulff escort" and "Bettina Wulff red-light district."

Google acts as if all this were unavoidable. "The search terms in Google Autocomplete reflect the actual search terms of all users," says a company spokesman. He also spoke of the "algorithmic result of several objective factors, including the popularity of search terms," which sounds far more complex and typically vague, but basically amounts to the same shoulder-shrugging response: One cannot accuse an automatic mechanism of defamation. The company maintains that the search engine only shows what exists. It's not its fault, argues Google, if someone doesn't like the computed results.

How We Perceive the World

Google increasingly influences how we perceive the world. What are we more afraid of? That behind the computing processes stands a merciless machine, or the opaque and arbitrary decisions of a large US corporation?

Both are to be feared and, in the case of Google, both come into play. Contrary to what the Google spokesman suggests, the displayed search terms are by no means solely based on objective calculations. And even if that were the case, just because the search engine means no harm, it doesn't mean that it does no harm. The Autocomplete function, the usefulness of which Google so guilelessly praises as a means of giving one's fingers a rest, undeniably helps spread rumors. Assuming that someone unsuspectingly begins to look for information on "Bettina Wulff" and is offered "prostitute," "Hanover" and "dress" as additional search terms -- where, independent of their actual interests, will users most likely click?

And everyone who selects the most exciting suggestion adds to the popularity of this search, and thus increases the probability that others will see this suggestion in the future.

Perhaps this is one reason why we find these functions and their algorithms so unsettling -- because they so relentlessly expose human behavior. Google is a rumormonger for the simple reason that people are rumormongers. When we hear that there is a rumor concerning Bettina Wulff, we want the details.

Diligently Searching

Who looked up these terms so diligently that they became popular enough to appear in the Autocomplete suggestion box in the first place? Indeed, such unsubstantiated rumors don't reach the top of the search list by merely surfacing on some obscure website in a dark corner of the Web. It may well have been the politicians and journalists who spread the false rumor that Ms. Wulff had been a prostitute -- a rumor she has vehemently denied.

For many months, they looked so hard and long for details on the Internet that the algorithms at Google and other search engines eventually concluded that it would be helpful to suggest the term "prostitute" to people who were looking for "Bettina Wulff" -- just as they recommend "iphone 5" to people who type in "iph."

Anyone who looks for "Angela Merkel" will, depending on their country location, be given "Zeuthen" as an additional search term. After pursuing the initial results here, they will find fairly skeptical news stories about the rumor that the chancellor supposedly wants to move to this town southeast of Berlin.

Until recently, anyone who followed the search suggestions on Bettina Wulff found no newspaper articles, no professional search results and no denials, only the rumor itself. Anyone with a little imagination -- and on the Internet there are certainly people who fall into this category -- could see a conspiracy in the deafening silence of the traditional media on a story that appeared to permeate the Web. The fact that the purported story was not being reported made the rumor even more plausible for those who were spreading it.

A Tacit Agreement Not to Report

In fact, there was apparently a tacit agreement among journalists not to report on the rumor, despite the fact that so many people had heard it. Even critical reporting aimed at refuting the rumor was off-limits, no doubt due to concerns that Ms. Wulff would take legal action against the publishers.

This case shows how dangerous it can be in th...

Google Autocomplete: Former German First Lady Defamation Case

Google searches employ two features: autocomplete and Google instant. These work together to complete your search terms and to automatically load search results while you're typing. While you're probably thankful for the few seconds this saves, or the way it triggers a connection you couldn't recall, Bettina Wulff (wife of former German President Christian Wulff) would be unlikely to agree with you these days. Type Wulff's name into Google, and the first autocomplete suggestions you'll see are "Bettina Wulff escort," and "Bettina Wulff prostituierte." Wulff is now suing Google for defamation, along with German TV host Günther Jauch and over 30 bloggers and media outlets. Wulff's suit against Google focuses on the results of this autocomplete feature.

Years ago, rumors began that Bettina Wulff had a former career as a prostitute named "Lady Victoria," possibly intended to damage Christian Wulff's political career. Wulff denies these rumors in her new biography, and asserts that they have harmed her reputation and family life. Wulff has filed a defamation suit in the Hamburg District Court to force Google to remove these false, damaging terms from the results of its autocomplete function. So far, Google has not removed the result terms; according to Spiegel Online, the company denies responsibility and claims that the products of the autocomplete function are driven by an algorithm relying on, among other things, popular search terms selected by users.

Without going into too much detail about the search algorithms, suggested autocomplete searches are all real searches done by Google users. The algorithm considers popularity foremost, but also considers geography, relevance, and your prior search history, among other objective factors, when providing these results. In this manner, then, Google's autocomplete feature and Google instant consider more than popularity, and there are times in which Google's algorithms limit search results or alter their ranking to reflect policy considerations.

Thus, if Google chose to alter the results of a search for Bettina Wulff, it would not be the first time Google has censored or altered Autocomplete results for policy reasons. For example, pressure from the entertainment industry and government officials has impacted Google's searches. In an attempt to fight piracy, Google's search function will not complete words such as "bittorrent", "torrent" and "rapidshare." Similarly, up until recently, Google had blocked the term "bisexual" from autocomplete search results, only removing "bisexual" from its banned words after a campaign by BiNet and other advocacy groups. Should false speech, particularly that injurious to its subject, be its next consideration?

Bettina Wulff is not the first individual to sue Google over false autocomplete results, and there is some precedent for finding defamation arising out of search results. Earlier this year in Japan, a man sued Google after its autocomplete results linked him to a crime he had never committed, allegedly resulting in irretrievable damage. The Japanese court ordered Google to delete the identified "false" autocomplete terms. Google was also fined last year when autocomplete suggested "crook" after the name of a French insurance company, and an Italian court ordered Google to filter out libelous search results that falsely suggested fraud. It would not be surprising, then, if a case tried in Germany, where defamation is codified in the criminal code, follows the lead of these other nations.

As in many other situations involving liability for content, this case highlights American exceptionalism in the realm of free speech. The United States takes a very speech-protective approach to libel, particularly with respect to public figures like Wulff. While the specific elements of defamation vary slightly from state to state, to establish liability for defamation in the case of public figures, the injured party must prove that the speaker acted with actual malice -- i.e., knowledge of falsity or a high degree of awareness of probable falsity.

Under this rigorous standard, it seems unlikely that this case or a similar defamation case would succeed in a U.S. court. Google, in devising its algorithm, did not knowingly intend to publish false statements of fact. At most, a court might determine that Google failed to take adequate precautions against defamatory combinations of words popping up, but "actual malice" depends on knowledge of falsehood and not objective perceptions of negligence. Moreover, even under a negligence standard it might be unreasonable to expect Google to police for truthfulness every potential autocomplete suggestion resulting from an inquiry on its search engine.

In addition, popularity is the primary factor considered in producing search results, and with regards to this element, Google is not itself speaking in a traditional manner but rather collecting and presenting the speech of others. As such, Google would li...

Defamation Case Attacks Google Autocomplete Results

Such a ruling would mean that Google would not be liable if information displayed via its 'autocomplete' function was defamatory, said media law specialist Ian Birdsey of Pinsent Masons, the law firm behind Out-Law.com.

Autocomplete suggests words or characters for completing a partial search on Google.

Last week a court in Australia ruled that Google should have to pay damages to Milorad Trkulja, a TV presenter who had complained that the internet giant had defamed him, according to a report by the BBC.

Trkulja was shot in a Melbourne restaurant in 2004 by a gunman wearing a balaclava. He claimed that, following the shooting and subsequent reporting of the incident, his name had become associated with the images of alleged criminals when users typed his name into the 'Google Images' search function.

Trkulja sued Google claiming that the company had failed to remove the defamatory link between him and the alleged criminals when he requested such action. The Supreme Court of Victoria accepted Google's argument that it had innocently disseminated the material but said that that defence was only applicable up to the point at which the company received Trkulja's complaint and held Google to be liable for defaming the man as a result of its inaction, according to the BBC's report.

This Australian case follows a similar ruling in Japan after a court there ordered Google to stop its search engine technology from suggesting "specific terms" that have linked a man's name to crimes he did not commit.

The unnamed man sued Google after claiming that the terms the company's autocomplete software suggests in association with his name caused him to lose his job and has subsequently put off potential new employers, according to a report at the time by Kyodo news agency on the Japan Times website.

In a similar ruling in France Google was fined $65,000 by a court after its search engine suggested the French word for 'crook' when users typed-in the name of an insurance company.

However, in the UK in 2009 the High Court ruled that Google is not the publisher of defamatory words that appear in its search results. Mr Justice Eady ruled that even when notified that its results contained libellous words Google was not liable as a publisher.

Google's liability for defamatory words that appear via its 'autocomplete' suggestions is as yet untested in the UK. However, Ian Birdsey said that it is unlikely that a UK court would come to a different conclusion from the one arrived at by the High Court in 2009.

"Although the issue of whether Google’s liability for its ‘autocomplete’ search function has yet to be dealt with by UK courts, the High Court in 2009 did determine that Google was a mere facilitator of the information displayed on its search results because it did not authorised the appearance of the information on users’ screens in a ‘meaningful sense’,” Birdsey said. “If the UK courts were to assess whether Google was liable for defamation as a result of the way its ‘autocomplete’ system suggests terms to users I think the courts would draw similar conclusions and find that Google is not a publisher."

"There has to be recognition that Google search terms are the product of input by its users. It is unfair to view Google as a traditional publisher of suggested search terms as a result of this," he added.

“In his judgement Mr Justice Eady said that there was a ‘degree of international recognition that the operators of search engines should put in place [a take-down policy] (which could obviously either be on a voluntary basis or put upon a statutory footing) to take account of legitimate complaints about legally objectionable material’,” Birdsey said. "The European Commission is currently looking to reform 'notice and takedown' rules that govern illegal material posted on the internet and has asked whether search engines, among other intermediaries, should be deemed to be ‘hosts’ of content. It is to be hoped that the Commission’s plans make clear whether search engines do have responsibility for removing illegal content and what that process should be."

To be considered libellous under common law in the UK comments must be published, communicated to someone other than the person being defamed and not be justified by a range of defences, including that the comments are true or were expressed as an opinion.

In the UK laws on defamation are also written into legislation. Under the Defamation Act a person can claim a defence against allegations of defamation if they can show that they were neither the author, editor or publisher of the comments, "took reasonable care in relation to its publication" and "did not know, and had no reason to believe, that what he did caused or contributed to the publication of a defamatory statement". The Act defines 'publisher' as meaning "a commercial publisher, that is, a person whose business is issuing material to the public, or a section of the public, who issues material containing the statement in...

Google would not be held liable for defamatory 'autocomplete' suggestions in UK, says expert

Tim Hornyak/CNET

A Japanese court has ordered Google to modify its autocomplete function so that it does not suggest a connection to crimes when a Japanese man's name is entered, adding that the Web giant must pay 300,000 yen ($3,100) to the plaintiff.

The ruling by the Tokyo District Court comes after its injunction last year backing the plaintiff, a Tokyo man who has not been identified. Google did not follow the injunction.

The man claimed that when Google users begin typing his name, the search engine would automatically suggest criminal acts he did not commit. The links would produce articles slandering him, he said.

The plaintiff said that along with the autocomplete function slandering him, he unexpectedly lost his job and was repeatedly rejected when he applied for other work.

"A situation has been created by which illegally submitted documents can be easily viewed," chief judge Hisaki Kobayashi was quoted as saying by the Mainichi Shimbun newspaper.

The court did not rule that the search functions were directly responsible for the loss of the plaintiff's job.

But the decision marks the first time a Japanese court has ordered Google to change these search terms, according to the plaintiff's lawyer, Hiroyuki Tomita.

"This [autocomplete feature] can lead to irretrievable damage, such as job loss or bankruptcy, just by displaying search results that constitute defamation or violation of the privacy of an individual person or small and medium-size companies," Tomita was quoted as saying last year by Kyodo News.

Google runs data centers in Hong Kong, Singapore, and Taiwan, but not in Japan, and so the court can't compel it to make the changes.

Google told CNET it had no comment on the case, but is studying the matter carefully.

The decision follows Google's loss of a case in Italy and another in France over autocomplete results.

Earlier this year, an Australian surgeon sued Google in California over autocomplete suggestions that he was bankrupt. Google, meanwhile, has indicated that it is not responsible for autocomplete results because they are generated automatically.

(Via AFP)...

Google loses autocomplete defamation suit in Japan

searchenginepeople.com · 2013

Some people say you never get a second chance to make a first impression. It's funny how much time and energy companies and brands devote to marketing activities when a big source of trouble is there, in front of their eyes and needs to be fixed before anything else. As people type their name, Google suggests one or more words to immediately complete their search. Mind reading? Probably not. Complex algorithms? Probably yes. One thing is sure: your reputation is being affected by what people instantly read. Will they decide to trust you?

From bars to search bars

Google Suggest helps people do what they used to do in bars in front of a much more limited audience: spreading rumors. It doesn't matter whether that Mike is a good guy or not, some people heard that other people heard that he did something wrong, so that's it. Mike is a scammer, a criminal, a bad person.

Is this right? On the one hand, it's a time saver for searchers. On the other hand, it can be a reputation bomb for people, companies and brands of any type. Google faced several lawsuits because of defamatory autocomplete suggestions and agreed to reduce certain types of content. However, this is still one of the most important factors in one's online reputation.

How to deal with the problem

Nobody wants their name to be associated with something negative. Removing automatic suggestions is not something you can do overnight, but I'm going to share a simple and actionable list of things to do to start taking control of your search bar reputation.

1. Own the search term

Let's be practical and reduce the negative impact of the autocomplete. Before doing anything else, you should build an optimized page for "YOURNAME scam" or anything else that might come immediately from Google Suggest. It will not prevent people from judging you, but at least you have the chance to express your point of view and explain why that term appeared and why it has nothing to do with you / it is false etc. For example, our promotion team was able to hide two hate sites spreading false information about a famous financial expert by creating new minisites matching the same keywords they were targeting.

2. Investigate

Why did that search term appear? Google Suggest is based on real searches, but how can we be sure they were not manipulated by somebody who wants to hurt your brand? There are cases of companies hiring cloud workers to google defamatory phrases to damage competitors. Legal actions may be needed: a good online reputation management firm can help you with that.

3. Don't manipulate

There are people who did experiments on altering the autocomplete mechanism to promote one's best face, content and reputation. You may be tempted by the idea of getting rid of the unwanted term(s), but keep in mind this is illegal and may end up causing more damage.

4. Outrank

We know that the best place to hide a dead body is in on page 2 of Google search results. The biggest SEO-oriented action you can take is to work on the promotion of your positive content that will naturally outrank hate sites. As a consequence, the negative terms will be less visible, less searched and less prominent on the autocomplete.

5. Remove?

This is not something easy. These are the main types of filtered suggestions:

Hate or violence related suggestions

Personally identifiable information in suggestions

Porn & adult-content related suggestions

Legally mandated removals

Piracy-related suggestions

What about your case? Legal actions are the only possible way in certain situations. Google actually lost several cases against companies being damaged by the autocomplete function, especially in France and Italy.

Do you know any other good things to do when dealing with undesired terms showing up in the search bar?...

How to Deal with Google Suggest Defamation

This morning, the Federal Supreme Court (Bundesgerichtshof) has held Google liable for a functionality of its search engine, the autocomplete function. The claimants had requested that Google ceased the publication of autocomplete results that suggested “fraud” or “Scientology” as additional search terms when the claimants’ names were searched. Google will now have to stop the publication of such “predictions” if and when it has become aware that automatically created predictions infringe the rights of third parties.

The predictions of additional search terms are generated automatically by a Google algorithm, based inter alia on the search terms entered by Google users. Google has been using the function since April 2009.

Google has been taken to court across the globe on this issue, winning in Italy in March 2013, and loosing in Japan in April 2013, for example. Today’s ruling appears to be the first one from a court of last instance. And if Google were not enough to get the matter into the headlines: Germany’s former first lady, Bettina Wulff, had commenced a similar action, which had been stayed, pending the outcome in today’s case. Bettina Wulff is seeking to stop the publication of autocomplete results linking her to search words such as “escort” or ” red light”. Her case had triggered a broad debate about the legal limits on the publication of search results.

The Court of Appeals (Oberlandesgericht) Köln had found, in the previous instance, in favour of Google. It did not attribute the “statements” generated by the autocomplete function to Google (“Den […] Ergänzungssuchbegriffen ist nicht der Charakter eigenständiger inhaltlicher Aussagen der Suchmaschine bzw. deren Betreibers […] beizumessen.”)

The Federal Supreme Court did not agree. Its line of arguments is as follows: The publication of predictions as a result of the autocomplete function constitutes a violation of personality rights (Persönlichkeitsrecht), if they imply a statement that is untrue. On the other hand, not every violation of personality rights by the search engine triggers Google’s liability. The autocomplete function per se is perfectly legal. The issue is the missing safeguard against the publication of results of a defamatory nature (“… dass sie keine hinreichenden Vorkehrungen getroffen hat, dass die von der Software generierten Suchvorschläge Rechte Dritter verletzen.”)

The Federal Supreme Court also does not find that there is a general duty to scrutinize the search results for potential infringements of third party rights prio to publication (“Der Betreiber einer Suchmaschine ist regelmäßig nicht verpflichtet, die […] Suchergänzungsvorschläge vorab auf etwaige Rechtsverletzungen zu überprüfen.”)

This duty is only triggered if and when Google becomes aware of a violation of third party rights. In practice, Google will now have to investigate the defamatory or slanderous nature of suggested search terms if a cease and desist letter comes in, and will then have to take appropriate action.

Technically, the claimants’ right to request Google to cease and desist is based on Sec. 823, 1004 German Civil Code (BGB) and Art. 1, 2, Basic Law (Grundgesetz), that is, on a combination of civil law tort concepts, and the fundamental rights of human dignity and personal freedom. The Federal Supreme Court does weight the respective rights of claimants on the one hand and Google’s rights, which also enjoy constitutional protection, and on balance, finds in favour of the claimants. I refrain from a personal comment on the matter, since my firm acts for one of the parties involved.

Share and Enjoy...

Federal Supreme Court: Google Liable for Defamatory Autocomplete Search Terms

Note: A version of the following also appears on the Tow Center blog.

In Germany, a man recently won a legal battle with Google over the fact that when you searched for his name, the autocomplete suggestions connected him to “scientology” and “fraud,” — two things that he felt had defamatory insinuations. As a result of losing the case, Google is now compelled to remove defamatory suggestions from autocomplete results when notified, in Germany at least.

Court cases arising from autocomplete defamation aren’t just happening in Germany though. In other European countries like Italy, France, and Ireland, to as wide afield as Japan and Australia people (and corporations) have brought suit alleging these algorithms defamed them by linking their names to everything from crime and fraud to bankruptcy or sexual conduct. In some cases such insinuations can have real consequences for finding jobs or doing business. New services, such as brand.com’s “Google Suggest Plan” have even arisen to help people manipulate and thus avoid negative connotations in search autocompletions.

The Berkman Center’s Digital Media Law Project (DMLP) defines a defamatory statement generally as, “a false statement of fact that exposes a person to hatred, ridicule or contempt, lowers him in the esteem of his peers, causes him to be shunned, or injures him in his business or trade.” By associating a person’s name with some unsavory behavior it would seem indisputable that autocomplete algorithms can indeed defame people.

So if algorithms like autocomplete can defame people or businesses, our next logical question might be to ask how to hold those algorithms accountable for their actions. Considering the scale and difficulty of monitoring such algorithms, one approach would be to use more algorithms to keep tabs on them and try to find instances of defamation hidden within their millions (or billions) of suggestions.

To try out this approach I automatically collected data on both Google and Bing autocompletions for a number of different queries relating to public companies and politicians. I then filtered these results against keyword lists relating to crime and sex in order to narrow in on potential cases of defamation. I used a list of the corporations on the S&P 500 to query the autocomplete APIs with the following templates, where “X” is the company name: “X,” “X company,” “X is,” “X has,” “X company is,” and “X company has.” And I used a list of U.S. congresspeople from the Sunlight Foundation to query for each person’s first and last name, as well as adding either “representative” or “senator” before their name. The data was then filtered using a list of sex-related keywords, and words related to crime collected from the Cambridge US dictionary in order to focus on a smaller subset of the almost 80,000 autosuggestions retrieved.

Among the corporate autocompletions that I filtered and reviewed, there were twenty-four instances that could be read as statements or assertions implicating the company in everything from corruption and scams to fraud and theft. For instance, querying Bing for “Torchmark” returns as the second suggestion, “torchmark corporation job scam.” Without really digging deeply it’s hard to tell if Torchmark corporation is really involved in some form of scam, or if there’s just some rumors about scam-like emails floating around. If those rumors are false, this could indeed be a case of defamation against the company. But this is a dicey situation for Bing, since if they filtered out a rumor that turned out to be true it might appear they were trying to sweep a company’s unsavory activities under the rug. People would ask: Is Bing trying to protect this company? At the same time they would be doing a disservice to their users by not steering them clear of a scam.

While looking through the autocompletions returned from querying for congresspeople it became clear that a significant issue here relates to name collisions. For relatively generic congressperson names like “Gerald Connolly” or “Joe Barton” there are many other people on the internet with the same names. And some of those people did bad things. So when you Google for “Gerald Connolly” one suggestion that comes up is “gerald connolly armed robbery,” not because Congressman Gerald Connolly robbed anyone but because someone else in Canada by the same name did. If you instead query for “representative Gerald Connolly” the association goes away; adding “representative” successfully disambiguates the two Connollys. The search engine has it tough though: Without a disambiguating term how should it know you’re looking for the congressman or a robber? There are other cases that may be more clear-cut instances of defamation, such as on Bing “Joe Barton” suggesting “joe barton scam” which was not corrected when adding the title “representative” to the front of the query. That seems to be more of a legitimate instance of defamation since even with the disambiguation it’s sti...

Algorithmic Defamation: The Case of the Shameless Autocomplete

Google argued that the "entire basis of the internet could be compromised" if it was forced to police what search suggestions people see. Photo: AP...

Hong Kong tycoon can sue Google over 'autocomplete' search suggestions, court rules

REUTERS/Stringer BEIJING (Reuters) - A Hong Kong court has ruled that a local tycoon can sue Google Inc for defamation because searches for his name on Google suggest adding the word 'triad', Hong Kong's notorious organized crime groups.

Searches in both English and Chinese for Albert Yeung Sau-shing, the founder and chairman of Hong Kong-based conglomerate Emperor Group, will automatically suggest phrases related to organized crime using Google's 'autocomplete' function.

On Tuesday, the High Court of Hong Kong dismissed Google's argument that it was not responsible for the autocomplete suggestions related to Yeung and that the court did not have personal jurisdiction over the U.S. search giant.

Google frequently finds itself embroiled in legal issues over what results are shown by its search engine. The European Union's top court in May ruled that people have a right to request that years-old personal information that is no longer relevant be removed from Internet search results.

"There is a good arguable case that Google Inc is the publisher of the Words and liable for their publication," said Marlene Ng, the deputy high court judge, in her ruling.

Google declined to comment on the verdict.

Yeung is seeking damages from Google for libel and wants the company to remove the defamatory search suggestions, court documents said.

Google argued that autocomplete works according to an automated algorithm and the company is not responsible for the resulting suggestions, which change depending on what a critical mass of users search for.

"The entire basis of the internet will be compromised if search engines are required to audit what can be assessed by users using their search tools," court documents attributed Gerard McCoy, Google's lawyer, as saying.

"It would be impossible for Google Inc to manually interfere with or monitor the search processes given the billions of searches conducted by Google Search," McCoy said according to the documents.

Because Google did not protest that the autocomplete suggestions were defamatory and they have criminal associations, Google may end up paying a large amount of money if Yeung sues successfully.

"In my view, it cannot be said at this stage that damages for reputational damage in Hong Kong are likely to be minimal if Yeung wins at trial," Ng said.

(Reporting by Paul Carsten; Additional reporting by Venus Wu in HONG KONG; Editing by Simon Cameron-Moore)...

Hong Kong Court Rules Tycoon Can Sue Google For Defamation Over Search Results

The Google logo is reflected in an eye. (Chris Jackson/Getty Images)

Google’s autocomplete feature has earned meme status for the hilarious and sometimes disturbing search terms it suggests.

For example, if you type “why are” into Google search right now, the number one suggested search is: “Why are manhole covers round?”

But autocomplete isn’t all fun and games. When Hong Kong business tycoon Albert Yeung Sau-shing Googled his name, the autocomplete feature suggested the word “triad,” a term that, in Asia, is associated with organized crime.

Now Yeung is suing Google for libel.

Yeung is the founder and chairman of Emperor Group, a sprawling business empire that includes property development, entertainment and financial services. He has been found guilty of crimes including illegal bookmaking and perverting the course of public justice, and has been fined for insider trading.

Albert Yeung, left, chairman of the Hong Kong media conglomerate Emperor Entertainment Group, and his wife attend the premiere of his film “Shinjuku Incident” in 2009. (AP Photo/Vincent Yu, File)

However, he has tried to rehabilitate his image through charity work. He is not one to tolerate even a minor blemish to his reputation — or his tablecloth for that matter: “If any sauce drops on the dining table, I would be very unhappy, as though my business had failed,” he once said, according to the South China Morning Post.

Google tried to have the case dismissed, but Tuesday a judge in Hong Kong said it could go to trial. The decision follows a trend in overseas courts sympathetic to those who want to hold Google responsible for what the Internet says about them. Deputy High Court Judge Marlene Ng cited Europe’s recent “right to be forgotten” ruling requiring Google to remove embarrassing or outdated search results upon request.

To prove libel in Hong Kong — and the United States — you have to show the accused published a defamatory statement about you. At issue in this case is whether Google can be regarded as the “publisher” of terms suggested by its search algorithm.

Autocomplete uses predictive search, which makes suggestions based on past Googling. For Yeung, “triad” appeared in autosearch because others were searching for his name and the word — or because those words appeared together in pages indexed by Google. Google, however, says it can’t “publish” libelous search results like a newspaper might publish a libelous article because it uses automated search algorithms without human input.

Autosearch in action.

Yeung doesn’t dispute the search process is automated. However, he argues Google has control over suggested search terms because it designed the search algorithm. Therefore, he says, Google is the “publisher” of search results.

The court acknowledges that “to influence or change what the autocomplete instant results show will require a large number of users with unique internet protocol (IP) addresses to type the desired search query into Google Search on an ongoing basis.”

Nonetheless, the judge ruled that Yeung has a “good arguable case” and cleared it to move forward.

Writing for Gigaom, Jeff John Roberts said the ruling is worrisome because it “may give prominent figures like Yeung a new tool to silence public opinion they disagree with. At the same time, the decision cites — and builds on — a worrying worldwide rush to censor Google.”

This isn’t the first time Google has been sued over its suggested search results. Last year, a German court ruled in favor of a nutritional supplement company who sued to remove suggested results linking it to Scientology and fraud. An Italian court in 2011 ordered Google to filter suggested searches that implied a plaintiff was a con man....

Can Google be sued for a mere search suggestion? A Hong Kong judge says yes.

The South Australian Supreme Court has found that Google published defamatory statements that appeared in autocomplete and related search terms on its search engine, after it received notice of the defamatory material and failed it remove it.

His Honour Justice Blue reasoned that the defamatory phrase was 'generated by Google programs as a result of Google's programming' and that 'the mere fact that the words are programmed to be generated because the user or others have previously searched for those words makes no difference' to the question of publication. His Honour decided that there was no reason why Google should not be held accountable for these 'publications' after it was put on notice by the plaintiff. It is worth noting that the plaintiff did not seek to argue that Google should be liable for the period prior to notification.

In his judgment, his Honour referred to the 'only authority' on this point as supporting this conclusion, being the decision of the High Court of Hong Kong in Dr Yeung Sau Shing Albert v Google Inc. In that case, the High Court dismissed an application by Google to have the proceeding stayed or dismissed on the basis that, among other things, the plaintiff had 'a good arguable case' that Google was the publisher of defamatory statements that appeared in Google Autocomplete and Related Search results.

His Honour Justice Blue also followed the reasoning of the Victorian Supreme Court in Trkulja v Google Inc LLC (No 5) and held that Google was the publisher of defamatory search results (comprising of the title, snippet and URL) after it received notification and failed to remove the defamatory results within a reasonable time.

A further hearing is scheduled to decide the remaining issues of the defences of triviality and time limitation, the application for an extension of time, causation and quantum of damages.

You can read the judgment in full here....

Google held to be a publisher of defamatory autocomplete and related search terms

Is Google liable for defamation for not removing defamatory information in search results? Is Google liable for defamation as a secondary publisher by including hyperlinks to a website that contains defamatory materials when the hyperlink is included in search results? Finally, is Google liable for defamation when its Autocomplete and Related Search features produce suggested search inquires that are defamatory? According to the recent decision of an Australian Court in Duffy v Google Inc., [2015] SASC 170 (27 October 2015), yes to all, at least once Google has received notice of these activities and fails to stop them within a reasonable period of time.

The case contains a treasure trove of Commonwealth law summarizing decisions on the liability of Internet intermediaries and service providers for defamation committed or facilitated by their online activities. Sadly, the decision, and the cases against Google relied on in the decision, illustrate the plight of individuals who sought and were denied the help asked for from Google to stop publishing information that damaged their reputations. Invariably, after being forced to litigate against one of the world’s wealthiest companies, their claims were vindicated by the courts which made findings that Google has a corporate responsibility to act, at least once it is put on notice that its search results are alleged to be defamatory. See, for example, Trkulja v Google Inc LLC (No 5) [2012] VSC 533 (summarized here), Tamiz v Google Inc [2013] EWCA Civ 68 (summarized here), Dr Yeung Sau Shing Albert v Google Inc [2014] HKCFI 1404, A v Google New Zealand Ltd [2012] NZHC 2352, Rana v Google Australia Pty Ltd [2013] FCA 60, Bleyer v Google Inc LLC [2014] NSWSC 897.

The Duffy v Google case arose from six articles that were published on the Ripoff Report website about the plaintiff, Dr. Duffy, and later published on other sites ostensibly derived from the Ripoff Report articles. The plaintiff notified Google and asked that the offending text and hyperlinks be removed from its search indexes. After many attempts to get Google to change its search results it removed the material relating to the six Ripoff Report webpages but not the other webpages. The plaintiff also notified Google that searches for her name on its websites resulted in the display by it’s Autocomplete utility of defamatory alternative search term “Janice Duffy Psychic Stalker” and requested its removal. Google did not comply.

The first issue in the case was whether the paragraphs (title, snippet and URL) displayed by the Google websites to users in response to searches for Dr Duffy’s name were published by Google. An example of a search alleged to be defamatory is the following:

R1 Ripoff Report Janice Duffy – Psychic Stalker Psychics Beware Of… Dr Janice Duffy is truly an embarrassment to her profession as a Senior Researcher in Adelaide Australia #2 Consumer Comment. Respond to this report… www.ripoffreport.com/…Janice-Duffy…/janice-duffy-psychic-stalker-98d93.htm Cached

Google argued it could not be liable for defamatory information published in search results as the information was produced without human intervention through the use of its computer systems and processes. It contended that for it to be liable it would have to authorize or accept responsibility for the publication. The court rejected this defense finding that Google played a critical role in publishing the defamatory content, once it became aware of what its systems were disseminating. Google could not be likened to a passive telecommunications carrier given the active role its systems played in generating and transmitting the offending information to the public.

I reject Google’s contention that a defendant can only ever be a publisher if the defendant authorises or accepts responsibility for the publication… Google was the sole operator and controller of the Google website. The paragraphs resided on Google’s website. The paragraphs were communicated by Google to the user conducting a search. Google played a critical role in communicating the paragraphs to the user. The physical element of publication is present. Google did not play the passive role of a mere conduit such as an internet service provider who merely provides access to the internet or a telecommunications carrier who merely provides access to the telephone network. Google played an active role in generating the paragraphs and communicating them to the user. The mere fact that the words are programmed to be generated because they appear on third party webpages makes no difference to the physical element. It makes no difference to the physical element whether a person directly composes the words in question or programs a machine which does so as a result of the program. I agree with the analysis of Beach J in Trkulja v Google Inc LLC (No 5)[140] in this respect… The mere fact that the paragraphs were generated automatically by Google’s software programs does not prevent...

Google liable for defamation through search and autocomplete features: Duffy v Google

In May 2013, the German Federal Court of Justice stated that Google’s predictions within the autocomplete function of its web search engine can violate the right of personality.[1] The right of personality ensures that a person’s (or even a company’s[2]) personality (reputation) is respected and can be freely developed.[3] Only the individual shall, in principle, decide how he/she wants to present himself/herself to third parties and the public.[4]

[5] Facts of the case [ edit ]

A stock corporation, which sold food supplements and cosmetics online, and its chairman filed an action for an injunction and financial compensation against Google based on a violation of their right of personality.[6] Google runs a web search engine under the domain “www.google.de” (among others), which allows Internet users to search for information online and access third party content through a list of search results.

In 2009, Google implemented a so-called “autocomplete” function which shows word combinations as predictions for the search of the user in a new window while typing in a search term into the search mask. These predictions are based on an algorithm which evaluates the number of searches on specific terms of other users. If users typed the full name of the chairman into the search engine in May 2010 the autocomplete function showed the predictions “Betrug” (fraud) or “Scientology”. The claimants stated that the chairman would have no connection to Scientology and that he was under no investigation for fraud. Furthermore, they argued that no search result would show a connection between the chairman and fraud or Scientology. Therefore, they saw these predictions as a violation of their right of personality.

The Regional Court Cologne decided in favour of Google and dismissed the case as unfounded.[7] The Higher Regional Court Cologne uphold this judgement.[8] The claimants filed an appeal to the German Federal Court of Justice.

[9] The decision [ edit ]

The German Federal Court of Justice set aside the judgement of the Higher Regional Court Cologne and referred the case back to this court.[10]

The Federal Court of Justice held that

the predictions (“Betrug”/”Scientology”) expressed the existence of a factual connection between the chairman and these negatively connoted terms and violated the right of personality [11] (the Higher Regional Court Cologne had taken a different view previously and had held that the predictions only expressed that other users typed in these word combinations for their search or that the terms could be found in linked third party content)

(the Higher Regional Court Cologne had taken a different view previously and had held that the predictions only expressed that other users typed in these word combinations for their search or that the terms could be found in linked third party content) the claimant's right of personality outweighed Google's freedom of expression [12] and commercial freedom [13] in a trade-off because false expressions do not need to be accepted

and commercial freedom in a trade-off because false expressions do not need to be accepted the violation was directly assignable to Google because they designed the software, exploited the user's behaviour, and suggested the predictions to the users

the national implementation [14] of the provisions of the Electronic Commerce Directive, [15] which grant intermediaries (access, caching, and host provider) immunity from liability to a certain extent, [16] were not applicable in this case because the predictions were not third party content that Google only made accessible or presented, but Google's own content

of the provisions of the Electronic Commerce Directive, which grant intermediaries (access, caching, and host provider) immunity from liability to a certain extent, were not applicable in this case because the predictions were not third party content that Google only made accessible or presented, but Google's own content the basis for a liability of the search engine provider is not the fact that he developed and used the software because these actions are protected by the provider's commercial freedom [17]

the liability can only be based on the fact that the provider did not take the necessary precautions to prevent the violation of a right of personality as part of a so-called “Stoererhaftung” (the “Stoererhaftung” (interferer's liability) is a liability of a person (the “Stoerer”) who is not a perpetrator or participant himself, but contributed willingly and adequately causally to the infringement of a protected legal interest in any way and requires a breach of a reasonable duty of care [18] )

) the search engine provider has, in principle, no obligation to monitor the predictions generated by a software beforehand and is only responsible if he becomes aware of the violation by the predictions

if the provider is notified of a violation by the victim he is also required to prevent future violations.[19]

In April 2014, the High...

Judgement of the German Federal Court of Justice on Google's autocomplete function

Are Google search results capable of defaming someone? A High Court hearing today may shed light on this very 21st-century question.

The case concerns Milorad Trkulja, who was shot in the back in a Melbourne restaurant in 2004 by an unknown assailant.

The incident coincided with gangland activity in the city, and subsequent Google searches wrongly linked Mr Trkulja with figures like crime boss Tony Mokbel.

Mr Trkulja sued Google for defamation and won in 2012.

He later began another proceeding, claiming that when searching phrases such as "Melbourne underworld criminals", defamatory text, autocomplete predictions and images were returned displaying him alongside convicted felons.

In 2017, the High Court granted him special leave to appeal against a Victorian Court of Appeal's decision, which agreed with Google that the case had no real prospect of successfully proving defamation.

Legal experts have debated whether search engines like Google can be considered "publishers" under Australian defamation law.

But Professor David Rolph, a media law specialist at the University of Sydney, said the case may hinge on a separate, similarly sticky issue.

If search results bring something defamatory to the attention of internet users, should those results be considered capable of defamation?

Share Milorad Trkulja sued Google for search results that wrongfully connected him with crime figures such as Tony Mokbel [pictured].

Who is the 'ordinary, reasonable search engine user'?

To decide if something is defamatory, courts often apply a hypothetical person test: would an ordinary, reasonable reader see the material as damaging?

The question has adapted to new technologies: "The ordinary and reasonable television viewer, the ordinary reasonable radio listener, for instance," Professor Rolph explained.

When setting aside Mr Trkulja's appeal, the Victorian Court of Appeal used "the ordinary, reasonable search engine user" to understand how Google's search results would have been understood.

The search results in question returned images of police commissioners and actors, as well as convicted gang figures, Queensland University of Technology Law School associate professor Nicolas Suzor pointed out.

The court noted there were thumbnails of, "a former chief commissioner of Victoria Police, two well-known crime reporters, a barrister dressed in wig and gown," as well as "the late Marlon Brando".

Given this variety of results, Dr Suzor believes the Victorian Court of Appeal came to a reasonable conclusion.

"An ordinary internet user knows how to interpret search results now, and so just because the applicant's face turns up on the page with certain potentially defamatory keywords, [it] isn't actually likely to hurt his reputation," he suggested.

The Appeals Court also determined autocomplete predictions were incapable of being defamatory — in particular, that the ordinary reasonable person would understand they were previous search terms by other users and not a statement by Google.

Legal experts are waiting to see if the High Court agrees.

Defamation in the internet age

There are concerns defamation law is struggling to adapt to the internet era.

Websites can be hosted anywhere in the world, which can make it difficult to find the right person to blame.

"You've got a plaintiff coming before the court and they've been harmed or wronged, and there's no-one we can point to that is within the reach of the law that we can ask to fix that wrong," Dr Suzor said.

"That's why search engines are now becoming the target."

In Australia, Mr Trkulja is among a handful of people to have successfully sued Google.

In 2015, for example, Google was found legally responsible by the South Australian Supreme Court for search results that linked to defamatory content, but the courts have come to conflicting decisions.

Professor Rolph questioned whether defamation law that works for newspapers, radio or television can be transposed directly onto new technology, or whether a greater rethink is needed.

"You have very wealthy media companies like Google, that are providing technology that allows people to search content which is generated by millions of users over which the media company has no control," he said.

Dr Suzor is concerned that if Mr Trkulja ultimately succeeds, search engines will have an incentive to remove content, but never an incentive not to.

He is worried free speech may ultimately lose out.

"[Companies like Google may] face monetary damages if they fail to remove content, but they can't be sued for removing content wrongfully," he said.

"We need to think about an open and just way of dealing with these issues that doesn't leave it all to the discretion of a private company.

Google declined to comment....

Defamation via Google search case to be discussed by High Court

CANBERRA, Australia -- An Australian man who alleges Google defamed him on Wednesday won a court battle to sue the search engine giant. Milorad "Michael" Trkulja was shot in the back in 2004 in a restaurant in Melbourne, Australia's second largest city.

The Australian High Court unanimously ruled in favor of Trkulja, supporting his allegation that a Google search of his name could indicate to an ordinary person he was "somehow associated with the Melbourne criminal underworld."

Trkulja had successfully argued in the Victoria state Supreme Court in 2012 that Google defamed him by publishing photos of him linked to hardened criminals of Melbourne's underworld.

The Power of Google

Four years later, the Victorian Court of Appeal overturned the decision, finding the case had no prospect of successfully proving defamation. The High Court disputed that ruling and ordered Google to pay Trkulja's legal costs.

Google searches for "Melbourne criminal underworld photos" bring up images of Trkulja alongside gangland figures, his lawyer Guy Reynolds told the High Court in March.

However, Google's lawyers argued it would be "irrational" for someone to assume photos in a Google image search for underworld figures are all of criminals, because the same search would also bring up the Google logo, movie posters, images of crime victims and photos of actor Marlon Brando.

Trkulja is also claiming defamation around Google's "autocomplete" options for his name, which have included phrases like "is a former hit man," ''criminal" and "underworld."

How did Google get so big?

However, the court heard autocomplete is an automatic function and that previous searches influence future suggestions.

The defamation suit is expected to go back to the Victoria Supreme Court for trial.

Trkulja said he would continue the legal action until he gets the result he wants, fearful someone will see the images and tell his grandchildren he's a hardened criminal.

"I will sue Google ... and I will sue them till they stop. I want them to block my pictures," he said. "I'm not a criminal, I've never been involved and I will make sure these people are not going to ruin my family - I have grandchildren," he added.

Google said in a statement: "We will continue to defend the claim. We decline to comment further on ongoing legal matters."...

Australian to sue Google for defamation over search results

A MAN who claims Google has defamed him has won his High Court battle to sue the search engine giant.

The court ruled in favour of Milorad "Michael" Trkulja in a judgment on Wednesday, supporting his claim that search engine results could indicate to an ordinary person he was "somehow associated with the Melbourne criminal underworld".

Mr Trkulja, who was shot in the back in a Melbourne restaurant in 2004, successfully argued in the Victorian Supreme Court in 2012 that Google defamed him by publishing photos of him linked to hardened criminals of Melbourne's underworld.

Camera Icon Milorad Trkuja Picture: AAP

Four years later the Victorian Court of Appeal overturned the decision, finding the case had no prospect of successfully proving defamation.

Google searches for "Melbourne criminal underworld photos" bring up images of Mr Trkulja alongside gangland figures Mick Gatto, Carl Williams, Chopper Reid, Mario Condello and Mark and Jason Moran, his lawyer Guy Reynolds told the High Court in March.

However, Google's lawyers argued it would be "irrational" for someone to assume photos in a Google image search for underworld figures are all of criminals, because the same search would also bring up the Google logo, movie posters, images of crime victims and photos of actor Marlon Brando.

Mr Trkulja also claimed defamation around Google's "autocomplete" options for his name, which have included phrases like "is a former hit man", "criminal" and underworld".

However the court heard autocomplete is an automatic function and that previous searches influence future suggestions.

In its decision, the High Court of Australia ruled the Google search results were capable of defaming Mr Trkulja.

“It would be open to a jury to conclude that an ordinary reasonable person using the Google search engine would infer that the persons pictured whose identities are unknown are persons, like the notorious criminals with whom they are pictured, in some fashion opprobriously connected with criminality and the Melbourne criminal underworld,” the judgment said. “So to conclude, as the Court of Appeal observed, might result in the list of persons potentially defamed being large and diverse. But contrary to the Court of Appeal's apparent reasoning, that does not mean that the conclusion is unsound. It means no more than that, in such cases, the liability of a search engine proprietor, like Google, may well turn more on whether the search engine proprietor is able to bring itself within the defence of innocent dissemination than on whether the content of what has been published has the capacity to defame.”

Google was also order to pay the costs of the appeal.

More to come....

Australian man allowed to sue Google for defamation after taking case to High Court

Entertainment promoter Milorad Trkulja claims Google has continued to disseminate content unfairly linking him to the Melbourne criminal underworld

A court cleared the way for a rare defamation action against Google on Wednesday after a man claimed the global internet giant published material linking him to Australia's criminal underworld.

Entertainment promoter Milorad Trkulja was shot in the back at a Melbourne restaurant in a 2004 crime that was never solved.

In 2012, Google was ordered to pay Aus$200,000 (US$150,000) in damages to Trkulja, who claimed he was defamed by material that implied he was a major crime figure and had been the target of a professional hit.

Trkulja then launched further proceedings against the online behemoth relating to images and text that he said continued to link him to underworld figures, according to the Australian Broadcasting Corporation.

A Victorian state court ruled in favour of Google, but Australia's High Court has now upheld an appeal by Trkulja, paving the way for his defamation action.

At least some search results for Trkulja "had the capacity to convey... that the appellant was somehow associated with the Melbourne criminal underworld", the court said.

Google has denied the claims, saying it had innocently disseminated material published by others.

In the 2012 decision, a jury ruled Google had failed to act when Trkulja's lawyers wrote to them demanding action over the "grossly defamatory" content.

The judge at the time likened the internet giant to a library or newsagent which has at times been considered a publisher in defamation cases.

Trkulja argued his reputation was critical to his work as a promoter and had been seriously damaged by the defamatory material.

There has been legal debate in Australia about whether search engines like Google can be considered "publishers" under Australian defamation law, even if they did not create the content.

Previous court rulings have given conflicting views.

Explore further Australian wins $208k from Google for defamation

© 2018 AFP...

Australia court paves way for Google 'underworld' defamation case

Melbourne man to sue over ‘Melbourne criminal underworld photos’ search results that show his face

This article is more than 9 months old

This article is more than 9 months old

Melbourne man Milorad “Michael” Trkulja has won his high court battle to sue the search engine Google for defamation over images and search results that link him to the Melbourne criminal underworld.

Trkulja said he would continue legal action against Google until it removed his name and photos from the internet.

• Sign up to receive the top stories in Australia every day at noon

Trkulja, who was shot in the back in a Melbourne restaurant in 2004, successfully argued in the Victorian supreme court in 2012 that Google defamed him by publishing photos of him linked to hardened criminals of Melbourne’s underworld.

Four years later the Victorian court of appeal overturned the decision, finding the case had no prospect of successfully proving defamation.

The high court disputed that ruling in a judgment on Wednesday and ordered Google to pay Trkulja’s legal costs.

Trkulja said he would continue the legal action until he got the result he wanted.

“I will sue Google … and I will sue them til they stop. I want them to block my pictures,” he said. “I’m not a criminal, I’ve never been involved and I will make sure these people are not going to ruin my family – I have grandchildren.”

Google searches for “Melbourne criminal underworld photos” bring up images of Trkulja alongside gangland figures Mick Gatto, Carl Williams, Chopper Reid, Mario Condello and Mark and Jason Moran, Trkulja’s lawyer Guy Reynolds told the high court in March.

However, Google’s lawyers argued it would be “irrational” for someone to assume photos in a Google image search for underworld figures all showed criminals, because the same search would also bring up the Google logo, movie posters, images of crime victims and photos of actor Marlon Brando.

Man sues Google for $750,000 over defamatory search results Read more

In a unanimous judgment led by the chief justice, Susan Keifel, the court said it was to be assumed someone searching for members of the Melbourne criminal underworld would “rationally suppose” the people whose pictures or names appeared, or at least some of them, were members of such.

The court found while it was clear some of those pictured, such as Brando, were not criminals, it could be concluded someone who was relatively unknown, such as Trkulja, could be connected with criminality or the underworld.

Trkulja also claimed defamation around Google’s “autocomplete” options for his name, which have included phrases like “is a former hit man”, “criminal” and “underworld”.

However, the court heard autocomplete was an automated function and that previous searches influenced future suggestions.

Comment is being sought from Google....

Man wins right to sue Google for defamation over image search results

Ever googled yourself and not like what you've found? What if the search engine posted your photo next to that of gangster Chopper Read?

In a landmark case, the High Court has allowed Google to be sued for defamation.

The court today gave Milorad Trkulja the green light to sue the search engine after he argued a Google search for 'Melbourne criminal underworld photos' brought up images of him alongside gangland figures, including Chopper Read and Mick Gatto.

The defamation case will set a powerful precedent for the responsibilities of search engines and individuals' right to deletion, also known as 'the right to be forgotten'.

Mr Trkulja argued the search results created a "false innuendo" suggesting he had been involved in crime and this had damaged his reputation, including one incident when he had been snubbed at a wedding.

Mr Trkulja also claims to have been defamed by Google's autocomplete function and text-based search results referring to him. He says the autocomplete options for his name included phrases like 'is a former hit man', 'criminal' and 'underworld'.

In 2012, Mr Trkulja successfully argued in the Victorian Supreme Court that Google defamed him by publishing the photos, but this was overturned on appeal. The case then went to the High Court, which today decided in Trkulja's favour.

Google's lawyers argued it would be "irrational" for someone to assume the photos in a Google image search for underworld figures are all of criminals, but the court found that was not the case. It found "an ordinary reasonable person" using Google would infer that people pictured alongside the criminals would be "connected with criminality".

A spokesperson for Google said the company "will continue to defend the claim", but would not comment further on ongoing legal matters....

You can now sue Google over defamatory 'innuendo' of search results

It has been a huge week for defamation law.

Last Thursday, the NSW Government announced a push to reform Australia’s uniform defamation laws. It is calling for a “cyber-age reboot”. That proposal was backed by a “statutory review” of the NSW Defamation Act. At a meeting of the Council of Attorneys-General, the states and territories agreed to reconvene a working party to consider reform of equivalent statutes around Australia.

The following Wednesday, the High Court delivered its most important defamation judgment in years. In a case that fits perfectly with the theme of the NSW proposals, Milorad “Michael” Trkulja succeeded in his appeal against Google. The Court found that Trkulja could sue the American company for defamation in respect of search results which potentially indicated that he had ties to Melbourne’s criminal underworld.

The next morning, the Victoria Court of Appeal allowed Bauer Media’s appeal from the judgment that awarded Rebel Wilson A$4.5 million in damages. The Court held that Wilson was entitled to A$600,000, and not to millions extra for lost opportunity to earn from roles that she may have been offered had the defendant not defamed her in its gossip magazines. The previous assessment of damages depended on the spread of the defamatory allegations on the internet via the “grapevine effect”.

The record for Australia’s largest defamation judgment is now barrister Lloyd Rayney’s A$2.6 million defamation win against the State of Western Australia, litigated by Perth firm Bennett + Co. If Rayney’s current appeal is successful, that figure may increase even further.

There’s a lot to think about.

The NSW proposal to allow large corporations to sue for defamation is particularly worrying. It would have a significant chilling effect on journalism.

But the issue that the NSW government chose to highlight from its statutory review was that defamation law is ill-equipped for the digital era. I agree that the way we communicate has completely changed in the 13 years since our Uniform Defamation Acts were introduced.

Read more: Defamation in the digital age has morphed into litigation between private individuals

Trkulja v Google shows it is time for reform

Trkulja was shot in the back in a Melbourne restaurant in 2004. As you’d expect, people wrote about it on the internet. Google provided access to that content through its search engine: web crawlers discovered web pages relevant to Trkulja, indexed them, and ranked them via its Google Search algorithms.

The result of those processes was that Trkulja was associated with some shady figures through Google search. A Google image search for his name would display Trkulja’s picture with those of Melbourne criminals. The results pages contained keywords like “melbourne criminals” and “melbourne underworld photos”.

Google’s autocomplete results would also cast him in a poor light, returning terms like “michael trkulja criminal” or “michael trkulja underworld”. The results page linked to content which described Trkulja as a “former hitman”.

Trkulja sued, claiming that this computer-generated material defamed him. Google argued that the claim was so weak that it should come to an end even before a trial. Victoria’s Supreme Court rejected Google’s argument.

Read more: Craig McLachlan, defamation and getting the balance right when sexual harassment goes to court

But the Victorian Court of Appeal allowed Google’s appeal, agreeing that the claim had no prospect of success. It found that the ordinary, reasonable person would not understand that the search results conveyed “imputations” which damaged Trkulja’s reputation. In their view, ordinary people would understand that there may be a disconnect between the words you type into Google and the results that follow.

On further appeal, the High Court unanimously decided that the Court of Appeal was wrong. At least some of the search results complained of had the capacity to convey the idea that Trkulja was associated with dodgy characters. Trkulja was given “the green light to sue” Google. Trkulja’s claim can now proceed.

Even before this case, you could sue Google for defamation

Like other foreign companies, Google is not immune to litigation because it is based overseas. On old principles, Google can be responsible for third party content which it “published” by sharing. It might have a defence of “innocent dissemination”, but perhaps not if the defamed person drew the problem to the company’s attention.

People have won against Google before. A few years ago, Janice Duffy succeeded in her claim that Google should be responsible for linking to defamatory websites. So in a sense, yesterday’s judgment is nothing really new.

It does provide some clarity on whether something like search results has the “capacity” to convey defamatory meaning. It is likely that Google will continue to be sued by all sorts of people who are aggrieved by search results that cast them in a poor light.

The case also demonstrates that ou...

Protecting Google from defamation is worth seriously considering

A recent High Court decision has opened the door to defamation proceedings that could affect search engine providers, as well as other businesses using search engine optimisation to improve their online profile.

Search engines have, without question, made access to information much easier and faster. Indeed, without search engines, it would be very difficult to navigate the trillions of internet pages and find meaningful results expeditiously. The algorithms behind search engines like Google are also now so developed that they are able to predict what you are looking for through their auto-complete function. What happens though, if those functions return results that are claimed to be untrue and potentially defamatory?

In the recent decision of Trkulja v Google Inc [2018] HCA 25, the High Court of Australia allowed an action to continue in which Google Inc. (a US company) is alleged to be liable in defamation for publishing search results that include images of Mr Trkulja mixed with images of convicted Melbourne criminals, as well as text referring to him and predictions generated by Google's autocomplete functionality. The decision holds the door open to defamation proceedings. While it clearly affects search engines like Google, it may also have implications for businesses using search engine optimisation to increase their online profile (eg media providers where key words could be lumped together in an unintentional manner).

Relevant defamation law

The law of defamation provides a remedy where a person's reputation is damaged by the publication of unjustifiable derogatory information.

In order to sue, it is necessary to prove that the defamatory material that is complained of was in fact published by the party being sued. In order to prove that fact, it must be shown that the alleged publisher:

was in some degree an accessory to the communication of the material in issue; and

intentionally participated in the communication of the allegedly defamatory material.

As to whether something is actually defamatory depends on what ordinary reasonable people would understand by the matter complained of, and whether it would cause them to think less of the person in question.

The events leading up to the Google defamation case

Milorad "Michael" Trkulja is an Australian resident who was shot in the back during a shooting in a Melbourne restaurant in 2004. This incident led to several articles in which there were references to certain Melbourne crime figures and investigations.

Mr Trkulja alleges that when Google image searches were performed during 2012 - 2014 of "Melbourne criminal underworld photos" and "Melbourne underworld criminals", images of him were mixed with images of convicted Melbourne criminals, and the pages contained various phrases such as "melbourne criminals", "melbourne criminal underworld figure", "melbourne criminal underworld photos", "Melbourne underworld crime" etc. It was also alleged that searches of Michael Trkulja's name associated him through the autocomplete function with terms like "is a former hit man", "criminal" and "underworld".

When Mr Trkulja issued proceedings in the Victorian Supreme Court, Google responded by applying to have the proceeding summarily dismissed on three bases:

first, that it did not publish the images;

second, that the matters in issue were not defamatory of Mr Trkulja; and

third, that Google was entitled to immunity from suit as a matter of public interest.

The Supreme Court dismissed Google's application, finding that:

by intentionally participating in the communication of the allegedly defamatory search results, there was a basis for alleging that Google was the publisher of the images;

the fact that the images were often returned alongside well-known underworld figures meant that it was arguable that the material was defamatory as it suggested that Mr Trkulja was a convicted criminal; and

the immunity proposed by Google was not in the public interest.

Google appealed the Court's decision on all three bases. The Court of Appeal did not decide the first ground (but noted its view that the innocent dissemination defence would likely be available), and rejected the third ground. However, it upheld Google's contention that Mr Trkulja would have no prospect of success in claiming that the matters in issue were capable of being defamatory.

Mr Trkulja disagreed and appealed to the High Court.

Appealing to the High Court

The primary issue before the High Court was whether or not the defamation proceedings should have been summarily dismissed by the Court of Appeal.

In a joint judgment, the High Court upheld Mr Trkulja's appeal, finding that:

it was strongly arguable that Google's intentional participation in the communication of the allegedly defamatory results to Google search engine users supports a finding that Google 'published' the allegedly defamatory results; and

some of the search results complained of had the capacity to convey to any ordinary reasonable pe...

When auto-complete goes wrong: High Court gives green light to defamation action for Google search results

Registration (please scroll down to set your data preferences)

Mondaq Ltd requires you to register and provide information that personally identifies you, including your content preferences, for three primary purposes (full details of Mondaq’s use of your personal data can be found in our Privacy and Cookies Notice):

To allow you to personalize the Mondaq websites you are visiting to show content ("Content") relevant to your interests.

To enable features such as password reminder, news alerts, email a colleague, and linking from Mondaq (and its affiliate sites) to your website.

To produce demographic feedback for our content providers ("Contributors") who contribute Content for free for your use.

Mondaq hopes that our registered users will support us in maintaining our free to view business model by consenting to our use of your personal data as described below.

Mondaq has a "free to view" business model. Our services are paid for by Contributors in exchange for Mondaq providing them with access to information about who accesses their content. Once personal data is transferred to our Contributors they become a data controller of this personal data. They use it to measure the response that their articles are receiving, as a form of market research. They may also use it to provide Mondaq users with information about their products and services.

Details of each Contributor to which your personal data will be transferred is clearly stated within the Content that you access. For full details of how this Contributor will use your personal data, you should review the Contributor’s own Privacy Notice.

Please indicate your preference below:

Yes, I am happy to support Mondaq in maintaining its free to view business model by agreeing to allow Mondaq to share my personal data with Contributors whose Content I access

No, I do not want Mondaq to share my personal data with Contributors

Also please let us know whether you are happy to receive communications promoting products and services offered by Mondaq:

Yes, I am happy to received promotional communications from Mondaq

No, please do not send me promotional communications from Mondaq

Terms & Conditions

Mondaq.com (the Website) is owned and managed by Mondaq Ltd (Mondaq). Mondaq grants you a non-exclusive, revocable licence to access the Website and associated services, such as the Mondaq News Alerts (Services), subject to and in consideration of your compliance with the following terms and conditions of use (Terms). Your use of the Website and/or Services constitutes your agreement to the Terms. Mondaq may terminate your use of the Website and Services if you are in breach of these Terms or if Mondaq decides to terminate the licence granted hereunder for any reason whatsoever.

Use of www.mondaq.com

To Use Mondaq.com you must be: eighteen (18) years old or over; legally capable of entering into binding contracts; and not in any way prohibited by the applicable law to enter into these Terms in the jurisdiction which you are currently located.

You may use the Website as an unregistered user, however, you are required to register as a user if you wish to read the full text of the Content or to receive the Services.

You may not modify, publish, transmit, transfer or sell, reproduce, create derivative works from, distribute, perform, link, display, or in any way exploit any of the Content, in whole or in part, except as expressly permitted in these Terms or with the prior written consent of Mondaq. You may not use electronic or other means to extract details or information from the Content. Nor shall you extract information about users or Contributors in order to offer them any services or products.

In your use of the Website and/or Services you shall: comply with all applicable laws, regulations, directives and legislations which apply to your Use of the Website and/or Services in whatever country you are physically located including without limitation any and all consumer law, export control laws and regulations; provide to us true, correct and accurate information and promptly inform us in the event that any information that you have provided to us changes or becomes inaccurate; notify Mondaq immediately of any circumstances where you have reason to believe that any Intellectual Property Rights or any other rights of any third party may have been infringed; co-operate with reasonable security or other checks or requests for information made by Mondaq from time to time; and at all times be fully liable for the breach of any of these Terms by a third party using your login details to access the Website and/or Services

however, you shall not: do anything likely to impair, interfere with or damage or cause harm or distress to any persons, or the network; do anything that will infringe any Intellectual Property Rights or other rights of Mondaq or any third party; or use the Website, Services and/or Content otherwise than in accordance with these Terms; use any trade marks or s...

Defamation Update: Google under fire again

CANBERRA, Australia — An Australian man who alleges Google defamed him on Wednesday won a court battle to sue the search engine giant.

Milorad "Michael" Trkulja was shot in the back in 2004 in a restaurant in Melbourne, Australia's second largest city.

The Australian High Court unanimously ruled in favor of Trkulja, supporting his allegation that a Google search of his name could indicate to an ordinary person he was "somehow associated with the Melbourne criminal underworld."

Trkulja had successfully argued in the Victoria state Supreme Court in 2012 that Google defamed him by publishing photos of him linked to hardened criminals of Melbourne's underworld.

Four years later, the Victorian Court of Appeal overturned the decision, finding the case had no prospect of successfully proving defamation. The High Court disputed that ruling and ordered Google to pay Trkulja's legal costs.

Google searches for "Melbourne criminal underworld photos" bring up images of Trkulja alongside gangland figures, his lawyer Guy Reynolds told the High Court in March.

However, Google's lawyers argued it would be "irrational" for someone to assume photos in a Google image search for underworld figures are all of criminals, because the same search would also bring up the Google logo, movie posters, images of crime victims and photos of actor Marlon Brando.

Trkulja is also claiming defamation around Google's "autocomplete" options for his name, which have included phrases like "is a former hit man," ''criminal" and "underworld."

However, the court heard autocomplete is an automatic function and that previous searches influence future suggestions.

The defamation suit is expected to go back to the Victoria Supreme Court for trial.

Trkulja said he would continue the legal action until he gets the result he wants, fearful someone will see the images and tell his grandchildren he's a hardened criminal.

"I will sue Google ... and I will sue them till they stop. I want them to block my pictures," he said. "I'm not a criminal, I've never been involved and I will make sure these people are not going to ruin my family — I have grandchildren," he added.

Google said in a statement: "We will continue to defend the claim. We decline to comment further on ongoing legal matters."...

Australian court rules man can sue Google for defamation