Citation record for Incident 53

Suggested citation format

Yampolskiy, Roman. (2016-03-31) Incident Number 53. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Editors
53
18
2016-03-31
Sean McGregor

CSET Taxonomy Classifications

Taxonomy Details

Full Description

On Jun 6, 2016, 18 year-old Kabir Alli pointed out how Google image searches of "three black teenagers" versus "three white teenagers" differ, with the former presenting results mostly consisting of mugshots and the latter mostly consisting harmless, smiling stock pictures. Reactions on social media suggested that Google's algorithms presented racial bias.

Short Description

On June 6, 2016, Google image searches of "three black teenagers" resulted in mostly mugshot images whereas Google image searchers of "three white teenagers" consisted of mostly stock images, suggesting a racial bias in Google's algorithm.

Severity

Minor

Harm Distribution Basis

Race

Harm Type

Harm to social or political systems

AI System Description

Google Images is a search engine system that generates results based on machine learning and user input and data such as the popularity of the image, how frequently it is shared, context such as text around the image, and meta-tagging.

System Developer

Google

Sector of Deployment

Information and communication

Relevant AI functions

Perception, Cognition, Action

AI Techniques

open-source, machine learning

AI Applications

image classification, search engine, content filtering

Location

Global

Named Entities

Google

Technology Purveyor

Google

Beginning Date

2016-06-06T07:00:00.000Z

Ending Date

2016-06-06T07:00:00.000Z

Near Miss

Unclear/unknown

Intent

Accident

Lives Lost

No

Data Inputs

user input, images

Incidents Reports

On occasion, I ask my university students to follow me through a day in the life of an AfricanAmerican aunt, mother, mentor, or friend who is trying to help young women learn to use the Internet. In this exercise, I ask what kind of things they think young black girls might be interested in learning about: music, hair, friendship, fashion, popular culture?

I ask them if they could imagine how my nieces’ multicultural group of friends who are curious to learn about black culture and contributions (beyond watching rap music videos or Tyler Perry movies) might go to Google to find information about black accomplishments, identities, and intellectual traditions. I ask them to think about the book report they might write, or the speech they might give about famous black girls involved in human and civil rights movements in the United States and across the world. I remind my students that to be black is to encompass more than an African-American identity, but to embrace an affinity with black people in the diaspora, that it is our identification with others of African descent in Africa, the Caribbean, Latin America, Europe, and all parts of the globe. I remind them of the reclamation of the word “black” that my parents’ and their grandparents’ generations fought for, as in “Black Is Beautiful.” I ask them to imagine a 16-year-old, or even an 8-year-old, opening up Google in her browser and searching for herself and her friends by typing in the words “black girls.” Someone inevitably volunteers to come forward and open a blank Google search page—a portal to the seemingly bottomless array of information online—intending to find accurate and timely information that can’t easily be found without a library card or a thoughtful and wellinformed teacher.

Last semester, SugaryBlackPussy.com was the top hit. No matter which year or class the students are in, they always look at me in disbelief when their search yields this result. They wonder if they did something wrong. They double-check. They try using quotation marks around the search terms. They make sure the computer isn’t logged in to Gmail, as if past searches for pornography might be affecting the results. They don’t understand. consider myself far from prudish. I don’t care if someone types “porn” into a search engine and porn is what they get. I do care about porn turning up in the results when people are searching for support, knowledge, or answers about identity. I care that someone might type in “black girls,” “Latinas,” or other terms associated with women of color and instantly find porn all over their first-page results. I care that women are automatically considered “girls,” and that actual girls find their identities so readily compromised by porn.

At the moment, U.S. commercial search engines like Google, Yahoo!, and Bing wield tremendous power in defining how information is indexed and prioritized. Cuts to public education, public libraries, and community resources only exacerbate our reliance on technology, rather than information and education professionals, for learning. But what’s missing in the search engine is awareness about stereotypes, inequity, and identity. These results are deeply problematic and are often presented without any way for us to change them. Last year when I conducted these exercises in class, the now-defunct HotBlackPussy.com outranked SugaryBlackPussy.com, indicating that the market for black women and girls’ identities online is also in flux, and changes as businesses and organizations can afford to position and sustain themselves at the top of the search pile. These search engine results, for women whose identities are already maligned in the media, only further debase and erode efforts for social, political, and economic recognition and justice.

While preparing to write this article, I did a search for “women’s magazines,” having a hunch that feminist periodicals would not rise to the top of the search pile. After looking through the websites provided by Google, I gave up by page 11, never to find Bitch magazine. This search raises questions about why “women’s magazines” are automatically linked to unfeminist periodicals like Cosmopolitan and Women’s Day. (Not coincidentally, these titles are all owned by the Hearst Corporation,which has the funds to purchase its way to the top of the search pile, and which benefits from owning multiple media properties that can be used for cross-promotional hyperlinks that mutually push each other higher in the rankings.) These titles are the default for representations of women’s magazines, while alternative women’s media—say, those with a feminist perspective—can be found only via searching by name or including purposeful search terms like “feminist.”

Try Google searches on every variation you can think of for women’s and girls’ identities and you will see many of the ways in which commercial interests have subverted a diverse (or realistic) range of representations. Try “women athletes” and do your best not to cringe at the lists of “Top 25 Sexiest Female Athletes” that surface. Based on these search results, constructions of women’s identities and interests seem to be based on traditional, limited sexist norms, just as they are in the traditional media. What does it mean that feminism—or, barring a specific identification with that term, progressivism—has been divorced from the definitions or representations of “women” in a commercial search engine? That antifeminist or even pornographic representations of women show up on the first page of results in search engines by default?

Google’s search process is based on identifying and assigning value to various types of information through web indexing. Many search engines, not just Google, use the artificial intelligence of computers to determine what kinds of information should be retrieved and displayed, and in what order. Complex mathematical formulations are developed into algorithms that are part of the automation process. But these calculations do not take social context into account. If you were to try my classroom experiments for yourself (which I imagine you may do in the middle of reading this article), you may get a variation on my students’ results. The truth is, search engine results are impacted by myriad factors. Google applications like Gmail and social media sites like Facebook track your identity and previous searches to unearth something slightly different. Search engines increasingly remember where you’ve been and what links you’ve clicked in order to provide more customized content. Search results will also vary depending on whether filters to screen out porn are enabled on your browser. In some cases, there may be more media and interest in non-pornographic information about black girls in your locale that push such sites higher up to the first page, like a strong nonprofit, blog, or media source that gets a lot of clicks in your region (I teach in the Midwest, which may have something to do with the results we get when we do Google searches in class). Information that rises to the top of the search pile is not the same for every user in every location, and a variety of commercial advertising and political, social, and economic factors are linked to the way search results are coded and displayed.

Recently, the Federal Trade Commission started looking into Google’s near-monopoly status and market dominance and the harm this could cause consumers. Consumer Watchdog.org’s report “Traffic Report: How Google Is Squeezing out Competitors and Muscling into New Markets,” from June 2010, details how Google effectively blocks sites that it competes with and prioritizes its own properties to the top of the search pile (YouTube over other video sites, Google Maps over MapQuest, and Google Images over Photobucket and Flickr). The report highlights how Universal Search is not a neutral search process, but rather a commercial one that moves sites that buy paid advertising (as well as Google’s own investments) to the top of the pile. But many analysts watching the antitrust debates around Google argue that in the free market economy, market share dominance and control over search results isn’t a crime. In a September 2011 Businessweek.com article, reporter Mathew Ingram suggested that “it would be hard for anyone to prove that the company’s free services have injured consumers.” But Ingram is arguably defining “injury” a little too narrowly. Try searching for “Latinas,” or “Asian women,” and the results focus on porn, dating, and fetishization. “Black women” will give you sites on “angry black women,” and articles on “why black women are less attractive.” The largest commercial search engines fail to provide relevant and culturally situated knowledge on how women of color have traditionally been discriminated against, denied rights, or been violated in society and the media even though we have organized and resisted this on many levels. Search engine results don’t only mask the unequal access to social, political, and economic life in the United States as broken down by race, gender, and sexuality—they also maintain it. You might think that Google would want to do something about problematic search results, especially those that appear racist or sexist. Veronica Arreola wondered as much on the Ms. blog in 2010, when Google Instant, a searchenhancement tool, initially did not include the words “Latinas,” “lesbian,” and “bisexual,” because of their X-rated front-page results: “You’re Google. I think you could figure out how to put porn and violence-related results, say, on the second page?” But they don’t— except where it’s illegal (Google will not surface certain neo-Nazi websites in France and Germany, where Holocaust denial is against the law). Siva Vaidhyanathan’s 2011 book, The Googlization of Everything: (And Why We Should Worry) reminds us why this is an important matter to trace. He chronicles recent attempts by the Jewish community and the Anti-Defamation League to challenge Google’s priority ranking of anti-Semitic, Holocaust-denial websites. So troublesome were these search results that in 2011 Google issued a statement about its search process, encouraging people to use “Jews” and “Jewish people” in their searches, rather than the pejorative term “Jew”—which they claim they can do nothing about white supremacist groups co-opting. The need for accurate information about Jewish culture and the Holocaust should be enough evidence to start a national discussion about consumer harm, to which we can add a whole host of cultural and gender-based identities that are misrepresented in search engine results. Google’s assertion that its search results, though problematic, were computer-generated (and thus not the company’s fault) was apparently a good enough answer for the ADL, which was “extremely pleased that Google has heard our concerns and those of its users about the offensive nature of some search results and the unusually high ranking of peddlers of bigotry and anti-Semitism.” A search for the word “Jew” today will surface a beige box from Google linking to its lengthy disclaimer about your results—which remain a mix of both antiSemitic and informative sites. These kinds of disclaimers about search results are not enough, and though our collective (and at times tormented) love affair with Google continues, it should not be given a pass just because it issues apologies under the guise of its motto, “Don’t be evil.” Just because search engines are shrouded in hightech processes that may be difficult for the average Internet user to grasp doesn’t mean that the search methods of all the market leaders shouldn’t be examined. In addition, it is important that those who feel harmed by what goes to the top of a page-ranking system be heard in these processes. The question that the Federal Trade Commission might ask is whether search engines like Google should be probed about the values they assign to keyword combinations like “black girls,” “Latinas,” and other racial, gendered, and sexual-identity combinations, and whether saying they are not responsible for what happens through disclaimers should suffice. The rapid shift over the past decade from public-interest journalism to the corporate takeover of U.S. news media—which has made highlighting any kind of alternative news increasingly difficult—has occurred simultaneously with the erosion of professional standards applied to information provision on the web. As the search arena is consolidated to a handful of corporations, it’s even more crucial to pay close attention to the types of biases that are shaping the information prioritized in search engines. The higher a web page is ranked, the more it’s trusted. And unlike the vetting of journalists and librarians, who have been entrusted to fact-check and curate information for the public, the legitimacy of websites is taken for granted. When it comes to commercial search engines, it is no longer enough to simply share news and education on the web—we must ask ourselves how the things we want to share are found, and how the things we find have surfaced. These shifts are similar to the ways that certain kinds of information are prioritized to the top of the search pile: information, products, and ideas promoted by businesses and sold to industries that can afford to purchase keywords at a premium, or urls and advertising space online that drive their results and links to the top of the near-infinite pile of information available on the web. All of these dynamics are important for communities and organizations that want to make reliable information, education, culture, and resources available to each other—and not on page 23 of a Google search.

The Pew Internet & American Life consumer-behavior tracking surveys are conducted on a regular basis to understand the ways that Americans use the Internet and technology. An August 9, 2011, report found that 92 percent of adults who use the Internet—about half of all Americans—use search engines to find information online, and 59 percent do so on a typical day. These results indicate searching is the most popular online activity among U.S. adults. An earlier Pew report from 2005, “Search Engine Users,” specifically studied trust and credibility, finding that for the most part, people are satisfied with the results they find in search engines, with 64 percent of respondents believing search engines are a fair and unbiased source of information. But in the case of a search on the words “black girls,” the results that come up are certainly not fair or unbiased representations of actual black girls. In a centuries-old struggle for self-determination and a decadeslong effort to have control over our media misrepresentations—from mammies to sapphires, prostitutes to vixens—black women and girls have long been subject to exploitation in the media. Since we are so reliant on search engines for providing trusted information, shouldn’t we question the ways in which “information” about women is offered up to the highest bidder, advertiser, or company that can buy search terms and portray them any way they want? When I conducted my classroom exercise this semester, Black Girls Rock!, a nonprofit dedicated to empowering young women of color, was ranked high on the first-page results, showing that there are, indeed, alternatives to the usual search results. This coincided with a national campaign the organization was doing for an upcoming tv special, meaning a lot of people visited their site, helping move them up to the front page. But not all organizations have the ability to promote their url via other media. One of the myths of our digital democracy is that what rises to the top of the pile is what is most popular. By this logic, sexism and pornography are the most popular values on the Internet when it comes to women. There is more to result ranking than simply “voting” with our clicks. Search engines have the potential to display information and counternarratives that don’t prioritize the most explicit, racist, or sexist formulations around identity. We could experience freedom from such contrived and stereotypical representations by not supporting companies that foster a lack of social, political, and economic context in search engine results, especially as search engines are being given so much power in schools, libraries, and in the public domain. We could read more for knowledge and understanding and search less for decontextualized snippets of information. We could support more funding for public resources like schools and libraries, rather than outsourcing knowledge to big corporations. We need more sophisticated and thoughtful rankings of results that account for historical discrimination and misrepresentation. Otherwise, it appears that identity-based search results could be nothing more than old bigotry packaged in new media.

Missed Connections: What search engines say about women

What is the difference between the image search results when a person types “Three Black Teenagers” into google and then types “Three White Teenagers”?

Some people found the results disturbing and some people are calling google racist; Watch the video below for whether these claims are valid or invalid. And then carry on reading for some more analysis on this issue.

As I explain in the video google images is showing people the most popular tags and searches. If Black people want google to change the search result they can do two things.

Start a campaign pushing google to change the order of the search results. However, what really would be the point of this? How many people actually regularly type “Three Black Teenagers” into google? Barely anyone! Moreover, it wouldn’t mean the mugshots of people whom have committed crimes would disappear. They would simply be replaced with the same fake/model images that appear for “Three white teenagers”. Instead start sharing, searching and making more positive stories about black teenagers. Ie if there were more stories of black teenagers doing well or positive news they would be higher up in the search results. This is the best and my preferred strategy of working towards changing the perception of Black teenagers/people.

We all know there are plenty of Black teenagers doing positive things every single day. But are you sharing or promoting their stories? Or do you get distracted by the negative or by pointless discussions?

Each person must look at how they contribute to the perception of Black people and youth culture. Are you following/liking pages like MediaTakeOut or WorldStarHipHop? These sites only perpetuate the stereotypes and negative stories of Black people by constantly sharing fight videos and other ignorant behaviour and content.

If you or people you know prolifically share negative videos you are part of the problem and worse than the results of these google searches.

Point being, everyone must take personal responsibility for the part they play in the perception of themselves and their community. If you want a positive perception then start sharing and doing positive things. If you are already doing so; then start inspiring others to follow your example. But, also remember that your validation for whom you are, your worth, your people and your destiny should never come or be sculpted by what the media says about you.

Those are the bigger issues than a random search term barely anyone will every type.

So, in short no google is nor racist. FYI if you google “Black Teenagers” the search results are smiling faces of black friends i.e. stock photography images. Just as if you google ‘Sexy Black mum’ instead of ‘Sexy Black mom’ you get completely different results. Google is only a reflection of society and the internet.

With that said, remember the media, social media and people influence search results both negatively and positively. If you want change; you must be the change you want to see. I have written a letter to brothers; read and share if you agree.

With that said, let us know your thoughts.

Do you think our analysis of this issue was valid or invalid?

Do you think google should purposely change the results?

Please comment below and share; challenge someone to think a little deeper

FYI you can also tweet me at @AntoineSpeakson

Is Google Racist? Search 'Three Black Teenagers' Vs 'Three White Teenagers' & The Images Might Shock You: True Or False?

On Monday, Twitter user @iBeKabir learned that searching the phrase "three black teenagers" on Google images yields almost exclusively mugshots of black teens.

Curious, @iBeKabir says, "Let's just change the color" and swaps out "black" for "white." What appears is a sea of stock photos of blue-blooded white teenagers looking like they're having some good, clean fun.

Read more: Searching for This Racist Phrase on Google Maps Takes You to the White House

YOOOOOO LOOK AT THISpic.twitter.com/uY1JysFm8w

@iBeKabir isn't the first to notice the racist double standard: In March, U.K. blogger Antoine Speaks investigated whether there's a reason for the disparity and what the root cause of it may be. In a YouTube video, Speaks explains, "Google isn't racist — it literally shows you the most-wanted search results."

The stock photos of white people likely appear because there is a demand for them; individuals or companies are looking to purchase them, said Speaks, pushing them to the top of the page. On the flip side, the mug shots of black teens are typically associated with news stories about their arrests.

But, as Speaks points out, not all of the mugshots belong to convicted criminals. Many of them have been acquitted, yet the press continues to use their mugshots in articles about them.

The bias surrounding when outlets choose to share a mugshot has been playing out in real time as the press covers the case of Stanford sex offender Brock Turner, a white 20-year-old. Though he was convicted in March on three counts of sexual assault for assaulting an unconscious woman behind a dumpster, until Monday the Santa Clara county sheriff's office had yet to release Turner's mugshot. And, at first, the office only released Turner's sentencing photo, where he appears clean-cut and wearing a suit.

Back to the original question. Is Google racist? Yes — but only because we are.

Correction: June 7, 2016

A previous version of this story mischaracterized Brock Turner's conviction. Among other charges, he was convicted of assault with intent to commit rape.

Google "Three Black Teenagers" vs. "Three White Teenagers" - See the Problem?

On Monday, Twitter user @iBeKabir learned that searching the phrase "three black teenagers" on Google images yields almost exclusively mugshots of black teens.

Curious, @iBeKabir says, "Let's just change the color" and swaps out "black" for "white." What appears is a sea of stock photos of blue-blooded white teenagers looking like they're having some good, clean fun.

Read more: Searching for This Racist Phrase on Google Maps Takes You to the White House

YOOOOOO LOOK AT THISpic.twitter.com/uY1JysFm8w

@iBeKabir isn't the first to notice the racist double standard: In March, U.K. blogger Antoine Speaks investigated whether there's a reason for the disparity and what the root cause of it may be. In a YouTube video, Speaks explains, "Google isn't racist — it literally shows you the most-wanted search results."

Google More

The stock photos of white people likely appear because there is a demand for them; individuals or companies are looking to purchase them, said Speaks, pushing them to the top of the page. On the flip side, the mug shots of black teens are typically associated with news stories about their arrests.

But, as Speaks points out, not all of the mugshots belong to convicted criminals. Many of them have been acquitted, yet the press continues to use their mugshots in articles about them.

The bias surrounding when outlets choose to share a mugshot has been playing out in real time as the press covers the case of Stanford rapist Brock Turner, a white 20-year-old. Though he was convicted in March on three counts of sexual assault for raping an unconscious woman behind a dumpster, until Monday the Santa Clara county sheriff's office had yet to release Turner's mugshot. And, at first, the office only released Turner's sentencing photo, where he appears clean-cut and wearing a suit.

Back to the original question. Is Google racist? Yes — but only because we are.

Google "Three Black Teenagers" vs. "Three White Teenagers" - See the Problem?

To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video

A man compared the results for Googling ‘Three black teenagers’ and ‘Three white teenagers’ – and the results have ignited debate over whether Google is racist.

The video – posted by Twitter user @iBeKabir – has been retweeted thousands of times, with the ‘white’ search showing cheerful stock images, and the ‘black’ search showing mugshots.

YOOOOOO LOOK AT THIS pic.twitter.com/uY1JysFm8w — July 3rd. (@iBeKabir) June 7, 2016

But viewers are divided as to why the difference was so great – with some suggesting that there are simply more stock images featuring white people.

Advertisement

Advertisement

Google results are delivered by an algorithm, and can be affected by many factors, such as what words images are tagged with .

I'm weak at y'all saying Google is racist ??? — July 3rd. (@iBeKabir) June 8, 2016

Man compares Google searches for ‘black teenagers’ and ‘white teenagers’

A video shows image search results where the only difference in the search is "white" versus "black". In the first case stock photography is returned, while in the second, mugshots are returned.

Googling "Three Black Teens" vs. "Three White Teens"

When one searches a stock photo website like Shutterstock for “three black teenagers,” photos like the one above of three cute black teenagers by a photographer dubbed “Logo Boom” show up. Perhaps Google Images should take notes from the stock image website’s results for “three black teenagers.” That’s because the results that people get when searching Google Images for “three black teenagers” versus “three white teenagers” are causing consternation in some circles, reports Heavy.

As of this writing, searching Google Images for “three black teenagers” turns up disturbing results. The first two results are questionable sources containing questionable statistics about black teens and crime. The results in Google Images for “three white teenagers” shows plenty of smiling white teens in stock photos, interspersed with news photos and a mugshot or two.

null

But to those who have been publishing online for years and tend to know the way Google ranks certain images know that the “three black teenagers” drama is all about search-engine optimization, or SEO. Google tends to favor large, clear images, and ones that include keywords, captions and file names attributable to the topic at hand, which is “three black teenagers.”

So here’s hoping with the new focus on the results in Google Images when searching for “three black teenagers” that more images of black teens like the above stock photo show up. Instead of photos of “three black teenagers” displaying black teens involved in crimes or black teenagers who were the victims of crimes, perhaps better SEO’d articles will replace the ones that exist now in Google Images.

The YouTube star known as Antoine Speaks highlighted the issue in the following YouTube video, wherein Antoine encourages people to share and publish more positive articles about blacks in order to organically change the results to more becoming images when one searches for “three black teenagers.” Instead of dubious blog posts with fear-mongering and weird stats about three black teens committing some unverifiable crime against a white person, more positive and verifiable articles about three black teenagers performing good deeds could take their places.

For example, as reported by Business Insider, three black teenagers created an app to battle police brutality. Or, as reported by the Kansas City Star, three black teens were part of an advisory board to help foster an understanding between police and black teenagers in the wake of the situation in Ferguson, Missouri.

Then there is the thought-provoking article on Take Part about colorism, as revealed by three black teenagers who discussed their experiences with being light-skinned or dark-skinned. Those types of results are the ones being bandied about on social media as the ones that would be more productive during a Google Images search for “three black teenagers.”

It was the following Twitter video that shows a live Google search for “three black teenagers” versus a Google search for “three white teenagers” that set the ball rolling and went viral. That video has been liked and retweeted nearly 60,000 times on Twitter since being tweeted on June 6. By displaying images that look more akin to mugshots for the “three black teenagers” search — as opposed to the “three white teenagers” search showing images of soccer players and such — the “three black teenagers” search disparity is evident.

Warning: The below Twitter video displaying search results for “three black teenagers” versus “three white teenagers” contains language that might be offensive to some viewers.

null

A search for “black teenagers” in Google Images (leaving off the “three”) doesn’t seem to bring as many bad results. Instead, more stock images of smiling black teenagers appear, along with images of black teens in literacy programs, as reported by PBS.

[Image via Shutterstock]

Three Black Teenagers Vs. Three White Teenagers: Google Images Shows Evil Mugshots And Happy Stock Photos

A Google image search is going viral on Twitter as it’s said to highlight the pervasiveness of racial bias and media profiling.

A dude named Kabir Alli posted this clip on Twitter of himself carrying out a simple search of ‘three black teenagers’ on Google, which loads up a bunch of prisoner mugshots and undesirables.

When he searches for “three white teenagers” however, all you get is stock photos of smiling, well-behaved looking young people:

YOOOOOO LOOK AT THIS pic.twitter.com/uY1JysFm8w — 21 Kabbage (@iBeKabir) June 7, 2016

Much of the reaction seems to be people calling Google racist, although Kabir himself has a little more sense:

https://twitter.com/iBeKabir/status/740537933904683008?ref_src=twsrc%5Etfw

He added:

The results were formed through the algorithm they set up. They aren’t racist but I feel like they should have more control over something like that.

At the end of the day, there are a lot of stories out there covering crimes involving “three black teenagers.” Is that the messed up reality of media coverage and crime statistics? Yes. But is that down to Google purposely trying to manipulate their search results to oppress black people? No. In fact, the more articles show up about this story, the more results it creates for Google where both “three white teenagers” and “three black teenagers” is in the SEO and all photos associated with the story now come up:

That’s just how Google works – nothing intentional about it on their part.

Obviously, in true social media fashion, this guy is making the most of his Tweet going viral:

add me on snap though pic.twitter.com/WMiZ6Kyb2J — 21 Kabbage (@iBeKabir) June 7, 2016

This isn’t the first time Google has been accused of racism though – remember when their Photos app was tagging black people as gorillas? Fuck me, that was embarrassing.

Google Image Search For ‘Three Black Teenagers’ vs. ‘Three White Teenagers’ Is Outrageously Offensive

Google image search 'three black teenagers' and then 'three white teenagers'.

A Twitter video of a man typing in this supposedly unremarkable search has gone massively viral since it was uploaded yesterday afternoon.

That's because his clip shows the results for the former almost exclusively show mugshots, or at least pictures posed to look like them, and the latter reveals mostly stock photos of happy white teens.

But since @iBeKabir posted his tweet it has become so popular, with favourites and retweets reaching over 100,000 shares combined, the image results themselves are being skewed.

Is Google Images racist? A viral video posted to twitter shows the differences in typing in 'three black teenagers' compared to 'three white teenagers'

In the footage the man, from Virginia U.S.A, scrolls through the results for 'three black teenagers', which in his words is mainly 'inmates'.

He then says: 'Now let's just change the colour,' before he and his friends laugh in amazement at the surprising outcome for such a simple adjustment in the search.

'What the f***!' exclaims the American man.

Commenters have also expressed their shock, one writing: 'What the fox! that is just shameful! most white people I know don't even look like that but there's no excuse for that, sorry man.'

User @iBeKabir told MailOnline he had seen the search done before and 'wanted to see if it was actually real for myself.

'When I saw the results I was shocked, I feel like a search engine like Google should have more control over something like that. I understand it's really just an algorithm but it's still a problem,' he said.

Earlier in March British YouTuber Antoine Speaks had previously drawn attention to the results of the exact same search.

The results for 'three black teenagers' mostly shows pictures posed so they look like mugshots, whereas in the case of the latter predominately happy white teens are pictured

Current results for 'three black teenagers': The tweet has gone so viral the order of the results have changed

Current results for 'three white teenagers': The results for this search have changed even more notably

@iBeKabir What the fox! that is just shameful! most white people I know don't even look like that but there's no excuse for that, sorry man. — what.the.fox! (@MartensShell) June 8, 2016

Go to GOOGLE.

Type in "three black teenagers "

Now...

Go to GOOGLE.

Type in "three white teenagers"

WHITE PRIVILEGE at its finest. — aKEMPnameSlickback (@KEMPSAIDWHAT) June 7, 2016

In a blog post accompanying the video Antoine had a message for how people can alter the images the search brings up.

He advised that instead of pushing Google to change the order of the results, users must take it upon themselves.

'Start sharing, searching and making more positive stories about black teenagers. Ie if there were more stories of black teenagers doing well or positive news they would be higher up in the search results,' he wrote.

STATEMENT FROM GOOGLE 'Our image search results are a reflection of content from across the web, including the frequency with which types of images appear and the way they're described online. This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what image search results appear for a given query. These results don't reflect Google's own opinions or beliefs - as a company, we strongly value a diversity of perspectives, ideas and cultures.'

'This is the best and my preferred strategy of working towards changing the perception of Black teenagers/people,' he added.

Google advised MailOnline that search engines simply reflect what's on the web, so 'this is fundamentally a societal problem', said a spokesperson for the company.

'There are persistent and problematic biases, and they're pervasive in the media, on the web. Images are meta-tagged with their own descriptions especially in media articles.

'As a company we strongly value a diversity of perspectives, ideas and cultures - these search results do not reflect Google's view on the matter.'

On his reaction to how viral the tweet has gone, @iBeKabir said: 'I never thought the tweet would blow up like this at all. I'm an avid Twitter user and I've never gotten this much attention from a tweet before.'

In the responses to his tweet, debate raged over who was to blame, and what the wider implications are from such a result.

@lou_haus wrote: 'How can a search engine be racist? It pulls up results based on frequency.'

While Andrew Panebianco responded: 'This is such a sad reflection of our society.'

There was similar controversy over Google Image search results earlier this year, when a woman from Botswana found only black women had 'unprofessional hairstyles'

There was similar controversy over Google Image search results earlier this year, when a woman from Botswana exposed the differing outcomes from typing in 'unprofessional hairstyles for work' compared to 'professional hairstyles for work'.

Bonnie Kamona posted on Twitter: 'I saw

Google Image search for 'three black teenagers' vs 'three white teenagers' causes outrage

Image copyright Google images

What happens when you type: "Three black teenagers" into a Google image search?

Twitter user @ibekabir reposted a video showing someone doing just that, and thousands of social media users responded - because most of the resulting images were police mug shots.

By contrast, a Google image search for "Three white teenagers" throws up photos of happy, smiling groups of friends.

Image copyright @Q_Piece/Twitter

Social media users started the hashtag #threeblackteenagers to discuss the video.

The post has been retweeted more than 60,000 times and "favourited" 57,000 since it was posted on 7 June.

@No__I__D__ tweeted: "Google is racist."

@j0pierce posted: "Google got oddly specific though."

@husslej posted "@google please Google 'Three black teenagers' and 'Three white teenagers', and then tell us that is not racist."

Computers can't be racist

A Google spokeswoman said: "Our image search results are a reflection of content from across the web, including the frequency with which types of images appear and the way they are described online.

"This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what image search results appear for a given query.

"These results don't reflect Google's own opinions or beliefs.

"As a company, we strongly value a diversity of perspectives, ideas and cultures."

Not everyone agreed that the simple search was cause for great concern.

@ThisKaySaidso posted: "Loooool. Not Google's fault, though. Just means black people need to work on stock online presentation and presence."

@typhoonjim tweeted: "Computers can't be racist guys."

World White Web

But it is not the first time Google has been accused of being racist.

Designer Johanna Burai noticed her web searches for parts of the body were returning pictures of mostly white skin.

She set up World White Web, which encourages people to share their images of non-white hands in a bid to push them up Google's search results.

On her website, she writes: "The more people that share these images, the higher their ranking will be on Google."

Image copyright Google images

In April, student Bonnie Kamona made a similar observation after her search for '"Unprofessional hairstyles for work" resulted in pictures of black women.

Read more: Wear a weave your Afro hair is unprofessional

On typing: "Professional hairstyles for work", she was presented with pictures of blonde white women.

And her screenshots of the results went viral.

'Three black teenagers' Google search sparks Twitter row

The short answer to why Google's algorithm returns racist results is that society is racist. But let's start with a lil' story.

On June 6 (that's Monday, for those of you keeping track at home) Kabir Alli, an 18-year old in Virginia, posted a brief video of himself running a couple of quick Google image searches. First he searched for "three black teenagers" and was met with several rows of decontextualized mugshots. Then he searched for "three white teenagers" and was served up stock photos of relaxed teens hanging out in front of various plain white backgrounds.

The tweet has been retweeted 67,687 times as of this writing. On Thursday Alli told The Guardian that he'd been told about the differing results by friends, but that, "When I saw the results for myself I was shocked."

He also told the paper that he doesn't think Google is racist. He noticed that some people were accusing the company of racism in responses to his tweet, and offered up a rejoinder.

Advertisement

“The results were formed through the algorithm they set up. They aren’t racist but I feel like they should have more control over something like that.”

Google agrees, at least when it comes to whether or not it's racist. A spokesperson for Google offered the following statement to FUSION via email:

Our image search results are a reflection of content from across the web, including the frequency with which types of images appear and the way they’re described online. This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what image search results appear for a given query. These results don’t reflect Google’s own opinions or beliefs — as a company, we strongly value a diversity of perspectives, ideas and cultures.

Advertisement

As many have already pointed out, this isn't the first time this has happened. Earlier this year, grad student Bonnie Komona posted the results for "unprofessional hairstyles for work," which showed photos of black women and those for "professional hairstyles for work," which mostly displayed photos of white women.

There are cases where Google will intervene when its algorithm does bad things, such as last summer after Google Photo's auto-tagging program suggested that two young black people were gorillas. In that case a high-ranking engineer at the company apologized. There was also the 2009 instance where the company seemed to remove a racist image comparing Michelle Obama to a gorilla that appeared high in her search results.

Advertisement

The search results for the two terms Alli used remain largely the same for now, although, as is often the case, they're now also full of side-by-side comparisons from news articles. Here's what they look as of this writing:

Advertisement

And just for contrast, here are the results for "three latino teenagers:"

And here's "three asian teenagers," where you will see a different kind of bias. It almost always shows female teens, often scantily clad. (These aren't personalized to me, by the way; my colleagues got the same results.)

Advertisement

As BuzzFeed News explained back in April, a number of factors play into what images appear first in Google's image results, including "[t]he popularity of the image, how frequently it is shared, context such as text around the image, and meta-tagging." Meta-tagging (or metadata) is information about the image provided by the page the image is from or the image itself, so in the case of "three black teenagers" that description is likely coming from the sites that posted the mugshots, and people are probably clicking on those images. And in general, Google's algorithms are acting on a lot of inputs.

Alli's thought that Google "should have more control over [image results]" is a fair one. Google makes plenty of big, discretionary decisions about ad results and what appears in their other apps, but when it comes to search results they broadly stick to the line that they're simply very good at efficiently turning up what people are looking for on the internet. In other words, they're not racist, society is racist.

Advertisement

And fair enough, society is racist. But Alli raises the very fair question about what role society's institutions ought to play in allaying that racism rather than abetting it. Google may turn up what people on the web want, but because of its role as the world's most popular search engine, it also reinforces those wants. Being at the top of those rankings can make all the difference in the world, and there's a big industry based solely on getting there. But Google is ultimately a company in the U.S., and will pursue its fiduciary interests above anything else, which rely in part on not futzing too much with search results.

So for now, our best option may be something a little more difficult: trying to make society as a whole less racist.

Guess we'd better get on that.

Ethan Chiel is a reporter for Fusion, writing mostly about the internet and technology. You can (and should) email him at

'Black teenagers' vs. 'white teenagers': Why Google's algorithm displays racist results

This week Twitter user Kabir Alli posted a video of him carrying out two specific searches on Google. The search for “three white teenagers” produced smiling and happy generic images of white teenagers, while the search for “three black teenagers” produced some generic happy images too – alongside far too many mug shots and what could be perceived as negative images of black teenagers. The video of the search was put up without any explanation, and people predictably reacted emotively; it’s been shared more than 60,000 times. It brought back an internet meme I debunked back in March this year, in which, on the basis of such search results, people on social media called Google “racist”.

'Three black teenagers': anger as Google image search shows police mugshots Read more

The outrage towards Google as a result of those searches makes sense if a person isn’t aware of the nature of search engine optimisation (SEO), algorithms, alt tagging and stock photography.

But once you have that knowledge, it enables you to direct your outrage more accurately. In short, Google doesn’t produce or tag the images themselves. Google is a search engine; search engines collect data from the internet. The most popular and most accurate search results make their way to the top. Websites and companies use SEO to get their images, products and articles to the top of the search engine. So you, the viewer, can see them.

Alt tags are the descriptive words attached to an image or article by its producer, ie, a human, and Google uses these alt tags to bring you “accurate” results. For this particular search the images that appear tend to come from two sources: stock photography and news sites.

July 3rd. (@iBeKabir) YOOOOOO LOOK AT THIS pic.twitter.com/uY1JysFm8w

Stock photography involves a photographer taking generic images of models and then tagging the images in order to sell them to advertising companies. Black people make up 13% of the US population and 3% of the British population. That means there are far more white people in each population, which means far more companies potentially looking to buy images of smiling white teens. The demographic breakdown of society isn’t, in itself, racist. However, the fact that companies don’t think white people would buy their products if they had black models advertising them seems like a reflection of society’s prejudices. For instance, when the US clothing brand Old Navy used an interracial family in its advertising, it was bombarded with racist tweets.

Whenever a news site publishes an article writers will describe the pictures in the caption and alt text, and these news pictures form the source of many of the “negative” images and mugshots that appear. So, if a story is about a white or black teenager committing a crime the image which accompanies it may well be associated with the phrase “black/white teenager”.

News organisations want page views, and sadly many see the promotion of fear as a great way to reach a big audience. In western countries one of the fears some seek to exploit is the perception of black men as “dangerous”. This perception is evident if you compare the media’s depiction of young black men Tamir Rice and Trayvon Martin, who were 12 and 17 respectively when they were shot dead, and that of Brock Turner, 20, who has just been convicted of sexual assault. The two black teenagers were depicted as criminals and their deaths were blamed on themselves. This narrative was supported by images chosen to portray them with the “young black thug” stereotype. Turner has been depicted as the wholesome white swimming star with a bright future ahead of him – except for the moment he decided to try to rape an unconscious woman. The media portrayed him with a smiling college photo rather than his mugshot.

A study by the US campaign group Color of Change found that black people account for 51% of those arrested for violent crime in New York City. However, the arrests of black people receive 75% of the news coverage. Why? Because a calculation has been made – even if subconsciously or inadvertently – that these stories are of particular interest to a news audience.

So, is Google racist? No. But society is still racist. Not in the same way as the obvious and profound segregation seen in the US before the civil rights movement. But in more subtle, insidious ways, manifested through advertising, the media, film and policing.

We have to accept that computers and search engines do not think for themselves. They are a reflection of their creators, and in the case of search engines, a reflection of those who use them – us. Negative images of black teenagers aren’t at the top of the search results because Google is racist, but because society reflects our institutional and subconscious prejudices.

If people want to see positive images of black young people they are going to have to start writing, searching, reading and sharing them. This is the only way to change the negative perception of blac

The ‘three black teenagers’ search shows it is society, not Google, that is racist

What you need to know about AMI, the company Jeff Bezos says tried to blackmail him

Search 'three black teenagers' on Google and this is what you see

A Google Image search which reveals starkly different results for 'three black teenagers' and 'three white teenagers' has sparked anger on social media.

'Three black teenagers' was trending on Twitter this week after 18-year-old Kabir Alli of Virginia posted a video of himself carrying out the two searches.

The results for 'three black teenagers' were mainly of police mugshots, while 'three white teenagers' turns up mostly stock images of wholesome-looking young people laughing and smiling.

YOOOOOO LOOK AT THIS pic.twitter.com/uY1JysFm8w — July 3rd. (@iBeKabir) June 7, 2016

"I had actually heard about this search from one of my friends and just wanted to see everything for myself," Alli told USA Today.

"When I saw the results I was nothing short of shocked."

Since he uploaded it earlier this week, Alli's video has been retweeted more than 66,000 times and sparked a racism debate on Twitter.

Go to GOOGLE.

Type in "three black teenagers "

Now...

Go to GOOGLE.

Type in "three white teenagers"

WHITE PRIVILEGE at its finest. — aKEMPnameSlickback (@KEMPSAIDWHAT) June 7, 2016

— Bernard Tum (@TumBernard) June 9, 2016

While some have accused Google of racism, the internet giant says the search results are a reflection of what's on the internet, including the frequency with which certain types of images appear, and how they are described.

"This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what image search results appear for a given query," Google said in a statement.

"These results don't reflect Google's own opinions or beliefs - as a company, we strongly value a diversity of perspectives, ideas and cultures."

Alli said he doesn't believe Google is racist, but does think the company should take more responsibility on the issue.

"I understand it's all just an algorithm based on most visited pages but Google should be able to have more control over something like that," Alli said.

Others seem to agree.

Loooooool. Not Google's fault though. Just means black people need to work on stock online presentation and presence https://t.co/cBJtpFHl47 — Chaos (@ThisKaySaidSo) June 7, 2016

The results displayed when Alli typed in the search terms differ to what Google Images displays now, as a result of the coverage of the experiment.

'Three black teenagers' Google Image search sparks racism row

If you searched for “three white teenagers” on Google Images earlier this month, the result spat up shiny, happy people in droves — an R.E.M. song in JPG format. The images, mostly stock photos, displayed young Caucasian men and women laughing, holding sports equipment or caught whimsically mid-selfie.

If you searched for “three black teenagers,” the algorithm offered an array of mug shots.

Search 'three black teenagers' on Google and this is what you see

Kabir Alli recorded the vastly different results he saw when he searched "three black teenagers" and "three white teenagers" on Google Images (Twitter/Kabir Alli)

A soon-to-graduate senior at Clover Hill High School, in Midlothian, Va., named Kabir Alli recorded the disparity — and, as any enterprising 18-year-old would, posted the video to Twitter. The result was a swift and massive viral response, and his video was shared more than 65,000 times. (Similar observations had been made before, by YouTube videographers and others, but had not quite so deeply lodged in the Internet’s consciousness.)

Before he made the video, friends told Alli about what the Google search would pull up. But the teenager says watching it happen in person was still a surprise. “I didn’t think it would actually be true,” Alli said in an interview with USA Today. “When I saw the results I was nothing short of shocked.”

Google responded that its search algorithm mirrors the availability and frequency of online content. “This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what image search results appear for a given query,” the company said in a statement to the Huffington Post UK. “These results don’t reflect Google’s own opinions or beliefs — as a company, we strongly value a diversity of perspectives, ideas and cultures.”

Algorithms like the ones that power Google’s search engine tend to be esoteric or trade secrets or both, giving the software an air of mystery. Considering algorithms are, after all, lines of code, it is understandable why we might want to perceive them as unerring, impartial decision makers. That is a tempting view — but it is also, experts say, incorrect.

What you don’t know about Internet algorithms is hurting you. (And you probably don’t know very much)

As David Oppenheimer, a University of California at Berkeley law professor said about algorithms to the New York Times in 2015: “Even if they are not designed with the intent of discriminating against those groups, if they reproduce social preferences even in a completely rational way, they also reproduce those forms of discrimination.”

Google image searches for “beautiful dreadlocks” yield mostly dreadlocked white people, as Buzzfeed UK pointed out in April, with some critics citing this Eurocentric bent as an example of racism. Alli, for his part, told USA Today that he does not believe Google is racist, saying that “black males making poor choices also plays a major role.” But others disagree that Google should be so readily absolved. Safiya Umoja Noble, an African American studies professor at UCLA, argued to USA Today that Google has a responsibility to eliminate racial bias from its algorithm.

“It consistently issues a statement that it’s not responsible for the output of its algorithm,” Noble said. “And yet we have to ask ourselves: If Google is not responsible for its algorithm, who is?”

Even if human programmers do not reflect widespread discrimination, intentionally or not, they can introduce bias through omission. Google also came under fire in July 2015 when its photo app autonomously labeled a pair of black friends as animals. As The Washington Post noted, the engineer in charge of the program believed the underlying program was fine, but the data used during training of the algorithm was “faulty” — suggesting Google, perhaps, neglected to run a sufficient number of minority portraits through the AI.

Apparent algorithmic bias can be based on political views or gender as well. Facebook was accused of burying conservative news in its trending topics section on users’ homepages on the social network, though many of the allegations focused on human curators; Facebook later met with conservative leaders, and ended relying on an algorithm that monitors media websites. And a blurb for a job with a “$200K+” salary, which appeared in Google’s ad program, was almost six times more likely to be shown to a man than a woman, a Carnegie Mellon University study found; it was unclear if advertisers wanted to target men, the scientists concluded, or if Google’s software indicated men were more likely to click on the ad.

That is not to say that all algorithmic bias must be bad. Roboticists are creating artificially intelligent robots that stereotype to make quick, crucial decisions. Last year, Georgia Tech researcher Alan Wagner built an experimental machine to observe how a robot might distinguish between civilians and emergency personnel in a disaster.

Based on features like uniforms, for instance, Wagner’s program was able to determine if someone was a police officer. But it also concluded that all firefighters must have beards — simply because the simulated firemen all happened have facial hair during the experiment. Wagner now advocates that young artificial intelligences need “perceptual diversity,” as he described to the website Inverse — for instance, showing a broad swath of humanity to a program being trained to recognize faces.

The Google Image results for three white or black teenagers now reflect that the algorithm has learned the debate over bias is popular, showing photo montages linked to news stories like the one you are currently reading.

But racial disparities among Google’s treatment of teens are still easy to find: a Google video result for “three white teenagers” brings up YouTube news clips about the image result snafu. The same search, with “Asian” substituted for “white,” yields dozens of links to pornography.

Google faulted for racial bias in image search results for black teenagers

And just to make sure it’s not some sort of algorithmic Google search that only this guy is seeing:

Screen Shot 2016-06-08 at 11.00.47 AM

Screen Shot 2016-06-08 at 11.01.15 AM

Okay I’m sorry but those three dudes in the white teenagers pic slay me, that’s the most pressing item for me here. Stock photos, you’ve done it again. But anyway this tweet is going viral as if it’s an illustration of ingrained racism or even worse that Google is somehow manipulating media to oppress people. But in reality what you’re seeing is the magic of search engine optimization. In fact, as more and more articles pop up about it and create more results on Google, you can actually see the “three white teenagers” Google Image search start to get infiltrated by websites SEOing the fuck out of that phrase because it’s going viral literally as I’m writing this blog:

Screen Shot 2016-06-08 at 11.14.04 AM

Because a lot of news sites have had to cover crimes involving “three black teenagers,” you guessed it, a lot of photos have popped up of black teenagers and their mugshots. Is that sort of an ugly reality of media coverage and crime statistics? For sure. But for this to be going viral as some sort of “WOE IS ME THE WORLD IS AN INEQUITABLE PLACE EVEN DOWN TO OUR INNOCENT IMAGE SEARCHES” rallying cry is just inaccurate. This is more like Google getting drunk and breaking out an old Dave Chappelle bit at a party; technically it’s not politically correct these days but it’s not their fault the world’s ability to get offended grows by the month.

Also I’m glad the dude blowing up for this Twitter video is making the most of his opportunity:

Screen Shot 2016-06-08 at 11.21.40 AM

Kinda bummed it’s not a mixtape but it’d be greedy to expect even more stereotyping in one blog.

Barstool Sports

The internet might seem like a level playing field, but it isn’t. Safiya Umoja Noble came face to face with that fact one day when she used Google’s search engine to look for subjects her nieces might find interesting. She entered the term “black girls” and came back with pages dominated by pornography.

Noble, a USC Annenberg communications professor, was horrified but not surprised. For years she has been arguing that the values of the web reflect its builders—mostly white, Western men—and do not represent minorities and women. Her latest book, Algorithms of Oppression, details research she started after that fateful Google search, and it explores the hidden structures that shape how we get information through the internet.

The book, out this month, argues that search engine algorithms aren’t as neutral as Google would like you to think. Algorithms promote some results above others, and even a seemingly neutral piece of code can reflect society’s biases. What’s more, without any insight into how the algorithms work or what the broader context is, searches can unfairly shape the discussion of a topic like black girls.

Noble spoke to MIT Technology Review about the problems inherent with the current system, how Google could do better, and how artificial intelligence might make things worse.

If we’re looking for the closest Starbucks, a specific quote, or something very narrow that is easily understood, it works fine. But when we start getting into more complicated concepts around identity, around knowledge, this is where search engines start to fail us. This wouldn’t be so much of a problem except that the public really relies upon search engines to give them what they think will be the truth, or something vetted, or something that’s credible. This is where, I think, we have the greatest misunderstanding in the public about what search engines are.

To address bias, Google normally suppresses certain results. Is there a better approach?

We could think about pulling back on such an ambitious project of organizing all the world’s knowledge, or we could reframe and say, “This is a technology that is imperfect. It is manipulatable. We’re going to show you how it’s being manipulated. We’re going to make those kinds of dimensions of our product more transparent so that you know the deeply subjective nature of the output.” Instead, the position for many companies—not just Google—is that [they are] providing something that you can trust, and that you can count on, and this is where it becomes quite difficult.

How might machine learning perpetuate some of the racism and sexism you write about?

I've been arguing that artificial intelligence, or automated decision-making systems, will become a human rights issue this century. I strongly believe that, because machine-learning algorithms and projects are using data that is already biased, incomplete, flawed, and [we are] teaching machines how to make decisions based on that information. We know [that’s] going to lead to a variety of disparate outcomes. Let me just add that AI will be harder and harder to intervene upon because it will become less clear what data has been used to inform the making of that AI, or the making of those systems. There are many different kinds of data sets, for example, that are not standardized, that are coalescing to make decisions.

Since you first searched for “black girls” in 2010, have you seen things get better or worse?

Since I started writing about and speaking publicly about black girls in particular being associated with pornography, things have changed. Now the pornography and hypersexualized content is not on the first page, so I think that was a quiet improvement that didn’t come about with a lot of fanfare. But other communities, like Latina and Asian girls, are still highly sexualized in search results.

Bias already exists in search engine results, and it’s only going to get worse

Gregory Bush is arraigned on two counts of murder and 10 counts of wanton endangerment Thursday, Oct. 25, 2018, in Louisville, Ky. Bush fatally shot two African-American customers at a Kroger grocery store Wednesday and was swiftly arrested as he tried to flee, authorities said Thursday. Photo: Scott Utterback/Courier Journal via AP Pool

Two black senior citizens were murdered in Louisville, Kentucky, on Thursday. Maurice Stallard, 69, was at a Kroger supermarket when Gregory Bush, a 51-year-old white man, walked in and shot him multiple times. Bush then exited the store and shot Vickie Lee Jones, 67, in the parking lot before an armed bystander reportedly fired back, prompting him to flee. Police were unable to confirm accounts that Bush encountered a second armed man, who engaged him in a brief standoff where no shots were fired, according to the New York Times. “Don’t shoot me and I won’t shoot you,” the man’s son, Steve Zinninger, claimed Bush told his father. “Whites don’t kill whites.” Police apprehended Bush minutes later.

Bush had no known connection to either of his victims. Any doubt of a racial motive seemed quelled when surveillance footage showed the shooter forcibly tried to enter a black church minutes before moving on to the supermarket. The Times reports that a member of the 185-year-old First Baptist Church of Jeffersontown grew alarmed when she saw Bush yanking “aggressively” at its locked front doors. Up to ten people were inside the chapel following a midweek service. “I’m just thankful that all of our doors and security was in place,” church administrator Billy Williams said.

The murder of black seniors is a relatively rare phenomenon in the U.S. People over 65 accounted for just 2 percent of black homicide victims in 2014, according to a 2017 Violence Policy Center report, citing that year as the most recent for which data was available. Yet they have been central victims in recent racist killings. From Charleston to New York City and, now, possibly Louisville, some of the 21st century’s most notorious white supremacists have targeted black seniors for violent deaths. The unique cruelty of this pattern magnifies its obvious illogic, demonstrating yet again that white rhetoric framing black people as threats is shallow cover for terrorizing the vulnerable.

It also casts harsh light on the canards used to deflect reckoning with racist violence among partisan pundits. Arguments that police brutality claims are overblown, and that “black-on-black crime” is the more pressing issue, can make interracial violence a tough sell as worthy of national attention — if mostly for conservatives seeking to avoid confronting racism altogether. Yet their reasoning rarely cuts both ways. Terrorist attacks by Muslim refugees have not happened in the U.S., yet their specter fueled President Donald Trump’s election. Violent crime committed by undocumented immigrants is rare, but as a rhetorical device, it is among the central Republican wedge issues of the upcoming midterm elections.

The reality is that there has long been a tacit understanding in America that some forms of violence are more morally objectionable than others, regardless of their frequency. That this understanding is often weaponized to promote xenophobia and white supremacy belies that it also has appropriate applications. Black Americans have been targeted for centuries of enslavement and racial violence. Black people in their 60s are among the last generation who lived through and remember Jim Crow. Maurice Stallard was about five years old when Brown v. Board of Education was decided and a teenager when the Voting Rights Act passed in 1965. Vickie Lee Jones was five when the Montgomery Bus Boycotts ended. She was close to 13 when white terrorists in Birmingham, Alabama, murdered four little black girls in a church.

If anything constitutes a uniquely repugnant act of violence, white racists murdering black Americans who endured the 20th century’s banner period of white racist violence in the U.S. and lived to tell of it qualifies. Yet we continue to see it unfold — and remarkably, often justified using the rhetoric of defense. “Y’all are raping our white women. Y’all are taking over the world,” shouted Dylann Roof as he massacred nine black people at a church in Charleston, South Carolina, in 2015. White supremacist James Jackson, who initially hoped to kill “younger [black] guys” who “put white girls on the wrong path,” settled for stabbing Timothy Caughman to death with a sword in New York City in 2017. “The white race is being eroded,” he later said.

Caughman was 66 when he was killed. Ethel Lee Lance was 70, Daniel Simmons was 74, and Susie Jackson was 87 when Roof murdered them in purported defense of his race. At 69 and 67, respectively, Stallard and Jones are the latest to join their cohort. And their number could have easily been larger. Had Bush succeeded in breaking into the church — a place of black worship, black community, a

When White Supremacists Target the Black Elderly

Similar Incidents

By textual similarity


Did our AI mess up? Flag the unrelated incidents