Incident 229: Content Using Bestiality Thumbnails Allegedly Evaded YouTube’s Thumbnail Monitoring System

Description: YouTube’s thumbnail monitoring system was allegedly evaded by content farms such as ones in Cambodia who spike viewership and generate ad revenue using bestiality-themed thumbnails.
Alleged: YouTube developed and deployed an AI system, which harmed YouTube users and YouTube content creators.

Suggested citation format

Dickinson, Ingrid. (2018-04-23) Incident Number 229. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
229
Report Count
2
Incident Date
2018-04-23
Editors
Khoa Lam

Tools

New ReportNew ReportDiscoverDiscover

Incidents Reports

Yet another issue artificial intelligence is ill-equipped to handle.

YouTube videos with thumbnails depicting women engaging in various sexual acts with horses and dogs populate top search results on the video platform, according to a report from BuzzFeed News. Some of these videos, which can be easily found through YouTube’s algorithmic recommendation engine after searching innocuous phrases like “girl and her horse,” have millions of views and been on the platform for months.

Of course, YouTube videos depicting such acts would be more easily caught by the company’s algorithmic filters, its user-reporting system, and its human content moderators. Harder to find and weed out are videos that use graphic and obscene images as thumbnails, alongside clickbait titles, to juice viewership and generate more ad revenue. It does not seem like any of the videos featuring the bestiality thumbnails do in fact feature bestiality.

This is not an isolated problem, but rather yet another example of how the fundamental structure of YouTube can be exploited by bad actors, many of whom game the platform’s rules either to generate ad revenue for click farms or for nefarious purposes. Like Facebook and Twitter, YouTube has struggled over the last couple of years with the lack of control it has over the immense amount of user-generated content it overseas each and every day. Though YouTube has powerful algorithms that help flag content and many thousands of contracted human moderators, it seems like each and every week a new issue pops up that shows just how frail and ill-equipped the company’s moderation system is at dealing with content that is against the rules or otherwise illegal.

So a moderation problem that began years ago largely with copyrighted content has now expanded to include terrorism recruitment and propaganda videos, child exploitation content, and porn and other explicit material, among millions upon millions of other non-advertiser friendly videos. YouTube has made substantial changes to its platform to appease advertisers, quell criticism, and improve the safety and legality of its product. Those changes include pledging to hire more human moderators, mass demonetization and banning of accounts, and updates to its terms of service and site policies.

According to BuzzFeed, YouTube went through and began scrubbing its platform of the videos and accounts responsible for the bestiality thumbnail content once the news organization notified it of the issue. In a story published by The New York Times today, YouTube said it took down a total of 8.28 million videos in 2017, with about 80 percent of those takedowns having started with a flag from its artificial intelligence-powered content moderation system. Still, so long as YouTube relies mostly on software to address so-called problem videos, it will have to manually scrub its platform clean of content like bestiality thumbnails and whatever other dark corners of the internet surface on public-facing YouTube search results and through its recommendation engine.

YouTube channels are using bestiality thumbnails as clickbait

There is bestiality on YouTube, and it's surprisingly easy to find. It's also surprisingly prevalent, but not in videos — in video thumbnails, some of which have racked up millions of views.

Search YouTube for "girl and her horse" and the platform will return more than 12 million results. Among the first 20 results are four videos promoted with thumbnails of women seemingly engaged in sexual acts with horses. The top search return for the query? A video titled "Fantastic Girl and Her Horse in My Village" promoted with a half-blurred thumbnail of a woman being mounted by a horse with an erection. Created by an account called "SC Today," the video has amassed nearly 35,000 views in the four weeks it's been on YouTube.

The "Fantastic Girl and Her Horse in My Village" video itself does not feature any bestiality. It's largely footage of a woman bathing and brushing a horse. But clicking on it triggers YouTube's recommendation engine, which promptly serves up dozens more animal videos — many with thumbnails featuring graphic bestiality. One such thumbnailed video, published by a channel called "ALL ANIMAL," had amassed 2.3 million views at the time of this writing.

Most of these bestiality-thumbed videos — which appear to originate in South Asian countries like Cambodia — feature women in sundresses playing with or caring for animals like horses and dogs; some feature upskirt angles and crotch shots of women as they bathe or brush horses and dogs. And there are many. Without needing to search, YouTube's recommendation algorithm pointed BuzzFeed News to dozens of accounts, each with multiple videos featuring explicit bestiality thumbnails.

A senior employee at YouTube tasked with building out the company's intelligence desk (a new unit that seeks to identify controversial and rule-violating content trends on the platform) told BuzzFeed News that these graphic thumbnail videos appear similar to those made by a Cambodian content farm that was kicked off the platform in the fall of 2017. The employee noted that in its previous iteration, the content farm used provocative thumbnails (though none featuring bestiality) to promote titillating videos of women petting snakes.

The employee told BuzzFeed News that the Cambodian accounts were likely trying to spike their view counts in hopes of later monetizing them (the vast majority of the accounts BuzzFeed News discovered had not been monetized at the time they were terminated). The employee explained that YouTube's thumbnail monitoring technology — which, at present, is not as thorough as its video monitoring technology — didn't catch bestiality thumbnails as they don't necessarily have the same characteristics as typical pornography (often those in the videos are mostly clothed and the videos lack certain signifiers like skin). The use of bestiality images highlights how the Cambodian content farm tactics are evolving, the employee added.

"These images are abhorrent to us and have no place on YouTube," a spokesperson for the company told BuzzFeed News. "We have strict policies against misleading thumbnails, and violative content flagged to us by BuzzFeed has been removed. We're working quickly to do more than ever to tackle abuse on our platform, and that includes developing better tools for detecting inappropriate and misleading metadata and thumbnails so we can take fast action against them."

The bestiality-thumbed videos discovered by BuzzFeed News are largely variations on "A girl and horse" or "Lovely smart girl playing baby cute dogs on rice" and often use similar or identical photos. Some appear realistic; others are obviously photoshopped. All of them appear to point the way to pornographic videos.

Others appear to mix disturbing kids videos with the graphic thumbnail content.

Some of these bestiality-thumbed videos have garnered millions of views in the months they've been on YouTube. They don't appear to have been monetized, which may be part of the reason why they went undetected by YouTube. That said, they were reported to YouTube via its @TeamYouTube Twitter account on Thursday, April 19. The account replied the same day, saying they had "shared this with the right people." On the morning of April 23, many videos with the same thumbnails were easily searchable.

Bestiality thumbnails are the latest in a series of missteps for YouTube, which has had ongoing difficulties policing its platform for content that violates its rules. In November 2017, YouTube faced criticism following reports of unsettling animated videos and bizarre content aimed at children. Weeks later, the company announced it would crack down on child-centric videos after BuzzFeed News reported on dozens of videos — with millions of views — that depicted children in disturbing and abusive situations (many of those videos were monetized, making some of their creators hundreds of thousands of dollars per month). In 2018, YouTube came under fire for unwittingly directing users searching for news to conspiracy videos.

Shortly after being contacted by BuzzFeed News on Monday afternoon, YouTube began deleting the bestiality-thumbed videos on its platform and terminating the accounts that published them. A few hours later, the company proudly announced the success of its latest content policing efforts, telling the New York Times that 80% of the 8.28 million videos the company took down during the fourth quarter of 2017 were flagged by machines.

YouTube Hosted Graphic Images Of Bestiality For Months