Associated Incidents

Why did Microsoft’s chatbot Tay fail, and what does it mean for Artificial Intelligence studies?
Botego Inc Blocked Unblock Follow Following Mar 25, 2016
Yesterday, something that looks like a big failure has happened: Microsoft’s chatbot Tay has been taken offline after a series of offending tweets. And here’s how the social media has responded:
Keywords associated with "Artificial Intelligence" throughout the day. "Microsoft" and "dangerous" are on the rise.
We will not mention the racist and otherwise offensive content that Tay learned from people, as it’s not as newsworthy as it seems… Especially considering that it’s so easy to "teach" and ask her to repeat something.
Let’s take a look at Microsoft’s official website "tay.ai" to see how they describe Tay’s objectives… The first thing we notice is that, Microsoft wants you to not take it too seriously, because On Tay’s Twitter account, they provided a link to Tay’s "about" page -that lists the following frequently asked questions-, rather than the regular home page.
"Entertainment purposes only"
The FAQ page seems to be far from covering what people really want to know about Tay, but one thing is clear: Tay doesn’t claim to be a smart bot capable of reasoning. She just wants to have small talk with youngsters.
And here’s a list of "Things to do with Tay". (Along with the sad "Going offline for a while" message with a black background.)
Is this really what 18 to 24 year olds expect from a chatbot?
We know by (9 years of) experience that, the most important thing to do before releasing a chatbot is to plan a strategy to make sure you communicate the content domain properly, so that you can set the expectations right. Since perception is everything, nothing else matters. Remember the success of the YO! app? That’s the content domain we’re talking about. As long as people get it, you can get away with just one word.
Title of the website, apparently wasn’t enough to convey Tay’s mission:
Tay is an artificial intelligence chat bot designed to engage and entertain through casual and playful conversation
Some more description from the "about" page:
Tay has been built by mining relevant public data and by using AI and editorial developed by a staff including improvisational comedians. Public data that’s been anonymized is Tay’s primary data source. That data has been modeled, cleaned and filtered by the team developing Tay.
Noticed the "comedians" part? And the fact that possibly terrabytes of data being cleaned and filtered manually, sounds problematic, even with the most efficient method one can imagine.
Let’s take a look at what her conversations were all about. Source: foller.me
Tay has only 3 tweets addressing all her followers. 96.000 tweets are mentions.
So, the keyword cloud seems to be consistent with the goal: Common keywords such as "chattin, pix, selfie, pics, omg, love" represents a mixture of Justin Bieber & Kim Kardashian profiles.
And here’s the three hashtags that Tay has been using so frequently:
Microsoft engineers don’t seem to have spent much time coming up with creative hashtags.
The way she uses them, didn’t make sense to us, though. So this is what Microsoft thinks Tay’s followers would find entertaining?