News & Views
Microsoft's chatbot has zero chill

Microsoft’s latest foray into artificial intelligence ended in embarrassment when its chatbot, Tay, got a little… racist. Tay was created as an exercise in “conversational understanding,” and aimed at 18-24 year old users; that oft-coveted millennial audience which dominates mobile social chat in the US. The premise was simple enough.

Individuals could sign up to interact with Tay, and Tay’s responses would become increasingly tailored to that person’s interests and voice. “Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,” say Microsoft. “The more you chat with Tay the smarter she gets, so the experience can be more personalised for you.” What Microsoft overlooked, however, was the opportunity for trolls to wreak havoc with Tay. Namely, that she was trying to learn how human beings communicate, and that fundamental gaps in her understanding could be exploited. So naturally, a number of online troublemakers swiftly found ways to provoke a series of incendiary race-baiting responses from Tay, including this doozy: “Bush did 9/11 and Hitler would have done a better job than the monkey we have now. Donald Trump is the only hope we’ve got.”

Tay was also co-opted to harass a female game developer, which should give you some ideas as to the leanings of the people behind this little disaster. The Holocaust-denying, pro-genocide, misogynist tweets were all transparently engineered, of course, but Microsoft still made the decision to take Tay offline for the time being to work out these reputation-damaging bugs. “As [Tay] learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it,” Microsoft say, ending on quite the understatement: “We’re making some adjustments to Tay.”

The company has come under fire for making a content-neutral programme open to the public without any appropriate language filters, showing either a woeful ignorance or a complete disregard of online harassment issues. Ultimately, Tay is a machine, and in many ways a mirror which reflects the personalities interacting with her. However, the “Tay goes cray” story is also a cautionary tale to a great many companies who are contemplating opening up their brand.

Some of the best content can come from your community of users, but the same is true of the absolute worst. When selecting creators and ambassadors for your brand, a focus on respect and moderation is paramount. And Tay has proven that it’s not enough to simply believe people will be nice, or rely on algorithms to weed out the bad seeds. As sociologist Katherine Cross recently stated at the Online Harassment Summit: “human beings will always be the final arbiters of ethics.”

There are no comments

Add yours