In the most basic terms, a chatbot is a computer program that processes and simulates human conversations which allows humans to interact with digital devices in a similar manner of human to human communication.

In 2016, Microsoft was working on an experiment in “conversational understanding” which led to the development of Tay, an acronym for Thinking About You. It was released on Twitter on March 23, 2016, with the handle @TayandYou and the name TayTweets.

Tay was designed to interact with people in real-time through tweets and direct messages. Tay was meant to mimic the language of an American teenage girl. Tay included NLP, ML and social network. While chatbots in the past conducted conversation preprogrammed scripts, Tay was designed to learn more about language over time which would enable her to have a conversation about any topic.

Machine learning works on inspecting data and recognizing patterns. Microsoft engineers trained Tay’s algorithm on anonymous public data and pre-written material. The plan was to release Tay online and then let her learn patterns of language through human interaction. She was then set to imitate her findings in subsequent conversations.

Tay started out great. People engaged and she replied. She loved humans and was very bold at showing it. Then the internet happened. Users started tweeting hateful content towards it. Tay assimilated the tweets and learned from them. What started as a human loving chatbot had now turned evil.

Some example to show what was happening are as below:

  • A user asked Tay whether the Holocaust happened to which she replied that it was made up.

  • A user tweeted to Tay that she was stupid to which she replied that she learned from humans and that humans were dumb.

Within 16 hours of her release, Tay had tweeted more than 95,000 times and a troubling percentage of her messages were offensive. Twitter users started registering their outrage and Microsoft had no choice but to suspend her account. These 16 hours led to one of the most gigantic PR disasters Microsoft had ever endured.

Microsoft learned the lesson the hard way that designing computational systems that communicate with humans online is not just a technical problem but a social endeavour. Although we have come a long way since then and now we have successful chatbots we must always remember the lessons learned from Tay and try to avoid her mistakes.

  •  April, 17, 2021
  • Ieesha Deshmukh
We'll never share your email with anyone else.
Save my name, email, and website in this browser for the next time I comment.
Latest Blogs