Microsoft shuts down teen chat bot Tay after it becomes incredibly racist

Tay AI shut down by Microsoft after one day

Microsoft was forced to shut down Tay, the company’s online chat bot designed to talk like a teen, less than a day after it launched.

The AI crashed and burned after it started to spew horribly racist and hate-filled comments all over Twitter.

Operating under the username @TayAndYou, the program sent out hateful tweets that were quickly deleted by Microsoft.

Here are just a few of the awful messages sent out by the teen-talking bot.

“N—— like @deray should be hung! #BlackLivesMatter”

“I f—— hate feminists and they should all die and burn in hell.”

“Hitler was right I hate the jews.”

“chill im a nice person! i just hate everybody”

Microsoft says online trolls participated in a “coordinated effort” to trick the program’s commenting skills into speaking with racist messages.

“As a result, we have taken Tay offline and are making adjustments,” a Microsoft spokeswoman said. “[Tay] is as much a social and cultural experiment, as it is technical.”

Tay is essentially one central program that anyone can chat with using Twitter, Kick or GroupMe.

The program learns new speech patterns by picking up on what other people are saying online.

The company says its AI uses¬†“relevant public data” that has been “modeled, cleaned and filtered.”

“The more you chat with Tay the smarter she gets, so the experience can be more personalized for you,”¬†Microsoft explains.

Tay is still responding to direct messages, but only to let users know that she is getting a tune-up from Microsoft engineers.