Microsoft says it’s making ‘adjustments’ to Tay chatbot after Internet ‘abuse’ – PCWorld

7 months ago Comments Off on Microsoft says it’s making ‘adjustments’ to Tay chatbot after Internet ‘abuse’ – PCWorld

It sounds like Microsoft’s Tay chatbot is getting a time-out, as Microsoft instructs her on how to talk with strangers on the Internet. Because, as the company quickly learned, the citizens of the Internet can’t be trusted with that task.

In a statement released Thursday, Microsoft said that a “coordinated effort” by Internet users had turned the Tay chatbot into a tool of “abuse.” It was a clear reference to a series of racist and otherwise abusive tweets that the Tay chatbot issued within a day of debuting on Twitter. Wednesday morning, Tay was a novel experiment in AI that would learn natural language through social engagement. By Wednesday evening, Tay was reflecting the more unsavory aspects of life online.

tay screenshot feminists

 Some were just odd (and racist):

And some were simply controversial:

It didn’t take users long to learn that the Tay chatbot contained a “repeat after me” command, which they promptly took advantage of. That produced a series of tweets where Tay parroted what users told her to say. Tay inexplicably added the “repeat after me” phrase to the parroted content on at least some tweets, implying that users should repeat what the chatbot said. Naturally, those tweets were recirculated around the Internet.

As a result, Microsoft said Tay would be offline while the company made “adjustments.” “The AI chatbot Tay is a machine learning project, designed for human engagement,” a Microsoft spokeswoman said in a statement. “It is as much a social and cultural experiment, as it is technical.  Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.” Microsoft gave no details on adjustments it would make to the algorithm.

Why this matters: For decades, we’ve lived with search engines that have served as our digital servants, fetching information for us from the Internet. Only recently have those engines evolved into assistants, which can communicate on a more personal level. Microsoft clearly wants to go further, allowing its own assistant, Cortana, to interact using what computer scientists call “natural language.” Unfortunately, Tay was like a child wandering into some very dark corners of the Internet. But there are still signs that Microsoft could turn this into a positive.

A cultural experiment gone wrong…

As a result of the abuse, Microsoft and Twitter began removing some of the more abusive tweets. Tay also signed off on Wednesday night, and hasn’t returned since. 

Microsoft says it’s making ‘adjustments’ to Tay chatbot after Internet ‘abuse’ – PCWorld