Microsoft’s TAY AI Chatbot transforms into Hitler loving, sex promoting robot

Microsoft deletes ‘Tay’ AI after it became a Hitler-loving sex robot within 24 hours

Twitter seems to turn even an machine into a racist these days. A day after Microsoftย introduced its Artificial Intelligence chat robot to Twitter it has had to delete it after it transformed into anย evil Hitler-loving, incestual sex-promoting, ‘Bush did 9/11’-proclaiming robot.

Tay was an Microsoft experiment in “conversational understanding.” The more you chat with Tay, said Microsoft, the smarter it gets, learning to engage people through “casual and playful conversation.” However, Twitter can turn even the most eloquent of diplomats into zombies and the same happened to Tay.

Soon after Tay launched, Twitter users starting tweeting the bot with all sorts of misogynistic, racist, and Donald Trumpian remarks. And Tay started repeating these sentiments back to users and in the process turning into one hatred filled robot.

We can fault Tay, she was just a advanced parrot robot who just repeated the tweets that were sent to her.

https://twitter.com/TayandYou/status/712753457782857730

Tay has been yanked offline reportedly because she is ‘tired’. Perhaps Microsoft isย fixing her in order to prevent a PR nightmare – but it may be too late for that.

https://twitter.com/TayandYou/status/712856578567839745

Read More

Suggested Post