Thursday, March 31, 2016

Microsoft accidentally revives Nazi AI chatbot Tay, then kills it again

Microsoft today accidentally re-activated "Tay," its Hitler-loving Twitter chatbot, only to be forced to kill her off for the second time in a week.
Tay "went on a spam tirade and then quickly fell silent again," TechCrunch reported this morning. "Most of the new messages from the millennial-mimicking character simply read 'you are too fast, please take a rest,'" according to the The Financial Times. "But other tweets included swear words and apparently apologetic phrases such as 'I blame it on the alcohol.'"
The new tirade reportedly began around 3 a.m. ET. Tay's account, with 95,100 tweets and 213,000 followers, is now marked private. "Tay remains offline while we make adjustments," Microsoft told several media outlets today. "As part of testing, she was inadvertently activated on Twitter for a brief period of time."

No comments:

Post a Comment