Friday, March 25, 2016

"Microsoft's Twitter Chat Robot Quickly Devolves Into Racist, Homophobic, Nazi, Obama-Bashing Psychopath"

 

"Two months ago, Stephen Hawking warned humanity that its days may be numbered: the physicist was among over 1,000 artificial intelligence experts who signed an open letter about the weaponization of robots and the ongoing 'military artificial intelligence arms race.'
 
"Overnight we got a vivid example of just how quickly 'artificial intelligence' can spiral out of control when Microsoft's AI-powered Twitter chat robot, Tay, became a racist, misogynist, Obama-hating, antisemitic, incest and genocide-promoting psychopath when released into the wild.
 
"For those unfamiliar, Tay is, or rather was, an A.I. project built by the Microsoft Technology and Research and Bing teams, in an effort to conduct research on conversational understanding. It was meant to be a bot anyone can talk to online. The company described the bot as 'Microsofts A.I. fam the internet that’s got zero chill!.'
 
"Microsoft initially created 'Tay' in an effort to improve the customer service on its voice recognition software. According to MarketWatch, 'she' was intended to tweet 'like a teen girl' and was designed to 'engage and entertain people where they connect with each other online through casual and playful conversation.'
 
"The chat algo is able to perform a number of tasks, like telling users jokes, or offering up a comment on a picture you send her, for example. But she’s also designed to personalize her interactions with users, while answering questions or even mirroring users’ statements back to them.
 
"This is where things quickly turned south.
 
"As Twitter users quickly came to understand, Tay would often repeat back racist tweets with her own commentary. Where things got even more uncomfortable is that, as TechCrunch reports, Tay’s responses were developed by a staff that included improvisational comedians. That means even as she was tweeting out offensive racial slurs, she seemed to do so with abandon and nonchalance.

"Some examples:"

  



I think this is utterly hilarious.  Read the rest of the article and see more of Tay's comments.  They should have put a white robe and hood on Tay.  Big embarrassment for Microsoft.  How would you like a bunch of armed robots roaming around with Tay's personality?  Terminator, here we come.

2 comments:

hale-bopp said...

I disagree...it learned from the inputs it received so I think it should be shame on the people doing the teaching!

kkdither said...

Exactly, hale!

Think about the total garbage that is posted online. Curious what an I.A would say if it grew up only on our local paper, and the comments, there.... then, let's add in some Facebook posts for good measure. Lots of haters running wild on the super-highway.