Thursday, March 24, 2016

Microsoft’s AI chatbot Tay learned how to be racist in less than 24 hours

Tay, Microsoft’s AI chatbot on Twitter had to be pulled down within hours of launch after it suddenly started making racist comments.
“bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we’ve got.”
Read More