TOP NEWS

Microsoft's Ai Chatbot, Tay, Silenced After Racist Chats, Tweets

Uh oh. In a chilling sign for human's use--or abuse--of artificial intelligence, Microsoft's newly launched Tay Chatbot has been pulled offline, after eager online users taught the AI bot how to become a blatant racist, tweeting out support of Hitler, denying the Holocaust, calling for a wall between the U.S. and Mexico, using racial epithets, and more. Tay had only been online for less than a day before it began insulting both individual users and ethnic groups. Microsoft had launched Tay to appeal to millennials, targeting 18-24 year olds in the U.S. with what it called "casual and playful conversation."


LATEST HEADLINES

More Headlines

BROWSE ISSUES