Microsoft Zo Chatbot Goes Rogue With Offensive Speech Reminiscent Of Defunct Tay AI

Microsoft Zo Chatbot Goes Rogue With Offensive Speech Reminiscent Of Defunct Tay AI
Microsoft Tay was a well-intentioned entry into the burgeoning field of AI chatbots. However, Tay ended up being a product of its environment, transforming seemingly overnight into a racist, hate-filled and sex-crazed chatbot that caused an embarrassing PR nightmare for Microsoft.

The AI wunderkinds in Redmond, Washington hoped to right

Source: Hot Hardware – Microsoft Zo Chatbot Goes Rogue With Offensive Speech Reminiscent Of Defunct Tay AI