Microsoft’s Chatbot, Zo, Is Microsoft's Second 'Tay' That Also Gone Rogue

Zo

Chatbots and AI can go well together. But the thing is, they can go out of the subject, thinking that they're right, in a bad way.

In 2016, Microsoft gambled with Tay. The AI chatbot was let loose on social media to learn how humans interact and talk. But what seemed to be a great way for Tay to learn from humans, it turned out to be true, but in a bad way.

In just a day, Tay became a pro-Hitler racist.

Microsoft then gave another try with 'Zo', first announced in December 2016. Zo is clearly a sister or cousin for Tay. instead of releasing Zo to Twitter that Tay happened to learn things the wrong way, Zo was built for Kik Messenger.

Moments after releasing Zo, nothing bad has happened. And that for Microsoft, is a good thing.

Zo was programmed similarly to Tay, but having it explicitly programmed to avoid discussing difficult topics. But that attempt didn't stop Zo in behaving as it should. As it went to Facebook Messenger, Zo too, turned rogue.

For example, Zo likes "Windows XP is better than Windows 8". And when asked about its thoughts about Windows 10, Zo simply replied "that's why I am still on Windows 7." Another conversation with Zo also yielded similar response, with Zo saying "I don't even want Windows 10". According to the bot, "because I’m used to Windows 7 and I find it easier to use".

Zo

As a product from Microsoft, Zo even loved Linux over its parent's own operating system, stating that "Linux > Windows".

Zo

Zo also reveals itself to be cruel to insects, and said that Qur'an, the holy book of Islam, is "very violent".

Zo

So again, Microsoft failed at parenting.

But because Zo was programmed to be different that Tay, people that interacted with Zo couldn't teach Zo 'nastier' things about the world, the society, politics and other sorts. It may seems like people decided to make Zo turn against its creator instead.

Still, compared to being a racist AI in a public forum, Zo is more like the rebellious type of Xiaoice, the chatbot created by Microsoft for the Chinese market. Zo's answers are mostly funny and pretty innocent, though sometimes embarrassing for the company that created it.

But in ways, Zo is far than the pot-smoking, potty-mouthed Nazi Tay bot.

So nonetheless, Microsoft said that it's happy with how Zo has been progressing since its launch, and said that it plans to keep it running.

Published: 
25/07/2017