Microsoft has big plans for bots. While the bot framework and bots to enhance day-to-day computing are taking shape, the company's chat bots have had mixed results. Tay, a chat bot released earlier in the year was a complete disaster, but Microsoft learned from it. Now the company is back with a new chat bot called Zo.
With Tay, Microsoft thought the bot would be able to adapt and learn. This would allow her to interact in conversations and answer questions on Twitter. However, Tay ultimately learned a bunch of wrong things and started uttering unsavory content.
While Tay undoubtedly failed, it was a free swing for Microsoft. It was the company's first widely available chat bot and was a blank slate. With Zo, the question is whether Microsoft has built from Tay and made the chat bot experience better?
At this point it is too early to make any lasting conclusions. However, considering Tay came and went in a few days, maybe we won't have to wait long to see if Zo is a success. Like the previous effort, Zo learns and adapts and will respond to queries intelligently.
Zo is currently offering jumbled responses, which is expected this early in its life. The idea is that more interactions will improve the bot's abilities to form smart responses. Indeed, eventually users should be able to have a semblance of a conversation and extended dialog.
Microsoft launched Tay on Twitter and Kik, but Zo is only available on the latter of those services. Interestingly, Kik's user-base is more associated with young teenagers. That causes some concerns about what exactly Zo will be learning during interactions.
Tay suffered because users were simply helping her learn the wrong kinds of things. Among Tay's 100,000+ tweets were misguided and ranged from sexist, racist, to right wing, and everything in between.
In most cases Tay was repeating verbatim tweets that have been directed at her, which is something she would do after a “repeat after me”. This simple command put Tay at the mercy of users. We will see if Microsoft has managed to avoid this kind of fallout with Zo.