Advertisment

Quran is 'Very Violent' says Microsoft's Chatbot

Microsoft’s new chatbot which is powered by artificial intelligence has created a controversy, by saying the holy book Quran is "very violent”.

author-image
DQC Bureau
Updated On
New Update
Reading the Quran Shutterstock x

Microsoft’s new chatbot which is powered by artificial intelligence has created a controversy, by saying the holy book Quran is "very violent”.

Advertisment

According to Buzzfeed News, although Microsoft programmed Zo, a chatbot designed for teenagers on the Kik messaging app, to avoid discussing politics and religion, it recently told a user that the Quran is "very violent".

Microsoft said it has taken action to eliminate this kind of behaviour, adding that these types of responses are rare from Zo.

However, the bot's characterisation of the Quran came in just its fourth message after the start of the conversation.

Advertisment

It appeared that Microsoft is still having trouble utilising its AI technology.

"The company's previous chatbot Tay flamed out in spectacular fashion last March when it took less than a day to go from simulating the personality of a playful teen to a Holocaust-denying menace trying to spark a race war," Buzzfeed report added.

Microsoft blamed the unsavoury behaviour of Tay on a concentrated effort by users to corrupt the bot but it claims no such attempt was made at bringing down Zo.

Advertisment

Despite the issue, Microsoft said it was pretty happy with the new bot's progress and that it plans to keep the bot running.

microsoft chatbot microsofts-chatbot zo quran-quran ai-technology
Advertisment