Abusive replies: Microsoft cuts off its chatbot

A reporter talks to a Microsoft chatbot for more than two hours.

Abusive replies: Microsoft cuts off its chatbot

A reporter talks to a Microsoft chatbot for more than two hours. At some point in the conversation, the program "confesses" its love to the journalist. Reason for the Microsoft group to drastically limit speaking time with immediate effect.

Microsoft has restricted the use of its Bing chatbot, which uses artificial intelligence to answer even complex questions and have in-depth conversations. The software group is thus reacting to a number of incidents in which the text robot got out of hand and formulated answers that were perceived as encroaching and inappropriate.

A test of the Bing chatbot by a reporter from the New York Times caused a stir on the Internet. In a dialogue lasting more than two hours, the chatbot claimed that he loved the journalist. He then asked the reporter to separate from his wife.

In a blog post, the company announced that it would now limit Bing chats to 50 questions per day and five per session. "Our data showed that the vast majority of people find the answers they are looking for within 5 rounds," the Bing team explained. Only about one percent of chat conversations contain more than 50 messages. When users reach the limit of five entries per session, Bing will prompt them to start a new topic.

Microsoft had previously warned against engaging in lengthy conversations with the AI ​​chatbot, which is still in a testing phase. Longer chats with 15 or more questions could result in Bing "repeating itself or prompting or provoking responses that aren't necessarily helpful or don't match our intended tonality."

Microsoft relies on technology from the start-up OpenAI for its Bing chatbot and supports the Californian AI company with billions.