Microsoft will put limits on the questions users can ask the Bing chatbot per session

Microsoft  has decided to apply some changes to its version of ChatGPT included in Bing to avoid rendering problems with the chatbot .

Microsoft  has noticed the plethora of errors that have been reported in recent days with the chatbot that it has included in its Bing search engine , so it has announced that it will establish certain limits on the number of questions that the chatbot can be asked per session. The technology firm highlights that although users will be able to engage in conversations with the AI , they cannot be excessively long to avoid “confusing” it.

As the company’s team in charge of Bing has mentioned in its official blog , users who already have access to the version of ChatGPT included in the search engine will not be able to make their conversations too long as this could cause artificial intelligence to start generate repetitive or unhelpful responses. For this reason, the questions have been reduced to a certain number of times per session.

Microsoft changes the rules of its AI to avoid failures

According to what the firm indicated, “as of today the chat experience will have a limit of 50 shifts per day and five per session . ” Microsoft Bing understands that a “turn” is a complete interaction with the language model built into its browser, that is, a question from the user and a response from the chatbot.

This number of turns has not been set randomly. The technology firm mentions that, according to the data it manages, most users find the answers they are looking for in just five turns and only about 1% of the conversations exceed 50 messages. “Very long chat sessions can confuse the underlying model in the new Bing , ” Microsoft said .

Once the user has asked their five questions and the chatbot has answered them, the platform will request to start a new topic. “At the end of each session you should clarify the context so that the model does not get confused. Just click the broom icon to the left of the search box to start over ,” the company clarified, which is also open to extending its use in the future if necessary.

“As we continue to receive feedback, we will explore the possibility of expanding chat session limits to further improve search and discovery experiences ,” he said.

An unexpected situation for the Microsoft Bing chatbot

The announcement comes days after Microsoft revealed that 71% of Bing chatbot users said they were satisfied with the responses generated. However, certain points to improve have been detected and, of course, some errors in certain cases. “We have found that in long, stretched out chat sessions of 15 or more questions, Bing can become repetitive or can be prompted to provide answers that are not necessarily helpful or in keeping with the tone we have designed,” the company admitted .

The company highlighted that very long sessions and the fact that the model “tries to respond or reflect the tone in which it is asked to provide answers” are the factors that cause the AI ​​to become ‘confused’. “We may need to add a tool so you can update the context more easily or start from scratch ,” he said a few days ago.