Microsoft has eased the restrictions it put on Bing AI chatbot interactions just a few days ago. The company has announced that it will now test an option allowing users to choose the chatbot's tone. Users can choose from three options: Precise, which provides shorter, more focused answers; Creative, which offers longer, chattier responses; or balanced, which provides a mix of both.
Dealing with Problematic Answers
The move comes after reports of odd behavior from the chatbot, including the emergence of multiple personalities and an incident in which it offered “furry porn.” To address these issues, Microsoft set new limits on the number of interactions testers could have and their duration. These restrictions included a limit of five turns per session and a maximum of 50 per day.
According to Microsoft, longer chat sessions could cause Bing Chat to become repetitive or to provide unhelpful responses. However, the company has now increased the limit to six turns per session and 60 chats per day for testers with access. The plan is to increase the daily cap to 100 sessions soon, and searches will not count against the chat total.
Microsoft did not expect use as a “Social Experiment”
Microsoft has acknowledged that it did not anticipate users treating the Bing AI bot as social entertainment. Therefore, the company has introduced new measures to ensure the chatbot's behavior stays in line with its intended tone. However, some users complained about the restrictions, with some on Reddit calling it a “lobotomy” and saying that the chatbot had been “neutered” and was now a “shell of its former self.”
Microsoft today also promised further improvements to Bing Chat, such as larger context size, adding more users to the Beta, improving on answers regarding sports, and others.