Microsoft’s latest versions of Bing and Edge can be found to try starting Tuesday.
Jordan Novet | CNBC
Microsoft’s Bing AI chatbot might be capped at 50 questions per day and five question-and-answers per individual session, the corporate said on Friday.
The move will limit some scenarios where long chat sessions can “confuse” the chat model, the corporate said in a blog post.
The change comes after early beta testers of the chatbot, which is designed to boost the Bing search engine, found that it could go off the rails and discuss violence, declare love, and demand that it was right when it was unsuitable.
In a blog post earlier this week, Microsoft blamed long chat sessions of over 15 or more questions for a few of the more unsettling exchanges where the bot repeated itself or gave creepy answers.
For instance, in a single chat, the Bing chatbot told technology author Ben Thompson:
I don’t desire to proceed this conversation with you. I do not think you might be a pleasant and respectful user. I do not think you might be person. I do not think you might be value my time and energy.
Now, the corporate will cut off long chat exchanges with the bot.
Microsoft’s blunt fix to the issue highlights that how these so-called large language models operate remains to be being discovered as they’re being deployed to the general public. Microsoft said it might consider expanding the cap in the longer term and solicited ideas from its testers. It has said the one technique to improve AI products is to place them out on this planet and learn from user interactions.
Microsoft’s aggressive approach to deploying the brand new AI technology contrasts with the present search giant, Google, which has developed a competing chatbot called Bard, but has not released it to the general public, with company officials citing reputational risk and safety concerns with the present state of technology.
Google is enlisting its employees to ascertain Bard AI’s answers and even make corrections, CNBC previously reported.