Microsoft limits the Bing chat to five answers to prevent the AI ​​from getting really weird

by Janice Allen
0 comments

Microsoft says it implemented some conversation limits for its Bing AI just days after the chatbot went off the rails multiple times for users. Bing chats are now limited to 50 questions per day and five per session after the search engine insulted, lied to and emotionally manipulated users.

“Our data has shown that the vast majority of people find the answers they’re looking for within 5 turns and only about 1 percent of chat conversations have more than 50 messages,” says the Bing team in a blog post. If users reach the limit of five per session, Bing will prompt them to start a new topic to avoid long back-and-forth chat sessions.

Microsoft warned earlier this week that these longer chat sessions, with 15 or more questions, could cause Bing to be “repeated or asked/provoked to provide answers that aren’t necessarily helpful or in line with our designed tone.” Clearing a conversation after just five questions means “the model won’t get confused,” says Microsoft.

Microsoft is still working to improve Bing’s tone, but it’s not immediately clear how long these limits will last. “As we continue to receive feedback, we will explore expanding chat limits,” says Microsoft, so this appears to be a limited limit for now.

Bing’s chat feature continues to see improvements daily, addressing technical issues and releasing larger fixes on a weekly basis to improve search and responses. Microsoft said earlier this week that it didn’t “fully envision” people using the chat interface for “social entertainment” or as a tool for more “general world discovery.”

You may also like

All Right Reserved Businesskinda.com