Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 2971

Associated Incidents

Incident 4685 Report
ChatGPT-Powered Bing Reportedly Had Problems with Factual Accuracy on Some Controversial Topics

Loading...
Microsoft limits Bing chat to five replies to stop the AI from getting real weird
theverge.com · 2023

Microsoft says it’s implementing some conversation limits to its Bing AI just days after the chatbot went off the rails multiple times for users. Bing chats will now be capped at 50 questions per day and five per session after the search engine was seen insulting users, lying to them, and emotionally manipulating people.

“Our data has shown that the vast majority of people find the answers they’re looking for within 5 turns and that only around 1 percent of chat conversations have 50+ messages,” says the Bing team in a blog post. If users hit the five-per-session limit, Bing will prompt them to start a new topic to avoid long back-and-forth chat sessions.

Microsoft warned earlier this week that these longer chat sessions, with 15 or more questions, could make Bing “become repetitive or be prompted / provoked to give responses that are not necessarily helpful or in line with our designed tone.” Wiping a conversation after just five questions means “the model won’t get confused,” says Microsoft.

Reports of Bing’s “unhinged” conversations emerged earlier this week, followed by The New York Times publishing an entire two-hour-plus back-and-forth with Bing, where the chatbot said it loved the author and somehow they weren’t able to sleep that night. Many smart people have failed the AI Mirror Test this week, though.

Microsoft is still working to improve Bing’s tone, but it’s not immediately clear how long these limits will last. “As we continue to get feedback, we will explore expanding the caps on chat sessions,” says Microsoft, so this appears to be a limited cap for now.

Bing’s chat function continues to see improvements on a daily basis, with technical issues being addressed and larger weekly drops of fixes to improve search and answers. Microsoft said earlier this week that it didn’t “fully envision” people using its chat interface for “social entertainment” or as a tool for more “general discovery of the world.”

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd