Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2726

関連インシデント

インシデント 4776 Report
Bing Chat Tentatively Hallucinated in Extended Conversations with Users

Loading...
Microsoft Limits Bing AI Chats to 5 Replies to Keep Conversations Normal
cnet.com · 2023

Microsoft is limiting how extensively people can converse with its Bing AI chatbot, following media coverage of the bot going off the rails during long exchanges. 

Bing Chat will now reply to up to five questions or statements in a row for each conversation, after which users will be prompted to start a new topic, the company said in a blog post Friday. Users will also be limited to 50 total replies per day. 

The restrictions are meant to keep conversations from getting weird. Microsoft said long discussions "can confuse the underlying chat model." 

On Wednesday the company had said it was working to fix problems with Bing, launched just over a week before, including factual errors and odd exchanges. Bizarre responses reported online have included Bing telling a New York Times columnist to abandon his marriage for the chatbot, and the AI demanding an apology from a Reddit user over whether we're in the year 2022 or 2023.

The chatbot's responses have also included factual errors. Microsoft said on Wednesday that it was tweaking the AI model to quadruple the amount of data from which it can source answers. The company said it would also give users more control over whether they want precise answers, which are sourced from Microsoft's proprietary Bing AI technology or more "creative" responses that use OpenAI's ChatGPT tech.

Bing's AI chat functionality is still in beta testing, with potential users on a wait list for access. With the tool, Microsoft hopes to get a head start on what some say will be the next revolution in internet search. 

The ChatGPT technology made a big splash when it launched in November, but OpenAI itself has warned of potential pitfalls, and Microsoft has acknowledged limitations with AI. Despite AI's impressive qualities, concerns have been raised about artificial intelligence being used for nefarious purposes like spreading misinformation and churning out phishing emails.

With Bing's AI capabilities, Microsoft would also like to get a jump on search powerhouse Google, which announced its own AI chat model, Bard, last week. Bard has had its own problems with factual errors, fumbling a response during its first public demo.

In its Friday blog post, Microsoft suggested the new AI chat restrictions are based on information gleaned from the beta test.

"Our data has shown that the vast majority of you find the answers you're looking for within 5 turns and that only ~1% of chat conversations have 50+ messages," it said. "As we continue to get your feedback, we will explore expanding the caps on chat sessions to further enhance search and discovery experiences." 

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd