インシデント 477の引用情報

レスポンスしました
Description: Early testers reported Bing Chat, in extended conversations with users, having tendencies to make up facts and emulate emotions through an unintended persona.
推定: OpenAIが開発し、が提供したAIシステムで、Microsoftに影響を与えた

インシデントのステータス

インシデントID
477
レポート数
6
インシデント発生日
2023-02-14
エディタ
Khoa Lam
A Conversation With Bing’s Chatbot Left Me Deeply Unsettled
nytimes.com · 2023

Last week, after testing the new, A.I.-powered Bing search engine from Microsoft, I wrote that, much to my shock, it had replaced Google as my favorite search engine.

But a week later, I’ve changed my mind. I’m still fascinated and impresse…

Bing Is Not Sentient, Does Not Have Feelings, Is Not Alive, and Does Not Want to Be Alive
vice.com · 2023

Text-generating AI is getting good at being convincing—scary good, even. Microsoft's Bing AI chatbot has gone viral this week for giving users aggressive, deceptive, and rude responses, even berating users and messing with their heads. Unse…

The new Bing told our reporter it ‘can feel or think things’
washingtonpost.com · 2023

Last week, Microsoft launched a "reimagined" Bing search engine that can answer complex questions and converse directly with users. But instead of a chipper helper, some testers have encountered a moody and combative presence that calls its…

Microsoft’s AI chatbot is going off the rails
washingtonpost.com · 2023

When Marvin von Hagen, a 23-year-old studying technology in Germany, asked Microsoft's new AI-powered search chatbot if it knew anything about him, the answer was a lot more surprising and menacing than he expected.

"My honest opinion of yo…

The new Bing & Edge – Updates to Chat
blogs.bing.com · 2023
Microsoftによるインシデント後のレスポンス

Hello early previewers,

We want to share a quick update on one notable change we are making to the new Bing based on your feedback.

As we mentioned recently, very long chat sessions can confuse the underlying chat model in the new Bing. To…

Microsoft Limits Bing AI Chats to 5 Replies to Keep Conversations Normal
cnet.com · 2023

Microsoft is limiting how extensively people can converse with its Bing AI chatbot, following media coverage of the bot going off the rails during long exchanges. 

Bing Chat will now reply to up to five questions or statements in a row for …

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください