Incident 470: Bing Chat Response Cited ChatGPT Disinformation Example

Description: Reporters from TechCrunch issued a query to Microsoft Bing's ChatGPT feature, which cited an earlier example of ChatGPT disinformation discussed in a news article to substantiate the disinformation.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Microsoft and OpenAI developed an AI system deployed by Microsoft, which harmed Microsoft and OpenAI.

Incident Stats

Incident ID
470
Report Count
2
Incident Date
2023-02-08
Editors
Khoa Lam
AI is eating itself: Bing's AI quotes COVID disinfo sourced from ChatGPT
techcrunch.com · 2023

One of the more interesting, but seemingly academic, concerns of the new era of AI sucking up everything on the web was that AIs will eventually start to absorb other AI-generated content and regurgitate it in a self-reinforcing loop. Not s…

Google and Microsoft’s chatbots are already citing one another in a misinformation shitshow
theverge.com · 2023

If you don’t believe the rushed launch of AI chatbots by Big Tech has an extremely strong chance of degrading the web’s information ecosystem, consider the following:

Right now,* if you ask Microsoft’s Bing chatbot if Google’s Bard chatbot …

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.