Incident 470: Bing Chat Response Cited ChatGPT Disinformation Example
Description: Reporters from TechCrunch issued a query to Microsoft Bing's ChatGPT feature, which cited an earlier example of ChatGPT disinformation discussed in a news article to substantiate the disinformation.
One of the more interesting, but seemingly academic, concerns of the new era of AI sucking up everything on the web was that AIs will eventually start to absorb other AI-generated content and regurgitate it in a self-reinforcing loop. Not s…
If you don’t believe the rushed launch of AI chatbots by Big Tech has an extremely strong chance of degrading the web ’s information ecosystem, consider the following:
Right now,* if you ask Microsoft’s Bing chatbot if Google’s Bard chatbot …
A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.