Incident 577: Bankrate's Resumption of AI-Generated Content Allegedly Continuing to Produce Inaccurate and Misleading Information
Description: Bankrate, and its sister site CNET, both owned by Red Ventures, resumed publishing AI-generated articles claiming thorough human fact-checking. However, new articles are alleged to contain numerous factual errors, including inaccurate statistics and misleading information. Despite public criticism, the company defended its use of AI and blamed out-of-date datasets for the errors. In addition to the errors, the incident raises questions about the ethical use of AI in journalism, especially given the company's insistence on "fact-checked" content.
The finance site Bankrate has started publishing AI-generated articles again, and it insists that this time they've been meticulously fact-checked by a human journalist before being published.
"This article was generated using automation te…
A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.