Incident 639: Un client a été surfacturé en raison des fausses demandes de réduction du chatbot d'Air Canada
Entités
Voir toutes les entitésStatistiques d'incidents
Risk Subdomain
7.3. Lack of capability or robustness
Risk Domain
- AI system safety, failures, and limitations
Entity
AI
Timing
Post-deployment
Intent
Unintentional
Rapports d'incidents
Chronologie du rapport
Air Canada must pay a passenger hundreds of dollars in damages after its online chatbot gave the guy wrong information before he booked a flight.
Jake Moffatt took the airline to a small-claims tribunal after the biz refused to refund him f…

After months of resisting, Air Canada was forced to give a partial refund to a grieving passenger who was misled by an airline chatbot inaccurately explaining the airline's bereavement travel policy.
On the day Jake Moffatt's grandmother di…
After his grandmother died in Ontario a few years ago, British Columbia resident Jake Moffatt visited Air Canada's website to book a flight for the funeral. He received assistance from a chatbot, which told him the airline offered reduced r…

Artificial intelligence is having a growing impact on the way we travel, and a remarkable new case shows what AI-powered chatbots can get wrong – and who should pay. In 2022, Air Canada's chatbot promised a discount that wasn't available to…
Lloyd's of London a lancé un produit d'assurance destiné aux entreprises confrontées à des dysfonctionnements liés à l'intelligence artificielle (IA).
Comme l'a rapporté le Financial Times (FT) dimanche 11 mai, ce lancement intervient alors…
Variantes
Incidents similaires
Did our AI mess up? Flag the unrelated incidents

Uber AV Killed Pedestrian in Arizona
Defamation via AutoComplete

A Collection of Tesla Autopilot-Involved Crashes
Incidents similaires
Did our AI mess up? Flag the unrelated incidents

Uber AV Killed Pedestrian in Arizona
Defamation via AutoComplete
