Incidente 639: Cliente cobrado de más debido a falsas afirmaciones de descuento de un chatbot de Air Canada
Entidades
Ver todas las entidadesEstadísticas de incidentes
Risk Subdomain
7.3. Lack of capability or robustness
Risk Domain
- AI system safety, failures, and limitations
Entity
AI
Timing
Post-deployment
Intent
Unintentional
Informes del Incidente
Cronología de Informes
Air Canada must pay a passenger hundreds of dollars in damages after its online chatbot gave the guy wrong information before he booked a flight.
Jake Moffatt took the airline to a small-claims tribunal after the biz refused to refund him f…

After months of resisting, Air Canada was forced to give a partial refund to a grieving passenger who was misled by an airline chatbot inaccurately explaining the airline's bereavement travel policy.
On the day Jake Moffatt's grandmother di…
After his grandmother died in Ontario a few years ago, British Columbia resident Jake Moffatt visited Air Canada's website to book a flight for the funeral. He received assistance from a chatbot, which told him the airline offered reduced r…

Artificial intelligence is having a growing impact on the way we travel, and a remarkable new case shows what AI-powered chatbots can get wrong – and who should pay. In 2022, Air Canada's chatbot promised a discount that wasn't available to…
Lloyd's of London ha presentado un producto de seguros para empresas que se enfrentan a fallos relacionados con la inteligencia artificial (IA).
Como informó el Financial Times (FT) el domingo 11 de mayo, este lanzamiento se produce en un m…
Variantes
Incidentes Similares
Did our AI mess up? Flag the unrelated incidents

Uber AV Killed Pedestrian in Arizona
Defamation via AutoComplete

A Collection of Tesla Autopilot-Involved Crashes
Incidentes Similares
Did our AI mess up? Flag the unrelated incidents

Uber AV Killed Pedestrian in Arizona
Defamation via AutoComplete
