Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1109: Year-long AI Surveillance Pilot in Two South Australian Aged Care Facilities Reportedly Overwhelmed Staff with False Positives

Description: Between March 2021 and March 2022, an AI-enabled video and audio monitoring system was trialed in two South Australian aged care facilities. According to an independent audit commissioned by South Australia Health, the system produced over 12,000 false alerts, overwhelming staff and contributing to at least one missed real incident. The report concluded the system did not reach accuracy levels acceptable to staff or management during the pilot.
Editor Notes: Timeline note: The AI-enabled CCTV surveillance trial in two South Australian aged care facilities reportedly began around March 2021 and concluded around March 2022. This incident was ingested into the AIID on 06/20/2025.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Sturdie Trade Services developed an AI system deployed by South Australia Health, which harmed Residents and staff of Northgate House , Residents and staff of Mount Pleasant District Hospital , Northgate House and Mount Pleasant District Hospital.
Alleged implicated AI system: AI-enabled video and audio surveillance system for fall/scream detection

Incident Stats

Incident ID
1109
Report Count
5
Incident Date
2021-03-01
Editors
Daniel Atherton

Incident Reports

Reports Timeline

Incident Occurrence+3
Aged care CCTV trial using artificial intelligence resulted in too many false reports, minister says
Protecting the vulnerable, or automating harm? AI’s double-edged role in spotting abuse
Aged care CCTV trial using artificial intelligence resulted in too many false reports, minister says

Aged care CCTV trial using artificial intelligence resulted in too many false reports, minister says

abc.net.au

‘Botched’ aged care AI camera trial generates 12,000 false alerts

‘Botched’ aged care AI camera trial generates 12,000 false alerts

innovationaus.com

South Australia's aged care AI trial produced 12,000 false alarms

South Australia's aged care AI trial produced 12,000 false alarms

itnews.com.au

AI video surveillance care home trial produced too many false alarms, say auditors

AI video surveillance care home trial produced too many false alarms, say auditors

ifsecglobal.com

Protecting the vulnerable, or automating harm? AI’s double-edged role in spotting abuse

Protecting the vulnerable, or automating harm? AI’s double-edged role in spotting abuse

theconversation.com

Aged care CCTV trial using artificial intelligence resulted in too many false reports, minister says
abc.net.au · 2022

A federally-funded CCTV trial in residential aged care homes alerted staff to so many incidents it became "a case of the boy who cried wolf", South Australia's Health Minister says.

The 12-month trial was looking into the potential of CCTV …

‘Botched’ aged care AI camera trial generates 12,000 false alerts
innovationaus.com · 2022

A 12-month pilot of AI-based surveillance technology designed to detect falls and abuse in two South Australian aged care homes generated more than 12,000 false alerts, a review has found.

The sheer number of alerts created alert fatigue th…

South Australia's aged care AI trial produced 12,000 false alarms
itnews.com.au · 2022

The Australian-first project was intended to pilot the use of cameras and AI to aid monitoring of residents under care, with a view to making the lives of staff easier.

However, a review of the pilot by PwC [pdf] showed the technology produ…

AI video surveillance care home trial produced too many false alarms, say auditors
ifsecglobal.com · 2022

A federally funded CCTV trial in two South Australian care homes -- where artificial intelligence was used to detect falls and screams -- produced more than 12,000 false incidents over a 12-month period, as Ron Alalouff reports.

An audit of…

Protecting the vulnerable, or automating harm? AI’s double-edged role in spotting abuse
theconversation.com · 2025

Artificial intelligence is rapidly being adopted to help prevent abuse and protect vulnerable people -- including children in foster care, adults in nursing homes and students in schools. These tools promise to detect danger in real time an…

Variants

A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Australian Automated Debt Assessment System Issued False Notices to Thousands

Australian Automated Debt Assessment System Issued False Notices to Thousands

Jul 2015 · 39 reports
Australian Retailers Reportedly Captured Face Prints of Their Customers without Consent

Australian Retailers Reportedly Captured Face Prints of Their Customers without Consent

May 2022 · 2 reports
Facial Recognition Trial Performed Poorly at Notting Hill Carnival

Facial Recognition Trial Performed Poorly at Notting Hill Carnival

Aug 2017 · 4 reports
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Australian Automated Debt Assessment System Issued False Notices to Thousands

Australian Automated Debt Assessment System Issued False Notices to Thousands

Jul 2015 · 39 reports
Australian Retailers Reportedly Captured Face Prints of Their Customers without Consent

Australian Retailers Reportedly Captured Face Prints of Their Customers without Consent

May 2022 · 2 reports
Facial Recognition Trial Performed Poorly at Notting Hill Carnival

Facial Recognition Trial Performed Poorly at Notting Hill Carnival

Aug 2017 · 4 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 69ff178