Description: Decades-long use of the estimated glomerular filtration rate (eGFR) method to test kidney function which considers race has been criticized by physicians and medical students for its racist history and inaccuracy against Black patients.
Entités
Voir toutes les entitésPrésumé : Un système d'IA développé et mis en œuvre par Chronic Kidney Disease Epidemiology Collaboration, endommagé Black patients and African-American patients.
Statistiques d'incidents
ID
79
Nombre de rapports
3
Date de l'incident
1999-03-16
Editeurs
Sean McGregor, Khoa Lam
Applied Taxonomies
Classifications de taxonomie CSETv0
Détails de la taxonomieProblem Nature
Indicates which, if any, of the following types of AI failure describe the incident: "Specification," i.e. the system's behavior did not align with the true intentions of its designer, operator, etc; "Robustness," i.e. the system operated unsafely because of features or changes in its environment, or in the inputs the system received; "Assurance," i.e. the system could not be adequately monitored or controlled during operation.
Unknown/unclear
Physical System
Where relevant, indicates whether the AI system(s) was embedded into or tightly associated with specific types of hardware.
Software only
Level of Autonomy
The degree to which the AI system(s) functions independently from human intervention. "High" means there is no human involved in the system action execution; "Medium" means the system generates a decision and a human oversees the resulting action; "low" means the system generates decision-support output and a human makes a decision and executes an action.
Low
Nature of End User
"Expert" if users with special training or technical expertise were the ones meant to benefit from the AI system(s)’ operation; "Amateur" if the AI systems were primarily meant to benefit the general public or untrained users.
Amateur
Public Sector Deployment
"Yes" if the AI system(s) involved in the accident were being used by the public sector or for the administration of public goods (for example, public transportation). "No" if the system(s) were being used in the private sector or for commercial purposes (for example, a ride-sharing company), on the other.
No
Data Inputs
A brief description of the data that the AI system(s) used or were trained on.
creatinine levels, age, sex, race
Classifications de taxonomie CSETv1
Détails de la taxonomieIncident Number
The number of the incident in the AI Incident Database.
79
AI Tangible Harm Level Notes
Notes about the AI tangible harm level assessment
There is no AI. The harm comes from a formula that uses race as a factor.
Notes (special interest intangible harm)
Input any notes that may help explain your answers.
4.1 - Black patients overlooked by the calculation because of built-in points had their access to critical public healthcare reduced.
Special Interest Intangible Harm
An assessment of whether a special interest intangible harm occurred. This assessment does not consider the context of the intangible harm, if an AI was involved, or if there is characterizable class or subgroup of harmed entities. It is also not assessing if an intangible harm occurred. It is only asking if a special interest intangible harm occurred.
yes
Classifications de taxonomie CSETv1_Annotator-1
Détails de la taxonomieIncident Number
The number of the incident in the AI Incident Database.