students
Incidents Harmed By
Incident 4667 Report
AI-Generated-Text-Detection Tools Reported for High Error Rates
2023-01-03
Models developed to detect whether text generation AI was used such as AI Text Classifier and GPTZero reportedly contained high rates of false positive and false negative, such as mistakenly flagging Shakespeare's works.
MoreIncident 4042 Report
Sound Intelligence's Aggression Detector Misidentified Innocuous Sounds
2019-06-25
Sound Intelligence's "aggression detection" algorithm deployed by schools reportedly contained high rates of false positive, misclassifying laughing, coughing, cheering, and loud discussions.
MoreIncident 7052 Report
Turkish Student in Isparta Allegedly Uses AI to Cheat on Exam, Leading to Arrest
2024-06-08
A Turkish student in Isparta was reportedly arrested for using ChatGPT to cheat during the 2024 YKS university entrance exam. The student, identified as M.E.E., is alleged to have employed a sophisticated setup involving a router, mobile phone, earphone, and a button-shaped camera to transmit exam questions to ChatGPT and receive answers in real-time.
MoreIncident 2391 Report
Algorithmic Teacher Evaluation Program Failed Student Outcome Goals and Allegedly Caused Harm Against Teachers
2009-09-01
Gates-Foundation-funded Intensive Partnerships for Effective Teaching Initiative’s algorithmic program to assess teacher performance reportedly failed to achieve its goals for student outcomes, particularly for minority students, and was criticized for potentially causing harm against teachers.
MoreIncidents involved as Deployer
Incident 33914 Report
Open-Source Generative Models Abused by Students to Cheat on Assignments and Exams
2022-09-15
Students were reportedly using open-source text generative models such as GPT-3 and ChatGPT to complete school assignments and exams such as writing reports, essays.
MoreRelated Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
Related Entities
Intensive Partnerships for Effective Teaching
Incidents involved as both Developer and Deployer
Springer Nature
Incidents involved as Deployer
- Incident 13081 Report
Springer Nature Book 'Mastering Machine Learning: From Basics to Advanced' Reportedly Published With Numerous Purportedly Nonexistent or Incorrect Citations
- Incident 13091 Report
Springer Nature Book 'Social, Ethical and Legal Aspects of Generative AI: Tools, Techniques and Systems' Reportedly Published With Numerous Purportedly Fabricated or Unverifiable Citations
Unknown large language model developers
Incidents involved as Developer
- Incident 13081 Report
Springer Nature Book 'Mastering Machine Learning: From Basics to Advanced' Reportedly Published With Numerous Purportedly Nonexistent or Incorrect Citations
- Incident 13091 Report
Springer Nature Book 'Social, Ethical and Legal Aspects of Generative AI: Tools, Techniques and Systems' Reportedly Published With Numerous Purportedly Fabricated or Unverifiable Citations
Academic researchers
Incidents Harmed By
- Incident 13081 Report
Springer Nature Book 'Mastering Machine Learning: From Basics to Advanced' Reportedly Published With Numerous Purportedly Nonexistent or Incorrect Citations
- Incident 13091 Report
Springer Nature Book 'Social, Ethical and Legal Aspects of Generative AI: Tools, Techniques and Systems' Reportedly Published With Numerous Purportedly Fabricated or Unverifiable Citations
Unknown large language models
Incidents implicated systems
- Incident 13081 Report
Springer Nature Book 'Mastering Machine Learning: From Basics to Advanced' Reportedly Published With Numerous Purportedly Nonexistent or Incorrect Citations
- Incident 13091 Report
Springer Nature Book 'Social, Ethical and Legal Aspects of Generative AI: Tools, Techniques and Systems' Reportedly Published With Numerous Purportedly Fabricated or Unverifiable Citations