Entities
View all entitiesIncident Stats
CSETv1 Taxonomy Classifications
Taxonomy DetailsIncident Number
37
CSETv1_Annotator-1 Taxonomy Classifications
Taxonomy DetailsIncident Number
37
CSETv1_Annotator-2 Taxonomy Classifications
Taxonomy DetailsIncident Number
37
Notes (special interest intangible harm)
Resumes featuring language commonly associated with women were downgraded.
Special Interest Intangible Harm
yes
Date of Incident Year
2014
Estimated Date
No
Multiple AI Interaction
no
CSETv0 Taxonomy Classifications
Taxonomy DetailsProblem Nature
Specification
Physical System
Software only
Level of Autonomy
Medium
Nature of End User
Expert
Public Sector Deployment
No
Data Inputs
Resumes
Incident Reports
Reports Timeline
- View the original report at its source
- View the report at the Internet Archive
Amazon has scrapped a “sexist” tool that used artificial intelligence to decide the best candidates to hire for jobs.
Members of the team working on the system said it effectively taught itself that male candidates were preferable.
The arti…
- View the original report at its source
- View the report at the Internet Archive
Amazon has scrapped a "sexist" internal tool that used artificial intelligence to sort through job applications.
The AI was created by a team at Amazon's Edinburgh office in 2014 as a way to automatically sort through CVs and pick out the m…
- View the original report at its source
- View the report at the Internet Archive
Amazon discovered a problem with using artificial intelligence to hire: their AI was biased against women.
The Seattle-based company developed computer programs designed to filter through hundreds of resumes and surface the best candidates,…
- View the original report at its source
- View the report at the Internet Archive
Machine learning, one of the core techniques in the field of artificial intelligence, involves teaching automated systems to devise new ways of doing things, by feeding them reams of data about the subject at hand. One of the big fears here…
- View the original report at its source
- View the report at the Internet Archive
Image copyright Getty Images Image caption The algorithm repeated bias towards men, reflected in the technology industry
An algorithm that was being tested as a recruitment tool by online giant Amazon was sexist and had to be scrapped, acco…
- View the original report at its source
- View the report at the Internet Archive
Machine learning technology is becoming increasingly common across various industries, from policing to recruiting. But reports have shown that many of these systems have long-standing problems regarding discrimination. To avoid amplifying …
- View the original report at its source
- View the report at the Internet Archive
SAN FRANCISCO (Reuters) - Amazon.com Inc’s (AMZN.O) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women.
The team had been building computer programs since 2014 to review job applicants’ resu…
- View the original report at its source
- View the report at the Internet Archive
Amazon sign, with dude. David Ryder/Getty Images
Thanks to Amazon, the world has a nifty new cautionary tale about the perils of teaching computers to make human decisions.
According to a Reuters report published Wednesday, the tech giant d…
- View the original report at its source
- View the report at the Internet Archive
Why Global Citizens Should Care
Gender discrimination in the workplace prevents women from achieving to their full potential. Eliminating gender inequality in the workforce would greatly increase economic activity. When half of the populati…
- View the original report at its source
- View the report at the Internet Archive
Amazon has been forced to scrap its AI recruitment system after it was discovered to be biased against female applicants.
The AI was developed in 2014 by Amazon as a way of filtering out most candidates to provide the firm with the top five…
- View the original report at its source
- View the report at the Internet Archive
Algorithms are often pitched as being superior to human judgement, taking the guesswork out of decisions ranging from driving to writing an email. But they're still programmed by humans and trained on the data that humans create, which mean…
- View the original report at its source
- View the report at the Internet Archive
Amazon had to scrap its AI hiring tool because it was ‘sexist’ and discriminated against female applicants, a report from Reuters has found.
Amazon’s hopes for creating the perfect AI hiring tool were dashed when it realised that the algori…
- View the original report at its source
- View the report at the Internet Archive
David Ryder/Getty Images Amazon CEO Jeff Bezos.
Amazon tried building an artificial-intelligence tool to help with recruiting, but it showed a bias against women,Reuters reports.
Engineers reportedly found the AI was unfavorable toward fema…
- View the original report at its source
- View the report at the Internet Archive
What is artificial intelligence (AI)? We look at the progress of AI and automation in Australia compared to the rest of the world and how the Australian workforce may be affected by this movement.
Will the rise of AI take away our jobs? 0:5…
- View the original report at its source
- View the report at the Internet Archive
Amazon ditches sexist AI
It’s not news to learn that AI can be something of a bigot.
Amazon scrapped an algorithm designed to become a recruitment tool because it was too sexist.
Did you hear the one about my wife — well, she… is a really n…
- View the original report at its source
- View the report at the Internet Archive
AI may have sexist tendencies. But, sorry, the problem is still us humans.
Amazon recently scrapped an employee recruiting algorithm plagued with problems, according to a report from Reuters. Ultimately, the applicant screening algorithm di…
- View the original report at its source
- View the report at the Internet Archive
London | Amazon has scrapped a "sexist" internal tool that used artificial intelligence to sort through job applications.
The program was created by a team at Amazon's Edinburgh office in 2014 as a way to sort through CVs and pick out the m…
- View the original report at its source
- View the report at the Internet Archive
Amazon decided to scrap a machine learning (ML) algorithm it was creating to help automate the recruitment process because the model kept favoring male candidates, Reuters revealed. The discrimination against female candidates has been put …
- View the original report at its source
- View the report at the Internet Archive
Specialists had been building computer programs since 2014 to review résumés in an effort to automate the search process
This article is more than 5 months old
This article is more than 5 months old
Amazon’s machine-learning specialists unc…
- View the original report at its source
- View the report at the Internet Archive
Artificial intelligence (AI) human resourcing tools are all the rage at the moment and becoming increasingly popular. The systems can speed up, simplify and even decrease the cost of the hiring process becoming every recruiter's dream come …
- View the original report at its source
- View the report at the Internet Archive
Amazon trained a sexism-fighting, resume-screening AI with sexist hiring data, so the bot became sexist
Some parts of machine learning are incredibly esoteric and hard to grasp, surprising even seasoned computer science pros; other parts of…
- View the original report at its source
- View the report at the Internet Archive
Amazon's has scraped its artificial intelligence hiring tool after it was found to be sexist.
Photo: © 2014, Ken Wolter
A team of specialists familiar with the project told Reuters that they had been building computer programmes since 2014…
- View the original report at its source
- View the report at the Internet Archive
So AI may be the future in hiring and recruitment but it certainly isn't there yet it seems.
If you're basing it's learning on history which quite possibly may have been biased towards men, then it is likely that it will discriminate agains…
- View the original report at its source
- View the report at the Internet Archive
Amazon’s AI gurus scrapped a new machine-learning recruiting engine earlier this month. Why? It transpired that the AI behind it was sexist. What does this mean as we race to produce ever-better artificial intelligence, and how can we under…
- View the original report at its source
- View the report at the Internet Archive
The tech giant canned their experimental recruitment system riddled with problems, according to Reuters.
Amazon, back in 2014, set up the recruiting system in place, hoping to mechanize the entire hiring process. It used artificial intellig…
- View the original report at its source
- View the report at the Internet Archive
Amazon recently scrapped an experimental artificial intelligence (AI) recruiting tool that was found to be biased against women. At this point, I hope you might have a few questions, such as: What is an AI recruiting tool and how does it wo…
- View the original report at its source
- View the report at the Internet Archive
Amazon has scrapped a "sexist" internal tool that used artificial intelligence to sort through job applications.
The AI was created by a team at Amazon's Edinburgh office in 2014 as a way to automatically sort through CVs and pick out the m…
- View the original report at its source
- View the report at the Internet Archive
However, bias also appears for other unrelated reasons. A recent study into how an algorithm delivered ads promoting STEM jobs showed that men were more likely to be shown the ad, not because men were more likely to click on it, but because…
- View the original report at its source
- View the report at the Internet Archive
Last December Synced compiled its first “Artificial Intelligence Failures” recap of AI gaffes from the previous year. AI has achieved remarkable progress, and many scientists dream of creating the Master Algorithm proposed by Pedro Domingos…
- View the original report at its source
- View the report at the Internet Archive
It was supposed to make finding the right person for the job easier. However, an AI tool developed by Amazon to sift through potential hires has been dropped by the firm after developers found it was biased against picking women.
From prici…
- View the original report at its source
- View the report at the Internet Archive
In 1964, the Civil Rights Act barred the humans who made hiring decisions from discriminating on the basis of sex or race. Now, software often contributes to those hiring decisions, helping managers screen résumés or interpret video intervi…
- View the original report at its source
- View the report at the Internet Archive
I’m at home playing a video game on my computer. My job is to pump up one balloon at a time and earn as much money as possible. Every time I click “Pump,” the balloon expands and I receive five virtual cents. But if the balloon pops before …
- View the original report at its source
- View the report at the Internet Archive
Artificially intelligent hiring tools do not reduce bias or improve diversity, researchers say in a study.
"There is growing interest in new ways of solving problems such as interview bias," the Cambridge University researchers say, in the …
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
AI Beauty Judge Did Not Like Dark Skin
Biased Sentiment Analysis
Racist AI behaviour is not a new problem
Similar Incidents
Did our AI mess up? Flag the unrelated incidents