インシデントのステータス
CSETv1 分類法のクラス
分類法の詳細Incident Number
37
CSETv1_Annotator-1 分類法のクラス
分類法の詳細Incident Number
37
CSETv1_Annotator-2 分類法のクラス
分類法の詳細Incident Number
37
Notes (special interest intangible harm)
Resumes featuring language commonly associated with women were downgraded.
Special Interest Intangible Harm
yes
Date of Incident Year
2014
Estimated Date
No
Multiple AI Interaction
no
CSETv0 分類法のクラス
分類法の詳細Problem Nature
Specification
Physical System
Software only
Level of Autonomy
Medium
Nature of End User
Expert
Public Sector Deployment
No
Data Inputs
Resumes
インシデントレポート
レポートタイムライン
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Amazon has scrapped a “sexist” tool that used artificial intelligence to decide the best candidates to hire for jobs.
Members of the team working on the system said it effectively taught itself that male candidates were preferable.
The arti…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Amazon has scrapped a "sexist" internal tool that used artificial intelligence to sort through job applications.
The AI was created by a team at Amazon's Edinburgh office in 2014 as a way to automatically sort through CVs and pick out the m…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Amazon discovered a problem with using artificial intelligence to hire: their AI was biased against women.
The Seattle-based company developed computer programs designed to filter through hundreds of resumes and surface the best candidates,…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Machine learning, one of the core techniques in the field of artificial intelligence, involves teaching automated systems to devise new ways of doing things, by feeding them reams of data about the subject at hand. One of the big fears here…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Image copyright Getty Images Image caption The algorithm repeated bias towards men, reflected in the technology industry
An algorithm that was being tested as a recruitment tool by online giant Amazon was sexist and had to be scrapped, acco…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Machine learning technology is becoming increasingly common across various industries, from policing to recruiting. But reports have shown that many of these systems have long-standing problems regarding discrimination. To avoid amplifying …
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
SAN FRANCISCO (Reuters) - Amazon.com Inc’s (AMZN.O) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women.
The team had been building computer programs since 2014 to review job applicants’ resu…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Amazon sign, with dude. David Ryder/Getty Images
Thanks to Amazon, the world has a nifty new cautionary tale about the perils of teaching computers to make human decisions.
According to a Reuters report published Wednesday, the tech giant d…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Why Global Citizens Should Care
Gender discrimination in the workplace prevents women from achieving to their full potential. Eliminating gender inequality in the workforce would greatly increase economic activity. When half of the populati…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Amazon has been forced to scrap its AI recruitment system after it was discovered to be biased against female applicants.
The AI was developed in 2014 by Amazon as a way of filtering out most candidates to provide the firm with the top five…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Algorithms are often pitched as being superior to human judgement, taking the guesswork out of decisions ranging from driving to writing an email. But they're still programmed by humans and trained on the data that humans create, which mean…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Amazon had to scrap its AI hiring tool because it was ‘sexist’ and discriminated against female applicants, a report from Reuters has found.
Amazon’s hopes for creating the perfect AI hiring tool were dashed when it realised that the algori…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
David Ryder/Getty Images Amazon CEO Jeff Bezos.
Amazon tried building an artificial-intelligence tool to help with recruiting, but it showed a bias against women,Reuters reports.
Engineers reportedly found the AI was unfavorable toward fema…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
What is artificial intelligence (AI)? We look at the progress of AI and automation in Australia compared to the rest of the world and how the Australian workforce may be affected by this movement.
Will the rise of AI take away our jobs? 0:5…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Amazon ditches sexist AI
It’s not news to learn that AI can be something of a bigot.
Amazon scrapped an algorithm designed to become a recruitment tool because it was too sexist.
Did you hear the one about my wife — well, she… is a really n…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
AI may have sexist tendencies. But, sorry, the problem is still us humans.
Amazon recently scrapped an employee recruiting algorithm plagued with problems, according to a report from Reuters. Ultimately, the applicant screening algorithm di…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
London | Amazon has scrapped a "sexist" internal tool that used artificial intelligence to sort through job applications.
The program was created by a team at Amazon's Edinburgh office in 2014 as a way to sort through CVs and pick out the m…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Amazon decided to scrap a machine learning (ML) algorithm it was creating to help automate the recruitment process because the model kept favoring male candidates, Reuters revealed. The discrimination against female candidates has been put …
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Specialists had been building computer programs since 2014 to review résumés in an effort to automate the search process
This article is more than 5 months old
This article is more than 5 months old
Amazon’s machine-learning specialists unc…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Artificial intelligence (AI) human resourcing tools are all the rage at the moment and becoming increasingly popular. The systems can speed up, simplify and even decrease the cost of the hiring process becoming every recruiter's dream come …
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Amazon trained a sexism-fighting, resume-screening AI with sexist hiring data, so the bot became sexist
Some parts of machine learning are incredibly esoteric and hard to grasp, surprising even seasoned computer science pros; other parts of…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Amazon's has scraped its artificial intelligence hiring tool after it was found to be sexist.
Photo: © 2014, Ken Wolter
A team of specialists familiar with the project told Reuters that they had been building computer programmes since 2014…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
So AI may be the future in hiring and recruitment but it certainly isn't there yet it seems.
If you're basing it's learning on history which quite possibly may have been biased towards men, then it is likely that it will discriminate agains …
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Amazon’s AI gurus scrapped a new machine-learning recruiting engine earlier this month. Why? It transpired that the AI behind it was sexist. What does this mean as we race to produce ever-better artificial intelligence, and how can we under…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
The tech giant canned their experimental recruitment system riddled with problems, according to Reuters.
Amazon, back in 2014, set up the recruiting system in place, hoping to mechanize the entire hiring process. It used artificial intellig…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Amazon recently scrapped an experimental artificial intelligence (AI) recruiting tool that was found to be biased against women. At this point, I hope you might have a few questions, such as: What is an AI recruiting tool and how does it wo…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Amazon has scrapped a "sexist" internal tool that used artificial intelligence to sort through job applications.
The AI was created by a team at Amazon's Edinburgh office in 2014 as a way to automatically sort through CVs and pick out the m…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
However, bias also appears for other unrelated reasons. A recent study into how an algorithm delivered ads promoting STEM jobs showed that men were more likely to be shown the ad, not because men were more likely to click on it, but because…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Last December Synced compiled its first “Artificial Intelligence Failures” recap of AI gaffes from the previous year. AI has achieved remarkable progress, and many scientists dream of creating the Master Algorithm proposed by Pedro Domingos…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
It was supposed to make finding the right person for the job easier. However, an AI tool developed by Amazon to sift through potential hires has been dropped by the firm after developers found it was biased against picking women.
From prici…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
In 1964, the Civil Rights Act barred the humans who made hiring decisions from discriminating on the basis of sex or race. Now, software often contributes to those hiring decisions, helping managers screen résumés or interpret video intervi…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
I’m at home playing a video game on my computer. My job is to pump up one balloon at a time and earn as much money as possible. Every time I click “Pump,” the balloon expands and I receive five virtual cents. But if the balloon pops before …
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Artificially intelligent hiring tools do not reduce bias or improve diversity, researchers say in a study.
"There is growing interest in new ways of solving problems such as interview bias," the Cambridge University researchers say, in the …
バリアント
よく似たインシデント
Did our AI mess up? Flag the unrelated incidents
よく似たインシデント
Did our AI mess up? Flag the unrelated incidents