Professional Documents
Culture Documents
Graded Discussion 2
Graded Discussion 2
The company's experimental tool basically used an automated recruiting tool to give
candidates scores ranging from 1 (the lowest match) to 5 (the highest match) for the open
position. During the end of 2015, the company realized that its new system was not rating
candidates in a gender-neutral way. One of the unusual things that happened was that
Amazon’s system taught itself that male candidates were preferable to women. The system
penalized resumes that included the word “women” and downgraded two candidates from
unspecified women's colleges. Although Amazon made changes to the program, there was
still no guarantee that the system would not be gender biased.
The reason stated for the sort of discrimination as per the article is that the computer models
were trained to screen applications by observing patterns in resumes submitted to the
company over a 10-year period, and during that period, most of the resumes came from men.
In 2017 beginning, Amazon lost faith in his AI tool for recruitment screening and abandoned
the project.