You are on page 1of 1

Sexist AI

As per the article published on GlobalNews.ca, Amazon.com machine learning specialists


have uncovered a big problem in which their new recruiting engine does not like women. The
team has been working on building a recruitment computer program since 2014 to review job
applicant resumes with the aim of automating the search for top talent. Automation has been a
key factor in Amazon’s dominance in the e-commerce sector.

The company's experimental tool basically used an automated recruiting tool to give
candidates scores ranging from 1 (the lowest match) to 5 (the highest match) for the open
position. During the end of 2015, the company realized that its new system was not rating
candidates in a gender-neutral way. One of the unusual things that happened was that
Amazon’s system taught itself that male candidates were preferable to women. The system
penalized resumes that included the word “women” and downgraded two candidates from
unspecified women's colleges. Although Amazon made changes to the program, there was
still no guarantee that the system would not be gender biased.

The reason stated for the sort of discrimination as per the article is that the computer models
were trained to screen applications by observing patterns in resumes submitted to the
company over a 10-year period, and during that period, most of the resumes came from men.

In 2017 beginning, Amazon lost faith in his AI tool for recruitment screening and abandoned
the project.

You might also like