>It was supposed to make finding the right person for the job easier. However, an AI tool developed by Amazon to sift through potential hires has been dropped by the firm after developers found it was biased against picking women.
>From pricing items to warehouse coordination, automation has been a key part of Amazon’s rise to e-commerce domination. And since 2014, its developers have been creating hiring programs aimed at making the selection of top talent as easy and as automated as possible.
>“Everyone wanted this holy grail,” one of the anonymous sources told Reuters about the ambitions for the software.
>“They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”
>However, a leak by several of those familiar with the program give an insight into some of the mishaps in the AI-based hiring software’s development, and how it taught itself to penalize women… for being women.
>It was in 2015 that human recruiters first noticed discrepancies with the tool, when it seemingly marked down female candidates for roles in the male-dominated spheres of software development and other technical roles at the firm.
>When the engine came across words like “women’s” on a resume, or if a candidate graduated from an all-women’s college, it unfairly penalized female candidates from selection, the sources said.
https://www.rt.com/usa/440893-amazon-sexist-ai-recruitment-tool/
>create AI designed to pick the right people for the job
>AI prefers men over women
>AI is put down because that can't possibly be right