Viewing a single comment thread. View all comments

myBisL2 t1_j1soeup wrote

A good real life example comes from Amazon a few years ago. They implemented AI that "learned" to prefer male candidates. From this article:

>In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter.

What it came down to was that the tech industry tends to be male dominated, so based on the resumes fed to the AI, it identified a pattern that successful candidates don't have words like "women's" in their resume or go to all-women schools.

14