Comments

You must log in or register to comment.

neutral-spectator t1_j5ae7cn wrote

I had a dream last night that I was going to look at an apartment for rent and when I got there the apartment was fully sentient and used AI to decide who got to live in it, I was scanned and "denied" right there on the sidewalk. And I remember thinking in my dream this was still somehow better than going through the whole process the old way with a human

6

Ok-Welder-4816 t1_j5cia81 wrote

At least you waste less time per apartment, I guess?

Better than companies that string you along through 3+ interviews just to deny you (in the job market).

4

cryptoderpin t1_j5a3fy2 wrote

Use a AI bot against the AI bot with more AI bots, duhhhhh.

3

Art-Zuron t1_j5aygjq wrote

Amazon already made a racist hiring AI that immediately disqualified anyone with a black sounding name, so it's a bit late.

From what I understood, the AI used hiring information to identify ideal candidates. So, it would look for traits that are common in resumes and applications that are accepted and hired. The issue, however, was that the majority of people that apply are white men. That necessarily means that more white males are hired than other groups. So, the AI associated the characteristics shared most commonly between white applicants as the best hires.

This created a feedback loop which resulted in it hiring only people with characteristics ascribed to white people, and especially white men.

3

AlanzAlda t1_j5aw2qt wrote

Good, recruiters in general suck.. especially for tech jobs. Yeah it turns out that some English major probably isn't the best judge of technical talents, just whether or not the resume is pretty.

2

morenewsat11 OP t1_j59ptls wrote

It was never going to be an easy process, the research hi-lights some of the challenges of AI tools to filter job applications.

>The researchers interviewed more than 2,250 executives across the United States, United Kingdom and Germany. They found more than 90 per cent of companies were using tools like ATS to initially filter and rank candidates.
>
>But they often weren't using it well. Sometimes, candidates were scored against bloated job descriptions filled with unnecessary and inflexible criteria, which left some qualified candidates "hidden" below others the software deemed a more perfect fit.

...

>"I advise HR practitioners they have to look into and have open conversations with their vendors: 'OK, so what's in your system? What's the algorithm like? … What is it tracking? What is it allowing me to do?" said Pamela Lirio, an associate professor of international human resources management at the Université de Montréal.
>
>Lirio, who specializes in new technologies, says it's also important to question who built the AI and whose data it was trained on, pointing to the example of Amazon, which in 2018 scrapped its internal recruiting AI tool after discovering it was biased against female job applicants.
>
>The system had been trained on the resumes of past applicants — who were, overwhelmingly, men — so the AI taught itself to down-rank applicants whose resumes mentioned competing in women's sports leagues or graduating from women's colleges.

1