Viewing a single comment thread. View all comments

Background-Net-4715 OP t1_j1qrk94 wrote

Now scheduled to come into force in April, Local Law 144 of 202 would bar companies from using any “automated employment decision tools” unless they pass an audit for bias. Any machine learning, statistical modeling, data analytics, or artificial intelligence-based software used to evaluate employees or potential recruits would have to undergo the scrutiny of an impartial, independent auditor to make sure they’re not producing discriminatory results.

The audit would then have to be made publicly available on the company’s website. What’s more, the law mandates that both job candidates and employees be notified the tool will be used to assess them, and gives them the right to request an alternative option.

6

D-redditAvenger t1_j1quoqo wrote

Seems reasonable but it's good that it's made public. It will be important to see what "bias" exactly means in this context.

4

Background-Net-4715 OP t1_j1qvdk7 wrote

Yes definitely - that's part of what's causing the delay. Companies will need to explain what goes into their algorithms, what decisions are made in the model, what the accuracy rate is, who made the model, and so on. It seems complicated but I guess it's because it's the first law of its kind.

2

killcat t1_j1tbmxu wrote

5 will get you 10 that "discriminatory outcomes" will define if the AI has "bias" as opposed to the actual programming.

1

cheapsexandfastfood t1_j22hlj9 wrote

I think it'll be easy to test with a bunch of resumes of different races and genders equally spread along a qualification spectrum.

Just have it pick from this test set and if the results are statistically tilted in any way it fails.

1