Viewing a single comment thread. View all comments

eeeeeeeeeepc t1_ixh2szl wrote

>Writing in the IEEE Technology and Society Magazine, Chugh points to the landmark case Ewert v. Canada as an example of the problems posed by risk assessment tools in general. Jeffrey Ewert is a Métis man serving a life sentence for murder and attempted murder. He successfully argued before the Supreme Court of Canada that tests used by Corrections Services Canada are culturally biased against Indigenous inmates, keeping them in prison longer and in more restrictive conditions than non-Indigenous inmates.

The court only ruled that the test might be culturally biased and that the prison authorities needed to do more research on it. Ewert's own expert didn't argue that he knew the direction of the bias.

The same wishful thinking appears later in the article:

>"Ewert tells us that data-driven decision-making needs an analysis of the information going in—and of the social science contributing to the information going in—and how biases are affecting information coming out," Chugh says.

>"If we know that systemic discrimination is plaguing our communities and misinforming our police data, then how can we be sure that the data informing these algorithms is going to produce the right outcomes?"

>Subjectivity is needed

Does "systemic discrimination" mean that police focus enforcement on indigenous communities, or that they ignore crimes there? Again this is a bias claim of indeterminate direction and size. If we think that differences in crime reporting and clearance rates exist, let's estimate them, adjust the data, and respond rationally rather than retreating into "subjectivity" not disciplined by mathematical consistency.

−1