HuskyCriminologist t1_jees02o wrote
In case anyone is curious I found the original report. After browsing the report for a bit I'm... well I'm not skeptical but I'm not convinced either.
The 95% confidence intervals on the half of the results that have a p-value of < 0.05 are massive. Just as an example, the Average Effect of All Sites was between -0.34 and 0.00. So the study is 95% certain there was an effect of somewhere between -34% and 0%. There were also several observed places where homicides skyrocketed after the implementation of Safe Streets. Sandtown-Winchester's homicide rate went up by a staggering 44% as compared to the synthetic control (i.e., theoretically was caused by safe streets), Belair-Edison's doubled (+103%), Brooklyn's went up by 27%.
On the other hand, the observed non-fatal shooting rates don't match the homicide rates at all. Generally speaking, the rate of non-fatal shootings and the rate of fatal shootings move roughly in lockstep. That's not to say they can't move in different directions, or at different rates, but it is weird and surprising when that happens. This report shows that SW's homicide rate went up by 44% compared to the synthetic control, but their nonfatal shootings dropped by 53% compared to the expected value? Belair-Edison observed 21% fewer nonfatal shootings than was expected, compared to seeing 103% more fatal shootings than expected. On the flip side, Belvedere's nonfatal shootings were 459% (not a typo) higher than expected, but their homicide rate was 40% lower than expected.
This makes absolutely no sense, until you read a bit further and see that the p-value for the impact of Safe Streets on homicides is 0.381. There is a 38% chance that the observed impact on homicide rates is literally random chance.
I'm not saying this study is worthless, but it certainly looks like a case of "we spent a shit ton of money on this report we have to publish something".
Viewing a single comment thread. View all comments