FileHippo News

The latest software and tech news

Imagine a dystopian future in which you are assigned a number of key characteristics at birth, only instead of being assigned these labels by... New Data In Racism Of Criminal Risk Software

Imagine a dystopian future in which you are assigned a number of key characteristics at birth, only instead of being assigned these labels by genetics or upbringing, the government assigns them to you based on arbitrary, uncontrollable factors. A child’s educational track is determined within moments of emerging from the womb. His career path is determined before he can focus his eyes. His income, not just his potential for income but his actual pay, is decided and set in stone before he can support the weight of his own head.

department of justice building

It makes for entertaining fiction, especially when a hero rises from among the people and overthrows the system. But what if the scenario was not only real, but also impenetrable, meaning there is no hero coming to save us?

New data has been published by advocacy groups concerning this very situation, only instead of assigning someone’s career path or your wealth, the justice system is using software to predict whether or not someone will commit a crime. Unfortunately, the software is not only about as accurate as reaching into a hat to pull out a name, it’s horrifically skewed against black people. Even more alarming, it’s actually in use to determine things like how long a defendant’s sentence will be.

A report published last week by ProPublica proved a bone-chilling data point: the software was wrong nearly equally for whites and blacks, except for one terrible, terrible reality. When the software was wrong about white defendants’ likelihood of committing another crime, it predicted that the whites would not commit another offense, except they actually did. When the software was incorrect about black defendants’ likelihood of committing another crime, it stated that they were likely to be a repeat offender, when they in fact did not. Essentially, the algorithms looked at the criminals’ race and pretty much said whites won’t break the law, blacks will.

Obviously, that’s a severe oversimplification of what’s taking place, but at the end of ProPublica’s study, that was the bare bones discovery. But what the organization discovered that was even more upsetting is that this “indicator score” of likelihood to repeat lawless behavior was being used in real world applications, especially in terms of sentencing, early release, and similar decisions. As if it could get worse, the indicator score is based on information from the defendant’s criminal record–even if that record contains only one arrest–and then the individual’s own answers to a questionnaire that reads more like a freshman psychology project than a genuine determination. Additional factors, including educational level and employment status, have been added to the software’s metrics, but the algorithms themselves are listed as proprietary by the developer.

The company that produces that software disagrees entirely with ProPublica’s methodology, discoveries, and overall assessment of how this popular software is put into practice. US Attorney General Eric Holder had called for a study by the justice system to assess how this software was used and its actual effectiveness, but that study was never conducted, according to ProPublica.