TNA54 - Mbadiwe - banner

Executive Summary of “Algorithmic Injustice”

Don’t blame the algorithm — as long as there are racial disparities in the justice system, sentencing software can never be entirely fair.
Subscriber Only
Sign in or Subscribe Now for audio version

The article “Algorithmic Injustice” explores allegations of racism in criminal sentencing algorithms that are used in courts across America to help judges assess a defendant’s risk of re-offending. The article focuses on a controversy raised by a 2016 ProPublica investigation that found racially disparate outcomes in COMPAS, a widely used sentencing algorithm.

The article makes a central argument:

  • There is a “fairness paradox”: Racial disparities of many kinds can result from sentencing algorithms, and reducing one kind inevitably means increasing another. As long as disparities exist in the justice system itself — in arrest and sentencing rates — it is mathematically impossible for a sentencing algorithm to eliminate all disparities at once.
TNA54 - Mbadiwe - Risk Assignment 3000w
© The New Atlantis

Other key claims:

  • The racial disparities uncovered by ProPublica — for example, non-reoffending black defendants were nearly twice as likely as white defendants to have been mistakenly labeled “high-risk” — are a mathematical consequence of the fact that black people are arrested and convicted at higher rates than white people.
  • Present debates are bogged down by unclear and confusing terminology. Common terms like “accuracy” have different meanings in statistics than they do in everyday language, and even experts tend to use them in a confusing manner.
  • Much of the dispute over sentencing algorithms arises from unstated disagreements about how statistical metrics like sensitivity and false positive rate correspond to ideas of fairness and justice like disparate treatment and disparate impact. The article includes charts (1, 2, 3) and tables (1, 2) that aim to clear up confusion around statistical terms and translate them into accessible moral language.
  • Criminal sentencing algorithms aim to make sentencing decisions more fair by insulating them from bias and other flaws of human judgment. But the fairness paradox shows that algorithms cannot be neutral — that each is based on a value choice. And algorithms now in wide use threaten to entrench existing racial disparities in arrest and conviction rates.
  • Calls to address racial disparities through transparency, or by getting rid of algorithms, will not address these ethical concerns. Which disparities ought to be addressed by criminal sentencing algorithms — that is, which definitions of fairness and equality ought to be upheld — is a question that must be decided by the public and policymakers.
Tafari Mbadiwe, “Executive Summary of ‘Algorithmic Injustice’,” TheNewAtlantis.com, April 24, 2018.
Header image by Noah Berger/Bloomberg via Getty Images
Related

Delivered to your inbox:

Humane dissent from technocracy

Exhausted by science and tech debates that go nowhere?