This is the title of an article published on May 23, 2016 by four contributors to Mother Jones Magazine. They are; Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner in conjunction with data taken from ProPublica, which is a “non-profit newsroom that produces investigative journalism for the public interest.” Here’s what’s coming down the pike, or so they assert; “a landmark sentencing reform bill currently pending in Congress [that] would mandate the use of such assessments [algorithms] in Federal prisons” to assess the risk of prisoners.
First, after reading Bill 2123 introduced to the Senate by Sen. Grassley (R-IA) on 10/01/2015, entitled, “Sentencing Reform and Corrections Act 2015,” this blogger found it has no inferences to any mandatory use of algorithms in any courtroom – anywhere. In particular, it does though refer to the federal criminal codes in relation to the reduction of drug sentences; low-level, crack cocaine users, firearms used with drug, but nothing about the use of predictive/algorithms to forecast the ‘possibilities’ of arrestees engaging in future and violent crimes, or any bias against blacks per this Reform Bill.
Second, regarding algorithm software engineered by various companies, for example, Northpointe (COMPAS, Correctional Offender Management Profiling for Alternative Sanctions) is one. Currently, these states use Northpointe or similar software risk-rating tools: Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington and Wisconsin. The information is used to determine bond amounts, or who may be set free; therefore, limiting any interpretation about the individual or their crime by a human being! This software, data base programs predict future risk based on a set of 137 questions answered in part by the defendants, and by the court. It’s sort of psychology-in-a-can. The answers are then fed into the software data base. Your risk is spit out and rated, low to high, and “the results are given to the judge during criminal sentencing.” I guess one could say this is like a credit score rating, only your future is sealed with this number. You are labeled and your fate awaits.
Third, regarding the scoring, what if it is completely wrong, and you are now marked, or have a longer jail sentence because of the scoring? The facts are surfacing that many of those sentenced were not aware of this rating system, and subsequently discovered this fact, only to find themselves contesting a data base. How frustrating! Sitting in jail and combating a score card.
So, how can this array of question be biased against blacks? Here are some of the questions/categories reviewed.
Categories: Current Charges. Criminal History, Criminal Attitudes, Non-Compliance, Family Criminality, Peers, Substance Abuse, Residence/Stability, Education and Vocation, Social Isolation, Leisure and Recreation, Anger.
Here is a sampling selection of questions that probably weigh quite heavily in comparison to the other categories.
Criminal Attitudes (1 of 11 questions)
Question: “Some people just don’t deserve any respect and should be treated like animals.” How would you answer?
Strongly Disagree Disagree Not Sure Agree Strongly Agree
Family Criminality (1 of 8, Question #1 has 7 subsets)
Question: “Was one of your parents (or parent figure who raised you) ever sent to jail or prison?
And last, but not least a question taken from the category below.
Question: “Is it difficult to keep your mind on one thing for a long time?
No Yes Unsure
Open this link and take the questionnaire. See what you think.
In closing, I might rate myself a high risk #10, based on the last question regarding this article. Yes, it was difficult to keep my mind on one thing for a long time- the content of this article! Further, I am not black. What a sad state of affairs – our criminal justice system has now evolved into a canned, questionnaire of non-sense and babble.
By Sharla Esparza