By Shelby Brown —
With Kentucky’s incarceration rates above the U.S. average, criminal justice experts are developing algorithms to counter biases in 911 operators and courtroom sentences. The John Schnatter Center for Free Enterprise hosted the panel discussion on criminal justice reform Jan. 23.
Experts included Professor Gregory DeAngelo of West Virginia University, Professor Anna Harvey of New York University, Professor Megan Stevenson of George Mason University and U of L Professor Michael Losavio.
Stevenson said the artificial intelligence formulas can generate bail amounts, sentence duration, possibility of parole and how much supervision will be administered during parole. The algorithm uses information like the offense, criminal history and age.
“They also include, sometimes, more controversial input such as the education level that a person has, their employment status, the zip code they live in. Now these inputs do correlate with future offending, statistically, but they’re also direct socioeconomic markers and are correlated with race and poverty,” Stevenson said.
Stevenson said inputs can raise ethical issues. Using someone’s educational background, zip code or criminal background reveal racially disparate practices.
Stevenson said when a judge hands down a sentence, they’re attempting to predict whether the individual will offend again.
“So if we’re already in the business of trying to predict crime, why not make it a little bit more formal? At least with these risk assessment algorithms it’s transparent. You know what inputs are being included in this decision and what inputs are not. Race, for instance, is an input that is not allowed to be used in these. So even though there’s indirect ways they embed racial biases, they don’t have the same direct racism that many judges might have,” Stevenson said.
Harvey said equal treatment required science and data. She described two types of policing– proactive and reactive. Most often a routine traffic stop, proactive policing involves officers using discretion. Harvey said racial bias in proactive policing is a pressing question.
“I want to know if an officer, given two cars that are driving exactly the same way, and one of them is being driven by a white driver and the other one is being driven by a non-white driver, I want to know that the officer behaves exactly the same irrespective of the race of the driver,” Harvey said.
Reactive policing involves the answering of 911 calls.
Harvey cited the shooting of 12-year-old Tamir Rice in 2014. In the 911 call, the operator asked what race Rice was three times, Harvey said. The call was coded as priority one– the highest priority– to officers. Priority one means a dangerous situation where officers must arrive in under five minutes. Despite his age and the gun being a toy, Rice was fatally shot.
Harvey questions the frequency of calls where race is identified and causes call priority to become elevated. Harvey asked if the same happened if race was identified in victim cases resulting in a lower call priority.
“When calls come in, they’re being assigned priority codes subjectively, by humans, who have, as we probably all do, various biases and baggage that they bring,” Harvey said.
Harvey said her lab is working on an algorithm to assign priority to a 911 call based on call history. She hopes this will help develop equity in emergency response.
DeAngelo said the criminal justice system is often relied upon as a source of revenue, despite its underpaid public servants. He said traffic citation revenues accounted for $7.5- 15 billion over the last five years. Possession of drugs makes up approximately $12.5 billion annually in revenue.
With pressure on the criminal justice system to bring in money, the pressure to issue more citations will increase, DeAngleo said.
Losavio said data holds people accountable in distributed statewide systems.
Photo by Arry Schofield / The Louisville Cardinal