If you get arrested in New Jersey, you could find yourself either approved or denied bail based on the recommendation of an algorithm.
That is because, from the first day of 2017, the state decided to replace its busted human-led bail system with a new algorithm called the Public Safety Assessment, which adds the power of math and data science to an area that has often relied on nothing more scientific than gut instinct.
“When we first launched our initiative as an organization five years ago, we took a look at the whole criminal justice system and tried to find the areas where we could have the biggest positive impacts on fairness, public safety and efficiency,” Matt Alsdorf, vice president of Criminal Justice for the Texas-based Laura and John Arnold Foundation, which designed the algorithm, told Digital Trends. “There were a lot to focus on, but we thought that focusing on the front-end of the system — the initial decisions that are made in a potential criminal case — was somewhere we could really make a difference.”
The PSA algorithm is designed to predict whether or not a person is likely to present a risk if they are released pre-trial. It was based on analysis of a dataset of 1.5 million cases around the U.S., and takes into account nine different factors about defendants. These include age at current arrest, current violent offense, pending charges at the time of the offense, prior felony convictions, prior violent convictions, prior sentences, and prior failures to appear in court — both recent and long-term. Using these data points, the algorithm then makes a prediction about how likely someone is to commit new criminal activity if released or to fail to show up at court.
If the idea of using algorithms to decide on bail sounds a bit, well, Orwellian to you, Alsdorf is keen to assuage fears. For one thing, the algorithm is only making recommendations to a judge, who can decide to take them or not. It is also interesting to note that a lot of the potentially biased reasons people previously accused bail decisions of taking into account (such as a person’s educational attainment, family structure or employment) didn’t turn out to be strong predictors of reoffending or not showing up to court and therefore don’t show up in the algorithm.
“The goal is to provide judges with better research-based, data-driven guidance about who should be in and who should be out of jail during the pretrial period,” Alsdorf said.