Courts use risk algorithms to set bail: A step toward a more just system?

Court systems in more than two dozen US cities and states are using algorithms that assess flight risk without considering race, gender, or socioeconomic status, in an attempt to remove implicit bias from the equation.

|
Jim Young/Reuters
A Cook County Sheriff's police car patrols the exterior of the Cook County Jail in Chicago, Illinois on January 12, 2016.

In an age where mass incarceration and a racially biased criminal justice system are at the forefront of national conversation, we often hear about the 2.2 million people incarcerated in the United States.

Much of the conversation revolves around those in state and federal prisons. But less frequently discussed is a smaller subset of the incarcerated population: the 744,600 Americans held in local jails, more than half of whom have not yet been convicted of a crime. While some of these pretrial arrestees are considered a threat, many others are detained simply because they can't afford to bail themselves out. 

It's a system that favors the rich and punishes the poor, civil rights groups say. Furthermore, studies show that minorities are disproportionately affected by the current bail system: courts are more likely to view African Americans and Latinos as flight risks or public threats, often resulting in higher bail or mandatory pretrial detention.

Now, due to pushback from civil rights advocates and a desire to save government money, an increasing number of courts have begun using computer algorithms to assess risk. Such tools, proponents say, remove any implicit bias from the equation, producing a more objective assessment. 

As the pool of research grows and the science of risk assessment becomes more refined, "We actually have increasingly good models of who poses a risk and who doesn't pose a risk," John Pfaff, a professor of law at Fordham University, tells The Washington Post. 

The latest pretrial risk assessment tool is the Public Safety Assessment, developed by the Laura and John Arnold Foundation. Drawing from a database of over 1.5 million cases from more than 300 jurisdictions across the US, the algorithm calculates the probability that a defendant will commit a new crime, commit a new violent crime, or fail to return to court. 

The assessment takes into consideration a number of factors, including pending charges, prior convictions, whether the current offense is violent, and whether the person has failed to appear at other pretrial hearings. But unlike a human assessor, it's blind to race, gender, level of eduction, socioeconomic status, and neighborhood, all of which can affect a judge's decision, whether subconsciously or consciously. 

In San Francisco, one of 29 jurisdictions (including three states and several major cities) currently using the algorithm, the Arnold assessment was implemented months after national civil rights group Equal Justice Under Law brought a lawsuit against the city, saying its bail system unfairly punished poor arrestees.

Before adopting the Arnold system, the city relied on assessments from case managers at a nonprofit organization working under contract with the S.F. Sheriff's Department, which conducted face-to-face interviews with defendants before their first court appearance. Now, the case managers simply run arrestees' data through the algorithm, eliminating the need for an interview. 

But the new system has faced some resistance in San Francisco, where many judges are choosing to ignore the algorithm's recommendations. 

"I think it has the potential to be a move in the right direction, but when it is watered down or misunderstood or rejected unreasonably, then it's not clear what good it will do," Deputy Public Defender Danielle Harris told the San Francisco Chronicle. "We were excited about having more research and more data being brought into decision-making, but we've been disappointed." 

There are a number of reasons why a judge or attorney may be skeptical of the system, Cherise Fanno Burdeen, chief executive officer of the Pretrial Justice Institute, tells The Christian Science Monitor in an email. Some, she speculates, may feel as though their professional discretion is more valuable than a score produced by data; others may worry that the algorithm could identify too many people as low-risk. And for elected officials such as judges, prosecutors, and sheriffs, the influence of campaign contributions from bail bondsmen and the insurance companies that underwrite them may play a role. 

However, she says, any challenges that come with data-driven systems are "outweighed dramatically by the benefits of being able to identify and release the appropriate people." 

Ms. Burdeen compares the use of algorithms in the criminal justice system to the use of assessment tools in other fields that rely on evidence rather than personal discretion, such as medicine or auto insurance. 

"I don't want a doctor to imagine what he thinks my vitals are," she says. "I want him to take the vitals that are the ones science shows are predictive of my possible diagnosis." 

R. Andrew Murray, the district attorney of of Mecklenburg County, N.C., says he was skeptical of the Arnold system at the start of his jurisdiction's yearlong trial period. 

"I'm expected to do everything I can to keep the public safe," Mr. Murray told the New York Times. "If we're letting more people out earlier in the proceeding, based on more limited information, I'm going to be concerned."

However, he says, a year after implementing the system, the jail population of Charlotte (located in Mecklenburg County) had decreased by 20 percent, with no increase in crime. 

"It's saved the community a lot of tax dollars, there's not been an ill effect, and we've kept a lot of individuals from going through that turmoil," Murray said. 

But while they've proven successful in individual jurisdictions, could tools such as the Arnold assessment result in significant change on a larger scale? 

"There's lots of really promising initiatives that are trying to increase the levels of intellectual and evidentiary rigor with which we've been making these decisions," says Alec Karakatsanis, a civil rights attorney with Equal Justice Under Law, in a phone interview with the Monitor. 

However, he adds, while "these are all encouraging efforts," what is really needed is a broader societal rethinking of the concept of bail in "a culture that is indifferent to, and completely tolerates," holding people in jail simply because they can't pay their way out. 

"None of these changes are going to have an impact until … we confront this everyday systemic brutality," Mr. Karakatsanis says. 

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Courts use risk algorithms to set bail: A step toward a more just system?
Read this article in
https://www.csmonitor.com/USA/Justice/2016/0803/Courts-use-risk-algorithms-to-set-bail-A-step-toward-a-more-just-system
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe