Ads are being blocked

For us to continue writing great stories, we need to display ads.

Un-block Learn more
Back

Whitelist

Please select the extension that is blocking ads.

Ad Block Plus Ghostery uBlock Other Blockers
Back

Please follow the steps below

Should Philadelphia Count on an Algorithm for Bail Reform?

Debate is brewing over whether individuals’ risk assessments will be based on factors out of their control.

Police officers stand guard during a protest in Philadelphia during the 2016 Democratic National Convention. (John Minchillo/AP)

As is the case in many cities, jails in Philadelphia are overcrowded and full of people who have been found guilty of nothing. These people are behind bars because they cannot afford to pay bail, a circular problem that is costing the city millions. The city is currently holding about a thousand inmates over the capacity of its jails, and more than half of these individuals have not been found guilty of a crime, according to Newsworks.

In an effort to release more of that pre-trial inmate population, Philadelphia lawmakers are debating employing a new risk-assessment algorithm, which would determine who should be released and how high bail amounts should be set.

Proponents of the new program argue that a data-driven system would be fairer than the subjective decisions of bail commissioners. But critics contend that the algorithm is not the best solution to this problem, and could unfairly deny release to individuals because of policing patterns in the neighborhoods they happen to be from.

Opponents claim the algorithm’s apparent weighing of factors, including past arrest records, age, gender, and employment history, could penalize individuals based on characteristics beyond their control, or on mistakes made years ago.

“That’s a price. It’s a price you may choose to pay,” Richard Berks, the algorithm’s creator and a professor of criminology at the University of Pennsylvania, said at a recent council hearing, according to the technology blog Technical.ly.

For now, it appears that some factors, including race and zip code, will be excluded from the algorithm, but Berks says that could compromise the results.

"How many aggravated assaults, how many homicides, how many rapes are you prepared to trade for an instrument that's race-neutral?," said Berks to Newsworks.

Critics have pointed out that there may be better ways to release people from the city’s overcrowded jails.

“We can do this without the algorithm,” says Paul Hetznecker, a veteran criminal defense attorney in Philadelphia. “All we have to do is ask, ‘Is this a serious crime? Does this retail theft or possession of cocaine require incarceration?’”

Hetznecker contends that police bias and the concentration of aggressive policing in majority-black parts of cities could skew the algorithm’s computations of what an offender looks like in the first place. Studies have, for example, found that Philadelphia police often stop minorities without meeting ”reasonable suspicion” standards.

“This is racial profiling through computer models,” says Hetznecker. “They’re creating categories of people who are supposedly more likely to commit crimes by looking at what their addresses are or at the past arrest histories of their families, stripping individuals of every protection afforded by the Constitution.”

Which factors will be included and how heavily each will be weighed by the algorithm is still to be decided. So far, local media has not been able to obtain a complete breakdown of the factors used in the algorithm. And this lack of access is troubling. “The problem is that people who are not statisticians are not able to be critical consumers of the product,” said David Robinson, a data-ethics analyst to City & State.

Recent investigative reports have found similar programs to be racially biased. A recent ProPublica investigation found that a risk-assessment algorithm in Broward County, Florida, developed a bias against African-Americans, routinely labeling low-level offenders “riskier” than white criminals charged with more serious crimes.

Beyond the potential problems of this program, Hetznecker also worries that such predictive analytics could undermine assumptions which are vital to the civil liberties guaranteed by the criminal justice system.

“If you are at a bail hearing and the bail commissioner’s algorithm looks at the arrest history that your parents or siblings have, you’ve completely undermined due process,” says Hetznecker. “You are going to be considered a risk because of where you come from. You are no longer presumed innocent.”

About the Author

  • George Joseph
    George Joseph is an editorial fellow at CityLab, originally from Denton, Texas. He covers schools, policing, and surveillance.