Police officers stand guard during a protest in Philadelphia during the 2016 Democratic National Convention. John Minchillo/AP

Debate is brewing over whether individuals’ risk assessments will be based on factors out of their control.

As is the case in many cities, jails in Philadelphia are overcrowded and full of people who have been found guilty of nothing. These people are behind bars because they cannot afford to pay bail, a circular problem that is costing the city millions. The city is currently holding about a thousand inmates over the capacity of its jails, and more than half of these individuals have not been found guilty of a crime, according to Newsworks.

In an effort to release more of that pre-trial inmate population, Philadelphia lawmakers are debating employing a new risk-assessment algorithm, which would determine who should be released and how high bail amounts should be set.

Proponents of the new program argue that a data-driven system would be fairer than the subjective decisions of bail commissioners. But critics contend that the algorithm is not the best solution to this problem, and could unfairly deny release to individuals because of policing patterns in the neighborhoods they happen to be from.

Opponents claim the algorithm’s apparent weighing of factors, including past arrest records, age, gender, and employment history, could penalize individuals based on characteristics beyond their control, or on mistakes made years ago.

“That’s a price. It’s a price you may choose to pay,” Richard Berks, the algorithm’s creator and a professor of criminology at the University of Pennsylvania, said at a recent council hearing, according to the technology blog Technical.ly.

For now, it appears that some factors, including race and zip code, will be excluded from the algorithm, but Berks says that could compromise the results.

"How many aggravated assaults, how many homicides, how many rapes are you prepared to trade for an instrument that's race-neutral?," said Berks to Newsworks.

Critics have pointed out that there may be better ways to release people from the city’s overcrowded jails.

“We can do this without the algorithm,” says Paul Hetznecker, a veteran criminal defense attorney in Philadelphia. “All we have to do is ask, ‘Is this a serious crime? Does this retail theft or possession of cocaine require incarceration?’”

Hetznecker contends that police bias and the concentration of aggressive policing in majority-black parts of cities could skew the algorithm’s computations of what an offender looks like in the first place. Studies have, for example, found that Philadelphia police often stop minorities without meeting ”reasonable suspicion” standards.

“This is racial profiling through computer models,” says Hetznecker. “They’re creating categories of people who are supposedly more likely to commit crimes by looking at what their addresses are or at the past arrest histories of their families, stripping individuals of every protection afforded by the Constitution.”

Which factors will be included and how heavily each will be weighed by the algorithm is still to be decided. So far, local media has not been able to obtain a complete breakdown of the factors used in the algorithm. And this lack of access is troubling. “The problem is that people who are not statisticians are not able to be critical consumers of the product,” said David Robinson, a data-ethics analyst to City & State.

Recent investigative reports have found similar programs to be racially biased. A recent ProPublica investigation found that a risk-assessment algorithm in Broward County, Florida, developed a bias against African-Americans, routinely labeling low-level offenders “riskier” than white criminals charged with more serious crimes.

Beyond the potential problems of this program, Hetznecker also worries that such predictive analytics could undermine assumptions which are vital to the civil liberties guaranteed by the criminal justice system.

“If you are at a bail hearing and the bail commissioner’s algorithm looks at the arrest history that your parents or siblings have, you’ve completely undermined due process,” says Hetznecker. “You are going to be considered a risk because of where you come from. You are no longer presumed innocent.”

About the Author

Most Popular

  1. Perspective

    Untangling the Housing Shortage and Gentrification

    Untangling these related but different problems is important, because the tactics for solving one won’t work for the other.

  2. A cyclist rides on the bike lane in the Mid Market neighborhood during Bike to Work Day in San Francisco,
    Perspective

    Why We Need to Dream Bigger Than Bike Lanes

    In the 1930s big auto dreamed up freeways and demanded massive car infrastructure. Micromobility needs its own Futurama—one where cars are marginalized.

  3. a photo of police and residents of Stockton, CA, in a trust-building workshop
    Equity

    A Police Department’s Difficult Assignment: Atonement

    In Stockton, California, city and law enforcement leaders are attempting to build trust between police and communities of color. Why is this so hard to do?

  4. a photo of the Maryland Renaissance Festival
    Life

    The Utopian Vision That Explains Renaissance Fairs

    What’s behind the enduring popularity of all these medieval-themed living-history festivals?

  5. Design

    The New MoMA Is Bigger, More Diverse, and More Open to the City

    The renovated and expanded Museum of Modern Art looks to connect the museum to New York City while telling a fuller story about modernism.

×