Police officers stand guard during a protest in Philadelphia during the 2016 Democratic National Convention. John Minchillo/AP

Debate is brewing over whether individuals’ risk assessments will be based on factors out of their control.

As is the case in many cities, jails in Philadelphia are overcrowded and full of people who have been found guilty of nothing. These people are behind bars because they cannot afford to pay bail, a circular problem that is costing the city millions. The city is currently holding about a thousand inmates over the capacity of its jails, and more than half of these individuals have not been found guilty of a crime, according to Newsworks.

In an effort to release more of that pre-trial inmate population, Philadelphia lawmakers are debating employing a new risk-assessment algorithm, which would determine who should be released and how high bail amounts should be set.

Proponents of the new program argue that a data-driven system would be fairer than the subjective decisions of bail commissioners. But critics contend that the algorithm is not the best solution to this problem, and could unfairly deny release to individuals because of policing patterns in the neighborhoods they happen to be from.

Opponents claim the algorithm’s apparent weighing of factors, including past arrest records, age, gender, and employment history, could penalize individuals based on characteristics beyond their control, or on mistakes made years ago.

“That’s a price. It’s a price you may choose to pay,” Richard Berks, the algorithm’s creator and a professor of criminology at the University of Pennsylvania, said at a recent council hearing, according to the technology blog Technical.ly.

For now, it appears that some factors, including race and zip code, will be excluded from the algorithm, but Berks says that could compromise the results.

"How many aggravated assaults, how many homicides, how many rapes are you prepared to trade for an instrument that's race-neutral?," said Berks to Newsworks.

Critics have pointed out that there may be better ways to release people from the city’s overcrowded jails.

“We can do this without the algorithm,” says Paul Hetznecker, a veteran criminal defense attorney in Philadelphia. “All we have to do is ask, ‘Is this a serious crime? Does this retail theft or possession of cocaine require incarceration?’”

Hetznecker contends that police bias and the concentration of aggressive policing in majority-black parts of cities could skew the algorithm’s computations of what an offender looks like in the first place. Studies have, for example, found that Philadelphia police often stop minorities without meeting ”reasonable suspicion” standards.

“This is racial profiling through computer models,” says Hetznecker. “They’re creating categories of people who are supposedly more likely to commit crimes by looking at what their addresses are or at the past arrest histories of their families, stripping individuals of every protection afforded by the Constitution.”

Which factors will be included and how heavily each will be weighed by the algorithm is still to be decided. So far, local media has not been able to obtain a complete breakdown of the factors used in the algorithm. And this lack of access is troubling. “The problem is that people who are not statisticians are not able to be critical consumers of the product,” said David Robinson, a data-ethics analyst to City & State.

Recent investigative reports have found similar programs to be racially biased. A recent ProPublica investigation found that a risk-assessment algorithm in Broward County, Florida, developed a bias against African-Americans, routinely labeling low-level offenders “riskier” than white criminals charged with more serious crimes.

Beyond the potential problems of this program, Hetznecker also worries that such predictive analytics could undermine assumptions which are vital to the civil liberties guaranteed by the criminal justice system.

“If you are at a bail hearing and the bail commissioner’s algorithm looks at the arrest history that your parents or siblings have, you’ve completely undermined due process,” says Hetznecker. “You are going to be considered a risk because of where you come from. You are no longer presumed innocent.”

About the Author

Most Popular

  1. Tech workers sit around a table on their laptops in San Francisco, California
    Life

    America’s Tech Hubs Still Dominate, But Some Smaller Cities Are Rising

    Despite established urban tech hubs, some smaller cities are attracting high-tech jobs with lower living costs, unique talent pools, and geographic diversity.

  2. Solar panels on a New York City rooftop.
    Environment

    New York City Passes Sweeping Climate Legislation

    The Climate Mobilization Act lays the groundwork for New York City’s own Green New Deal.

  3. A tent-like pavilion with a colorful stained-glass design in a cemetery at dusk.
    Design

    The New Art Galleries: Urban Cemeteries

    With their long-dead inhabitants remembered only foggily, historic cemeteries like Mount Auburn and Green-Wood use art to connect to the living.

  4. A photo of the Notre-Dame Cathedral fire in Paris.
    Design

    Amid Notre-Dame’s Destruction, There’s Hope for Restoration

    Flames consumed the roof and spire of the 13th-century cathedral in Paris. The good news: Gothic architecture is built to handle this kind of disaster.

  5. A photo of a model of a church created by a young person held at Tornillo detention facility in El Paso, when it was operational.
    Equity

    In a Border Detention Center, Art Helps Migrant Kids Remember Home

    A new exhibit in El Paso showcases works of art created by children detained in a massive border encampment of migrants in Tornillo, Texas.