A map of California's Bay Area with a pop up window reading, "A 28-year-old intersex Asian gender nonconforming person got stopped on foot and felt relieved."
Raheem.Ai

Although a new app Raheem.Ai, stems from an incident of brutality, it’s for the sharing of all police interactions, good and bad, to support solutions to end police violence.

During his time as an army engineer, Brandon Anderson handled intelligence sent back from field soldiers and base commanders, seeing firsthand the impact that data had on governmental decision-making. When he returned to civilian life, Anderson decided to use his knowledge of data gathering and dispersal to create a tool that could accomplish a goal close to his heart: protecting communities from police brutality.

That tool, an app called Raheem.Ai, hosted on Facebook Messenger, allows people to report their interactions with police officers. It has received funding from Google and Barack Obama’s My Brother’s Keeper initiative; and last week Anderson was named a 2018 Echoing Green Fellow.

The app prompts users to enter the time and location of the incident, how they felt about it, and demographic data about the officer and about themselves.

Raheem.Ai

Some of the information is then shared on an interactive map. One red dot in Oakland states, “A 36-year-old Black genderqueer person was pulled over and felt disrespected.” Not all of the reports are negative; a green dot in San Francisco reads, “A 28-year-old intersex Asian gender nonconforming person got stopped on foot and felt relieved.”

While he was building the app, Anderson spoke with police departments—rank and file officers as well as police chiefs—and said the response he often got was that, given all the interactions that take place between police officers and members of the community, the percentage of people killed is relatively low.

“How, then are you measuring impact in the community?” Anderson asked. “Is it solely by the number of people we don’t kill? That’s probably not a good metric. I’ve had my own experiences with police—I’m alive, but it doesn’t mean they didn’t impact me in traumatic ways.” Anderson’s own partner was killed by police, a driving force behind Raheem.Ai’s creation.

The app gathers experiences that are already being told: “These stories are what every police chief hears,” said Chris Burbank, Director of Law Enforcement Engagement at The Center for Policing Equity and the former Chief of Police at the Salt Lake City Police Department. “When someone gets stopped you hear ‘My friend gets stopped, [or] my sister.’ The idea of having the data available, having more feedback on how the officers interact with people, I think is very significant.”

Raheem.Ai

Anderson wants to bring those stories out of the shadows. He started digging to understand how people were experiencing policing in their neighborhoods and learned that most people didn’t report their experience with police officers—especially when they were negative.

“People don’t trust the system,” Anderson said, which ensures that they rarely give feedback about police encounters. Anderson wants Raheem.Ai to make it easier for people to document and share their experiences with the police, to create a data-driven way for communities and cities to measure the impact of policing.

The importance of the app, said Burbank, is that it takes the information one step further. “It’s more formalized, so it starts to take on a better tone and the richness of data because there’s some accountability in it—it’s not just the arbitrary re-telling of a story.”

Raheem.Ai

Anderson has three goals for the app: to reduce underreporting, to use the data to advance policy solutions to end police violence, and to arm communities with tools to engage in participatory budgeting. That last aim may seem disconnected from the rest, but Anderson sees it as key to re-thinking the way policing is conceived and funded in cities. Money currently spent on the police force could be diverted to mental health services, he said.

“The reason that police in Ferguson had gas masks, tanks, Humveess—it’s because they had a program that invested in that gear, and that gear was kept because they said, ‘We’ll need this some day,’” said Anderson. “We need to find new ways—better ways—to invest that money, that will provide opportunities that will ultimately solve crime.”

In his book The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement, University of the District of Columbia law professor Andrew Ferguson, wrote that once police misconduct can be viewed as a system failure, rather than an individual problem, it becomes easier to address. “Normally, when you think about what police do and how they interact, it feels human: that every incident is its own unique incident,” said Ferguson. “But the more you quantify, you see police are doing similar things across the country. Once you see that repetition in data, you start seeing that these are systemic issues.”

“The reality of an app that can reveal these patterns is another data point to show that this is a structural problem, and needs a structural response,” he said. Ferguson likens the community-generated data to Yelp reviews: the strength in number of individual voices.

“We start trusting those things in part because the numbers support the intuition that there may be something going wrong there,” he said. “There’s a sense in many communities that there’s something broken in the police-community relationship, but it’s an extra validation if you can see over and over again citizens giving negative reviews to interactions with police.”

However, Ferguson did note that the more granular the data is, the more useful it becomes. The app only allows users on to see the age, gender, race, and location of those stopped, as well as how the interaction made them feel.  They cannot see information about the police officer involved.

That data is available, just not to the general public—at least, not yet. While the app also records more detailed information, like the incident’s case number and officer’s badge number, such information is currently shared only with a city’s mayor’s office, for privacy and legal concerns.

Burbank does see the app’s potential to be a double-edged sword; the flip side, he said, is that you are potentially giving criminals a voice, and if an officer’s information is posted publicly he worries that they could be personally targeted—a concern voiced about some other community-sourced police accountability platforms.

But Burbank still said he would endorse the product if he were still a police chief. The key details the app can provide on the back end—location, time of day, officer involved, person involved, race, ethnicity, gender, gender identity—are significant, he said, because departments can begin to look at how many people a given officer stops and searches, and what those demographics are.

Armed with that data, said Burbank, “I can make an informed decision about whether I want this officer representing me in public, and I can have a data set to back up that decision—and that’s what’s been lacking in policing.”

About the Author

Most Popular

  1. Equity

    CityLab University: Inclusionary Zoning

    You’ve seen the term. But do you really know what it means? Here’s your essential primer.

  2. Life

    Don’t Throw It Away—Take It to the Repair Cafe

    This series of workshops aims to keep broken items out of the landfill, and it might help you save a few bucks, too.

  3. Equity

    The ‘War on Poverty’ Isn’t Over, and Kids Are Losing

    Federal spending on America’s children is heading down, and the drop in funding could be dramatic.

  4. Toxic algae collects at a dam on a river in Florida.
    Environment

    Can Florida’s Toxic Algae Be Stopped?

    The algae blooms pose risks to humans and marine animals—and to Florida’s tourism-dependent economy.

  5. How To

    Ask CityLab: What's the Deal with Steam Rising from the NYC Streets?

    The plumes are so common, but also mysterious.