In an era of fewer resources, police could solve more crimes faster if they could leverage the Big Data long buried within their own departments.
About seven years ago, researchers from the University of Memphis approached the city’s police department with the idea that they might be able to detect patterns in local crime – geographic hot spots on the city’s map and moments in time when they’re most likely to flare up – if they could just have access to the department’s crime data. Police departments produce reams of this stuff: arrest warrants, crime-scene reports, traffic citations, mug shots, dispatch transcripts and incident times. But that data has traditionally been painstaking to cross-reference, to mine for connections and even future trends.
The researchers ultimately turned the department onto an analytic software called SPSS, which had for years been used to crunch data in a host of disciplines not necessarily connected to crime. The department launched a pilot program with it to analyze trends, as part of a strategy of fighting crime by real-time data-mining.
“It brought about some resistance from some of the station command staff because, whereas the crime analysts had been doing a certain thing, now we’re going to completely disturb the normalcy of what’s gone on,” says John F. Williams, the police department's crime analysis manager. “We’re going to create an entirely brand new methodology in our approach to reducing crime in the Memphis area.”
You can almost picture a scene out of that iconic moment in Moneyball, when the grizzled veterans learn that their time-honed human intuition may be replaced by an algorithm. But, of course, analytics work. Williams pulls up some of the most recent results at hand. Comparing the period of March 1-13 this year to the same stretch of 2006, the year in which Memphis really got this program underway, serious crime (the homicides, robberies, rapes, vehicle thefts, etc.) fell by 31.2 percent in the city. And it’s not just that crime dropped, or that officers can now hand over stronger cases to prosecutors.
“The resources that we have now,” Williams says, “will allow us to pretty much solve a crime far faster than what we had in the past.”
This question of time is an intriguing one. The FBI’s Uniform Crime Reports release data – and rate cities – on a mindboggling array of metrics, from arrest and clearance rates to per-capita crime frequency.
“But I’ve never seen an index on Time to Solve a Crime,” says Michael Valocchi, a vice president of IBM who works with the company's Smarter Cities initiatives. “You always see crime statistics, number of killings, number of robberies. You never see a statistic that says, ‘You know what, robberies are going to happen, but I solve them four times faster than my neighboring city.’”
You’d want to live there, right?
• • • • •
In a conference room of the U.S. Capitol Visitor Center in Washington earlier this month, Chriss Knisley was demonstrating for some congressional staff and colleagues a pair of IBM’s favorite new crime-fighting tools. IBM acquired SPSS back in 2009, and did the same late last year with Knisley’s software company, i2. On a computer monitor, Knisley had pulled up a program called COPLINK, which sucks into one massive database all that disjointed information that was once scribbled down by hand.
“Most police officers view those [reports] as a black hole,” he says. “They write up a history report, it gets filed in the system, they never see it again, never use it again. Our goal is to make that information useful to solve crimes.”
He was showing off a real-life incident study, the case of a 10-year-old girl kidnapped in Tucson from a playground, right before the eyes of two other young girls and not far from her parents. Ten-year-old girls aren’t usually great witnesses, and in this case they offered up only a few vague descriptors: two guys, either white or Hispanic, one called the other “Waydo,” and they drove off in a two-door red car of some kind.
Knisley starts to plug this information into his system, as the officer actually did at the scene of the crime. First the software spits out 29 names. Then Knisley asks for a second male associate, and a two-door red car. Each piece of information is connected to myriad others: convicted criminals are linked to the names of every accomplice and victim known in the system, every car and street address they’ve ever been associated with, every alias they’ve gone by and arrest warrant they’ve been served. IBM has even been working on software that can reconcile typos and misspelled identities, correctly pegging Chriss Knisley, Chris Knisley and Chris Knisely as the same man.
Eventually, the system comes up with a suspect with 53 prior arrests – several for child abductions. In the real-life scenario, dispatches were sent out for all of the vehicles associated with his name, and the kidnapped child was recovered just half an hour after her abduction.
The software here was relying only on data already contained within the police department (this does suggest that a first-time offender would have been much harder to catch). But that data has never been this useful before.
“The problem of these islands of information is arguably the No. 1 problem,” says Mark Cleverley, director for IBM Public Safety Solutions, who was standing nearby. “And it’s an old problem.”
Combine these tools with other streams of information from outside of a police department, and they become even more effective. Subpoenaed phone or financial transaction records can instantly be visualized for connections. Or otherwise mundane insights – say the first Tuesday of every month is a common payday for local cash workers – can be woven into predictive models.
“Predictive policing is really the use of information or analytics to generate a hypothesis about a state or a jurisdiction,” Cleverley says. “But what it isn’t is predicting that this individual is going to do this particular thing. That’s a little bit far-fetched.”
• • • • •
Memphis has been using its tools to both predict future crime trends – and deploy officers in anticipation of them – and to aid officers on the ground with instantaneous access to all the types of information Knisley was previewing. Officers in Memphis carry PDAs that automatically geocode incident reports the moment they’re filed into the system from a crime scene.
Those officers are, in essence, pulling and feeding information all in real time. Before this was possible, an officer might have called into a radio dispatcher to pull a suspect’s criminal history. Or, worse yet, he or she might have driven back to the station to rifle through paperwork. Maybe you can wait that long to recover a stolen car stereo. But other crimes – kidnapped children, for one – are a different matter.
“If an agency of this size is still taking paper reports,” Williams says, “their data cannot be as accurate as it should. When an officer takes a paper report, sometimes it may be three days to a week before that paper report is actually physically entered into the records management system by someone.”
Officials from other cities – Cincinnati, Baltimore, Boston, Las Vegas – have been traveling to Memphis for the past few years to watch the system at work. Digging through his email, Williams comes up with a new request to come visit next month, from an academic in Tokyo.
In cities across the U.S., urban crime has been falling for the past decade, and this national context is worth remembering as Memphis touts its tumbling numbers. But as more departments upgrade to these analytical tools – and then link with each other across jurisdictions in the process – it’s possible that cash-strapped cities could accelerate the public-safety gains of the last decade by, among other things, turning time-consuming cases into 30-minute ones.