Laura Bliss is a staff writer at CityLab, covering transportation and the environment. She also authors MapLab, a biweekly newsletter about maps (subscribe here). Her work has appeared in the New York Times, The Atlantic, Los Angeles magazine, and beyond.
Why do some intersections have such high pedestrian injury rates? It’s not all about the number of cars.
In 2011, there were 15 injury-causing crashes at Seventh Avenue and West 23rd Street in New York City. Nine involved pedestrians struck by vehicles. The intersection boasts one of the highest rates of pedestrian pain anywhere in the city. So city traffic engineers targeted the Manhattan crossroads for a major safety improvement project in 2013.
Once an open concourse of unseparated car lanes, Seventh Avenue now has two high-visibility pedestrian safety islands and specially marked left turn lanes squeezed between islands and curb. Left turns are banned altogether on one side of 23rd Street, and all four corners of the intersection have audible crosswalk signals.
Crashes causing pedestrian injuries there have gone down 68 percent since, according to city documents. Citywide, pedestrian deaths are ticking up. But fatalities have dropped 34 percent in areas that have seen major street design changes since 2005.
That’s great news. But what, specifically, drives down injuries in such dangerous spots? If you shortened a crosswalk on Fifth Avenue, would it reduce crashes, and by how much? How about narrower lanes? Or banning left turns? Since most improvement projects involve multiple design changes, it’s hard to know exactly what helped, and by how much.
“This is kind of what everyone who works in public policy wants,” says Rob Viola, the director of safety policy and research at the NYC Department of Transportation. “You want to know which piece is doing the most.”
In August 2015, the city teamed up with Datakind, a nonprofit that provides data science services to the social sector, to build a tool capable of projecting the impact of any given engineering intervention, on any given New York City street. An ambitious goal, for sure, and the first step was a bear by itself. Without a great estimate of citywide traffic volumes, it’d be impossible to know to the true rate of crashes in any given spot, or gain an accurate sense of what, besides the mere presence of cars, diminishes safety.
“A lot of the things that come up as important to a predictive crash model are related to traffic exposure,” says Michael Dowd, the Datakind data scientist who was the lead on the project.* “The complexity of an intersection, separated roadways, the roadbed area: these are all characteristics connected to how many cars the street is handling.”
New York City has traffic counts for thousands of corridors reaching back to 2008, but not every street and corner is covered, and not every year. So Datakind spent nearly two years developing an “exposure model” capable of estimating car volumes, using exact traffic counts where they did exist and a machine learning model that predicts volumes where they didn’t. Essentially, artificial intelligence software (which Microsoft provided) “reads” real counts on thousands of corridors, “learns” the contours of high- and low-volume streets, and then spits back predictions for similar locations.
The data scientists then amassed dozens of datasets that reflect the shape of traffic in New York City—crash rates, street widths, locations of pedestrian plazas, signal timings, bus lanes, bike lanes, transit schedules, the volume of retailers, among many others—to tease out whether any of these particular interventions had a statistically significant effect on crash rates. This would be the underpinning of the vaunted “if, then” predictive safety model.
What did they find? Well, nothing, sort of. Once Datakind ran the exposure model, not one engineering approach stood out as a statistically significant driver of lower crash rates.
That doesn’t mean street design has no effect on safety: The 23rd-and-Seventh tweak showed how effective it can be. But there haven’t been enough safety improvement projects to support the kind of nuanced analysis the city hope to do. “It’s a data issue,” says Dowd.
Viola agrees. “I think with a few more years of building, and more tweaks to the model, we can get closer to that holy grail,” he says.
While the city slowly builds a more robust network of safe-street interventions, the traffic exposure model can still help engineers suss out best practices. If a certain location has lots of traffic and few crashes, there might be something to learn from the street’s design. The city could also study how policies like congestion pricing or eliminating street parking might effect crash rates.
New York City isn’t the only place to benefit from this work: Datakind also worked with Seattle and New Orleans to develop similar tools. All three are also “Vision Zero” cities—they’ve signed on to the global effort to reduce traffic fatalities—so there’s hope that all member metros can learn from the process, too. Data-driven models may never be perfect, but in an era of limited funds and rising pedestrian fatalities, city leaders need all the help they can get to figure out why some street designs save lives.
*CORRECTION: A previous version of this article misattributed this quote to Jake Porway, founder and executive director of Datakind.