Data analysts are trying to give community development advocates the tools they need to fight displacement and economic decline.
“I know it when I see it,” is as true for gentrification as it is for pornography. Usually, it’s when a neighborhood’s property values and demographics are already changing that the worries about displacement set in—rousing housing advocates and community organizers to action. But by that time, it’s often hard to pause, and put in safeguards for the neighborhood’s most vulnerable residents.
But what if there was an early warning system that detects where price appreciation or decline is about to occur? Predictive tools like this have been developed around the country, most notably by researchers in San Francisco. And their value is clear: city leaders and non-profits pinpoint where to preserve existing affordable housing, where to build more, and where to attract business investment ahead of time. But they’re often too academic or too obscure, which is why it’s not yet clear how they’re being used by policymakers and planners.
Mallach’s non-profit focused on revitalizing distressed neighborhoods, particularly in “legacy cities.” These are towns like St. Louis, Flint, Dayton, and Baltimore, that have experienced population loss and economic contraction in recent years, and suffer from property vacancies, blight, and unemployment. Mallach is interested in understanding which neighborhoods are likely to continue down that path, and which ones will do a 180-degree turn. Right now, he can intuitively make those predictions, based on his observations on neighborhood characteristics like housing stock, median income, and race. But an objective assessment can help confirm or deny his hypotheses.
That’s where Steif comes in. Having consulted with cities and non-profits on place-based data analytics, Steif has developed a number of algorithms that predict the movement of housing markets using expensive private data from entities like Zillow. Mallach suggested he try his algorithms on Census data, which is free and standardized.
The phenomenon he tested was ‘endogenous gentrification’—this idea that an increase in home prices moves from wealthy neighborhoods to less expensive ones in its vicinity, like a wave. In his blog post, Steif explains:
Typically, urban residents trade off proximity to amenities with their willingness to pay for housing. Because areas in close proximity to the highest quality amenities are the least affordable, the theory suggests that gentrifiers will choose to live in an adjacent neighborhood within a reasonable distance of an amenity center but with lower housing costs.
Steif used Census data from 1990 and 2000 to predict housing price change in 2010 in 29 big and small legacy cities. His algorithms took into account the relationship between the median home prices of a census tract to the ones around it, the proximity of census tracts to high-cost areas, and the spatial patterns in home price distribution. It also folded in variables like race, income and housing supply, among others.
After cross-checking the 2010 prediction with actual home prices, he projected the neighborhood change all the way to 2020. His algorithms were able to compute the speed and breadth of the wave of gentrification over time reasonably well, overall. Although they were more reliable for some cities, like St. Louis, than for others, like New Haven. More granular, nimble data can sharpen the forecasts, Steif says.
Here’s what the neighborhood change in Detroit, Minneapolis, and Pittsburgh looks like, mapped:
With more resources and more granular data, Steif can bake his algorithms into a web tool like the one mocked-up below:
This kind of web app would not only predict the probability of redevelopment, but can track how it’s happening in real-time by showing new construction permits, rehab permits, and evictions for each parcel of land in the city, helping the not-so-data-savvy community development folks get ahead of the gentrification wave before it crashes down on them.
Of course, tools like this can also be used by developers or average joes looking to profit off neighborhood change—and already are. The point is to make sure the playing field is level. “Technology and open data can democratize this information,” he says. “The private market will always respond faster to market potential but that doesn't mean that this intelligence shouldn't be used to promote equity.”