Right now, 15 minutes might be the best we can do. But with the right tools, we could completely overhaul in the way we forecast tornadoes.
For those caught in the path of the EF 4 tornado in Oklahoma last week, 16 minutes were all they had to prepare themselves for the 200 mph winds that would flip cars, twist steel, and all but complete destroy a 1.3 mile-wide swath of the town.
Sixty or so years ago, they would have been lucky to have any warning at all. A form of the National Weather Service has been issuing tornado warnings only since 1938. Previously, the word "tornado" had been banned by the Army Signal Corps, which used to monitor the weather. To the corps, tornadoes were much too unpredictable to track, and the word incited more panic than meaningful response.
The House Environment Subcommittee is drafting a bill to spur forecasting research and technology procurement in the National Weather Service, in part, to extend lead times for tornado warnings. The National Weather Service had previously received $23.7 million in funds for forecasting in the Sandy Relief bill. The current bill would seek spur forecasting research investments (there's no specific dollar figure yet), and to replace satellite systems that will degrade over the next few years.
Previously, the Government Accountability Office had warned of a potential satellite data gaps in 2016 or 2017 because of old satellites not being replaced. Also propelling the bill is the fact that European forecasters are outpacing the Americans (European weather models more accurately predicted the path of Hurricane Sandy than the American forecaster). Barry Lee Myers, CEO of Accuweather put it in stark terms at a recent congressional hearing on the bill: "From a national-security standpoint, relying on other countries for better weather models places America in a weak and subservient position."
In a lot of ways, and especially compared with the tracking of large, slow-moving storms such as hurricanes, we haven't progressed very far in terms of issuing tornado warnings. The 16 minutes of warning in Moore, Oklahoma, is close to the average of 13 minutes. Meanwhile the rate of false alarms for tornado warnings is near 75 percent, according to NOAA. There's not much to do in the path of a tornado besides shelter in place.
"Because we can see hurricanes, because they are large, and because they move relatively slowly over land and sea areas, we can evacuate people," Myers testified. "With regard to tornadoes, we do the opposite, we expect people to ride out the storm in their bathtubs. That's unacceptable."
For the near term, or at least the next decade, those 15 or so minutes might be the best we can do. But NOAA's Lans Rothfuz explains that the next decade or so might see a complete overhaul in the way tornadoes are forecast. Right now, the only way to issue a warning for a tornado is to detect one forming. NOAA calls this "warn on detection." The ideal is a system based on warning on a forecast that is based on a complete computer simulation of a storm system.
"If you're relying on just detecting, some thunderstorms last for about an hour and then they are gone," says Rothfuz, who works in the National Storm Prediction Center. "And so, if you're relying on detection the best you can hope for is maybe 15—30 minutes at best—detections, lead times."
There are three obstacles to making this transition, Rothfuz says:
One: We need better science—more accurate and complex weather forecasting models that can recreate a storm on a computer. To simulate a storm is to predict its path with greater reliability.
Two: We need better, denser weather data. "In order to get the data that these models need, we have to take it down to finer and finer resolutions," he says. Currently, weather balloons monitor conditions every 100 miles (to use the resolution metaphor, this makes for a pixelated map). Increasing data points means implementing more advanced radars, such as as the Phased Array Radar system, which collects data one-sixth faster than more common systems. Or using new technologies, such as GPS profiling, which analyzes regular GPS signals as the bend through the atmosphere. Or crowd-sourcing, collecting data from passenger cars equipped with temperature and rain sensors.
Three: We need the sheer computer horsepower to run dense data through complex models.
These changes are in the works, but they may take a while to come to fruition. "We're still talking about 10 years out for that type of technology," Rothfuz says.
Top image: The evening light illuminates a car and a tree stump in an area heavily damaged by the May 20 afternoon tornado in Moore, Oklahoma. (Lucas Jackson/Reuters)
This post originally appeared on National Journal, an Atlantic partner site.