In late spring of 2012, climactic chaos descended upon the Midwest and Great Plains in the midst of the growing season. A drought is supposed to unfold on a timeline of seasons to years, but in the two weeks between June 12 and 26, the High Plains went from what a monitoring group called “abnormally dry” to “severe drought.” The affected area ballooned from covering 30 percent of the continental US in May to over 60 percent by August, with the agricultural losses tallying in the tens of billions of dollars.
The region had crashed into a “flash” drought—think of it like a flash flood, only far bigger and therefore far more consequential. It’s a phenomenon science is just beginning to understand, let alone predict. But today in the journal Nature Climate Change, two dozen researchers—atmospheric scientists, computer scientists, climate scientists, and more—are publishing a perspective piece trying to get their community to agree on a standard definition for a flash drought, and to set research priorities for the future. Why, for instance, do flash droughts happen in the first place? How can scientists get better at predicting them and giving water managers warning? And if climate change is making the world drier in general, what does that mean for flash droughts?
“I think the challenge with drought, just in general, that makes it so much different than any other hazard—much more challenging and very costly—is the fact that it has a very potentially large spatial footprint and a very potentially long temporal footprint,” says Mark Svoboda, director of the National Drought Mitigation Center and coauthor on the new paper. “Compared to a flood, earthquake, hurricane, tornado, there they have a relatively small impact area, and they last a very short amount of time.”
Here’s the first tricky bit: Calling a drought a “drought” is both an objective and subjective science. The objective side comes from raw data about precipitation and soil moisture. “But there are also these things that are coming from people on the ground, their opinions and their subjective observations,” says Angeline Pendergrass, an atmospheric scientist at the National Center for Atmospheric Research and lead author on the new paper. “And so this is a very rich data set, but it’s also not entirely objective.”
By experts, she doesn’t mean old folks sitting on porches saying that their joints aren’t acting up, so there must not be any rain coming. She means water managers, who control the distribution of water to residents and industries, and local officials who talk to farmers and other workers likely to be economically affected by drought.
These insights are combined with those objective measurements by the US Drought Monitor—a collaboration of federal agencies, like the USDA and NOAA, and Svoboda’s National Drought Mitigation Center—which updates a map every Thursday showing which parts of the country are in drought and how severely they are affected. The group ranks conditions from D0 (“abnormally dry” yet still not a drought) to D4 (“exceptional,” or the worst-case scenario).
It was this body that determined the High Plains jumped from D0 to D2 (“severe drought”) over those two weeks in 2012. “So one measure of intensification is changing by two categories of drought in two weeks,” says Pendergrass.
The problem is that this score deals in subjective judgments, and necessarily so. For scientists to actually quantify how things might get worse in the future thanks to climate change, they need objective measurements. “We don’t have a US Drought Monitor for the future,” Pendergrass says. “We need to have a different, more objective definition of flash drought in order to be able to even quantify how models project that flash drought could change.”