Predicting the Future of Drought Prediction

Better instruments and models could help scientists forecast droughts years in advance

3 min read
Photo of farmer viewing dry land
Photo: Scott Olsen/Getty Images

As extreme weather events go, droughts—like the one that singed Russia’s wheat crop two summers ago and the one that engulfed the United States in July—are about as tricky as it gets. Unlike hurricanes and tornadoes, drought does not have an obvious start or end. In fact, there isn’t even a clear definition for it, making it hard to measure and monitor, let alone predict. But with better observations of the earth, oceans, and atmosphere and improvements in computer modeling, scientists think they’ll be able to foresee the chances of drought up to a decade in advance, and better predict droughts that arise suddenly or last longer than a few months.

Today, scientists can forecast drought only about three months ahead for most parts of the world with any significant certainty. 

“Forecasting drought is an inexact science,” says Brian Fuchs, a climatologist at the National Drought Mitigation Center (NDMC) at the University of Nebraska–Lincoln. “Drought is typically characterized by slow onset and slow recovery. To pick up signals of something that happens over weeks or months is hard for computer models.”

Drought involves a seemingly inexhaustible list of factors—from local ones such as groundwater level, stream flow, soil moisture, and vegetation, to large-scale global weather patterns such as El Niño and La Niña. All of these change over different time periods, from days to decades, and many are tied to each other. Global warming tops off the chaotic mix. 


Image: National Drought Mitigation Center
Drying Up: A screen shot from the U.S. Drought Monitor in late July. Other countries are working on similar systems.
Click on image for a larger view.

Nevertheless, scientists in the United States have produced some of the most sophisticated tools to monitor and predict drought. Resource planners and policymakers rely mainly on the U.S. Drought Monitor, an online map of dryness that has been updated weekly since it was unveiled in 2000 by the Department of Agriculture, the National Oceanic and Atmospheric Administration (NOAA), and the NDMC. The monitor combines several mathematical indices calculated by feeding computer models with temperature, precipitation, and soil moisture data. The physical data comes from NOAA’s polar-orbiting satellites and on-the-ground temperature, rain, and snow gauges. “The U.S. leads in drought monitoring,” Fuchs says. “Many other countries are trying to emulate the U.S. Drought Monitor.”

While monitoring has been done for decades, forecasting drought is still in its infancy. Today, it blends science and art, and the only place it’s done on a national scale is at NOAA’s Climate Prediction Center (CPC) in Camp Springs, Md. Twice every month, meteorologists there subjectively produce the U.S. Seasonal Drought Outlook, which predicts conditions for the next three months.

To create the Drought Outlook, scientists mix data from the Drought Monitor with soil moisture information and the CPC’s three-month forecast of temperature and rainfall. They also incorporate current climate conditions along with past heat and precipitation.

The temperature and rainfall outlook comes from NOAA’s climate model, which runs on an IBM Power6 mainframe computer, and takes half a day to finish one simulation run. The model, which was unveiled seven years ago, couples ocean and atmosphere models that had to be run separately before, explains Dan Collins, a forecaster at the CPC. The newer, improved version that went into operation this year also includes things like sea- and land-ice melting and changing levels of carbon dioxide in the atmosphere. 

Collins says that more-powerful computers would make it possible to include more-detailed physics and more climate system components. “You want to better simulate the way in which clouds are formed or water evaporates from soil,” he says. Greater computing prowess would also mean that each simulation step would cover a shorter time and distance—a few minutes and a few kilometers as opposed to the hours and tens of kilometers used now. This would improve the ability to capture smaller, more rapid changes in temperature and precipitation. 

Other advances expected in the coming years include more extensive ground observation networks and NOAA’s next-generation polar satellite, which is scheduled for launch in 2016 and will be equipped with advanced visual and infrared imagers that produce data with a higher temporal and spatial resolution. These will give better temperature, precipitation, and soil moisture data, which will improve the accuracy of drought prediction, says Fuchs. Already, the Drought Monitor maps have gotten more refined over 12 years because of resolution improvements in satellite gear, he says. What’s more, NASA is developing satellite-based instruments that are better at measuring soil moisture, he adds. The Drought Monitor currently uses soil moisture that is indirectly estimated by computer models based on rainfall and temperature observations. 

Collins anticipates that longer-term drought prediction could be possible five years from now. That’s when NOAA scientists are expecting to next upgrade the climate model, to include decade-long climate fluctuations. They hope to do this by simulating known long-term ocean fluctuations such as the Atlantic multidecadal oscillation. Fuchs says, “I anticipate continuing to see drought monitoring and prediction evolve, and technology will be a driving factor.”

The Conversation (0)

Smokey the AI

Smart image analysis algorithms, fed by cameras carried by drones and ground vehicles, can help power companies prevent forest fires

7 min read
Smokey the AI

The 2021 Dixie Fire in northern California is suspected of being caused by Pacific Gas & Electric's equipment. The fire is the second-largest in California history.

Robyn Beck/AFP/Getty Images

The 2020 fire season in the United States was the worst in at least 70 years, with some 4 million hectares burned on the west coast alone. These West Coast fires killed at least 37 people, destroyed hundreds of structures, caused nearly US $20 billion in damage, and filled the air with smoke that threatened the health of millions of people. And this was on top of a 2018 fire season that burned more than 700,000 hectares of land in California, and a 2019-to-2020 wildfire season in Australia that torched nearly 18 million hectares.

While some of these fires started from human carelessness—or arson—far too many were sparked and spread by the electrical power infrastructure and power lines. The California Department of Forestry and Fire Protection (Cal Fire) calculates that nearly 100,000 burned hectares of those 2018 California fires were the fault of the electric power infrastructure, including the devastating Camp Fire, which wiped out most of the town of Paradise. And in July of this year, Pacific Gas & Electric indicated that blown fuses on one of its utility poles may have sparked the Dixie Fire, which burned nearly 400,000 hectares.

Until these recent disasters, most people, even those living in vulnerable areas, didn't give much thought to the fire risk from the electrical infrastructure. Power companies trim trees and inspect lines on a regular—if not particularly frequent—basis.

However, the frequency of these inspections has changed little over the years, even though climate change is causing drier and hotter weather conditions that lead up to more intense wildfires. In addition, many key electrical components are beyond their shelf lives, including insulators, transformers, arrestors, and splices that are more than 40 years old. Many transmission towers, most built for a 40-year lifespan, are entering their final decade.

Keep Reading ↓ Show less