1 June 2011—The clouds spat out an intermittent drizzle of rain—the kind of halfhearted shower that no one could have predicted—as Jane Lubchenco, administrator of the National Oceanic and Atmospheric Administration (NOAA), stood outside of NOAA’s Satellite Operations Facility and revealed the North American hurricane season forecast for 2011. "NOAA’s forecast team is calling for an above-normal season this year," Lubchenco intoned gravely.
This hurricane season, which begins today and runs through the end of November, is likely to produce 6 to 10 hurricanes in the Atlantic basin, Lubchenco said. Of those storms, 3 to 6 are expected to become major hurricanes, rated Category 3 or above, with sustained winds stronger than 179 kilometers per hour. In other words, Lubchenco told the assembled reporters and TV cameras, coastal communities should get ready, because NOAA forecasts typically come to pass. Last year’s outlook was "amazingly accurate," said Lubchenco. NOAA predicted 8 to 14 hurricanes, with 3 to 7 biggies, and that’s what North America got: 5 major hurricanes out of 12 overall. "It just blew my mind," said Lubchenco. "So we know we’re getting pretty good at making the outlook."
How do the wizards inside NOAA foresee future destruction and spy the winds whirling toward us months in advance? It starts at the Satellite Operations Facility, in Suitland, Md., where the antenna dishes that stud the roof pull down more than 16 gigabytes of environmental data every day from a fleet of 15 weather satellites. Inside the building’s cavernous control room, the satellites’ stewards calmly put their "birds" through their daily paces.
The polar-orbiting satellites, which provide most of the data for NOAA’s forecast models, keep an eye on Earth from 870 km up. They orbit along the north-south axis 14 times each day while the globe spins in a counterclockwise direction beneath them, giving the satellites a different view with each orbit. Because these spacecraft are aging, operators are eagerly looking forward to the next-generation polar satellite, which is scheduled to launch (if the U.S. Congress comes through with the funding) in 2015. As he oversees the control room, shift supervisor Ron Rademacher explains that the main improvements will be in the visual- and infrared-imaging systems. "It’s just like the evolution of cellphones and computers," he says. "You’re not changing what kind of data you’re getting, but you can get a lot more of it, and it’s better quality."
The cloud systems and ocean currents that whirl slowly beneath the satellites’ cameras are all fed into NOAA’s climate models, explains Gerry Bell, who serves as lead forecaster for the hurricane season at NOAA’s Climate Prediction Center. "Knowing what the West African monsoon is doing is critical," Bell tells IEEE Spectrum, "because that’s ultimately where a lot of these storm systems originate. And there’s very little climate data coming out of West Africa." The satellites also provide hard-to-obtain data about El Niño or La Niña weather patterns, which form over the equatorial Pacific Ocean. "We have an array of buoys out there, and we have ships out there, but satellite data is critical," Bell says.
Each April, all the satellite data about both atmospheric and ocean conditions are fed into a computer model of the global climate; those conditions are the starting points for a season-long simulation. The simulation runs on an IBM Power6 mainframe computer in Gaithersburg, Md., which can reach speeds of 69.7 teraflops. Jae Schemm, who oversees the models at the Climate Prediction Center, explains that it takes about three days of compute time to complete one eight-month run of the season. "It’s very expensive. That’s why we can’t do it routinely," she says.
This year, Schemm’s team managed to complete 19 simulation runs in April and May to provide information for NOAA’s hurricane outlook. "In these model runs, we actually see tropical storms developing in the Atlantic basin and track their progress," Schemm says. "That’s how we come up with the number of storms we expect for the season." The storm counts produced by NOAA’s model runs are then combined with other statistical estimates based on current climate conditions to produce the final hurricane outlook.
As sophisticated as NOAA’s climate model is, it’s about to get an upgrade. On 30 March, the newest version of the model went into operation. In the previous version, the atmospheric and ocean models exchanged information once every "day" of the simulation. In the new version, every time the ocean model takes a step forward, the two models will swap data, says Schemm. The new model will also begin to incorporate some of the subtleties of global warming. It will include information about melting sea ice near the poles, as well as the annual readings of carbon dioxide concentrations in the atmosphere. "In the previous version we just used a fixed concentration of CO2, but it’s changing every year," explains Schemm. "There’s a gradual linear increase over the years."
The new model wasn’t used for this year’s hurricane outlook, and it won’t be used next year, either. Schemm says that it must first be tested on historical climate conditions dating back to 1981. The simulation results for each hurricane season can then be compared with the output of earlier climate models and with the real storm counts produced by each year’s wind patterns and ocean temperatures. The Climate Prediction Center hasn’t yet had the computer time to test the new model against all 30 hurricane seasons.
Bell notes that just five years ago, NOAA barely used any climate models to produce its hurricane season forecasts and instead relied on statistical tools. Bell says he’s delighted to see the models evolving and improving so rapidly. When the new model goes into service, he says that NOAA forecasters can take the next step: attempting to predict not just how many dangerous storms the Atlantic Ocean will give birth to but also how many will make landfall and which stretches of coastline should prepare for a blow. "We don’t do that now," says Bell. "But with the continued development of these models, it’s now at least a possibility."
A correction to this article was made on 3 June 2011.
Senior Editor Eliza Strickland joined IEEE Spectrum in March 2011 and was initially assigned the Asia beat. She got down to business several days later when the Fukushima Daiichi nuclear disaster began. Strickland shared a Neal Award for news coverage of that catastrophe and wrote the definitive account of the accident's first 24 hours. She next moved to the biomedical engineering beat and managed Spectrum's 2015 special report, “Hacking the Human OS." That report spawned the Human OS blog about emerging technologies that are enabling a more precise and personalized kind of medicine. The blog reports on wearable sensors, big-data analytics, and neural implants that may turn us all into cyborgs. Over the years, Strickland watched as artificial intelligence (AI) technology made inroads into the biomedical space, reporting on crossovers between AI and neuroscience research and IBM Watson's ill-fated efforts in AI health care. These days she oversees Spectrum's coverage of all things AI. Strickland has reported on science and technology for nearly 20 years, writing for such publications as Discover,Nautilus, Sierra, Foreign Policy, and Wired. She holds a master's degree in journalism from Columbia University.