The accuracy of storm forecasting could improve dramatically because of research examining the individual cells that make up severe thunderstorms and tornadoes.
The prediction of thunderstorms has never been an exact science. But a research team from the University of Oklahoma and the federal government is poised to dramatically improve weather forecasting with supercomputer analyses of the individual cells that make up severe thunderstorms and tornadoes.
The numerical weather predictions widely used today suffer from coarse resolution, focusing on geographical areas of 10 kilometers or more, says Ming Xue, director of Oklahoma’s Center for Analysis and Prediction of Storms (CAPS). Greater computational power than is generally available to forecasters is necessary to observe the progress of individual storm cells, which can be as small as a few kilometers. Major storm systems are composed of many such cells.
“Without such resolution, you can’t really tell whether you will get thunderstorms or not,” Xue says. A typical forecast “does not explicitly predict individual cells. From the routine forecasts all we get is three-hour accumulated precipitation. You can’t really tell within the three hours when the precipitation actually falls.”
CAPS has teamed with the National Oceanic & Atmospheric Administration (NOAA) to run analyses of 2-kilometer areas throughout two-thirds of the United States, with a Cray supercomputer at the Pittsburgh Supercomputing Center.
They run 10 models simultaneously in what is known as ensemble forecasting, which uses the average of numerous models to lessen the impact of uncertainties and errors within forecasting. The forecasts, which lasted from mid-April through early June, were the first in which Xue’s team applied ensemble forecasting to individual storm cells. The analyses last eight hours and are able to predict weather 33 hours into the future.
Eventually, the strategies CAPS and NOAA have developed could improve predictions of all types of weather. The biggest roadblock to widespread adoption is availability of computing power.
“In order to do what we did, we needed 700 processors to run overnight,” Xue says. “The actual routine availability of this operationally is more than five years away.”
Xue was satisfied by the monitoring of individual cells two-thirds of the time. Forecasts will be more accurate next year, when his researchers begin using radar data, as opposed to information from weather balloons, satellites and aircrafts.
“We are the group who is best at using radar data, but it’s a very significant effort. This year is the first year, so it took a lot of setup,” Xue says.
The University of Oklahoma and the NOAA have enough funding to continue the experiments over the next two spring storm and tornado seasons, he says.
Researchers presented their results at the American Meteorological Society’s Conference on Weather Analysis and Forecasting in late June. Cray has begun publicizing the project to trumpet the power of the Cray XT supercomputer.
Each day during the experiment, trillions of bytes of data were generated, archived and transferred from the Pittsburgh Supercomputing Center to the National Weather Center in Norman, Okla.