Researchers have developed a computer algorithm that quickly and accurately assimilates actual biological data into climate models to generate more reliable forecasts.
Because of the high stakes involved in meeting air-quality targets, scientists, city officials and regulators all want a more accurate way not only to measure air quality but also to predict where pollution “hot spots” will occur and plan for additional control strategies. For example, when air-quality monitors and environmental regulators inspect the pollution levels of cities, the difference of one or two parts per million in the concentration of pollutants like ozone and carbon monoxide can mean the difference between achieving a target and having to implement additional costly provisions to get failing areas back on track, researchers at Argonne National Laboratory said.
The method uses a technique of data assimilation and the Ensemble Adjustment Kalman filter (EAKF) algorithm. According to researchers, when scientists include measurement data in their models, the uncertainties in those measurements compound the uncertainties already present in the model. Compensating for these uncertainties requires a mathematically rigorous analysis, so researchers decided to launch many simulations with slightly different initial conditions. This ensemble-based approach creates a better method to correct for uncertainty.
“By incorporating observation data into our models, we can refine our predictions,” said environmental scientist Rao Kotamarthi Argonne. “Meteorologists have been doing it for a while, but people in the chemical trace gas and aerosol modeling community have just started doing it.” The ensemble methods will give policy-makers another tool to guide their decisions, Kotamarthi added.
In order to prove the algorithm’s effectiveness, three different sets of numerical experiments were performed to test the effectiveness of the procedure and the range of key parameters used in implementing the procedure. The model domain includes much of the northeastern United States.
The first two numerical experiments use idealized measurements derived from defined model runs, and the last test uses measurements of carbon monoxide from approximately 220 Air Quality System monitoring sites over the northeastern United States, maintained by the U.S. Environmental Protection Agency. In each case, the proposed method provided better results than the method without data assimilation, researchers said.
Although Kotamarthi’s model looks at carbon-monoxide emissions, he claimed that researchers could use similar algorithms to examine the atmospheric concentrations of carbon dioxide and other greenhouse gases and aerosols. Kotamarthi and Argonne environmental scientist Paul Hovland have initiated a NASA-funded project to develop data assimilation methods for environmental chemical models that can incorporate satellite measurements of several atmospheric gases.Data assimilation may also boost researchers’ ability to project likely climate scenarios for the “near-term decadal scale”—approximately 10 to 20 years—which would help public officials assess the consequences of their decisions that concern climate change, researchers said.
Complex algorithms have been all the rage this year as scientists focus on myriad environmental problems. For example, IBM earlier this year said its researchers had created specialized algorithms to help model and manage natural disasters such as wildfires, floods and diseases.
The idea is to use high-level math techniques, which IBM calls Stochastic programming, to help speed up and simplify complex tasks such as determining the fastest route to deliver packages, detecting fraud in health insurance claims, automating complex risk decisions for international financial institutions, scheduling supply chain and production at a manufacturing plant to maximize efficiency or detecting patterns in medical data for new insights and breakthroughs.
Layer 8 in a box
Check out these other hot stories: