Skip to content ↓

Method helps reduce uncertainty of global climate prediction

Research scientist Chris E. Forest (seated), Professor Peter H. Stone and research scientist Andrei P. Sokolov (right) gather around a monitor displaying their climate model, which reduces the uncertainty of global climate change predictions. On the screen is a temperature record for the global mean surface temperature from 1900 to 1998.
Caption:
Research scientist Chris E. Forest (seated), Professor Peter H. Stone and research scientist Andrei P. Sokolov (right) gather around a monitor displaying their climate model, which reduces the uncertainty of global climate change predictions. On the screen is a temperature record for the global mean surface temperature from 1900 to 1998.
Credits:
Photo / Donna Coveney

Three MIT researchers reported at the fall meeting of the American Geophysical Union (AGU) that they have come up with a way to quantify the world of global climate change prediction and reduce some of the uncertainties.

The MIT 2D Climate Model can tell where each of the major global climate change models falls in terms of probability with respect to observations, said research scientists Chris Forest and Andrei P. Sokolov and Professor Peter H. Stone, all of the Department of Earth, Atmospheric and Planetary Sciences.

The MIT model yields an objective measure of how well each model measured up to what really happened based on the most recent 100 years of recorded data. In fact, these results indicate that the uncertainty in the net cooling effect due to sulfate aerosols and other pollutants is significantly less than what will be presented in the Third Assessment Report of the Intergovernmental Panel on Climate Change.

The researchers presented their work in San Francisco during a December 17 AGU session on climate change detection and attribution. Their paper on the method being applied in the results presented at the meeting is slated to be published in the journal Climate Dynamics.

"Once you have the probabilities, you can talk about uncertainties of future global warming, dependent on emissions of greenhouse gases. The probability of different scenarios can be determined by what we have here," Dr. Forest said.

Climate change models calculate the potential future impact of changes in concentrations of greenhouse gases such as carbon dioxide, methane, chlorofluorocarbons and others. Some of these are produced by natural sources and some are tied to human activities. No one knows for sure what the future holds in terms of concentrations of these gases in the atmosphere. This variable, coupled with the Earth's chaotic weather system, adds up to a lot of uncertainty.

Running the 10 or so existing major climate models on supercomputers is slow and expensive, so researchers "tend to talk about just a handful of predictions" of future climate conditions for the planet, Dr. Forest said.

"The problem is that certain features of these models are uncertain. Our 2D model allows major uncertainties to be varied, so you can see how changing one parameter affects the probability of the outcome and test this value against the others," he said.

In addition, the 2D model, though sophisticated, is inexpensive to run.

FORCING A CHANGE

Radiative forcing is a change imposed upon the climate system by greenhouse gases, for instance, that modifies the system's radiative balance. Many climate models seek to quantify the ultimate change in Earth's temperature, rainfall and sea level from a specified change in radiative forcing.

In addition to greenhouse gases, other variables affecting climate change are sulfate dioxide emissions, which are converted into sulfate aerosol particles that reflect sunlight back to space and change the amount of heat absorbed by the Earth; the rate of heat uptake by the oceans, which is not well known and has not figured prominently in assessments of existing models; and climate sensitivity, which is the likely temperature change on Earth if carbon dioxide concentration were to double in the atmosphere.

Each global climate change model and forcing scenario typically is tested against the observed climate change data to produce a "confidence interval," or likelihood that the model and scenario are correct. Some predictions were long assumed to have a certain confidence interval. But, Dr. Forest pointed out, "those calculations don't look at the results in terms of physical properties of the system. For instance, if you change the ocean parameter or the aerosol parameter, is the likelihood better or worse? They can't separate the effect of each parameter the way we've done it here."

Varying climate sensitivity, ocean heat uptake and strength of the aerosol forcing, the MIT researchers ran the 2D model for 1860 to 1995 and compared temperature change response to the recent observed record to see what changes in parameters were consistent with the observed record. They found that certain scenarios were far more unlikely than others.

For example, the likelihood is very low for the combination of a high climate sensitivity, weak aerosol forcing and slow rate of heat uptake. Alternatively, it also is not likely that the climate sensitivity is less than around 1��� C for any combination of aerosol forcing or ocean heat uptake. For either of those combinations to be correct, the Earth should have either warmed more or not at all to date.

"These results demonstrate that if the model diagnostics, i.e., patterns of temperature change, are carefully chosen, they can reduce the uncertainty of physically important model parameters that affect climate change projections," the authors wrote.

This work is supported by the MIT Joint Program on the Science and Policy of Global Change and the National Oceanic and Atmospheric Administration.

A version of this article appeared in MIT Tech Talk on January 10, 2001.

Related Topics

More MIT News