Where Do Climate Change Projections Come from

Humans are conducting an unprecedented, deliberate yet uncontrolled experiment using our planet as its subject. Human-induced emissions of greenhouse gases and other pollutants, together with changes in land use, like deforestation, are altering our climate system properties in ways that are already detectable (Hegerl et al. 2007). The experiment is continuing, with future emissions projected to steadily raise the greenhouse gas concentrations in the atmosphere. This is because greenhouse gases like CO2, unlike other gas species have a long life measured in decades and centuries, so that emissions over the years accumulate and increasingly alter the natural state of the system.

Climate Central, Princeton, NJ, USA email: [email protected]

R. Knutti

Institute for Atmospheric and Climate Science, ETH (Swiss Federal Institute of Technology), Zurich, Zwitzerland

D. Lobell and M. Burke (eds.), Climate Change and Food Security, Advances in Global Change Research 37, DOI 10.1007/978-90-481-2953-9_3, © Springer Science + Business Media, B.V. 2010

Because we are changing the natural climate state like never before, it would be unreliable to simply extrapolate current trends into the future in order to predict what we will experience as a result. This is particularly true when we focus on regional changes, which are most important for devising adaptation measures. The interactions and reactions of the system are too complicated to be approximated by statistical models. In fact, as we will see, they are often complicated enough to present a challenge even for process-based, dynamical climate models. Rather, climate scientists use numerical models to construct surrogates of the real system, in order to perform a controlled, and replicable, version of the experiment. In this fashion they can test different assumptions in future anthropogenic emissions and other parameters regulating the climate system, span a wide range of uncertainty at least with regard to the known unknowns, and thus offer a range of climate change scenarios attempting to span a substantial portion of the relevant uncertainties.

There exists a hierarchy of climate models, from simple energy-balance models that can only approximate the trajectory of global mean temperature to models of intermediate complexity (Claussen et al. 2002) that can only resolve very large regions, to global coupled models, which are the subject of this chapter. These extremely complex computer models, also called atmosphere-ocean general circulation models (GCMs), divide the surface of the Earth, the depths of the oceans and the layers of the atmosphere into grid boxes. These GCMs describe the evolution of a host of climate variables at each grid box and for various time steps (between a few minutes and an hour) by solving differential equations derived from well-established physical laws, such as conservation of energy and angular momentum.

In the typical climate change experiment the simulation starts from conditions representative of the climate of pre-industrial times (around 1850), and is performed by letting the system evolve according to the laws of physics, undisturbed (i.e. not prescribing any observed changes), except for so-called external forcings to the system. Some of these external forcings occur naturally, like changes in solar irradiance (the 11-year solar cycle for example) or volcanic eruptions, which may be energetic enough to spew large quantities of aerosols in the stratosphere. The volcanic dust acts as a reflective cloud, partially shielding the surface of the Earth from incoming radiation and thus having a short-lived cooling effect on the order of a few years.

Particularly important in climate change experiments are increasing atmospheric concentrations of greenhouse gases, which are another form of external forcing, but anthropogenic rather than natural. They are imposed according to standard scenarios agreed upon by the scientific community, reflecting hypotheses about the future evolution of socio-economic, technological and political factors. Climate model simulated changes are therefore termed projections rather than predictions, because they are usually conditional on the assumed storyline or scenario. The system responds to these protracted anthropogenic forcings by altering its behavior in a trend-like fashion, rather than by cyclical or episodic changes which are typically the result of natural disturbances. These changes can be assessed by analyzing the output of a GCM experiment which is typically at least two and a half centuries long, producing simulations of climate from pre-industrial conditions out to the end of the twenty-first century, and taking on the order of weeks to be carried out on super-computers at research centers around the world.

As both our scientific understanding of climate process and our computing power improve, more and more processes at increasingly finer scales can be represented explicitly in these simulations. The size of a GCM grid box is limited by the amount of computer power available. Doubling the resolution of a model grid, for example going from 250 km by 250 km grid boxes, typical of the current models, to 125 km by 125 km grid boxes makes the model about ten times slower to run. Even with relatively fine resolutions there always remains the need for approximating those processes that act at scales not explicitly represented. It is these approximations that are the source of large uncertainties, since many of the fine scale processes are responsible for the physical feedbacks that ultimately determine the direction and size of the changes of the system in response to its perturbations. Furthermore, fine scale processes are critical in determining the statistics of climate at local scales, which are usually the most relevant in determining impacts.

Let's consider a concrete example. The typical resolution of the GCMs that will participate in the next (fifth) assessment report of Intergovernmental Panel on Climate Change (IPCC, whose latest assessment report on the physical science basis of climate change is Solomon et al., 2007) will consist of about 200 km-wide boxes. An important process not explicitly represented at these scales is cloud formation. Nevertheless, the model needs to answer questions such as: how large a portion of the box is covered by clouds, given the temperature, humidity, pressure and wind conditions simulated at the box scale? What kind of clouds are going to form, high or low? How does the presence of aerosols influence the water holding capacity of cloud particles? How many water droplets will form, and what is the threshold for rain? The answers to these questions at each time step of the simulation are governed by parameters in the equations whose values are best guesses informed by experiments and observations, but contain a measure of uncertainty which reverberates in space and time within the simulation. Because the parameter-izations are describing the large-scale effect of the cloud rather than actually resolving the processes in the clouds, the values used in the parameterizations often need to be chosen to match some observed evidence, but they do not represent real physical quantities that can be measured directly with any instrument. The effect of clouds on temperature and of course precipitation behavior, and the ensuing interactions among climate variables, is extremely significant and determines the magnitude of the changes simulated in response to external forcings.

Different GCMs are developed across the world. About 15 research groups of different nationalities have produced climate models which use different solutions to the numerical integrations, grids of different resolution, different sets of processes explicitly represented (does the model have interactive vegetation? Interactive carbon cycle?) and, most importantly, different approximations to the unresolved processes. What results is an ensemble of models which could be thought of as a set of best guesses, and can help address the question of structural uncertainty across models.

However, within a single model, formulations of many alternative parameteriza-tions of sub-grid scale processes and parameter values are admissible, and costly experiments that vary those settings and thus explore within-model sources of uncertainty are being performed as well, albeit in just a handful of modeling centers because of the resources that they require. These are called perturbed-physics experiments, and probably the most famous example is given by climateprediction.net (Stainforth et al. 2005) whereby tens of thousands of variations of a Hadley Centre GCM (developed by the UK MetOffice) are distributed to personal computers all around the world, which run the model experiment in their idle time and send back results to a group of scientists in Oxford, who then analyze them to determine to which combination of changes in parameters' value the model is most sensitive.

Because different models make different choices about which processes to model and how to model them, there is a clear need to explore climate change projections across sets of GCMs, rather than relying on a single model's results. Also important are the limitations inherent in the resolutions of global models, which limit the models' abilities to represent local climates accurately, especially when those climates are influenced by complex topography not accurately represented at the GCM resolution. The limitation in resolution also undermines the models' ability to simulate particular sets of variables. Precipitation - especially summertime precipitation that is caused by small-scale convective processes - is a typical example. Winds at the surface are another example. As a result, there exists a cascade of confidence in the output of GCMs among climate scientists and modelers. Smooth fields of temperatures at continental scales are considered fairly reliable, details of temperature at regional scales less so. General tendencies in precipitation - changes given as a function of latitudes, for example - are generally agreed upon, but local features much less. In general, large area averages are considered more reliable than spatial details, and mean values are more robustly represented than variability and trends (e.g., Raisanen 2007).

Nevertheless, impact analysis needs regional detail. In order to "translate" large scale projections to local scales, two techniques of so-called "downscaling" are used. Regional dynamical models covering a limited domain can be nested into global models. Alternatively, statistical relations between the large scales and local scales may be derived on the basis of observations and applied to the large-scale projections. A simple and common example of this approach is to use only GCM projections of changes in temperature or precipitation, rather than absolute values, and add these changes to historical weather data from local stations.

In both dynamic and statistical downscaling, spatial detail is added to the coarse grid scale results, but part of that information is often just interpolation rather than providing additional knowledge and understanding. Regional models can be run down to ten or fifty kilometers, because they are run for a limited area (for example, North America, Europe, or South Africa) and usually for limited simulation times (for example 20 years at the end of the twentieth century, 20 years straddling 2050 and 20 years at the end of the twenty-first century). Similarly, statistical relationships can be fitted very economically between point-locations (like weather stations) and large scales and then applied to GCM output. Often, through the statistical approach, bias corrections or other calibration of the model output to the statistics of the observed regional weather, like variance inflation, can be imposed. Dynamical downscaling output has been shown to reproduce the statistics of extremes more accurately, thanks to the higher resolution at which simulations are conducted. However, the limitations and (most importantly) the uncertainty inherent in the results of the GCM used to drive the regional downscal-ing are inevitably passed down to the regional results. In order to address the characterization of uncertainty across models, similarly to what is being accomplished by coordinated experiments at the GCM level, efforts to conduct systematic downscaling from an ensemble of global models are being made, and some programs are well under way or are planned to be associated with the next IPCC report activities. (e.g., the North American Regional Climate Change Assessment Program; the PRUDENCE program in Europe, described in Christensen and Christensen 2007; Vrac et al. 2007).

There are other sources of uncertainty when it comes to future projections, mainly natural variability and emission uncertainty. Natural variability is due to the chaotic nature of weather processes which determine fast fluctuations in the time series of any given variable of interest. By definition climate is the long-term average behavior of the system, and in this respect fast fluctuations cancel out. However, when running impact models that need daily data, for example, it is important to feed different realization of model simulations based on different initial conditions to get a measure of the natural variability at play. For some variables (e.g., seasonal mean temperatures, averaged over decades) the uncertainty due to natural variability becomes secondary compared to the uncertainty due to modeling and emission scenarios. For other variables (e.g., precipitation, or extremes), natural variability maintains an important role in the overall uncertainty of future projections. Uncertainty in the magnitude of future greenhouse gas emissions is driven by uncertainty in the socio-economic, technological and political factors that will determine population growth, technological progress, energy demand and so on. Scientists have so far refused to assign probabilities to different scenarios of greenhouse gas emissions, opting for designing standard pathways of future emissions (Nakicenovic and Swart 2000), so that model experiments can adopt these prescribed alternative storylines and so their results can be analyzed conditionally on a specific emission scenario. We show in Fig. 3.1 time series of CO2 concentration levels from three of the most commonly explored SRES scenarios: A2, A1B and B1.

One should be careful in mixing scenario and model uncertainty, as the two are quite different. In a sense, scenario uncertainty is a matter of choice when making decisions, whereas the modeling uncertainty reflects our limited understanding or incomplete description of the true climate system in a numerical code. In the remainder of this chapter we then focus on modeling uncertainties.

As a summary of the discussion so far we list in Box 3.1 the main sources of uncertainties, their causes and their possible solutions.

C02 concentrations in 3 SRES emission scenarios

C02 concentrations in 3 SRES emission scenarios

1850 1900 1950 2000 2050 2100

Fig. 3.1 CO2 concentrations along the centuries as prescribed by three commonly used SRES scenarios: A2 (solid line), A1B (dashed line) and B1 (dotted line). Units on the y-axis are parts per million (ppm)

1850 1900 1950 2000 2050 2100

Fig. 3.1 CO2 concentrations along the centuries as prescribed by three commonly used SRES scenarios: A2 (solid line), A1B (dashed line) and B1 (dotted line). Units on the y-axis are parts per million (ppm)

Renewable Energy 101

Renewable Energy 101

Renewable energy is energy that is generated from sunlight, rain, tides, geothermal heat and wind. These sources are naturally and constantly replenished, which is why they are deemed as renewable. The usage of renewable energy sources is very important when considering the sustainability of the existing energy usage of the world. While there is currently an abundance of non-renewable energy sources, such as nuclear fuels, these energy sources are depleting. In addition to being a non-renewable supply, the non-renewable energy sources release emissions into the air, which has an adverse effect on the environment.

Get My Free Ebook


Post a comment