Validation of Climate Models

THE CLIMATIC SYSTEM is constituted by four intimately interconnected subsystems—atmosphere, hydrosphere, cryosphere, and biosphere—which evolve under the action of macroscopic driving and modulating agents, such as solar heating, Earth's rotation, and gravitation. The climate system features many degrees of freedom, which make it complicated, and nonlinear interactions taking place on a vast range of time-space scales accompanying sensitive dependence on initial conditions, which makes it complex. The climate is defined as the set of statistical properties of the observable physical quantities of the climatic system.

The evaluation of the accuracy of numerical climate models and the definition of strategies for their improvement are crucial issues in the Earth system scientific community. On one hand, climate models of various degrees of complexity constitute tools of fundamental importance to reconstruct and project in the future the state of the planet and to test theories related to basic geophysical fluid dynamical properties of the atmosphere and of the ocean, as well as of the physical and chemical feedbacks within the various subdomains and between them. On the other hand, the outputs of climate models, and especially future climate projections, are gaining an ever-increasing relevance in several fields, such as ecology, economics, engineering, energy, and architecture, as well as for the process of policymaking at a national and international level. Regarding influences at the societal level of climate-related finding, the effects of the fourth assessment report of the Intergovernmental Panel on Climate Change (IPCC4AR) are unprecedented.

The validation or auditing—overall evaluation of accuracy—of a set of climate models is a delicate operation that can be decomposed in two related, albeit distinct, procedures. The first procedure is the inter-comparison, which aims at assessing the consistency of the models in the simulation of certain physical phenomena over a certain time frame. The second procedure is the verification, the goal of which is to compare the models' outputs with corresponding observed or reconstructed quantities. Difficulties emerge because we always have to deal with three different kinds of attractor: the attractor of the real climate system, its reconstruction from observations, and the attractors of the climate models. Depending on the timescale of interest and on the problem under investigation, the relevant active degrees of freedom (mathematically corresponding to the separation between the slow and fast manifolds) needing the most careful representation change dramatically. For relatively short timescales

(below 10 years), the atmospheric degrees of freedom are active, whereas the other subsystems can be considered to be essentially frozen. For longer timescales (100-1,000 years), the ocean dominates the dynamics of climate, whereas for even longer timescales (over 5,000 years), the continental ice sheet changes are the most relevant factors of variability. Therefore, the scientific community has produced different families of climate models, spanning a hierarchical ladder of complexity, each formulated and structured for specifically tackling a class of problems.

COUPLED GLOBAL CLIMATE AND REGIONAL CLIMATE MODELS

Here, whereas most considerations are quite general, we mainly refer to the coupled global climate models (GCMs) and regional climate models (RCMs) currently used for the simulation of the present climate and for the analysis of the climate variability up to centennial scales. In these models, whereas the dynamical processes of the atmosphere and of the hydrosphere are represented within a wide framework of numerical discretization techniques applied to simplified versions of thermodynamics and Navier-Stokes equations in a rotating environment, the continental ice sheets are typically taken as fixed parameters of the system. In contrast, the so-called subscale processes, which cannot be explicitly represented within the resolution of the model, are taken care of through simplified parameterizations.

Several crucial processes, such as radiative transfer, atmospheric convection, microphysics of clouds, land-atmosphere fluxes, ice dynamics, and eddies and mixing in the ocean, as well as most of those controlling the biosphere evolution, undergo severe simplifications. With time, the formulation of the GCMs has developed through refinements to the spatial resolution, ameliorations of numerical schemes, and improved parameterizations, as well as through the inclusion of a larger and larger set of processes, such as aerosol chemistry and interactive vegetation, which are relevant for the representation of the system feedbacks and forcings.

In addition, limited-area climate modeling faces the mathematical complication of being a time-varying boundary conditions problem, as RCMs are nested into driving GCMs. Therefore, RCMs tend to be enslaved on timescales, depending on the size (and position) of their domain, and in principle, the balances evaluated over the limited domain are constrained at all times. Therefore, climate reconstructions and projections performed with an RCM can critically depend on the driving GCM. Other more technical issues arise from the delicate process of matching the boundary conditions at the models' interface, where rather different spatial and time grids have to be joined.

Model results and approximate theories can be tested only against past observational data of nonuniform quality and quantity, essentially because of the space and the timescales involved. The available historical observations sometimes feature a relatively low degree of reciprocal synchronic coherence and individually present problems of diachronic coherence, as a result of changes in the strategies of data gathering with time, whereas proxy data, by definition, provide only semiquantitative information on the past state of the climate system. Extensive scientific effort is aimed at improving the quality and quantity of the climatic databases. In particular, the best guess of the atmospheric state of roughly the last 50 years has been reconstructed by two independent research initiatives, through the variational adaptation of model trajectories to all available meteorological observations, including the satellite-retrieved data, producing the so-called reanalyses.

Given all the above mentioned difficulties, as well as the impossibility, because of the entropic time arrow, of repeating world experiments, the validation of GCMs is not epistemologically trivial, as the Galilean paradigmatic approach cannot be followed. Validation has to be framed in probabilistic terms, and the choice of the observables of interest is crucial for determining robust metrics able to audit effectively the models. Recently, the detailed investigation of the behavior of GCMs has been greatly fostered and facilitated, as some research initiatives have been providing open access to standardized outputs of simulations performed within a well-defined set of scenarios. Relevant examples to be mentioned are the project PRUDENCE (RCMs) and the PCMDI/CMIP3 initiative (GCMs included in the IPCC4AR).

One aim—from the end-user's viewpoint—is checking how realistic the modeled fields of practical interest are, such as surface temperature, pressure, and precipitation. In these terms, current GCMs feature a good degree of consistency and realism when considering present climate simulation, and they basically agree on short-term climate projections down to seasonal averages on continental scales. When decreasing the spatial or the temporal scale of interest, the signal-to-noise ratio of climatic signals—both observative and model generated—typically decreases, so that the validation of GCMs in control runs and climate change simulations becomes more difficult, even if improvements are observed over time by state-of-the-art models. Statistical and dynamical—provided by nested RCMs—down-scaling of climatological variables enlarges the scopes of model validation. In particular, RCMs provide a better outlook on small-scale and nonlinear processes, such as surface-atmosphere coupling, precipitation, and effects of climate change on the biosphere.

However, the above-mentioned quantities can hardly be considered climate state variables, whereas strategies for model improvement can benefit from understanding the differences in the representation of the climatic machine among GCMs. The comparison of the statistical properties of bulk quantities defin ing the climatic state, such as top-of-the-atmosphere energy fluxes, tropospheric average temperature, tro-popause height, geopotential height at various pressure levels, tropospheric average specific humidity, and ocean water structure, allows the definition of global metrics that constitute robust diagnostic tools. Moreover, to capture the differences in the representation of specific physical processes, it is necessary to use specialized diagnostic tools—process-oriented metrics— as indexes for model reliability.

Examples of these metrics are major features of atmospheric variability, such as tropical and extratropical cyclones; detailed balances, such as water vapor convergence over continents or river basins; teleconnection patterns, such as El Nino-Southern Oscillation or Madden-Julian Oscillation; or oceanic features, such as the overturning circulation and the Antarctic current intensity. The latter approach may be especially helpful in clarifying the distinction between the performance of the models in reproducing diagnostic and prognostic variables. Even if improvement

Specialised Climate Models
A NASA diagram of a strong El Niño striking surface waters in the Pacific Ocean. Warm water anomalies develop (indicated by shape at bottom) and westerly winds weaken, allowing the easterly winds to push the warm water against the South American coast.

is ongoing and promising, in these more fundamental metrics describing the climatic machine, current GCMs do not generally feature a comparable degree of consistency and realism at a quantitative level, and further investigations on basic physical and dynamical processes are needed.

Because the goal of a climate model is to reproduce the most relevant statistical properties of the climate system, the structural deficiencies, together with an unavoidably limited knowledge of the external forcings (uncertainties of the second kind) limit intrinsically the possibility of performing realistic simulations, especially affecting the possibility of representing abrupt climate change processes. The uncertainties of the initial conditions (uncertainties of the first kind), constituting, because of the chaotic nature of the system, probably the most critical issue in weather forecasting, are not in principle so troublesome—assuming that the system is ergodic—when considering the long-term behavior, where "long" is evaluated with respect to the longest timescale of the system. Nevertheless, to avoid transient behaviors, which may induce spurious trends in the large-scale climate variables on the multidecadal and centennial scales, it is crucial to initialize efficiently the slowest dynamical component of the GCMs, namely, the ocean. The validation of GCMs requires considering such uncertainties and devising strategies for limiting their influence when control run and, especially, climate change experiments are performed.

As for taking care of possible issues related to initial conditions, often an ensemble of simulations, where the same climate model is run under identical conditions from a slightly different initial state, allows a more detailed exploration of the phase space of the system, with a better sampling—on a finite time—of the attractor of the model. Some climate models have recently shown a rather encouraging ability to act as weather forecasting models, thus featuring encouraging local, in-phase space properties. Although such an ability gives evidence that short timescales' physical processes are well-represented, it says little on the overall performances when statistical properties are considered.

The structural deficiencies of a single GCM and the stability of its statistical properties can be addressed, at least empirically, by applying Monte Carlo techniques to generate an ensemble of simulations, each characterized by different values of some key uncertain parameters characterizing the global climatic properties, such as the climate sensitivity. Therefore, in this case, sampling is performed by considering attractors that are parametrically deformed.

To describe synthetically and comprehensively the outputs of a growing number of GCMs, recently it has become common to consider multimodel ensembles and to focus the attention of the ensemble mean and the ensemble spread of the models, taken respectively as the (possibly weighted) first two moments of the models outputs for the considered metric. Then information from rather different attractors is merged. Although this procedure surely has advantages, especially for GCMs intercomparison, such statistical estimators should not be interpreted in the standard way—with the mean approximating the truth and the standard deviation describing the uncertainty— because such a straightforward perspective relies on the (false) assumptions that the set is a probabilistic ensemble, formed by equivalent realizations of given process, and that the underlying probability distribution is unimodal.

SEE ALSO: Abrupt Climate Changes; Atmospheric Component of Models; Atmospheric General Circulation Models; Biogeochemical Feedbacks; Chaos Theory; Climate; Climate Models; Climate Sensitivity and Feedbacks; Climatic Data, Historical Records; Climatic Data, Proxy Records; Climatic Data, Reanalysis; Intergovernmental Panel on Climate Change (IPCC); Modeling of Paleoclimates; Ocean Component of Models.

BIBLIOGRAPHY. Isaac M. Held, "The Gap Between Simulation and Understanding in Climate Modeling," Bulletin of the American Meteorological Society (v.86/11, 2005); Intergovernmental Panel on Climate Change, Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (Cambridge University Press, 2007); Valerio Lucarini, "Towards a Definition of Climate Science," International Journal of Environment and Pollution (v.18/5, 2002); Valerio Lucarini, Sandro Calm-anti, Alessandro Dell'Aquila, Paolo M. Ruti, and Antonio Speranza, "Intercomparison of the Northern Hemisphere Winter Mid-Latitude Atmospheric Variability of the IPCC Models," Climate Dynamics (v.28/7-8, 2007); Jose P. Peixoto and Abraham H. Oort, Physics of Climate (American Insti tute of Physics, 1992); Barry Saltzmann, Dynamic Paleocli-matology (Academic Press, 2002).

VALERIO LUCARINI University of Bologna

Was this article helpful?

0 0
Healthy Chemistry For Optimal Health

Healthy Chemistry For Optimal Health

Thousands Have Used Chemicals To Improve Their Medical Condition. This Book Is one Of The Most Valuable Resources In The World When It Comes To Chemicals. Not All Chemicals Are Harmful For Your Body – Find Out Those That Helps To Maintain Your Health.

Get My Free Ebook


Post a comment