Free Power Secrets

Making Your Own Fuel

Get Instant Access

Note: Data by Carbon Dioxide Information Analysis Center (CDIAC), Oak Ridge National Laboratory; Data represent the year 2000 level. "Concentrations are expressed in parts per billion (ppb).

Note: Data by Carbon Dioxide Information Analysis Center (CDIAC), Oak Ridge National Laboratory; Data represent the year 2000 level. "Concentrations are expressed in parts per billion (ppb).

the late nineteenth century (Figure 4, Chapter 1) may be attributed to the man-made enhanced greenhouse effect.

Many gases exhibit greenhouse properties. Some of them (water vapor, CO2, methane (CH4), and nitrous oxide (N2O)) occur in nature, while others are exclusively of anthropogenic origin. Table 9 illustrates the growth of the GHG concentrations since the pre-industrial epoch to the year 2000. As observed for the last 250 years, concentrations in the atmosphere have indisputably grown: concentrations of CO2 about 30%, CH4 about 105%, and N2O about 9%. According to the information collected by the IPCC, significant enhancement of anthropogenic GHG emissions is the main cause of this growth. CO2 emissions are closely linked with industrial activities, primarily with the combustion of fossil fuels. For the past 20 years, about three fourth of human-produced CO2 emissions originated from burning fossil fuels. Another important source of CO2 is deforestation. Concentrations of CO2 in the atmosphere are regulated by numerous natural processes, such as plant photosynthesis; their common activity is known as the "carbon cycle". While the carbon cycle can absorb some of the net 6.1 billion metric tons of anthropogenic CO2 emissions produced each year, an estimated 3.2 billion metric tons is added to the atmosphere annually. Such positive imbalance between emission and absorption amounts results in continuous growth of GHG in the Earth's atmosphere.

The increase in the GHG content together with other human activities affects processes and feedbacks5 in the climate system. However, because of their complicated nature as well as the strong natural variability of the Earth's climate, it is not easy to determine the extent of the man-made GHG influence on the climate. In computer-based models, rising concentrations of GHG generally produce an increase in the average global temperature. For example, the 18 numerical model runs using 7 independent physical models by Kattenberg et al. (1996) have predicted an equilibrium temperature increase of 2.0 ± 0.6 K in the year 2100 using double the current level of atmospheric CO2. Increasing concentrations of GHG are likely to accelerate the rate of climate change.

5The interaction between processes operating in the climate system called climate feedback implies that the influence of some of the processes initiates the changes in some other process, and these changes in turn influence the former process. An intensification of the original process corresponds to the positive feedback, while the reduction represents the negative feedback.

Scientists expect that the average global surface temperature could rise by 0.6-2.5 K in the next 50 years and by 1.4-5.8K in the twenty-first century, with possible strong regional variations.

The most important GHG is CO2, but importance of CH4 is also significant because per kilogram it has 21 times of the effect of CO2 for producing global warming. The CH4 emissions are generally caused by bacteria-induced decay of organic material in anaerobic conditions. Natural wetlands represent the main source for CH4 emissions from decaying organic material. The decay process is also an important anthropogenic CH4 source in the digestive processes and manure of domestic animals, rice cultivation, landfills, and wastewater treatment. At present, CH4 composes 0.5% of total emissions, but it gives about 10% of the radiative forcing6 estimated to be caused by carbon dioxide (data by the Carbon Dioxide Information Analysis Center (CDIAC; It is assumed that in the future the absolute value of the CH4 forcing will increase but less than that of CO2. N2O (0.1% of total emissions and 296 times per kilogram of the effect of CO2 for producing global warming) is naturally emitted from soils and oceans. The human contribution consists of burning fossil fuels, the use of definite fertilizers, the cultivation of soil, and certain industrial processes (like production of nylon).

The most commonly considered indicator of climate change is the SAT. Global temperatures are, in fact, rising. They have increased between 0.3 and 0.6K in the last 150 years. Even this change has not been monotonic; it is unusual in the context of the last few centuries (see Section 1.2 of Chapter 1). An independent estimate based on the analysis of borehole temperature measurements (Section 3.2) supports the unprecedented character of the recent global warming at least for the last five centuries that is accelerated since the end of the nineteenth century. At first sight these results corroborate the hypothesis of human-induced warming. Long-term paleoclimatic studies also seem to confirm this claim. For example, there have been significant natural variations of CO2 in the geologic past, and these changes are correlated with the general course of climate variations. There is no known precedent for large increases in atmospheric CO2 without simultaneous changes in other components of the carbon cycle and climate system. The illustration of this claim is presented in Figure 106. The bottom image shows the oscillations in the concentration of CO2 in the atmosphere (Vostok ice core, Antarctica, data by Petit et al., 1999). A comparison is shown between observed trends in the CO2 content and the temperature changes estimated for the same location (see also Figure 1, Chapter 1). As shown, temperature changes almost perfectly repeat the CO2 oscillations.

On the other hand, on the timescale of the last few thousand years there have been even more pronounced climatic variations during periods when variations in CO2 have been relatively low. For example, from the end of the last glaciation episode about 10000 years ago until the end of the eighteenth century, the levels of GHG in the atmosphere remained fairly constant, while climate has shown significant oscillations. It is clear that atmospheric GHG does not solely influence global climate. There is still no exact knowledge of how the climate system varies naturally and/or responds to the GHG emissions,

6Radiative forcing represents a simple measure of the importance of a potential climate change mechanism. It represents the change in the net vertical irradiance due to variations in the internal and/or external forcing of the climate system, e.g., the output from the Sun or change in the carbon dioxide concentration and is measured in W/m2.

Fig. 106. Comparison of the trends in the surface temperature anomalies (top) and in the atmospheric concentrations of carbon dioxide (bottom) during the last 400000 years (Vostok ice core, Antarctica; data by Petit et al., 1999). Both curves are highly coherent. Increases in the atmospheric CO2 levels are accompanied by the corresponding warming.

Fig. 106. Comparison of the trends in the surface temperature anomalies (top) and in the atmospheric concentrations of carbon dioxide (bottom) during the last 400000 years (Vostok ice core, Antarctica; data by Petit et al., 1999). Both curves are highly coherent. Increases in the atmospheric CO2 levels are accompanied by the corresponding warming.

when the equilibrium response of the nonlinear climate system depends in complex ways on various feedbacks. The principal means for understanding climate system response to the GHG forcing is the use of computer models of the Earth's climate system based on the well-established physical/chemical/biological assumptions and their comparison with the observed/reconstructed paleoclimatic records. Systematic review and evaluation of the existing paleoclimatic data is thus indispensable to produce a consequent and robust paleoclimatic database that may serve as a test target for climate model studies. On the other hand, the models that have been constructed to predict future climate change are necessarily simplified representations of the climate operating system. The climate of the past centuries simulated by three-dimensional coupled atmosphere-ocean models may deviate from the true evolution of the past climate, due to uncertainties in the external forcing, model deficiencies, and internal variability. Thus, the researchers need some sort of testing that will clearly reveal the degree of confidence between models and paleoclimate observations. The progress in reducing uncertainties in the simulation of future climate patterns will also require better understanding of the total behavior of climate system as well as the buildup of GHG in the atmosphere. In the case when observed and modeled results appear to be contradictory, robust criteria to reveal the reason for the discrepancy (e.g., shortcomings of the model or quality of the observations used for testing) should be developed.

At present, however, the uncertainties in the analysis are considerable and their results are, therefore, only indicative. Local or regional effects can be different from assessed global effects. For example, burning of high-sulfur coal and oil leads to sulfate aerosols, which largely cool the climate in and somewhat downwind of the regions they are emitted. Such regional cooling tends to offset some of the globally averaged warming, but primarily serves to distort the patterns of climate changes relative to the regional patterns that would be produced by GHG-induced changes by themselves (Yang and Schneider, 1997). Simulations of the climate change for specific areas are much less reliable than global ones, and it is unclear whether regional climate will become more variable. Many researchers calculate scenarios for only a few decades because of large uncertainties that accumulate over time.

The ability to distinguish a warming trend from natural variability is crucial for an understanding of the climatic response to increasing GHG concentrations. Joint analysis of the borehole and SAT data as well as the complementary use of the past climate reconstructions and model generated synthetic data gives the possibility of interpreting observed twentieth century warming trend in the global warming context.

3.4.3 Was the twentieth century climate unusual? Evidence from the underground

In Sections 3.2 and 3.3 we have cited repeated claims of numerous authors about unprecedented nature of the twentieth century warming at least compared with those in the last 500 years. Although the Northern Hemisphere reconstructions prior to about 1400-1500 A.D. exhibit numerous uncertainties, important conclusions are still possible also for more remote epochs. While the warmth early in the millennium (the Medieval Warm Period) approaches mean twentieth century level, the late twentieth century appears quite anomalous. The 1990s was probably the warmest decade in at least a millennium scale. The climatic conditions during the Medieval Warm Period seem to be more spatially variable, while the pattern of the recent warming appears to be generally global.

To establish the unusual character of the twentieth century warming, two of the most important questions should be answered:

1. Is the Earth's surface significantly warmer now than it was in the pre-industrial epoch? If yes, how much warmer is the Earth's surface now than it was in the pre-industrial times?

2. Is the climate course of the last millennium really known to serve as the base for sure comparison?

Further important questions that are somewhat beyond the frames of this book may be:

3. Has mankind already changed global climate?

4. Is anthropogenic global climate change in the twenty-first century surmounting at least all Holocene variability?

With respect to the first question the situation is more or less obvious. Numerous measurements of the land surface air and sea surface temperature (continuously re-examined and updated) revealed a global average rise in the range of 0.3-0.6K between the late nineteenth century and the year 1994. Conclusions based on the conventional temperature observations are supported by the satellite-based data as well as by an indirect evidence such as the decline in the extent and thickness of the Arctic sea-ice cover, melting of the Greenland ice sheet, recession of glaciers (, etc. The situation with the GSTs is generally similar. The vast volume of the available measurements/GST inversions as well as the wide range of computations and statistical tests applied to this data in the recent years significantly increased the confidence of the conclusions reached. The warming trend in the GST record of the last 100 years is undoubtedly real. Existing estimates fall in the same range as the values calculated from the SAT data. They also show that the GST warming trend of the last century is substantially stronger than that in the whole previous five centuries (Section 3.2).

The answer to the second question is not so certain. Is the twentieth century surface temperature warming trend really unusual? With respect to the SATs, we have mentioned above that the IPCC affirmed that most of the recent 50 years' warming is unprecedented and can likely be attributed to anthropogenic emissions of the GHG. Although this statement was based on the wide-scale research in different scientific branches, it almost immediately became the target of trenchant, still ongoing debates. Their numerous traces can be found, e.g., in the reports by the scientific team of the Marshall Institute, which involves a critical examination of the scientific basis for global climate change policy ( The primary reason for the IPCC conclusion was a climate reconstruction by Mann et al. (1999) that produced the so-called "hockey stick" diagram (see Figure 11, Chapter 1), which shows the twentieth century as unusually warm compared to preceding times. A new evaluation of the underlying data used to create the diagram by Mann et al. (1999) presented in the work by McIntyre and McKitrick (2003) has raised serious questions about its validity (see also the original papers by McIntyre and McKitrick, 1998, 2005a). Authors have examined the application of the dataset of proxies that were used by Mann et al. (1998, 1999) to reconstruct the temperature record from 1400 to 1980. Their review has found different errors (inappropriate collation, unjustified truncation and extrapolation, use of obsolete data, as well as calculation mistakes). Correction for these shortcomings performed by above authors has revealed that the temperature in the early fifteenth century was actually higher than that in the twentieth century. Recent work by Soon and Baliunas (2003) also comprises definite criticism of the "hockey stick". Authors conclude that the available climatologic data do not support the hypothesis that the twentieth century was the warmest and/or most unusual of the last millennium. In the subsequent discussion (e.g., Huybers, 2005; McIntyre and McKitrick, 2005b, c; von Storch and Zorita, 2005; see also web sites globalwarming/hockey_stick/hockeystick01.html and the Mann's Responses and Counterarguments, participants have clearly stated that none of the reconstructions presented the recent warming as a simple turnout from the 14-16th centuries cold conditions, which was not statistically and climatologically meaningful, and argued that the suggestions by McIntyre and McKitrick influence the reconstruction in a minor way and in fact confirm the robustness of the Mann et al.'s (1998) reconstruction within its own framework. The "user-friendly" assessment of the data and methods as well as the reliability of Mann et al.'s (1998) reconstruction is also presented on the web site by Mann et al. "Global temperature patterns in past centuries: An interactive presentation ( This interactive forum allows users to check up directly reconstructed patterns and/or select particular spatial regions and time period of special interest.

To firmly answer the second question, it is essential to regard it in the context of the long-term climate variability. Because of their shortness and sparseness, the instrumental temperature measurements can ensure information generally on the higher frequency (seasonal, annual, or decadal) climate change. For example, the investigations of the trends in the high-frequency SAT variability in different regions of the globe by Karl et al. (1995) revealed that the interannual temperature variability is supported by data from the past few decades, while the longer data series indicated that this trend is an artifact. Proxy climate indicators calibrated against temperature time series together with borehole temperature reconstructions can provide data for the confident conclusions of the low-frequency temperature variability. The certainty of these conclusions strongly depends on the accuracy/consistency/completeness of the available paleoclimatic series. Recent development of the paleoclimatology and a number of advances in the new areas (e.g., in the borehole climatology) considerably enriched available information and allowed more meaningful conclusions about spatial and temporal patterns of climate change in the past centuries.

The long-term temperature trends over the last millennium or so are evident in many regions as well as on the global and/or the hemispheric scale. Although with definite limitations/uncertainties, the combined multiproxy/borehole temperature history reconstructions can provide global-scale sampling of the general course of the temperature variations over several centuries into the past. Thus, e.g., it was concluded that temperatures in the Northern Hemisphere during recent decades are the warmest in at least six centuries. The latest studies based on the global networks of the multiproxy and/or borehole data have confirmed the early results and have proved their usefulness for describing global and/or hemispheric patterns of climate variability in past centuries (e.g., as described in the works by Mann et al., 1998,1999; see the previous section). These studies have also provided better comparison of the derived climatic trends with possible physical influences and/or climate forcing (e.g., Crowley and Kim, 1996,1999; Delworth and Mann, 2000). On the contrary, the data prior to fifteenth to sixteenth century were proved to be too sparse for the firm inferences and could serve only for a few regional reconstructions.

Of course, paleoclimate experts should challenge each other's conclusions and interpretations. Various alternative hypotheses have been proposed to explain modern increase of the global temperatures:

• Most predictions are wrong and the warming is within the range of the natural variability.

• Observed warming represents simple recovery from the previous cold conditions (the Little Ice Age and/or the cold period in the second half of the nineteenth century).

• Recent warming is the result of the changes in solar irradiance, etc.

It should be mentioned, however, that the explanations of the recent warming include these and other possibilities, but they are not limited to any hypothesis. Climate variations include both natural and anthropogenic factors, and the recent warming reflects an integration of various forcings. As it was shown in the previously cited work by Huang (2004), the integrated surface temperature history calculated by the merging of GST and multiproxy sources coincides in its course with the curve of the radiative forcing, reconstructed by Crowley (2000), that comprises the effects of solar irradiance, GHG, anthropogenic aerosols, and volcanism. This coherence represents the useful validation of the strategy of coupling of the GST-SAT multiproxy information. On the other hand, good agreement between an integrated temperature reconstruction and the radiative forcing corroborates the presence of both natural and human-induced effects in the recent warming. And finally it should be emphasized that the evidence of the unusual character and possible anthropogenic forcing based on the surface air/ground temperature data is only one from a number of independent paleoclimatic researches indicating the strong likelihood that human influences on climate play very important (if not a dominant) role in the observed twentieth century warming of the Earth's surface.

Causes of the twentieth century temperature change can be discovered in more precise manner using optimal detection methodology. Detection and attribution studies probably represent the strongest piece of evidence in support of the above conclusion. These studies demonstrate that the pattern of twentieth century climate change agrees well with that predicted by the modern state-of-the-art numerical models of the climate system in response to the strengthening anthropogenic forcing.

3.4.4 The elements of optimal detection of the climate change and attribution of the causes

In the recent years, considerable progress was achieved in attempts to identify an anthropogenic effect on climate using the optimal detection methodology. Detection of the climate changes is the procedure which reveals that the climate is really changed in certain statistical sense. Detection of the anthropogenic changes demonstrates that observed change is significantly different than can be explained by natural internal variability. The detection of a change in climate does not necessarily imply that its causes are understood and/or does not search for the reasons of observed change. Attribution represents the process that looks for the most probable causes of detected climate change. Previously the detection and attribution studies have addressed the simple question: "Have we detected a human influence on climate?" Recently evidence for an anthropogenic contribution to climate trends over the twentieth century is accumulating rapidly (for detailed review see, e.g., IDAG, 2005). Significant progress achieved in this research field in the recent years inspired the re-formulation of the problem. At the present time the question that should be answered is rather: "How strong is the anthropogenic change?"

The response to anthropogenic changes in climate forcing is superimposed on the natural, internal, and externally forced climate variability that can occur on similar temporal and spatial scales. Internal climate variability7 manifests itself on the wide range of

7Internal variability means the part of the climate variability that is not forced by external influences.

timescales from weeks to centuries and millennia, and the climate is capable to produce the variations of noticeable magnitude without any external influences. Externally forced climate variations may be of two kinds: (1) changes in natural forcing factors, such as solar radiation or volcanic aerosols, and (2) changes in anthropogenic forcing factors that imply increasing concentrations of the GHG, etc. The presence of the natural climate variability means that the detection and attribution of anthropogenic climate change is a statistical "signal-in-noise" problem. Ideally, sure attribution of the detected climate change to anthropogenic causes would require wide series of experiments with the climate system where the probable mechanisms of change are systematically varied to determine the sensitivity of the system to each of them. Of course, such approach to attribution is not possible. In practice, the attribution of observed climate change to a given combination of human activity and natural influences consists of the statistical analysis and assessment of available evidence to test the hypotheses that the observed changes:

1. cannot be attributed entirely to internal variability,

2. are consistent with the estimated responses to the given combination of the anthropogenic and natural forcing, and

3. are not consistent with alternative, physically plausible explanations of recent climate change that do not take into account important elements of the given combination of investigated forcings.

In such approach the detection and attribution of the climate change represents the statistical problem that can be solved on the definite level of significance. The elucidation that observed changes do not simply represent a manifestation of internal variability (detection) is thus one of the components of the more complex and demanding process of attribution.

The records that are of interest for the detection and attribution of the anthropogenic climate change are approximately 50-100 years' long. Clearly, the instrumental record is short relatively to this interval. Paleoclimatic reconstructions (including GST histories) can provide time series that are long enough for the estimation of internal climate variability, but a number of problems (limited spatial coverage, temporal inhomogeneity, possible biases in the interpretation of the relationships between proxy indices and climatic variables) make this task difficult. An ongoing progress in the reconstruction of past temperatures, e.g., merging of different proxy series to obtain more certain millennium-long reconstructions of past temperatures with annual or finer resolution described in the previous section, improves the situation. Such kind of paleoclimate reconstructions becomes more and more important for assessing internal climate variability. These time series can also be successfully used for the verification and/or checking of the internal variability estimates from the coupled atmosphere-ocean models to ensure that the calculations do not under- or overestimate the level of internal variability on decadal to century-long timescales.

The only mean to quantify/separate internal ("noise") and the human-forced climate change and/or variability ("signal") is the use of numerical models of the climate system. As was mentioned above, modeling efforts should be combined with both empirical and statistical techniques. The question formulated above: "How strong is the anthropogenic change?"

that the attribution procedure should answer, can be extended as: "Is the magnitude of the response to greenhouse gas forcing as estimated in the observed records consistent with the response simulated by climate models?" The first IPCC Scientific Assessment in 1990 (Houghton et al. 1990; has detected that the global mean surface temperature has increased by 0.3-0.6K over the last 100 years and that the magnitude of this warming appears to be generally consistent with the predictions of the climate models forced by increasing concentrations of GHG. However, the question that the observed warming (or at least part of it) could be attributed to the enhanced greenhouse effect that time remained unclear. Reasons for this uncertainty were that: (1) there was only limited agreement between model predictions and observations, because climate models were still in the early stages of their development; (2) there was inadequate knowledge of natural variability and influence of other possible anthropogenic effects on climate; and (3) there was scarcity of available observational data, particularly of long, reliable time series.

Since 1985-1990 the scientific branch of the climate modeling experienced significant progress. At the present time there are numerous research groups working out both short- and long-scale high-performance models including many components of the climate system. Important results were achieved in the better understanding of the internal variability of the climate system through multi-century model simulations that did not take into account artificial forcing. On the other hand, numerous models have incorporated the climatic effects of different human-induced changes to quantify their possible projections on the future climate. Both theoretical principles of the model construction and results of numerical simulation can be found on numerous web sites, such as the Hadley Centre (, the National Center for Atmospheric Research (, or the National Oceanic and Atmospheric Administration (NOAA; There also exists a vast amount of published works on this topic.

The simplest way to attribute observed climatic changes to the anthropogenic effect is the qualitative assessment of consistencies and inconsistencies between the observed data and model projections of anthropogenic climate change. Such studies include generally simple descriptive analysis of climatic variables and models simulated using different forcing schemes. Reliable qualitative detection and attribution can be obtained for climatic variables that possess high climate change signal-to-noise ratios, good spatial data coverage, and consistent signals from different model simulations. Among all climatic variables the SAT appears to be the better database satisfying these requirements. Variations on large spatial scales and time scales of several decades or longer are generally considered to enhance the signal-to-noise ratio.

Numerous studies have identified areas of qualitative consistency/inconsistency between observed and modeled climate change. Of course, qualitative studies provide less reliable evidence for an anthropogenic influence on climate than the quantitative attribution techniques; however, they may indicate the possibility for further quantitative detection and attribution study in the areas of already revealed qualitative consistency between observations and models as well as efforts on the improvement of the models in the areas of detected inconsistency.

More certain quantitative attribution requires complex mathematical methodology and vast database. Significant progress in this field has been achieved in the recent years. The number of techniques for the quantitative detection of the observed climatic changes to the human-induced forcing and corresponding attribution studies has rapidly increased in the last decade. Quantitative studies revealed not only the degree of agreement between observed and modeled climate change, but also the statistical significance of obtained results and the degree to which the final conclusions are independent of the assumptions made in applying the data processing techniques (Hassellmann, 1997; North and Stevens, 1998; Allen and Tett, 1999; Tett et al., 1999, 2000, 2002; Barnett et al., 2000; Hill et al., 2001; Mitchell et al., 2001; Paeth and Hense, 2001; Jones et al., 2003). The "optimal detection" or "optimal fingerprinting" techniques are probably the most popular and are widely used for the surface temperature patterns. These techniques have several slightly different representations (e.g., Hegerl and North, 1997; Jones and Hegerl, 1998; Zwiers, 1999; IDAG, 2005). Both the theoretical base of the optimal detection studies and the applied database of the surface temperature have been extended in the works by Hegerl et al. (2000, 2001), North and Wu (2001), Stott et al. (2001).

Optimal detection is a technique that optimizes pattern variability and thus can provide a clearer separation of a climate change fingerprint from natural internal climate variations. It increases the detectability of the forced climate changes through increase of the signal-to-noise ratio by looking at the component of the response away from the direction of highest internal variability. The optimal detection represents a multiple regression between a set of signals derived from model simulations and observations. Its mathematical formulation assumes that a field of n observations y can be represented as a linear combination of candidate signals obtained from climate models g1, ..., gm plus noise u:

where G = (g1|.|gm) is the matrix composed of the signal patterns and a = (a1,., am)T is the vector composed of the unknown amplitudes. The optimization consists of projection of the observations, signals and noise onto the leading eigenvectors of an estimate of the noise covariance matrix and by weighting down patterns of temperature change with high variability and weighting up low-variability temperature change patterns. The noise covariance matrix can be calculated, e.g., from the long-term model simulation with constant external forcing (Jones et al., 2003). Various permutations of the projected signal patterns are then regressed against the projected observations. This procedure provides sets of regression coefficients (amplitudes). Further generalizations allow, e.g., incorporation of the signal uncertainties into the analysis. The multiple regression procedure consists of: (1) the estimate of the unknown amplitudes a with respect to generalized least squares from observations, and (2) the testing of the null hypotheses that they are zero. If the hypotheses are rejected, the analysis can be continued by attribution consistence test, in a word, by testing the hypothesis that for some combination of the signals the amplitudes are unity.

To attribute all or part of the recent climate change to human activity, one also needs to demonstrate that alternative explanations (e.g., pure internal variability and/or purely naturally forced climate change) could not respond to a set of observed changes that can be explained by human-induced influence. It should be mentioned, however, that the researchers cannot predict what alternative explanations for observed climate change may be discovered and accepted as plausible, in the future. Thus, some interpretation accepted during the attribution process is never final. This problem is not only common in climatology but also present in all scientific branches that deal with establishing cause and effect using limited base of observations. The possibility of other explanation can never be excluded completely. The hypothesis will appear progressively more valid as numerous alternative explanations are tested and found to be inadequate.

Above approach is potentially more informative than the simpler regression techniques because in principle it allows to quantify through associated estimates of uncertainty, how much different factors have contributed to the recently observed climate changes. Such quantification, however, is possible under assumption that important sources of model error (e.g., missing and/or incorrectly represented atmospheric feedbacks) affect primarily the amplitude and not the structure of the response to external forcing. Most of the existing studies suggest that this is the case for the relatively small-amplitude changes observed to date; however, the possibility of model errors changing both the amplitude and the structure of the response cannot be excluded.

Probably the most cited works on the optimal detection/attribution of the surface temperature change were published by Tett et al. (2002) and Jones et al. (2003). Former authors have simulated the climatic response to natural (solar irradiance and volcanic activity) and anthropogenic forcing from 1860 to 1997 using coupled atmosphere-ocean GCM. Authors computed a set of models, which incorporated different anthropogenic forcings. One of the models, e.g., has taken into account the influence of the GHG alone, while the other has regarded coupled effects of the tropospheric ozone, GHG, sulfate aerosol, etc. Using the optimal detection analysis, authors have detected the contribution of different forcings to the SAT change. As shown, while natural forcings give a linear trend for the last century close to zero, the warming trend caused by total anthropogenic forcings is equal to 0.5 ± 0.15K/century. The above analysis has detected that the combination of the natural forcing and the GHG increase as well as the contribution from internal variability is the best explanation for the first half of the twentieth century warming. Warming in the second half of the century was probably caused by the coupled effect of the changes in the GHG, sulfate aerosol, and the stratospheric aerosol due to volcanic eruptions.

Jones et al. (2003) have investigated the causes of the surface temperature change over the last four decades using optimal detection methodology. As in the previous work, authors have used a coupled atmosphere-ocean GCM with different sets of forcing. The sets of the signals with the most powerful influence on the climate system included well-mixed GHG, other anthropogenic forcings as well as changes in the solar irradiance8 and volcanic activity. The SAT observations were taken on the latitudinal zonal scales: 90-30°N, 30°N-0°, and 0°-30°S. Because of insufficient number of observations, zone 30-90°S was excluded from the consideration. Obtained results have supported the hypothesis of an anthropogenic influence on the climate. The anthropogenic signals of the GHG and combined response to changes in sulfate aerosol

8Total solar irradiance is the amount of electromagnetic energy emitted by the Sun over all wavelengths that is received at the top of the Earth's atmosphere. It measures the solar energy flux in W/m2.

and tropospheric and stratospheric ozone were robustly detected. As shown, the GHG influence dominated in the last 40 years and has caused as much as 0.56 ± 0.15 K global SAT warming, while the combined effect of changes in sulfate aerosol and ozone produced cooling trend of 0.10 ± 0.01 K at the same time. Response to volcani-cally produced stratospheric aerosols has resulted in approximately similar cooling of 0.09 ± 0.04 K, and finally 0.12 ± 0.11 K warming of the surface temperatures occurred in response to the changes in solar irradiance.

Except for the "optimal detection" techniques, in the recent years there is a growing interest in the use of Bayesian methods for the climate change detection and attribution (Dempster, 1998; Hasselmann, 1998; Levine and Berliner, 1999; Berliner et al., 2000; Lee et al., 2005; Min and Hense, 2005). This interest was inspired by the ability of this formalism to account for uncertainties in various components of the detection and attribution procedure and to incorporate additional information. As shown in Section 2.3.5 (Chapter 2), where similar method was used for the FSI inversion of borehole temperatures, an incorporation of additional information can significantly increase the reliability of the results, in the case of detection and attribution reducing the range of alternative explanations for observed climate change. In most cases additional information includes knowledge about uncertainty and an anthropogenic influence on the climate. This knowledge is expressed through prior distributions that are non-committal on the climate change question. A study by Berliner et al. (2000) has applied a Bayesian framework in a multivariate climate change detection setting. As shown in this work, Bayesian approach allows greater flexibility and strictness in the treatment of different sources of an uncertainty. On contrary to the strong evidence of an anthropogenic influence on the climate of the twentieth century obtained by the optimal detection approach, the evidence from the Bayesian attribution assessment is not so firm. Lee et al. (2005) explain it by the limited length of an available observation record and/or by insufficient account for all possible sources of external forcing. The authors have estimated that the strong evidence from a Bayesian attribution assessment using stringent enough attribution criterion may be available by 2020.

Due to subsequent development of the detection/attribution techniques, modeling procedures, and the observational databases, the nature of detected climatic change has been evaluated in more detail in further IPCC Assessment Reports (e.g., Climate Change 2001; Different directions of the evidence on the causes of recent climate change were examined. Below we present the summary of the results obtained for the surface temperatures.

(1) Twentieth century climate was unusual. Palaeoclimatic reconstructions for the last 1000 years have indicated that in spite of the large natural climate variability, the twentieth century warming is really unusual, even taking into account uncertainties of the reconstructions. A comparison of empirical evidence with proxy reconstructions and GST histories shows that the natural factors explain relatively well the general temperature trends of the last millennium to the nineteenth century. However, they can hardly account for an unusual warming in the end of the twentieth century (Jones and Mann, 2004). The last two millennia reconstructions did not find an evidence for any earlier periods with warmer conditions than the post-1990 period (Moberg et al., 2005).

(2) Observed warming is inconsistent with the model estimates of the natural internal climate variability. The model estimates of the internal climate variability on the annual to decadal time-scales are generally similar. In some cases they are even larger than those observed. On the longer scales the model estimates of the internal climate variability vary substantially. Estimates from models and observations are uncertain on the century and longer timescales required for its detection. Notwithstanding that the long-scale internal climate variability is uncertain the detection of an anthropogenic signal is insensitive to the model used to estimate internal variability. Recent observed changes cannot be ascribed to internal variability alone even if the amplitude of simulated internal variations is increased by a factor of two or more. The ability of climate models to simulate large-scale temperature changes during the twentieth century, when they include both anthropogenic and natural forcings, and on the contrary, their inability to simulate warming observed over the last half a century, when they do not take into account increasing GHG concentrations, is generally regarded as an evidence for an anthropogenic influence on the global warming. On the whole, changes in the global climate over the twentieth century unlikely can be attributed to the pure internal climate variability.

(3) The observed warming (especially) in the second half of the twentieth century appears to be inconsistent with natural external forcing of the climate system. While for earlier periods common influence of solar and volcanic forcing can explain the Medieval Warm Period and the Little Ice Age, externally driven natural climate forcing cannot ensure climate changes comparable in their amplitude and time with the late twentieth century warming (Bertrand et al., 2002). All existing studies reject the natural climate forcing and/or internal variability alone as a possible explanation of the recent climate change. Direct measurements of the solar irradiance exist for only two decades. Longer records (e.g., presented in the works by Hoyt and Schatten, 1993; Lean et al., 1995; Lean, 2000) are the reconstructions based on different assumptions. All these reconstructions exhibit an increase in the amplitude of the total solar irradiance since the Dalton Minimum (Figure 107). Data also suggest that the sharp increase of the overall solar activity during the first half of the twentieth century has practically stopped in its second half. The radiative forcing due to stratospheric aerosols of volcanic activity possesses significant year-by-year variations. In the second half of the twentieth century it was particularly strong during 1961-1965 and 1991-1995 periods (Ammann et al., 2003). The increase in volcanic activity during the past decades would, if anything, produce tropospheric cooling and stratospheric warming, thus the opposite effect to what has occurred over this period. Trends in the volcanic aerosol content together with the small solar irradiance changes during the last two (possibly two to four) decades of the twentieth century indicate that volcanic forcing in this period was negative and could not explain the recent rapid increase of global surface temperature. Not only the surface temperature increase but also changes observed in patterns of vertical atmospheric temperature are similarly inconsistent with the natural forcing trends (Chanin and Ramaswamy, 1999; Jones et al., 2003).

1750 1800 1850 1900 1950 2000 TIME, year A.D.

Fig. 107. Trends in the total solar irradiance since 1750. (Data by Lean (2000).)

(4) Human-induced factors do provide an explanation of the rapid increase of the surface temperature in the twentieth century. Results by optimal detection methods indicate human influence on climate in the surface temperature observations. The model that can give an acceleration of the surface temperature warming during the last three to four decades can be constructed on the base of the GHG forcing. The use of the models considering a number of forced signals as well as an investigation of different sources of the uncertainty has shown that a considerable part of the recent warming can be attributed to the GHG influence. The better coincidence of the observed data with model simulations for the period since 1850 A.D. can be achieved by including the anthropogenic sulfate aerosol forcing. All models produce a response pattern to combined GHG and sulfate aerosol forcing, which is detectable in the nineteenth to twentieth centuries surface temperature record. Because the sulfate aerosol forcing is negative and thus tends to reduce the climate system response, detection of the response to the combined forcing implies the presence of a GHG signal that is at least as strong as a combined signal. For better observation/reconstruction fit the climate models should also include an impact of deforestation (Bertrand et al., 2002).

Results of different investigations may vary depending on the details of the analysis, especially as regards the timescales used. For example, an influence of the anthropogenic climate forcing dominates on decadal scales; response to volcanic activity can be better detected on the annual scale, while an incorporation of seasonal information helps in the detection of weaker solar signals (Jones et al., 2003).

(5) It is unlikely that detection studies have mistaken a natural signal for an anthropogenic signal. To demonstrate an anthropogenic contribution to climate, it is necessary to rule out the possibility that the detection procedure has mistaken the part or the whole of a natural signal for an anthropogenic change. While estimates of the amplitude of a single anthropogenic signal are generally consistent between different model signals and/or different approaches, joint estimate of the amplitude of several signals may vary between models and approaches. Precise separation of the observed warming on human-induced and naturally forced components should be done with considerable care. On physical grounds, natural forcing is unlikely to account completely for observed warming over the last three to four decades, given that it is likely that the overall trend in natural forcing over most of the twentieth century is small or negative. Several studies have involved three or more components - the responses to GHG, sulfate aerosols, and natural (solar, volcanic, or volcanic and solar) forcing. All performed model simulations have detected a substantial GHG contribution over the last 50 years, although in one case the estimated GHG amplitude is inconsistent with observations. Thus, it is unlikely that the researchers have misidentified the solar signal completely as a GHG response. On the other hand, an uncertainty in the amplitude of the response to the natural forcing continues to contribute to an uncertainty in the strength of the anthropogenic signal.

(6) Natural factors may have contributed to the early century warming. Most of the discussion presented in this section has been concerned with the evidence relating to a human-induced influence on the late twentieth century climate. Numerical investigations (e.g., Moberg et al., 2005) have revealed a large natural variability in the past climate that likely continues. The observed global mean surface temperature record shows two main periods of warming. Some studies detect a solar influence on surface temperature over the first five decades of the century (Figure 107) with perhaps a small additional warming due to an increase in GHG content, while others suggest that the early warming could be due to a combination of anthropogenic effects and a highly unusual internal variation. Thus, the early century warming could occur due to definite combination of natural internal variability, changes in solar irradiance, and anthropogenic influence. Additional temperature rise characteristic for the second half of the century most likely can be attributed to a substantial warming due to corresponding increase in GHG, partially offset by cooling due to aerosols, and perhaps by cooling due to natural factors toward the end of the century.

In the investigations described above, detection and attribution of the causes of the recent climate change were performed by comparison of observed SAT changes with simulated models. Could similar detection/attribution procedure be performed using borehole data? Because the ground acts as the low-pass filter attenuating the subsurface climate signal with depth and time, the resolution of borehole data decreases into the past (Section 2.4.3, Chapter 2). Thus, reconstructed GST histories cannot be directly compared with models. Recently, Beltrami et al. (2006) have suggested a new approach to perform qualitative detection and attribution of the GST changes. The basic steps of the approach developed by the authors are: (1) the use of the simulated model output as a forcing function at the surface (surface boundary condition), (2) the solution of the forward problem to obtain artificial temperature-depth profile, and (3) the comparison of the calculated perturbation with the measured borehole temperature log. Thus, the basic idea of the method by Beltrami et al. (2006) is to propagate modeled surface temperature into the underground and to compare an expected disturbance with the really measured perturbation. Comparison may be fulfilled by traditional statistical techniques used for detection and attribution of SAT signals.

Authors have tested suggested method using three different model simulations and the temperature logs from numerous boreholes in the territory of Canada. Long-term paleoclimate simulations were performed with the ECHO-g GCM ( projects/soap/data/model/echog.htm). The set of models has included coupled atmosphere-ocean interactions and simulated an evolution of the climate system for the period 1000-2000 A.D. The models have taken into account external forcing factors such as solar activity and volcanism as well as the atmospheric CO2 and CH4 concentrations (details can be found on one of the co-author J. Fidel Gonzalez-Rouco's web site; .html). Ground temperatures that were further used as the surface boundary condition for the calculation of artificial temperature-depth profiles were estimated from a five-layer soil model taking into account influence of vegetation on evapotranspiration,9 run-off as well as the fall, accumulation and melting of snow (Beltrami et al., 2006). Authors have modeled three 1000-year long GST records, namely control run (CTRL) that has taken into account only external forcing and two forced simulations (FOR1 and FOR2) with the same external forcing but different anthropogenic influences. The dataset for testing contained 210 Canadian borehole temperature logs. Because of their irregular distribution, the data were arranged into four clusters: British Columbia/Yukon, Manitoba/Saskatchewan, Quebec/Ontario, and Atlantic Canada. Measured temperature logs have exhibited significant differences and for further interpretation authors used averaged subsurface profiles. Figure 108 shows temperature logs measured at Atlantic Canada together with their averaged version and the SAT simulations performed with the CTRL, FOR1, and FOR2 versions of the ECHO-g model. Synthetic temperature-depth profiles were calculated by the 1-D half-space pure conductive forward technique using above simulations as the surface boundary condition.

Comparison of the synthetic and average T-z profiles for all four regions is presented in Figure 109. Even examining "by-eye" reveals that in three of the four investigated regions profiles that contain both external and anthropogenic forcings are closer to observed data than the control run that takes into account only external forcing. The external forcing alone can explain neither the curvature of the measured data in the 100-200 m depth range nor the recent warming. This fact hints that the temperature anomalies measured in Canadian boreholes unlikely arose from internal variability of the climate system and that inclusion of the human-induced forcing is indispensable to their realistic explanation. The temperature-depth distribution from British Columbia/Yukon (Figure 109, top-left) is the only one that exhibits good coincidence with all three simulated profiles. As shown, the subsurface temperature anomalies and/or the amount of the recent warming in this area are somewhat lower than those in other investigated regions.

In summary, it can be concluded that a qualitative comparison of borehole measurements with GCM for detection/attribution of the anthropogenic changes has shown that similarly to the SAT records underground temperatures are sensitive to the surface forcing and thus can be used for discovering the causes of the recent warming as well as for an improvement of the models for the simulation of the temperature changes. As shown in

9Evapotranspiration represents the combined process of evaporation from the Earth's surface and transpiration from vegetation.

Was this article helpful?

0 0
Guide to Alternative Fuels

Guide to Alternative Fuels

Your Alternative Fuel Solution for Saving Money, Reducing Oil Dependency, and Helping the Planet. Ethanol is an alternative to gasoline. The use of ethanol has been demonstrated to reduce greenhouse emissions slightly as compared to gasoline. Through this ebook, you are going to learn what you will need to know why choosing an alternative fuel may benefit you and your future.

Get My Free Ebook

Post a comment