The realization that small changes in Earth's orbit might have regular and predictable affects on climate originated just over a century and a half ago. It came about from a convergence of knowledge in two very different disciplines: the still-young science of geology and the somewhat older field of astronomy.
In the middle 1800s geologists first proposed that great, mile-high ice sheets had existed in large areas of the Northern Hemisphere and had disappeared in the not-too-distant past. These great masses of ice were called "sheets" because of their dimensions: thousands of miles in length and width and a mile or two in thickness, or just about the same relative proportions as the sheets on your bed.
The primary evidence for such vast covers of ice was the presence of long, curving ridges of rubble running for miles across the landscapes of northern North America and northern Europe. These ridges of jumbled debris, called moraines, contain everything from fine clay to sand, pebbles, cobbles, and boulders. For a long time, most scientists had believed them to be the result of the great biblical flood, which they imagined would have had the power to move even the largest debris across the landscape. The fresh, young-looking appearance of these ridges of rubble also seemed consistent with a biblically derived estimate that Earth was just under 6,000 years old.
But water tends to sort the material it carries by size, leaving deposits rich in sand and silt and clay in separate regions. In contrast, these piles of rubble were just that: everything from clay to boulders all heaped up together. Naturalists like Jean de Charpentier who lived and worked near the Alps and the mountains in Norway had noticed the same kinds of rubbly deposits lying at the edges of active mountain glaciers in mountain valleys. Ice is nature's messiest housekeeper: it carries and pushes all sizes of sediment and dumps it in great heaps wherever it melts. Also found in northern and montane regions were large boulders called erratics that were totally unlike the underlying bedrock. It seemed obvious that these boulders had been plucked from bedrock in far-away areas, carried long distances, and dumped onto the landscape.
By the late 1830s Charpentier and others had convinced Swiss geologist Louis Agassiz that the deposits in the alpine valleys were created by mountain glaciers, as were the gouges and scars left where glaciers had ground loose rocks across the underlying bedrock. But Agassiz took a much larger step: he proposed that ice also accounted for the other bands of jumbled debris and the bedrock scars found all across northern North America and northern Europe. He claimed that great ice sheets had once covered large parts of these continents. At a time when few, if any, were even aware of the existence of the Greenland ice sheet and the massive Antarctic ice sheet had not yet been discovered, this notion seemed absurd.
Agassiz borrowed a term previously used by botanist Karl Schimper to describe this cold interval as an "ice age." Energetic and persuasive, he began a long campaign to convince the scientific community that these enormous sheets of ice had indeed existed. Although his challenge to the scientific wisdom of the time was widely resisted at first, little by little other prominent scientists went out into the field, looked at the evidence, and concluded that Agassiz must be basically right. By the 1860s Agassiz's campaign had largely succeeded.
But this important discovery immediately raised a troubling question. Even though the technique of radiocarbon dating would not be discovered until the 1940s nor widely used until the 1950s, geologists could tell simply from the fresh look of the sediments that the glacial deposits were not very old. Most rock deposits from Earth's long history are hard and compact, and they look ancient, even at a glance. These piles of debris were loose, crumbly, and much fresher looking, and therefore obviously much younger. One way to estimate the time since the glaciers had melted was to examine sediments in lakes within or near glaciated areas and count the number of annual layers called varves, which are alternating light-dark bands deposited each year. Records spliced together from sediments in many Scandinavian lakes suggested that the ice had melted between 12,000 and 6,000 years ago, an estimate that turned out not to be far off.
The fact that such enormous masses of ice had existed so recently posed a major challenge to geologists who were only beginning to accept the theory devised by James Hutton in the 1700s that Earth's surface features result from geologic processes that work over long intervals of time. If it really takes tens or hundreds of millions of years to build or erode a mountain range, how could such processes create or destroy ice sheets within just thousands of years and, in the case of the recent ice melt-back, within just the last few thousand years?
Meanwhile, other features that also looked very "young" and thus seemed to indicate recent changes in climate were being discovered elsewhere. In the late 1800s American explorers on horseback came into the Great Basin desert region of the Southwest. Throughout the region they found features that clearly marked the shorelines of former lakes that had once flooded these basins to levels well above those of modern lakes or basin floors. Surrounding modern-day Salt Lake City, geologist Grove Gilbert found fresh-looking beaches up to 300 meters (more than 900 feet) above the level of today's Great Salt Lake. These old shorelines were marked by notches cut into rock outcrops by wave action and by fresh-looking lakeshore sediments deposited in continuous, flat-lying terraces visible from miles away. The obvious implication was that an enormous lake must once have flooded the entire basin between the surrounding mountain ranges, covering some 50,000 square kilometers (20,000 square miles), almost the size of modern-day Lake Michigan, and inundating the site of present-day Salt Lake City. Smaller lakes, also much larger than those present today, were scattered across the basins of the American Southwest. This entire region, not very long ago, had once been much wetter.
The earliest geologists and geographers who penetrated the interior of North Africa were finding similar evidence in and south of the Sahara Desert. Surrounding several basins where modern-day lakes are small or dried up entirely are the same kinds of wave-cut notches and sediment terraces, indicating that larger and deeper lakes must have existed in the recent past. Scattered remains of these older lake-bed sediments can be found in regions where the fierce Saharan winds have not yet blown them away. The full catalogue of young-looking deposits of this kind from around the world is long, but all of the examples point to the same basic conclusion: something has been causing enormous changes in climate on relatively short time scales, and that something cannot possibly be processes involved in Earth's slow tectonic changes.
Still another complication emerged. During the 1800s and early 1900s, it gradually became clear that large ice sheets and higher lake levels had existed more than once in the relatively recent past. In a few northern locations, the glacial debris heaped in moraine ridges lay on top of a layer of soil very much like the soils that typically accumulate today in temperate regions. Pollen grains found in these soils came from heat-adapted trees like oaks and hickories. This evidence indicated that a warmer (nonglacial) climate much like that today had existed before the ice sheet arrived and deposited the rubble. But then, lying underneath the older soil, scientists found another layer of glacial debris indicating another earlier glaciation in the same region. Explorers in now-arid regions also found evidence of more than one earlier interval of higher lake levels.
These discoveries meant not only that climate had been colder or wetter in the very recent past, but that it had switched back and forth more than once between the warm and cold or wet and dry states. A few geologists made brief but futile attempts to explain this evidence by invoking fluctuations in Earth's crust over intervals of a few tens of thousands of years, but these explanations convinced few.
As it turned out, the answer to this mystery was to be found not in Earth's internal geologic processes, but in its orbit in space. Centuries earlier, during the 1500s and 1600s, astronomers Nicholas Copernicus, Johannes Kepler, and Galileo Galilei had discovered that Earth is not the center of the universe but a small planet held in an orbit around the Sun by the pull of gravity across 93 million miles of space. Similarly, Earth's gravity field holds the Moon in orbit some 242,000 miles away, just as it holds us securely right here on terra firma.
In time, other astronomers began to investigate another, more subtle effect of gravity—the influences on Earth of the combined gravitational tugs of the Sun and Moon and all the planets as Earth orbits the Sun. Gradually they had come to learn that the secondary tugs of the planets, even though much smaller than the pull of the Sun, have important effects. Jupiter, 400 million miles away, is large enough to pull noticeably at Earth's orbit. Our moon, although relatively small, is close enough to tug on Earth. These gravitational pulls are aided by the fact that Earth's shape is not perfectly spherical but bulges out slightly at the equator, forming an irregularity on which the external gravitational attractions can act.
In 1842, just a few years after Agassiz's dramatic claim that ice sheets had once covered large parts of the Northern Hemisphere, astronomer Joseph Adhemar made a wonderfully imaginative mental leap that would eventually lead to a theory of the ice ages. Adhemar knew that some aspects of Earth's orbit had changed over intervals as short as a few tens of thousands of years. His brilliant "eureka" realization was this: changes in Earth's orbit should affect the amount of solar radiation reaching its surface, which in turn should have an impact on climate, including the appearance and disappearance of ice sheets. Although this wonderful insight proved correct, it was only a beginning, a conceptual basis on which to build by finding the actual mechanisms that link changes in Earth's orbit to changes in its climate.
One way to alter the Sun's effect on Earth's climate is to change its height in the sky. Summers today are warmer than winters in part because the Sun is higher in the daytime sky and provides more warmth, and winters are colder because the Sun is lower and its weak radiation provides less heat. By direct extension, anything that affects the Sun's height above the horizon over much longer time scales should affect long-term climate.
Of course, even though our Earthbound perspective makes it look as if the Sun is changing, in fact that is not the case. The real reason for the apparent changes in the Sun's elevation lies in Earth's orbit. Earth's rotational axis is tilted relative to its orbital path around the Sun. The angle of tilt remains constant at 23.5° throughout the yearly orbit, as does the direction in space toward which Earth is tilting or leaning. In summer, Earth is in the part of its orbit where it is tilted directly toward the Sun, and so the Sun appears high in the daytime sky. But by winter, Earth's orbit has reached the opposite extreme, and its tilt is now directed away from the Sun, so the Sun appears lower in the sky. So the height of the Sun in the sky, and the amount of solar radiation it delivers, actually results from the interplay between the Earth's axial tilt and its position in its annual orbit. Those sixteenth-century astronomers already had this figured out.
Was this article helpful?