Satellite imagery forms one of the basic tools for remote sensing. The types of satellite images available to the geologist, environmental scientist, and others are expanding rapidly, and only the most common in use are discussed here.
The Earth Resources Technology Satellite (ERTS-1), the first unmanned digital imaging satellite, was launched on July 23, 1972. Four other satellites from the same series, later named Landsat, were launched at intervals of a few years. The Landsat spacecraft carried a Multi-Spectral Scanner (MSS), a Return Beam Vidicon (RBV), and later, Thematic Mapper (TM) imaging systems.
Landsat Multi-Spectral Scanners produce images representing four different bands of the electromagnetic spectrum. The four bands are designated band 4 for the green spectral region (0.5 to 0.6 microns); band 5 for the red spectral region (0.6 to 0.7 microns); band 6 for the near-infrared region (0.7 to 0.8 microns); and band 7 for another near-infrared region (0.8 to 1.1 microns).
Radiation reflectance data from the four scanner channels are converted first into electrical signals, then into digital form for transmission to receiving stations on Earth. The recorded digital data are reformatted into what we know as computer compatible tapes (CCT) and/or converted at special processing laboratories to black-and-white images. These images are recorded on four black-and-white films from which photographic prints are made in the usual manner.
The black-and-white images of each band provide different sorts of information because each of the four bands records a different range of radiation. For example, the green band (band 4) most clearly shows underwater features because light with wavelengths in the green region of the visible spectrum is able to penetrate shallow water, and is therefore useful in coastal studies. The two near-infrared bands, which measure the reflectance of the Sun's rays outside the sensitivity of the human eye (visible range), are useful in the study of vegetation cover.
When these black-and-white bands are combined, false-color images are produced. For example, in the most popular combination of bands 4, 5, and 7, the red color is assigned to the near-infrared band number 7 (and green and blue to bands 4 and 5 respectively). Vegetation appears red because plant tissue is one of the most highly reflective materials in the infrared portion of the spectrum, and thus, the healthier the vegetation, the redder the color of the image. Because water absorbs nearly all infrared rays, clear water appears black on band 7. Therefore, one cannot use this band to study features beneath water even in the very shallow coastal zones, but it is useful in delineating the contact between water bodies and land areas.
The geologic mapping community originally was most interested in flying the RBV, since it offered better geometric accuracy and ground resolution (130 feet; 40 m) than was available from the MSS (260 feet/80 m resolution) with which the RBV shared space on Landsats 1, 2, and 3. The RBV system contained three cameras that operated in different spectral bands: blue-green, green-yellow, and red-infrared. Each camera contained an optical lens, a shutter, the RBV sensor, a thermoelectric cooler, deflection and focus coils, erase lamps, and the sensor electronics. The three RBV cameras were aligned in the spacecraft to view the same 70-square-mile-(185-km2-) ground scene as the MSS of Landsat. Although the RBV is not in operation today, images are available and can be utilized in mapping.
The TM is a sensor that was carried first on Landsat 4 and 5 with seven spectral bands covering the visible, near-infrared, and thermal infrared regions of the spectrum. With a ground resolution of 100 feet (30 m), the TM was designed to satisfy more demanding performance parameters, using experience gained from the operation of the MSS.
The seven spectral bands were selected for their band passes and radiometric resolutions. For example, band 1 of the TM coincides with the maximum transmissivity of water and demonstrates coastal water-mapping capabilities superior to those of the MSS; it also has beneficial features for the differentiation of coniferous and deciduous vegetation. Bands 2-4 cover the spectral region that is most significant for the characterization of vegetation. Band 5 readings allow estimation of vegetation and soil moisture, and thermal mapping in band 6 allows estimation of plant transpiration rates. Band 7 is primarily motivated by geological applications, including the identification of rocks altered by percolating fluids during mineralization. The band profiles, which are narrower than those of the MSS, are specified with stringent tolerances, including steep slopes in spectral response and minimal out-of-band sensitivity.
Geologic studies commonly use TM band combinations of 7 (2.08-2.35 pm), 4 (0.76-0.90 pm), and 2 (0.50-0.60 pm), due to the ability of this combination to discriminate features of interest, such as soil moisture anomalies, lithological variations, and to some extent, mineralogical composition of rocks and sediments. Band 7 is typically assigned to the red channel, band 4 to green, and band 2 to blue. This procedure results in a color composite image; the color of any given pixel represents a combination of brightness values of the three bands. With the full dynamic range of the sensors, there are 16.77 x 106 possible colors. By convention, this false-color combination is referred to as TM 742 (RGB). In addition to the TM 742 band combination, geologists sometimes use the thermal band (TM band 6; 10.4-12.5 pm) because it contains useful information potentially relevant to hydrogeology.
The French Système pour l'Observation de la Terre (SPOT) obtains data from a series of satellites in a sun-synchronous 500-mile- (830-km-) high orbit, with an inclination of 98.7°. The Centre Nationale d'Etudes Spaciales (CNES) designed the SPOT system, and the French industry in association with partners in Belgium and Sweden built it. Like the American Landsat, SPOT consists of remote-sensing satellites and ground receiving stations. The imaging is accomplished by two High-Resolution Visible (HRV) instruments that operate in either a panchromatic (black-and-white) mode for observation over a broad spectrum, or a multispectral (color) mode for sensing in narrow spectral bands. The ground resolutions are 33 and 66 feet (10 and 20 m) respectively. For viewing directly beneath the spacecraft, the two instruments can be pointed to cover adjacent areas. By pointing a mirror that directs ground radiation to the sensors, observation of any region within 280 miles (450 km) from the nadir is possible, thus
Part of the electromagnetic spectrum, showing relationship between wavelength, frequency, and nomenclature for electromagnetic radiation with different characteristics
10pm 100pm 1nm 10nm 100nm j__i_
1020 1 018 1 016
Gamma ray i
X-ray | Ultraviolet
1um10um 100um 1mm 1cm 10cm 1m 10m 100m 1km 10km 100km
106 104 -Frequency (Hz)
Near Mid Far
© Infobase Publishing
allowing the acquisition of stereo photographs for three-dimensional viewing and imaging of scenes as frequently as every four days.
Radar is an active form of remote sensing, where the system provides a source of electromagnetic energy to illuminate the terrain. The energy returned from the terrain is detected by the same system and is recorded as a digital signal that is converted into images. Radar systems can be operated independently of light conditions and can penetrate cloud cover. A special characteristic of radar is the ability to illuminate the terrain from an optimum position to enhance features of interest.
Airborne radar imaging has been extensively used to reveal land surface features. However, until recently it has not been suitable for use on satellites because: (1) power requirements were excessive; and (2) for real-aperture systems, the azimuth resolution at the long slant ranges of spacecraft would be too poor for imaging purposes. The development of new power systems and radar techniques has overcome the first problem and synthetic-aperture radar systems have remedied the second.
The first flight of NAsA's shuttle Imaging Radar (siR-A) in November of 1981 acquired images of a variety of features including faults, folds, outcrops, and dunes. Among the revealed features are the sand-buried channels of ancient river and stream courses in the Western Desert of Egypt. The second flight, siR-B, had a short life; however, the more advanced and higher resolution siR-C was flown in April 1994 (and was again utilized in August 1994). The siR-C system measures both horizontal and vertical polarizations simultaneously at two wavelengths: L-band (23.5 cm) and C-band (5.8 cm).
This provides dual frequency and dual polarization data, with a swath width between 18 and 42 miles (30 and 70 km), yielding precise data with large ground coverage.
Different combinations of polarizations are used to produce images showing much more detail about surface geometric structure and subsurface discontinuities than a single-polarization-mode image. similarly, different wavelengths are used to produce images showing different roughness levels since radar brightness is most strongly influenced by objects comparable in size to the radar wavelength; hence, the shorter wavelength C-band increases the perceived roughness.
interpretation of a radar image is not intuitive. The mechanics of imaging and the measured characteristics of the target are significantly different for microwave wavelengths than the more familiar optical wavelengths. Hence, possible geometric and electromagnetic interactions of the radar waves with anticipated surface types have to be assessed prior to their examination. in decreasing order of effect, these qualities are surface slope, incidence angle, surface roughness, and the dielectric constant of the surface material.
Radar is uniquely able to map the geology at the surface and, in the dry desert environments, up to a maximum 30 feet (10 m) below the surface. Radar images are most useful for mapping structural and morphological features, especially fractures and drainage patterns, as well as the texture of rock types, in addition to revealing sand-covered paleochannels. The information contained in the radar images complements that in the TM images and eliminates the limitations of Landsat when only sporadic measurements can be made; radar sensors have the ability to "see" at night and through thick cloud cover since they are active rather than passive sensors.
RADARSAT is an Earth observation satellite developed by Canada, designed to support both research on environmental change and research on resource development. It was launched in 1995 on a Delta II rocket with an expected life span of five years. RADARSAT operates with an advanced radar sensor called Synthetic Aperture Radar (SAR). The synthetic aperture increases the effective resolution of the imaged area by means of an antenna design in which the spatial resolution of a large antenna is synthesized by multiple sampling from a small antenna. RADARSAT's SAR-based technology provides its own microwave illumination, thus can operate day or night, regardless of weather conditions. Thus, resulting images are not affected by the presence of clouds, fog, smoke, or darkness. This provides significant advantages in viewing under conditions that preclude observation by optical satellites. Using a single frequency, 2-inch (5-cm) horizontally polarized C band, the RADARSAT SAR can shape and steer its radar beam to image swaths between 20 and 300 miles (35 km to 500 km), with resolutions from 33 feet to 330 feet (10 m to 100 m), respectively. Incidence angles can range from less than 20° to more than 50°.
The Space Shuttle orbiters have the capability of reaching various altitudes, which allows the selection of the required photographic coverage. A camera that was specifically designed for mapping the Earth from space using stereo photographs was first flown in October 1984 on the Space Shuttle Challenger Mission 41-G. It used an advanced, specifically designed system to obtain mapping-quality photographs from Earth orbit. This system consisted of the Large Format Camera (LFC) and the supporting Attitude Reference System (ARS). The LFC derives its name from the size of its individual frames, which are 26 inches (66 cm) in length and 9 inches (23 cm) in width. The 992-pound (450-kg) camera has a 12-inch (305-mm) f/6 lens with a 40° x 74° field of view. The film, which is three-fourths of a mile (1,200 m) in length, is driven by a forward motion compensation mechanism as it is exposed on a vacuum plate, which keeps it perfectly flat. The spectral range of the LFC is 400 to 900 nanometers, and its photo-optical ground resolution ranges from 33-66 feet (10 to 20 m) from an altitude of 135 miles (225 km) in the 34,200 square mile (57,000 km2) area that is covered by each photograph.
The ARS is composed of two cameras with normal axes that take 35-millimeter photographs of star fields at the same instant as the LFC takes a photograph of the Earth's surface. The precisely known positions of the stars allow the calculation of the exact orientation of the Shuttle orbiter, and particularly of the LFC in the Shuttle cargo bay. This accurate orientation data, together with the LFC characteristics, allows the location of each frame with an accuracy of less than half a mile (1 km) and the making of topographic maps of photographed areas at scales of up to 1:50,000.
Was this article helpful?