5-Year Interval termining the concentration of oxygen-demanding waste. Consequently, periods of low flow in the stream channel yield the highest concentration of BOD.
The combination of unnaturally high levels of BOD inputs, high water temperature, and low stream flow creates worst-case DO levels in streams and, in turn, the most critical conditions for the survival of aquatic organisms, that is, conditions of increased oxygen demand, low oxygen solubility, and low dilution potential. Fortunately, worst-case conditions do not occur all the time. Although the BOD loading component tends to remain relatively stable over the course of a year, there are usually distinct seasonal variations in temperature and rainfall (directly related to flow). On an annual basis in the contiguous United States, the highest water temperatures and minimal flow levels usually occur from early summer to late fall. Therefore, the months of July through September are generally considered "worst-case" months for DO.
Observations of year-to-year variations in climate reveal that many areas on the earth, including the United States, experience runs of wet and dry years, a phenomenon known as persistence. The short time frame of historical record-keeping makes it difficult for scientists to predict exactly when these wet and dry year cycles will occur; however, more than 100 years of rainfall data have proven that they are not uncommon. Importantly, persistence tends to have a cumulative effect on stream conditions. Therefore, the worst-case scenario for DO in waterways from a temporal perspective can be further refined to include the months of July through September (worst-case months) during a run of dry years (worst-case persistence).
Defining the periods of years before and after the CWA to represent worst-case persistence was accomplished in three steps. In the first step, USGS flow data taken from approximately 5,000 gages with over 20 years of record during the period from 1951 to 1980 was used to calculate long-term mean summer flow from July through September. The mean flow for each gage was then normalized as runoff (cfs per square mile) over the drainage area contributing flow at each gage. The normalized runoff data for each gage was spatially interpolated to determine the long-term mean summer runoff for each catalog unit (see Figure F-1 in Appendix F). Summer mean flows for each gage were computed for each year from 1961 through 1995 and classified as "dry," "normal," and "wet" years by calculating the relative ratio of flow for each summer to the long-term (1951-1980) summer mean flow computed for each gage. Years with ratios less than 0.75 were considered dry; normal years had ratios from 0.75 to 1.5, and wet years were defined as having ratios greater than 1.5.
Figure 3-3 illustrates how widely mean summer flow can vary over time. The figure displays USGS gage data from the Upper Mississippi River at St. Paul, Minnesota, for the years 1960 through 1995. The scale on the left vertical axis is stream-flow measurements as thousands of cubic feet per second (cfs). The scale on the right vertical axis is the interannual-to-long-term mean (10,658 cfs) streamflow ratio. Note that the benchmark ratio of 0.75 (which distinguishes dry from normal years) is represented by the dashed horizontal line. This graph shows that dry summers with low flow occurred in St. Paul in the years 1961, 1970, 1976, 1980, and 1987-1989. The data from this gage also show the enormous wet conditions that occurred primarily in response to the "Great Flood of 1993." That year, the mean summer flow was about 4.5 times greater than the normal mean summer flow.
For the second step, a sliding window methodology was used as an algorithm to
weight and interpolate normalized streamflow ratios for multiple gages within a catalog unit. The outcome was a weighted streamflow ratio assigned to each catalog unit for each year from 1961 through 1995. Similar to the gage-scale streamflow ratio, the catalog unit-scale streamflow ratio was used to classify catalog units into dry (< 0.75), normal (0.75-1.5), and wet ( >1.5) years.
The third and final step used to define the periods of worst-case dry persistence before and after the CWA involved grouping the 35-year period from 1961 to 1995 into consecutive 5-year "time-blocks." Then for each catalog unit, the number of years within each time-block during which the catalog unit scale streamflow ratio was below 0.75 (i.e., dry) was determined. Rather than using the seemingly obvious 5-year time-block of 1966-1970 to characterize water quality conditions "before" the 1972 CWA, 1961-1965 was selected instead to represent conditions "before" the CWA, while 1986-1990 was used to characterize conditions "after" the CWA.
Widespread drought conditions, a critical factor for "worst-case" water quality conditions, occurred in 1961, 1962, 1963, 1964, 1965, 1966, 1987, and 1988 in the Northeast, Middle Atlantic, Midwest, and Central states. Drought conditions thus occurred during both of these "before and after" 5-year time-blocks of record (i.e., 1961-1965 and 1986-1990). The widespread extent of drought conditions during the "before and after" time-blocks is shown in Figure 3-4 with maps of normalized streamflow ratios computed for each catalog unit for 1963 and 1988.
For the 5-year time-block of 1961-1965, selected to represent before-CWA conditions, 1,923 (91 percent) of the 2,111 catalog units of the 48 contiguous states were characterized by at least one year of "dry" streamflow conditions. Similarly, for the 5-year time-block of 1986-1990 "after" the CWA, 1,776 (84 percent) of the 2,111 catalog units of the 48 contiguous states were characterized by at least one year of "dry" streamflow conditions. For the catalog units characterized as "dry," low-flow conditions occurred for a mean period of 2.5 years during 1961-1965 and 2.7 years during 1986-1990 (Figure 3-5). Hydrologic conditions for the summers of 1963 and
Was this article helpful?