An Examination of Worst Case DO in Waterways Below Point Sources Before and After the CWA

Chapter 2 discussed the evolution of the BOD measurement, the impact of BOD loadings on DO levels in natural waters, and the massive amount of public and private money invested in municipal wastewater treatment to meet the mandates of the CWA. Key conclusions from the first leg of the three-legged stool approach are:

• The nation's investment in building and upgrading POTWs significantly reduced BOD effluent loading to the nation's waterways.

• This reduction occurred in spite of a significant increase in influent BOD loading caused by an increase in population served by POTWs.

The second leg follows up on the first leg with another question: Has the CWA's push to reduce BOD loading resulted in improved water quality in the nation's waterways? And, if so, to what extent? The key phrase in the question is "to what extent?" Earlier studies by Smith et al. (1987a, 1987b) and Knopman and Smith (1993) concluded that any improvements in DO conditions in the nation's waterways are detectable only within relatively local spatial scales downstream of wastewater discharges.

Perhaps the most noteworthy finding from national-level monitoring is that heavy investment in point-source pollution control has produced no statistically discernible pattern of increases in water's dissolved oxygen content during the last 15 years [1972-87]. . . . The absence of a statistically discernible pattern of increases suggests that the extent of improvement in dissolved oxygen is limited to a small percentage of the nation's total stream miles. This is notable because the major focus of pollution control expenditures under the act [CWA] has been on more complete removal of oxygen-demanding wastes from plant effluents.

—Knopman and Smith, 1993

The purpose of the second leg of this investigation is to examine evidence that may show that the CWA's municipal wastewater treatment mandates benefited water quality on a broad scale, as well as in reaches immediately downstream from POTW discharges. The systematic, peer-reviewed approach used in this investigation includes the following steps:

• Developing before- and after-CWA data sets composed of DO summary statistics derived from monitoring stations that were screened for worst-case conditions. The purpose of the screening exercise is to mine data that inherently contain a response "signal" linking point source discharges with downstream water quality.

• Calculating a worst-case DO summary statistic for each station for each before-and after-CWA time period and then aggregating station data at sequentially larger spatial scales (reaches, catalog units, and major river basins).

• Conducting an analysis of spatial units that have before- and after-CWA worst-case DO summary statistics and then documenting the direction (improvement or degradation) and magnitude of the changes in worst-case DO concentration.

• Assessing how the point source discharge/downstream DO signal changes over progressively larger spatial scales.

Section A of this chapter provides background on the relationship between BOD loading and stream water quality and discusses the two key physical conditions (high temperature and low flow) that create "worst-case" conditions for DO. Section B describes the development and application of a set of screening rules to select, aggregate, and spatially assess before- and after-CWA DO data drawn from USEPA's STORET database. Section C presents the results of the comparison analysis of worst-case DO from before and after the CWA for reach, catalog unit, and major river basin scales. The chapter concludes with Section D, which provides the summary and conclusions for the second leg of this investigation.

0 0

Post a comment