Degradation of water quality related to oxidation of iron disulfide minerals associated with coal is a naturally occurring process that has been observed since the late seventeenth century, many years before commencement of commercial coal mining in the United States. Disturbing coal strata during mining operations accelerates this natural deterioration of water quality by exposing greater surface areas of reactive minerals to the weathering effects of the atmosphere, hydrosphere, and biosphere. Degraded water quality in the temperate eastern half of the United States is readily detected because of the low mineralization of natural water. Maps are presented showing areas in the eastern United States where concentrations of chemical constituents in water affected by coal mining (pH, dissolved sulfate, total iron, total manganese) exceed background values and indicate effects of coal mining. Areas in the East most affected by mine drainage are in western Pennsylvania, southern Ohio, western Maryland, West Virginia, southern Illinois, western Kentucky, northern Missouri, and southern Iowa. Effects of coal mining on water quality in the more arid western half of the United States are more difficult to detect because of the high degree of mineralization of natural water. Normal background concentrations of constituents are not useful in evaluating effects of coal mine drainage on streams in the more arid West. Three approaches to reduce the effects of coal mining on water quality are: (1) exclusion of oxygenated water from reactive minerals, (2) neutralization of the acid produced, (3) retardation of acid-producing bacteria population in spoil material, by application of detergents that do not produce byproducts requiring disposal. These approaches can be used to help prevent further degradation of water quality in streams by future mining. ?? 1988 Springer-Verlag New York Inc.