Skip navigation

Virus.jpg

How effective are water treatment processes in controlling human enterovirus in a major water supply? In 1976, USEPA undertook to have this question addressed at a location downstream from the waste discharges of Kansas City, Missouri.

 

One highlight of the study's results was the seasonality of the virus challenge. During the summer,  the steady input of virus to the Missouri River appeared to decline en route to the study site at Lexington, Missouri. Unlike bacterial indicators, it was during the winter months, when water temperatures were near freezing that most human enterovirus were recovered.

 

Cold water periods are also those times when water treatment processes are least efficient. Chemicals dissolve slowly or incompletely. Due to increased viscosity, flocculation is less effective (unless paddle speeds are increased which they rarely are). With increased water density, sedimentation rates are greatly retarded. Filtration through granular media becomes relatively ineffective in removing planktonic cells. Even the rates of chemical disinfection are retarded.

 

Finally, our conventional microbiological indicators (total coliform, fecal coliform, and heterotrophic plate counts) become irrelevant because they are inversely correlated with virus.

Chlorine.jpg

Thirty years ago, USEPA was starting to assess the capabilities of various alternative drinking water treatment processes for reducing the consumer's exposure to trihalomethanes and organic carbon compounds. This pilot plant study, using Missouri River water, was one of the first to directly compare the use of chlorine versus chloramine for reducing the formation of trihalomethanes while maintaining disinfection capability. Over the ensuing generation, chloramine has become the dominant means for disinfection of surface waters throughout the United States.