John O'Connor


Blog Post created by John O'Connor on Jul 16, 2015

Oper. Training.jpg

In the production of municipal drinking water, nothing is monitored more assiduously than turbidity.


Why is this so? Just what is turbidity? And why is this quality parameter so critical that it rates designation as a microbiological surrogate and primary drinking water standard?


Direct microscopic examination of drinking waters reveals that turbidity is caused by stuff - stuff that absorbs or scatters light. As you might imagine, light-scattering stuff may include silt, clay, cyanobacteria, precipitated carbonates, sulfides, metal oxides (e.g., rust and corrosion products), plant fibers and organic debris, microfloc, activated carbon 'fines', paint chips, nematodes, protozoans, cysts, bacteria, virus, ... (Oddly, while they may sometimes be present in large number, bacterial cells are so translucent that they contribute little to either the turbidity of natural - or treated drinking waters.)


The attached report, "The Effect of Lower Turbidity on Distribution System Water Quality", (AWWARF, 1993), includes analyses of sets of operational data from a broad range of major U.S. water utilities (Kansas City, MO; St. Louis, MO; St. Louis County (MO) Water Company; New Orleans, LA; Boston, MA; Baltimore, MD; New Haven, CT; Cleveland, OH; Louisville, KY; Dallas, TX; Phoenix, AZ; Oakland, CA; Los Angeles, CA; Metropolitan Water District of Southern California).  From this data, the seasonal relationship between each utility's treated ('finished') water turbidity and the frequency of recovery of total coliform and heterotrophic plate count (HPC) organisms in monitoring samples from the water distribution system could be determined.


Of special interest, a comparison of data from various utilities indicated a very distinct advantage in maintaining control over total coliform in the distribution system using chloramine as opposed to chlorine.