In 1972, along with a graduate student assistant, I undertook a study of the removal of arsenic from drinking water. We focussed on the adsorption of arsenic (V) on aluminum and ferric hydroxides largely because this was consistent with the treatment provided in most conventional water treatment plants.
At the time, since the MCL for arsenic was set at 50 micrograms per liter and there were no well-established, widely-available chemical means for determining arsenic in drinking water at or below such concentrations, there were also no reported violations of the arsenic drinking water standard in the U.S.
To oversome analytical difficulties, the study employed an arsenic radiotracer. This allowed a quick and reproducible assessment of the removal of arsenic by orders of magnitude.
The results of the studies, published in the Journal of the American Water Works Association in 1973, showed that arsenic could be readily removed to a high degree (> 90%) by conventional dosages of both iron and aluminum coagulants. Iron precipitates were found to be particularly effective in adsorbing or co-precipitating arsenic.
With the development of advanced analytical capabilities, the issue of arsenic in drinking water had reemerged on the national and world scenes in dramatic fashion by the year 2000, leading both to modification of regulations and a renewed search for arsenic removal methods.
The attached series on Arsenic in Drinking Water examines some of the developments that occurred after the health effects of arsenic in drinking water were assessed.