AECOM, a global infrastructure firm, announced that Zeynep Erdal, Ph.D., P.E., has been named regional business line leader for its water business...
Part 4: Microbiologically-Mediated Deterioration in Surface Water Supplies
Seasonal Temperature Effects
While temperature is acknowledged to be an important factor in water treatment, remarkably little study has been made of the adverse influence of low temperatures on physical treatment process effectiveness. An early study concluded that "there is no preventative or retarding effect on alum floc formation with low raw water temperatures."1
This conclusion prompted Camp et al. (1940)2 to further evaluate the effect of temperature on the rate of floc formation. Camp utilized the direct measurement of iron or aluminum in lieu of turbidity. He concluded that temperature did not have a measurable effect on the time of floc formation.
However, pilot plant studies of aluminum sulfate-coagulated river water demonstrated the overall adverse effect of low temperatures on sedimentation and filtration.3 These researchers advised, "where raw water temperature is low, the jar tests must be run on samples held at the same temperature if results are to be used in plant control."
It was not until 1984 that a systematic evaluation of the adverse effects of low temperature on water treatment plant performance was undertaken.4 The investigators reported significant temperature effects on coagulation that accounted for observed decreases in turbidity removal efficiency, particularly when aluminum sulfate was used. Using decreases in alkalinity and measurements of the metal coagulant in solution, the researchers determined that the reduction in turbidity removal efficiency was not due to reduced metal hydroxide precipitation rate but to retardation of the floc growth. Under equivalent conditions, iron salts produced larger flocs than aluminum salts and resulted in lower residual turbidity values. The implications of these results became more significant in light of observed major temperature effects on organism removals.5
While low water temperatures have been shown to severely impair microorganism removals by physical water treatment processes,5, 6, 7, 8 microbiological violations due to coliform or heterotrophic plate count organisms are most commonly reported by U.S. water utilities during warmer (summer) months.9 Alternately, for 14 of the 21 large water distribution systems studied, finished water turbidities were lowest when water temperatures were highest. As a result, percent positive total coliform samples and finished water turbidity were either unrelated or inversely related.
Figure 1 illustrates the repetitive seasonal variation in finished water turbidity for Kansas City, Mo. While significant annual reductions in finished water turbidities from 0.8 to 0.2 ntu were achieved by the Kansas City Water Department, the recovery of positive total coliform samples from the distribution system, initially low, actually increased slightly during this five-year period.
Figure 2 compares the percent recovery of positive total coliform samples in summer and winter for several distribution systems. The marked seasonal differences indicate that the value of total coliform as an indicator of microbial contamination is severely compromised when water temperatures are low.
Those water utilities that used chloramine as a residual in their distribution systems recovered 8.6 times fewer positive total coliform samples than those using chlorine (Figure 3). Table 1 shows that percent positive coliform recoveries differed by a factor of more than 1,000 (7 percent to 0.006 percent) among the 21 distribution systems surveyed.
While the divergent seasonal variations in turbidity and total coliform are seemingly inconsistent, the data would suggest that the penetration of increased numbers of organisms into the distribution system during cold weather is not evidenced by distribution sampling for total coliform. This is because their influx is not accompanied by the subsequent regrowth during distribution that is experienced in the summer. These utility data illustrate the fundamental weaknesses in current operational microbiological standards. They confirm the need for alternative methods to directly visualize and quantify source water microorganisms entering and migrating through the distribution system.10
Case History: Aftergrowth
A study of the Philadelphia (Pa.) Suburban Water Company (PSWC) by Donlan and Pipes (1988)11 provided an example of a distribution system experiencing aftergrowth. HPC populations (periphytic) that developed on cast iron test specimens suspended in PSWC water mains were measured and related to the distribution system (planktonic) HPC values. No statistically significant relationships were found between the population of HPC attached to the cast iron specimens and measured water quality parameters, including total organic carbon, ammonium ion, phosphate and pH.
However, temperature was found to have a major influence. Both the attached and distribution system (planktonic) HPC were most abundant at higher temperatures. The density of HPC organisms attached to the cast iron test specimens ranged from 100 cfu/cm2 at 5° C to 37,000,000 cfu/cm2 at 23° C. High attached HPC also corresponded to high planktonic HPC in the distribution system.
Direct microscopic total bacterial cell counts made of the bacteria scraped from some of the cast iron specimens were 3 to 134 times greater than the HPC enumerated. These results were consistent with studies that showed microscopic cell counts to be orders of magnitude greater than the number of HPC colonies obtained on culturing distributed drinking waters.12
Within the PSWC distribution system, HPC increased markedly as water temperatures rose. Increased water temperatures (24° C) led to the more rapid depletion of chloramine. HPC was highest (8,318 cfu/ml) where the chloramine residual was almost totally depleted. Alternately, where chloramine was maintained at 1.3 g Cl/m3, only 10 colony forming units per millilitre were observed.
Case History: Regrowth
Although the 35° C standard pour plate (HPC) counts made in the Jefferson City, Mo., distribution system are not directly comparable to the 20° C spread plate counts measured in the PSWC system, it is evident that the smaller Jefferson City distribution system exhibited far lower and more uniform HPC populations throughout the year.5 The most significant difference between the Jefferson City and Philadelphia data was the effect of seasonal temperature change on the HPC. In the Jefferson City distribution system, HPC decreased as temperature increased, probably reflecting the significantly increased efficiency of total bacterial removals at the Jefferson City water treatment plant.
The results of the Jefferson City distribution system evaluation indicated that the HPC bacterial populations found throughout the distribution system remained close to the numbers (=30 cfu/ml) discharged from the plant clear well. Chloramine residuals were maintained with little depletion during distribution while initially low finished water turbidities remained relatively unchanged. HPC declined at near-plant locations and increased slightly at remote locations.
From direct microscopic count, the Jefferson City study also showed that seasonally, readily measured numbers (10,000 to 1,000,000 cells/ml) of planktonic bacteria from the raw water passed through a well-operated, multi-stage, lime softening, rapid sand filtration plant meeting existing microbiological standards (turbidity, coliform). The number of bacterial cells that passed through the plant strongly correlated with the number found throughout the distribution system.
These results suggest that in a distribution system where bacteriostatic disinfectant residuals can be maintained throughout the year, HPC and bacterial cell count will vary primarily with the efficiency of total cell removal during treatment. Conversely, where the disinfectant residuals are depleted, both regrowth and aftergrowth can add to the microbial populations observed in the distribution system.
Perhaps most important from the standpoint of protection of public health, the optimization of physical treatment process performance for biotic particle removal would reduce significantly the number of microrganisms penetrating and propagating through the distribution system. Based on microscopic observations relative to bacterial removals, other larger organisms of concern in source waters would be removed with still greater efficiency.8
Naturally-occurring particles in drinking water sources generally are mixtures of metal silicates, oxides, carbonates, sulfides, natural and anthropogenic organic debris, plus a range of microorganisms. Different biotic and abiotic particles, depending on size distribution, density, charge, shape and surface characteristics, may both scatter light (cause turbidity) and respond to coagulation and filtration differently.
In order to increase the removal of microorganisms, including pathogens, from surface water sources, efforts must be made to rapidly monitor and increase biotic particle removals through improved pretreatment (coagulation and sedimentation), particularly when water temperatures are low. Low temperatures decrease the rates of coagulant dissolution, precipitation, cell enmeshment and floc formation. In particular, reduced sedimentation rates impair microorganism removals. Improvements in coagulant additions (blending with warm water), mixing protocols and increased sedimentation periods should result in more complete total organism removals.
Recognition of the large populations of bacteria in raw and finished drinking waters ultimately should attract attention to questions regarding the fate of all biotic particles penetrating the treatment process and entering the distribution system. Their viability, their microbial ecology in the distribution system and the cycling of the microbial nutrients that they contain in their cells are key elements in understanding the deterioration of water quality in distribution systems.
In the future, when large numbers of organisms are found in any water distribution system, consideration must be given to the water source and treatment plant itself as their origin. It should not be assumed that they grew in the distribution system or were dislodged from the interior surfaces of distribution mains.