While pathogen reduction (log10) in specific sanitation technology unit processes has been the focus of the previous chapters in this section, this chapter will examine pathogen reduction and survival in complete treatment works. As noted in the original work by Feachem, et al. (1983), significant health risks can be posed by the pathogens that survive treatment processes and are discharged in the final effluent. The survival of pathogens is dependent on the reduction characteristics of individual unit treatment processes combined in full-scale treatment plants. For a given log10 reduction for a complete treatment works, higher influent pathogen concentrations, such as may be found in developing countries (due to infectivity of population and lower overall water usage), will lead to correspondingly higher effluent concentrations. Feachem et al. (1983) stated that complete treatment works using trickling filters or activated sludge processes will "still be heavily contaminated with pathogens". This section will update the sections originally discussed by Feachem et al. (1983) with recent quantitative data from complete treatment works using newer mechanical processes as well as more recent data from natural treatment systems, in combination with recent data on specific pathogen concentrations that were not available 35 years ago.
The full-scale sanitation (wastewater treatment) systems examined here are separated into two sections examining the most common wastewater technologies used worldwide: mechanized systems that use activated sludge and its variants with advanced treatment, and natural treatment systems using wastewater stabilization ponds and anaerobic reactors combined with stabilization ponds in series. Only representative studies that have looked at specific pathogens removed in full-scale wastewater treatment facilities are presented; fecal coliform and E. coli are also presented in those studies where their log10 reduction can be compared to specific pathogens. Finally, the reduction and survival data presented here will be compared with the original results presented in Feachem et al. (1983) and those presented in the 2006 World Health Organization (WHO) guidelines for wastewater agricultural reuse (WHO, 2006).
Figure 1 shows conventional secondary treatment as it has been developed in Europe and the U.S. (Burton et al., 2014). It consists of pretreatment and primary sedimentation of organic suspended solids followed by aerobic treatment of the mostly soluble organic matter. The aerobic treatment can use suspended growth biological processes such as activated sludge, or attached growth processes such as trickling filters. All aerobic processes need secondary sedimentation to remove the biomass generated by the aerobic process. The primary and secondary sludge are typically treated (e.g., with anaerobic digestion) and dewatered before final disposal or reuse. Historically, secondary treatment plants did not use disinfection although it is now mandatory in various countries and cities worldwide.
Figure 1. Conventional secondary treatment as developed historically and practiced today: a) with primary treatment and sludge stabilization; secondary treatment is most often activated sludge but also can be trickling filters; b) extended aeration activated sludge that does not require primary treatment and separate sludge stabilization.
Advanced (tertiary) treatment processes include various forms of filtration (cloth, media, membrane filters) and/or chemical flocculation/precipitation to improve effluent quality, followed by disinfection with chlorine, chlorine dioxide, ozone, or ultraviolet radiation (Burton et al., 2014). In some parts of the world (e.g., U.S.), advanced treatment processes are becoming more common as more stringent effluent discharge requirements are enacted (Burton et al., 2014).
The fundamental design of all secondary and secondary/advanced treatment systems to this day is based on the removal of suspended and dissolved organic matter, which in turn is based on the historical circumstances in which wastewater treatment was developed in Europe and North America. As Feachem et al. (1983, pp. 63-64) stated:
"....Conventional sewage works were originally developed in order to prevent gross organic pollution in European and North American rivers; they were never intended to achieve high removal of excreted pathogens...."
Where pathogen reduction is an important consideration in wastewater treatment, secondary and secondary- advanced treatment plants must employ disinfection or filtration/disinfection as the final unit processes in the overall treatment train. This section will examine the effectiveness of the continuation of this historical approach by reviewing representative case studies for the most common treatment trains employed worldwide.
Many conventional activated sludge plants worldwide are designed and operated without disinfection. In Brazil, for example, of 56 activated sludge plants studied for a detailed reliability analysis (Oliveira and von Sperling, 2008), none practiced disinfection (von Sperling, 2015). As another typical example, the only municipal activated sludge wastewater treatment plant in Guatemala, which discharges directly into Lake Atitlan (which is the drinking water supply for 100,000 people) was designed and is operated without disinfection. Finally, many activated sludge plants worldwide that discharge to the ocean or coastal waters with outfalls typically do so without disinfection.
An excellent example of an activated sludge plant discharging to the ocean without disinfection and the consequent effects of pathogens is presented by Flannery et al. (2012). The treatment plant served a population equivalent of 91,600, with a daily flow rate of 45,000 m3/d, and discharged to the ocean through a 400-m outfall at a depth of 10 m. The study assessed the reduction of E. coli and Norovirus (Genotypes I and II) in the treatment works and the effect of the final effluent on E. coli and Norovirus concentrations in tissue taken from oysters in the proximity of the outfall (Figure 2). The results show very low log10 reduction for E. coli (1.49), and lower still for Norovirus GI and GII, 0.80 and 0.92 log10 reduction, respectively. The concentration in the effluent is consequently high for the means and high range values (right column of Figure 2).
Of particular note (right column of Figure 2) is the concentration in oyster tissue of E. coli compared with Norovirus GI and GII, with mean values of Norovirus GI and GII concentrations 2.0 and 3.2 times greater than E. coli. The authors comment that E. coli is used as the indicator organism for the sanitary quality of shellfish in the European Union, and that the concentrations of E. coli found in the oyster tissue during the study (mean = 1,660 MPN/100 g) would have passed compliance for human consumption; in contrast, the Norovirus concentrations would have been consistent with those that have caused illness in consumers (Flannery et al., 2012). The authors conclude that E. coli is an inadequate indicator to assess the Norovirus risks associated with oysters.
Fong et al. (2010) monitored adenovirus reduction for one year in an activated sludge plant discharging to a Michigan (U.S.) river and sampled secondary effluent before disinfection with chlorine. Their data are also shown in Figure 2. They found a mean 1.15 x 106 copies/L of adenovirus in raw wastewater and a mean 2.0 x 104 copies/L in secondary effluent with a mean log10 reduction of 1.77. Their results after disinfection will be discussed in detail in the next section.
Ramo et al. (2017) present a valuable study in north-eastern Spain on protozoan cyst reduction (Cryptosporidium and Giardia) in 13 out of 23 conventional activated sludge plants that do not practice disinfection. The treatment plants served populations from 5,000 to over 660,000, with only 3 plants serving populations over 20,000. Most plants discharged their effluents into rivers, but three reused their effluents for irrigation of parks, residential lawns, and agriculture.
The results of that study are presented in Figure 3 for influent-effluent cyst concentrations and log10 reduction, and for cyst concentrations found in dewatered sludges. The influent (oo)cyst concentrations were higher for Giardia than Cryptosporidium (754 − 6,606 versus 22 − 456 cysts/L, respectively), and log10 reduction ranged from 1.50 - 2.34 for Giardia and from 0.35 - 1.63 for Cryptosporidium, reduction rates that would be considered insufficient if the effluents were directly reused for agricultural irrigation or if they were discharged into surface waters used downstream for irrigation or drinking water supply (Feachem et al., 1983).
The (oo)cyst concentrations measured in dewatered sludges were relatively high, ranging from 23 − 475 cysts/g TS for Giardia and 2 − 44 cysts/g TS for Cryptosporidium, with cyst viability exceeding 60%. Viability of cysts in effluents ranged 52 − 74% for Giardia and 61 − 91% for Cryptosporidium. The authors conclude that: 1) Giardia and Cryptosporidium are ubiquitous pathogens in the study area with a higher prevalence in the population than indicated by diagnostic data; 2) the high numbers of potentially viable (oo)cysts in effluents and dewatered sludge is a consequence of the limited efficacy of wastewater treatment plants; and 3) the results raise concerns regarding environmental impact and public health associated with disposal and reuse of treated wastewater and highlight the need for legislation that includes pathogens in regulation for wastewater reuse (Ramo et al., 2017).
Tyagi et al. (2011) monitored two activated sludge plants, one waste stabilization pond system, and one Upflow Anaerobic Sludge Blanket (UASB)-polishing pond plant in India for fecal coliforms and helminth eggs; the results for the two activated sludge plants are presented in Figure 4. Concentrations of helminth eggs ranged from 23.4 − 71.2 eggs/L in the influent and from nondetectable to 4.3 eggs/L in the effluent, with a mean log10 reduction of 1.40 and 1.55 respectively in the two plants. Fecal coliform log10 reduction was much higher at 2.22 and 3.10 reported for the two systems.
Rose et al. (2004) performed a detailed study of six activated sludge treatment plants in the U.S. by monitoring reduction of indicator bacteria and viruses, enteric viruses, and two protozoan pathogens, Cryptosporidium and Giardia, in individual unit processes. That study thus enables an analyses of pathogen reduction by unit processes through the activated sludge treatment train. Rose et al. (1996) also provided some of the few available data on helminth egg reduction in an activated sludge plant. The data from these two studies for activated sludge treatment without disinfection are presented in Figure 5. Reduction in secondary effluents the log10 reductions ranged from 1.62 - 3.11 for fecal coliforms, 1.85 - 2.44 for enterovirus, 1.23 - 2.67 for Giardia and 1.11 - 1.92 for Cryptosporidium, and less than 2.40 for helminth survival, in terms of (oo)cysts/L is much lower: 0.65 - 8.34 cysts/L compared with 22 - 81 cysts/L for Giardia, and 0.28 - 0.84 cysts/L compared with 38 - 49 oocysts/L for Cryptosporidium. These differences underscore the relationship of influent and effluent concentrations for a given log10 reduction; that is, higher influent pathogen concentrations will lead to correspondingly higher effluent concentrations (survival).
Ben Ayed et al. (2009) measured protozoan cyst (Entamoeba coli, Entamoeba histolytica, Giardia) and helminth egg (Ascaris, Enterobius vermicularis, Taenia) concentrations in 17 activated sludge plants in Tunisia and their results for four conventional systems not using disinfection are shown in Figure 6. Their findings are quite different from those of Ramo et al. (2017) and Rose et al. (2004) and show high concentrations of cysts and eggs both in influents and effluents, with total log10 reductions of only 0.73 and 0.90 respectively. The data from this study demonstrates that log10 reduction can vary greatly among different species of protozoa and helminths. In that study, mean log10 reduction of Entamoeba coli (1.28) was more than double that of Entamoeba histolytica (0.50) and Giardia (0.57), and log10 reduction for both Ascaris (0.90) and Enterobius vermicularis (1.13) were much greater than Taenia (0.64). The authors conclude that wastewater treatment efficiency for parasite reduction needs to be improved to protect public health in Tunisia, where prevalence of protozoan and helminth infections is high (Ben Ayed et al., 2009). Unfortunately the authors did not report on the detailed operation of each plant in terms of BOD and TSS removal, but since they were all operated by National Sanitation Utility it is assumed the plants were functioning adequately in terms of organic matter removal (i.e., BOD and TSS).
Table A1 in the Appendix of this chapter summarizes the data from Ben Ayed et al. (2009), Flannery et al. (2012), Fong et al. (2010), Ramos et al. (2017), Rose et al. (1996, 2004) and Tyagi et al. (2011) in comparison to the original data presented by Feachem et al. (1981, 1983) and the more recent data presented in WHO (2006) on pathogen reduction in conventional activated sludge treatment plants without disinfection. While the reduction and survival of E. coli is similar in the comparison, the newer methods for analysis of specific viruses show higher reductions than originally cited in Feachum et al. (1983): enterovirus can be removed to over 2.0 log10, adenovirus 1.77, and norovirus slightly less than 1.0 log10; nevertheless, the survival of these viruses ranged as high as 103 - 104/L and can pose public health risks depending on final effluent disposal or use. The newer data on Giardia and Cryptosporidium (oo)cysts that was not available to Feachem et al. (1983) show that these pathogens can be removed from 1.0 - 2.5 log10, and from 0.35 - <1.6 log10, respectively. And as with viruses, (oo)cyst survival in treated effluents and produced sludges is a cause for concern as discussed by Ben Ayed et al. (2009) and Ramo et al. (2017). The newer data available on helminth egg reduction show that the effluent concentration of eggs can often be below the limit of detection (which unfortunately is often not reported), but also can be found in very high concentrations in well-operated plants and thus pose a serious public health risk.
Activated sludge plants in countries where disinfection is mandated by regulations rely on final stage disinfection to remove pathogens, and typically do so without monitoring the effectiveness of pathogen reduction in upstream unit processes prior to disinfection. Worldwide mechanized wastewater treatment plants, of which activated sludge is the most common, continue to be designed principally for removal of organic suspended and soluble solids. Disinfection, if used, is incorporated as the final unit process before discharge.
In the study of Ramo et al. (2017) activated sludge plants with chlorine disinfection were also monitored: 7 out of the 23 plants monitored used disinfection as the final unit process. Figure 7 presents results for influent-effluent cyst concentrations and the log10 reduction, and for cyst concentrations found in dewatered sludges for these 7 systems. The influent (oo)cyst concentrations were higher for Giardia than Cryptosporidium (22 − 6,703 versus 67 − 134 (oo)cysts/L, respectively), and the reported log10 reduction ranged from 0.62 - 2.13 for Giardia and from 0.62 - 1.80 for Cryptosporidium. The reduction for Giardia is lower than that found for treatment without disinfection (Figures 3, 5, and 6), and Cryptosporidium reduction is only slightly different, illustrating the ineffectiveness of chlorine disinfection for (oo)cyst reduction, as would be expected. The (oo)cyst concentrations measured in dewatered sludges ranged from 0 − 593 cysts/g TS for Giardia and 0 −10 cysts/g TS for Cryptosporidium.
Kitajima et al. (2014a, 2014b) monitored a conventional activated sludge plant with chlorine disinfection for reduction of select pathogens that included enteric viruses (noroviruses, Aichi virus, sapovirus, rotavirus, enterovirus), Giardia and Cryptosporidium. The treatment plant had a mean flow rate of 155,000 m3/d and served a population of approximately 500,000; influent and final effluent samples were collected once a month for one year.
The results for Aichi virus, norovirus GI and GII, Giardia and Cryptosporidium are shown in Figure 8 for influent and effluent concentrations, and log10 reduction for each pathogen. Effluent E. coli concentrations are also shown although influent concentrations were not monitored. Mean Aichi virus reduction was low, 0.94 log10, and norovirus GI reduction was reported at 1.65 log10. This is 0.8 log10 reduction greater than the reduction reported by Flannery et al. (2012) without disinfection. Norovirus GII reduction, however, at 2.14 log10, is only 0.22 log10 higher than the 0.92 reported by Flannery et al. (2012) without disinfection. The log10 reduction of Giardia and Cryptosporidium, 2.08 and 0.71 respectively, are within the same range as the reductions found by Ramo et al. (2017) without disinfection (Giardia: 1.54-2.34 log10, Cryptosporidium: 0.35-1.63 log10). It should be noted that while effluent concentrations of E. coli were below 150 MPN/100mL (range reported as <1 ̶ 134 MPN/100mL) for the study period as a result of disinfection, viruses and protozoan (oo)cysts were much less influenced by disinfection with Aichi and norovirus concentrations as high as104/L and Giardia and Cryptosporidium as high as 150 cysts/L and 36 oocysts/L respectively.
Kitajima et al. (2014a, 2014b) also monitored a conventional trickling filter plant with chlorine disinfection for reduction of the same select pathogens as discussed in Section 2.3. The mean flow rate was 94,600 m3/d for a population of approximately 250,000. The results from this plant are shown in Figure 9. The results are similar to those of the activated sludge plant but the authors reported that Giardia reduction in the activated sludge plant was statistically different from the trickling filter (2.08 log10 reduction compared to 1.52 log10) but that there was no statistically significant difference for Cryptosporidium reduction (0.71 versus 0.81 log10). Norovirus GI and GII reduction, however, were higher in the trickling filter plant (2.57 and 2.85 log10 reduction respectively) than the activated sludge plant (1.65 and 2.14 log10 reduction). These differences, however, are minor compared to the likely need to remove >3.0 or 4.0 log10 viruses and protozoan cysts if the effluent were reused for agriculture or recreational purposes.
Table A2 in the Appendix shows a comparison of virus and protozoan cyst reduction and survival for the treatment plants that practiced disinfection (Sections 2.3, 2.4) and those that did no (Section 2.2).
Tertiary treatment processes used with activated sludge include filtration (cloth, sand or membrane filters), chemical processes (precipitation) and physical-chemical processes (coagulation, flocculation) commonly followed by disinfection. Tertiary processes have been developed principally to improve effluent quality in terms of physical-chemical parameters for more stringent effluent guidelines (e.g., nitrogen and phosphorus) with less attention on specific pathogen reduction (Burton et al., 2014).
Figure 10 presents the results of Giardia and Cryptosporidium reduction in an activated sludge plant in China with a flow rate of 1,000,000 m3/day (Fu et al., 2010). The plant uses the following tertiary processes: coagulation, flocculation, sedimentation, and sand filtration. Giardia reduction was reported at 3.46 log10 and Cryptosporidium 3.34 log10, with effluent concentrations ranging from <0.2−2.1 cysts/L for Giardia and from 0.2−0.4 oocysts/L for Cryptosporidium. Figure 11 presents the log10 reduction in each unit process (activated sludge, coagulation-flocculation-sedimentation, and sand filtration). The conventional activated sludge process has the highest log10 reduction for both Giardia and Cryptosporidium (1.85 and 1.64 log10 respectively), which is approximately double the log10 reduction for each of the subsequent unit processes (coagulation/flocculation/sedimentation (C/F/S) and sand filter (SF)) for both Giardia and Cryptosporidium (Figure 11). The total log10 reduction reported in this study for both Giardia and Cryptosporidium, however, is among the highest reported in the literature.
Rose et al. (2004), as part of the study discussed previously in Section 2.2 present reduction data for six activated sludge plants employing filtration followed by disinfection. Five of the plants used sand filters for filtration and one used a cloth filter (the filter pore size was not provided); five plants used chlorine disinfection and one used UV radiation.
The results for these treatment plants are presented in Figure 12. Log10 reduction of fecal coliforms ranged from 5.94−7.31, enterovirus from 2.85−4.33, Giardia from 2.15-3.87, and Cryptosporidium from 1.68−3.00. Data from Chaoua et al. (2017) on helminth reduction in an activated sludge plant with sand filtration and UV/chlorine disinfection in Morocco is also shown in Figure 12; total helminth egg influent concentrations averaged 173.8 eggs/L (86.75% Ascaris) and no eggs were ever detected in the final effluent. Effluent concentrations of all pathogens were relatively low as seen in Figure 12. Rose et al. (2004), however, concluded that all monitored effluents contain measureable concentrations of pathogens (except helminths) and exposure to these pathogens carries some risk to public health (Rose et al., 2004).
Figure 11. Log10 reduction of Giardia and Cryptosporidium in individual unit processes in an activated sludge treatment plant using tertiary treatment processes of coagulation, flocculation, and sedimentation with sand filtration. Data from Fu et al., 2010. AS = secondary effluent. C/F/S = coagulation/flocculation/sedimentation. SF = sand filtration.
Figure 13. Reduction of select pathogens as a function of sequential unit processes in an activated sludge treatment train that employed tertiary treatment. AS = activated sludge secondary effluent. AS/F = Activated sludge with filtration. AS/F/D = activated sludge with filtration and disinfection. Data points are mean values with range of values for six treatment plants. Adapted from Rose et al., 2004.
Figure 13 shows that although the mean values for the unit processes of secondary effluent, filtration, and filtration with disinfection each have a >2 log10 reduction for fecal coliforms, this is not the case for enterovirus, Giardia, or Cryptosporidium: for these pathogens the conventional activated sludge process contributes the greatest log10 reduction over the tertiary processes. As seen in Figure 14, the mean log10 reduction in the individual unit processes for enterovirus, Giardia, and Cryptosporidium is highest in the conventional activated sludge process, which is approximately 2.0, 1.0, and 1.0 log10 higher than filtration, respectively. Disinfection, not surprisingly, has very little effect on the protozoan (oo)cysts. Surprisingly, (oo)cyst reduction by filtration is roughly 1.0 log10 less than conventional activated sludge.
In summary, the principal unit process for reduction of enterovirus, Giardia, and Cryptosporidium in a system that employ tertiary treatment is conventional activated sludge, which is at least 1.0 log10 greater than filtration or disinfection.
Figure 14. Pathogen reduction in individual unit process in activated sludge treatment plants with and without tertiary processes. AS = secondary effluent. F = media filtration. D = disinfection. Data points are geometric means with range of values. Adapted from Rose et al., 2004.
Figure 15 presents the results of the study of Ramo et al. (2017) for three activated sludge plants that employed tertiary treatment using microfiltration and disinfection, with two plants using UV and one chlorine disinfection. The authors do not specify the microfilter pore size used. Microfilter pore size can range from 0.05−2.0 μm (Burton et al., 2014) which should filter protozoan cysts if operated properly. The results in Figure 16 for pathogen reduction as a function of sequential unit processes used, however, show that the plants with tertiary treatment processes did not operate any better, and perhaps worse, than those with only conventional activated sludge without disinfection. As an example, Figure 16 shows the log10 reduction for Giardia in conventional activated sludge (AS) is as high or higher than the plants with disinfection (AS/D) or microfiltration/disinfection (AS/MF/D or AS/MF/UV).
Figure 16. Giardia and Cryptosporidium (oo)cyst reduction as a function of sequential treatment processes. AS = activated sludge secondary effluent. AS/D = activated sludge with disinfection. AS/MF/D = activated sludge with microfiltration and chlorine disinfection. AS/MF/UV = activated sludge with microfiltration and UV disinfection. Data show mean and range of log10 reduction for number of plants in in each category (shown in parentheses). Adapted from Ramo et al., 2017.
Figure 17 plots the results of Fong et al. (2010) on adenovirus reduction in activated sludge with disinfection/filtration; reduction without disinfection from their study was discussed previously in Section 2.1. The treatment plant had chlorine disinfection followed by rapid sand filtration, which is an atypical design, especially so if the design intent were to remove pathogens. As seen in Figure 17, the mean concentration measured after disinfection/filtration was higher than the secondary effluent, with a mean of 83,000 copies/L and range of 13,500 − 428,000 copies/L, for a log10 reduction in the complete treatment works of 1.14. The authors state there was no statistical difference in adenovirus concentrations between secondary effluent and the disinfected/filtered effluent for the data after one year of monitoring, and that disinfection/filtration had no effect on adenovirus reduction. The authors detected adenoviruses in the river downstream of the wastewater effluent discharge and concluded that wastewater treatment is inadequate to remove viruses, and as a result adenovirus may represent a public health risk (Fong et al., 2010).
Figure 17. Influence of wastewater unit process treatment train on adenovirus reduction. Data are plotted as mean with range of values during study period. Adapted from Fon et al., 2010.
Fu et al. (2010) report on Giardia and Cryptosporidium reduction in an anaerobic-anoxic-oxic activated sludge plant using ultrafiltration and ozone/chlorine disinfection at another treatment plant in China. The treatment facility had a mean flow rate of 400,000 m3/day. The authors did not report the membrane pore sizes, but they can range from 0.005 − 0.1 μm for ultrafiltration (Burton et al., 2014). Figure 18 shows the results of 18 sampling groups over a two year period for the treatment plant. Influent Giardia and Cryptosporidium concentrations ranged from 170−3,600 cysts/L and 33−430 oocysts/L respectively, and no effluent concentrations were detected for the duration of the study; effluent concentrations for both pathogens were reported as <0.033 (oo)cysts/L, which is interpreted as the limit of detection of the analytical methods used by the authors (Fu et al., 2010). Table 2 shows the mean log10 reduction for each unit process in the treatment train. The authors concluded that the effluent from ultrafiltration can meet any reuse standard in China, and that treatment plants using this process require less use of disinfectants and produce less disinfection byproducts. They cautioned, however, that the expensive cost of operation is a great obstacle to widespread use of ultrafiltration in China (Fu et al., 2010). Ultrafiltration requires 0.2 − 0.3 kWh/m3 of wastewater treated, which is approximately double the energy used in a conventional activated sludge plant with chlorine disinfection (Burton et al., 2014).
Table 3 presents a summary of pathogen reduction and survival in mechanized wastewater treatment plants without disinfection, with disinfection, and with tertiary processes.
Natural wastewater treatment systems utilize natural physical and biological processes such as gravity flow, natural convection of air, photosynthesis (Crites et al., 2006), and anaerobic digestion, to treat wastewaters without external energy and chemical inputs. Natural systems are considered the best options for wastewater treatment in developing countries and small cities worldwide because of their low construction cost, simple operation and maintenance requirements, and contribution to sustainability by using natural energy-producing processes and producing effluents with nutrients that can be reused in agriculture and aquaculture (Kumar and Asolekar, 2016; Verbyla et al., 2015). Natural wastewater treatment systems include wastewater stabilization ponds, hyacinth and duckweed ponds, constructed wetlands, gravity flow anaerobic reactors such as UASBs and anaerobic filters, and gravity flow trickling filters with natural convection for aeration (Arceivala and Asolekar, 2007; Mendonça and Mendonça, 2016). This section presents representative case studies of the most common natural system technologies used worldwide.
Waste stabilization ponds (see Waste Stabilization Ponds chapter) are one of the most common wastewater treatment technologies in both developed and developing countries in small to large cities. Unlike mechanized technologies where disinfection is applied as the final unit process because the systems were designed for treatment of organic matter, pathogen reduction in facultative and maturation ponds, if desired, is incorporated into the fundamental design of the system (Ayres et al., 1992; von Sperling, 2007).
The simplest of all treatment technologies designed to remove pathogens (and organic matter) is a facultative pond followed by a maturation pond. In this configuration the facultative pond is designed to remove helminth eggs by hydraulic retention time or hydraulic overflow rate, and the maturation pond to further remove helminths and other pathogens (von Sperling, 2007). While helminth eggs can be routinely monitored throughout a pond system, fecal coliforms and E. coli are the commonly used indicators of bacterial, protozoan and viral pathogen reduction.
Figure 19 presents the reduction and survival data of helminth eggs and E. coli obtained from 10 facultative/maturation waste stabilization pond systems monitored over a two-year period in Honduras (Oakley, 2005). The mean flowrates ranged from 218 − 5,150 m3/d, with populations from 1,000 − >23,000 and theoretical hydraulic retention times ranging from 7.2 − 34.8 d. Mean log10 reduction of E. coli for the 10 systems was 2.92, but the range varied from a low of 1.87 to a high of 4.84; one system consistently had effluent concentrations during six sampling events in winter and summer <1,000 MPN/100mL with a geometric mean of 271 MPN/100mL.
Helminth egg concentrations, principally Ascaris, were always measured in the influent but never detected in the effluent; unfortunately the level of detection was not reported and log10 reduction could not be calculated. Facultative pond sludges all contained helminth eggs ranging from 1.0 − 4,473 eggs/g TS; the viability of eggs in one facultative pond sludge sampled at different locations ranged from 0.4 − 25.0 eggs/g TS (Oakley, et al., 2012). The presence of Shigella was monitored throughout the study but was never detected (Oakley, 2005). Oakley (2005) concluded that although several pond systems performed well, others did not and a major problem was adequate design and construction coupled with sustainable operation and maintenance on the part of the municipalities.
A common configuration where the effluent is to be reused for unrestricted irrigation is a facultative pond followed by two maturation ponds in series (F/M/M); the design strategy is to maximize pathogen reduction through extended hydraulic retention time for sedimentation of particles and exposure to solar radiation.
Figure 20 shows one of the few studies where bacterial pathogen reduction, in this case Vibrio cholera 01, was monitored throughout a F/M/M system and compared with fecal coliform reduction during the 1991 cholera epidemic in Lima, Peru (Castro de Esparza, et al., 1992). During the peak of the epidemic in March 1991, Vibrio cholera 01 was detected at concentrations as high as 4.3 x 105 MPN/100mL in one of the principal wastewater collectors in Lima (Castro de Esparza, et al., 1992). As a result, a monitoring study of the wastewater stabilization pond system in San Juan de Miraflores, where the final effluent was being reused in aquaculture and agriculture, was implemented from May - August 1991. During the monitoring period the flowrate into the system was controlled to maintain a hydraulic retention time of 50 days, and the raw wastewater influent and the effluent from each pond was monitored for fecal coliforms and Vibrio cholera 01; the aquaculture ponds were also monitored for fecal coliforms and Vibrio cholera 01, and tissue from tilapia for the presence of Vibrio cholera.
The results in Figure 20 show a 4.26 log10 reduction of Vibrio cholera 01 and a 4.89 log10 reduction for fecal coliforms, with effluent concentrations of 0.1 MPN/100mL and 2.11 x 104 respectively (geometric means). (It should be noted that while the final effluent concentrations of Vibrio cholera 01 are low at 0.1 MPN/100mL, during the peak of the epidemic raw wastewater concentrations measured at 4.3 x 105 MPN/100mL would have given an effluent concentration of 23.5 MPN/100mL for the measured 4.26 log10 reduction.) Vibrio cholera 01 was found in the aquaculture ponds at very low concentrations (0.03 MPN/100mL), and the presence of Vibrio cholera was detected in tilapia tissue (skin, gills, intestines), but the test did not distinguish between Vibrio cholera 01 and Vibrio cholera non-01 (Castro de Esparza et al., 1992). Figure 21 shows the serial reduction of fecal coliforms and Vibrio cholera 01 through each pond in series during the monitoring period.
Figure 21. Reduction and survival of fecal coliforms and Vibrio cholerae 01 at the San Juan wastewater stabilization pond system in Lima, Peru during the cholera epidemic from June - August 1991 (Castro de Esparza, et al., 1992). The influent flowrate was controlled during the monitoring period to maintain a hydraulic retention time of 50 days (HRT values shown in the figure are for each pond in series which add up to 50 days). Mean water temperature in the system averaged 17.5°C.
An excellent example of an F/M/M system with agricultural reuse that has operated for decades is the Campo Espejo system in Mendoza, Argentina (Figure 22). The system was built in 1995 and consists of 12 batteries of F/M/M ponds in series; the design flowrate is 146,620 m3/d with a theoretical hydraulic retention time of 21.1 days and a potential irrigated area of 4,000 ha (Barbeito Anzorana, 2001; Vélez, et al., 2002). The system has been continuously monitored for fecal coliforms and helminth eggs to meet the WHO guidelines for agricultural reuse. Figure 22 shows the mean results for daily monitoring during one year (January - December 2000) (Barbeito Anzorana, 2001). Mean fecal coliform concentrations were reduced from 1.1 x 107 MPN/100mL to 75 MPN/100mL, a 5.14 log10 reduction, and helminth eggs from a mean of 26 eggs/L to <1 egg/L, thus meeting the WHO effluent guidelines of 1989 of <1,000 fecal coliforms/100mL and <1 helminth egg/L for unrestricted irrigation (Vélez et al., 2002).
Tyagi et al. (2008) reported on the reduction of E. coli, Salmonella and helminth eggs in a facultative/maturation pond system in India that was monitored for 2 years; the system had a facultative pond followed by three maturation ponds in series (Figure 23). Mean E. coli influent and effluent concentrations were 2.2 x 106 MPN/100mL and 1.8 x 103 MPN/100L respectively, giving a log10 reduction of 3.09. Salmonella, in contrast, exhibited a much lower 1.24 log10 reduction, with means of 408 CFU/100mL in the influent and 27.5 in the effluent; this is an important example of how different genera of bacteria can exhibit different log10 reduction values when monitored simultaneously. Helminth eggs averaged 16 eggs/L in the effluent and were detected in the effluent at low concentrations with a mean value of 0.18 eggs/L, giving a log10 reduction of 2.48. The system performed well in terms of BOD and TSS removal, and met the WHO guidelines (WHO, 2006) for restricted reuse. The authors noted that fecal indicator reduction was higher in summer than winter by approximately 2.0 log10 units, and this was best correlated with suspended solids removal. This waste stabilization pond system performed better in helminth egg reduction than the two activated sludge plants discussed previously by Tyagi et al. (2011) (previously show in Figure 4).
Many existing waste stabilization pond systems in developing countries are improperly designed, lack adequate operation and maintenance, or are abandoned entirely, and Figure 24 provides an example of the performance of an improperly designed, operated and maintained F/M/M system in Bolivia (Symonds et al., 2014; Verbyla et al., 2013a,b). The system did not have pretreatment and was poorly designed hydraulically, with the mean hydraulic retention time (HRT) based on tracer studies estimated to be 50% of the theoretical HRT (≈13 days versus 27 days) (Verbyla et al., 2013b). Composite samples were taken on two different days in June 2012 and analyzed for enteric viruses: mean enterovirus concentrations were 4.2 x 104 infectious units/L (IU/L) for the influent and 37 IU/L in the final effluent for a log10 reduction of 3.06; norovirus GI and rotavirus concentrations, however, were not reduced through treatment and were approximately 1.0 x 106 in both influent and effluent samples (Symonds et al., 2014). Giardia and Cryptosporidium were monitored in one composite sample in June 2012, with influent and effluent concentrations of 160 cysts/L and 23 cysts/L for Giardia, for a log10 reduction of 0.84; Influent and effluent Cryptosporidium oocyst concentrations were 6.0 and 4.0 oocysts/L respectively, for a 0.18 log10 reduction (Verbyla et al., 2013a). Helminth eggs were measured once in June 2011 and 2012, and both Ascaris and Taenia were detected in the influent at up to 306 and 3,006 eggs/L respectively, with Taenia detected in one effluent sample at 45 eggs/L, for a 1.82 log10 reduction (Verbyla, et al., 2013b). The authors conclude that although the system effluent was not being used for irrigation, wastewater reuse is common in Bolivia and the effluent from systems such as this one poses a health risk from viruses, protozoa and helminths if it were to be reused without further treatment or in-field management in agriculture (Symonds et al., 2014; Verbyla et al., 2013a, 2013b).
An anaerobic pond followed by facultative and maturation ponds in series system (A/F/M) is a common design in tropical and semi-tropical climates where the purpose of the anaerobic pond is to remove organic matter (and capture methane in some designs) in order to reduce the total area of the facultative pond (Mara, 2003). Figure 25 presents the results of the A/F/M system at the International Institute for Water and Environmental Engineering in Ouagadougou, Burkina Faso, (Konaté et al., 2013). The system, which had a mean flowrate of 55 m3/d with a hydraulic retention time of 18 d, was monitored for protozoan cysts and helminth eggs twice a month for one year (2007-2008) in both wastewater and pond sludges.
The protozoan cysts encountered in raw wastewater included Entamoeba coli, Entamoeba histolytica, and Giardia lamblia, with mean influent concentrations of 85.8, 18, and 7 cysts/L for E. coli, E. histolytica, and Giardia respectively, and a total influent cyst concentration of 111 cysts/L with range of 4 − 327 cysts/L; no cysts were detected in the final effluent, and the log10 reduction was estimated at >1.87 (detection limit was not reported). Mean influent helminth eggs were reported at 15.7 eggs/L, with a range of 5 − 36 eggs/L; no eggs were detected in the final effluent, and the log10 reduction was estimated at >2.89 (detection limit was not reported). As shown in Figure 25, protozoan cysts were found in all pond sludges, gradually decreasing from 120/10/7 cysts/g TS in the A/F/M ponds respectively. Helminth eggs were also found in all pond sludges, decreasing from 556/32/12 eggs/L for the A/F/M ponds respectively, with percent viability of 36%/16%/0% in the A/F/M ponds. The authors conclude that the ponds perform well in protozoan cyst and helminth egg reduction from wastewater with respect to potential wastewater reuse in Burkina Faso, but great caution is needed with the management of pond sludges as a result of high concentrations of cysts and viable helminth eggs (Konaté et al., 2013).
Reinoso et al. (2011) studied the performance of an A/F/M system in Spain in terms of E. coli, Cryptosporidium oocyst, Giardia cyst, and helminth egg reduction with the goal to improve design, operation and maintenance of waste stabilization pond systems. The system had a mean influent flowrate of 3,200 m3/d with a theoretical hydraulic retention time of only 5.5 d, and was monitored weekly from December 2003 to September 2004, with results divided for into cold (Tinfluent = 13.1°C) and hot periods (Tinfluent = 20.1°C). The results are presented in Figure 26.
Mean E. coli influent concentrations ranged from 2.0 x 105 CFU/100mL in the cold period to 3.16 x 106 CFU/100mL in the hot period, and final effluent concentrations of 251 to 31.6 CFU/100mL in the cold and hot temperature periods respectively. These results gave a 2.90 log10 reduction for the cold period and a 5.00 log10 reduction for the hot period and is one of the few studies demonstrating the effect of season on E. coli reduction; the authors monitored other bacterial and viral indicators (total coliforms, faecal streptococci, and coliphages) but only found a seasonal variation with E. coli (Reinoso et al., 2011). Cryptosporidium oocysts had mean influent and effluent values of 14.9 and 0.4 oocysts/L, giving a 1.57 log10 reduction, while Giardia influent and effluent concentrations averaged 67.1 and 0.7 cysts/L, for a 1.98 log10 reduction. Helminth egg influent concentrations were low, with a mean of only 1.8 eggs/L, and no eggs were detected in the final effluent.
In a similar A/F/M in Morocco with a mean flowrate of 2,000 m3/d, Chaoua et al. (2017) found mean influent helminth egg concentrations of 95.01 eggs/L and a final effluent concentration of 5.0 eggs/L, for a 1.28 log10 reduction; unfortunately the authors did not report the theoretical hydraulic retention time of the system or provide any details as to its performance in other conventional water quality parameters such as BOD and TSS.
Figure 27 presents the results of two A/F/M pond systems in the Cochabamba Valley of Bolivia that were abandoned by the responsible municipality but are still in operation, with 100% of the effluents being directly reused for agricultural irrigation during the dry season (Verbyla et al., 2016). One system was an anaerobic pond followed by a facultative and two maturation ponds in series (referred to here as A/F/M1/M2), and the other an A/F/M pond system. The A/F/M1/M2 system had a mean flowrate of 750 m3/d, and because of sludge buildup in all the ponds, it was estimated the HRT was approximately 1.0 d during the period of the study; the A/F/M system had a mean flowrate of 2,730 m3/d and an estimated HRT of 2.5 d due to sludge buildup (the pond had not been desludged since its construction in 1995), poor construction, and possible modification for diversion to irrigated fields (Verbyla et al., 2016).
Grab samples were collected the influent and final effluent of both systems for 2 months in 2012 and 3 months in 2013 and analyzed for E. coli, Giardia, Cryptosporidium, and helminth eggs (Verbyla et al., 2016). The data presented in Figure 23 show the log10 reductions are minimal for both systems, with effluent concentrations ranging 1.3 x 106 − 2.1 x 107 CFU/L for E. coli, 14 − 2,530 cysts/L for Giardia, 0.4 − 11 oocysts/L for Cryptosporidium, 31 − 350 eggs/L for helminth eggs. The authors conclude that the wastewater stabilization pond systems may pose a serious public health risk to farmers using the effluents without further treatment or on-farm management. They also conclude the waste stabilization pond systems, which are the simplest and most cost-effective technologies for developing countries, cannot produce effluents that can be safely reused in agriculture unless they receive adequate operation and maintenance or additional on-farm management techniques to reduce risk to workers and consumers (Verbyla et al., 2016).
UASB reactors (see chapter titled Anaerobic Sludge Blanket Reactors) are one of the important natural system technologies that utilize anaerobic processes to degrade organic matter without energy input while producing methane as an alternative source of energy, and they have been widely implemented for domestic wastewater treatment in the last 20 years in countries such as India and Brazil (Arceivala and Asolekar, 2007; Chernicharo, 2007; Mendonça and Mendonça, 2016). UASB effluents, however, require post-treatment to further reduce pathogens and organic matter, and the most common post-treatment sanitation technology consists of waste stabilization ponds in series (Arceivala and Asolekar, 2007; Chernicharo, 2007). The term "polishing pond" is frequently used in the literature for UASB-pond systems and to avoid confusion in the discussion below the first pond following the UASB will be termed a secondary facultative pond (referred to here as F2) (Mara, 2003; von Sperling, 2007), which is followed by maturation ponds in series (e.g., referred to as M1, M2, etc.).
Tyagi et al. (2011) report on fecal coliform and helminth egg reduction for a UASB/F2 system in Morocco (Figure 28). The system had a mean flowrate of 38,000 m3/d and a total hydraulic retention time of only 1.3 d, and used baffles in the secondary facultative pond. Fecal coliform mean influent concentrations were 3.1 x 106 MPN/100mL with mean effluent concentrations of 1.1 x 104 MPN/100mL, giving a 2.45 log10 reduction. Mean influent and effluent helminth egg concentrations were 46.4 eggs/L and 0.13 eggs/L respectively, yielding a mean log10 reduction of 2.55.
Helminth egg reduction data from two full size UASB/F2 systems in Brazil were also reported by von Sperling, et al., (2005). The mean total HRT was 4.7 d in one system and 20.9 d in the other. Helminth egg influent data were not reported, and effluent egg concentrations were only detected in one system (high of 1.3 eggs/L) that had a mean effluent concentration of 0.2 eggs/L; helminth eggs were never detected in the final effluent of the other UASB/F2 (von Sperling et al., 2005). The authors conclude that the final effluent mean values of both treatment plants easily met the 1989 WHO guidelines for helminth eggs (<1 egg/L).
Figure 29 shows the pathogen reduction results for a UASB/F2/M system in Bolivia that was monitored for enterovirus, norovirus, rotavirus, Giardia, Cryptosporidium, and helminth eggs (Taenia, Ascaris, Trichuris, Hookworm) during sampling events in June 2011 and 2012 (Symonds et al., 2014; Verbyla et al., 2013a). The system was poorly maintained, with the measured flowrate increasing from 43 m3/d in 2007 to 124 m3/d in 2012 (Verbyla, et al., 2013).
Two composite samples taken on different days were analyzed for enteric viruses: mean enterovirus concentrations were 6.2 x 104 IU/L in the influent and 9.1 x 103 IU/L in the effluent, giving a log10 reduction of 0.83; norovirus concentrations did not decrease from influent to effluent; and rotavirus influent concentrations were approximately 1.0 x 105 copies/L and effluent concentrations approximately 1 x 104 copies/L, giving a 1.0 log10 reduction (Symonds, et al., 2014). Giardia and Cryptosporidium were monitored in one composite sample in June 2012, with mean influent and effluent concentrations of 2.4 x 102 cysts/L and 7.3 x 101 cysts/L respectively for Giardia, giving a 0.52 log10 reduction; mean influent and effluent Cryptosporidium oocyst concentrations were 9.5 and 6.5 oocysts/L, for a 0.16 log10 reduction. Helminth eggs were measured twice in composite samples taken in June 2011 and 2012; Taenia, Ascaris, Trichuris and hookworm were detected in the influent with a mean of 1,809 eggs/L, and only Taenia and Ascaris were detected in one effluent sample at approximately 1,200 eggs/L. Importantly, 236 helminth eggs/g TS were detected in the UASB sludge, with 33% viability (i.e., 78.7 eggs/g TS) (Verbyla et al., 2013). The authors conclude that although the system effluent was not being used for irrigation, wastewater reuse is common in Bolivia and the effluent poses a health risk if it were to be reused in agriculture without further treatment or on-farm practices to further reduce risk from pathogen exposure (Symonds et al., 2014; Verbyla et al., 2013a).
Días et al. (2014) report on the 10-year monitoring results for a UASB/F2/M1/M2 system in Brazil serving 250 population equivalents (Figure 30). Hydraulic retention times in each pond varied from 1.4 − 6.1 days during the study. Throughout the 10-year monitoring period geometric mean E. coli concentrations were 2.46 x 108 MPN/100mL in the influent and 4.50 x 102 MPN/100mL in the final effluent, yielding a 5.74 log10 reduction. Helminth egg concentrations were never detected in the final effluent (detection limit was not reported) and influent concentrations were not reported; ranges of helminth egg concentrations in raw wastewater and secondary facultative pond sludge reported at treatment plants near this site are presented in Figure 30. Assuming a detection limit of 1 egg/L, the log10 reduction of helminth eggs could range from 1.41 − 2.40 for influent concentrations of 254 and 26 eggs/L respectively. The authors conclude that this system has maintained excellent results for 10 years and meets the WHO guidelines for unrestricted irrigation (Días et al., 2014).
Tables 4 and 5 present a summary of pathogen reduction and survival in the natural wastewater treatment systems. Table 4 presents the data from systems that are assumed to be well designed, operated and maintained from the descriptions in the literature, while Table 5 presents the data from systems known to be improperly designed, overloaded, or lacking adequate operation and maintenance.
Tables 6 and 7a&b summarize the log10 reductions for the cited mechanized and natural wastewater sanitation systems in comparison to the theoretical reductions presented in the WHO guidelines for wastewater use in agriculture, which cite Feachem et al. (1983) as one of the key references (WHO, 2006).
The results in Table 6 for enterovirus show that the low end of log10 reduction ranges for activated sludge with the tertiary processes of sand filtration/disinfection (2.85 log10) is only slightly higher than the high range reported for conventional activated sludge without disinfection (2.44 log10). Furthermore, reduction of Aichi virus is very low at 0.94 and 0.99 log10 reduction compared to norovirus for activated sludge and trickling filter plants with disinfection. Virus reduction in all disinfection and tertiary processes reported does not approach the 5 - 6 log10 reduction cited by WHO (2006).
Giardia and Cryptosporidium (oo)cyst reduction is low in activated sludge and trickling filter plants with chlorine disinfection as would be expected, and the log10 reduction is in the same range as activated sludge plants without disinfection. The tertiary treatment processes generally performed better, and (oo)cysts were never detected in the final effluent of the system with ultrafiltration; the low range values for sand filtration/disinfection, however, are similar to activated sludge without disinfection (1.40 and 0.46 log10 reduction for Giardia and Cryptosporidium respectively). The protozoan (oo)cyst data also show that reduction can vary greatly among different pathogens, with, for example, in the study by Ben Ayed et al. (2009) in four activated sludge plants, mean log10 reduction of Entamoeba coli was more than double that of Entamoeba histolytica and Giardia (Figure 6).
Emphasis on bacterial indicator reduction, which is a common regulatory requirement worldwide, can mask the low reduction of pathogens in treatment plants with disinfection and tertiary processes. The results from the study of Kitajima et al. (2014a) (previously summarized in Figures 8 and 9) are illustrative on this point and are shown in the following Table X:
The information above shows the two treatment plants (employing activated sludge and trickling filters) in Arizona (U.S.) met their discharge requirements for BOD and TSS and chlorinated the final effluent to meet E. coli regulations. However, they never monitored for protozoan (oo)cyst pathogens in the final effluent (Kitajima et al., 2014a) and one can see from the information above the log10 reduction of protozoan (oo)cyst pathogens is not near that what was observed for E. coli. With the exception of the activated sludge plant with ultrafiltration in Table 8 (shown in Figure 18), every other mechanical system, including activated sludge and trickling filters with disinfection, and activated sludge with disinfection and various tertiary processes (coagulation/flocculation/sedimentation/sand filtration), had measurable concentrations of pathogens in the final effluent. While effluent concentrations in developed countries may often be low, this is not the case in developing countries where they can be quite high, even in a well-functioning plant, as shown in the study in Tunisia by Ben Ayed et al. (2009) (previously summarized in Figure 6).
Well-operated activated sludge plants meeting their bacterial effluent requirements with chlorine disinfection, however, can also contribute to serious public health risks in developed countries. The available evidence suggests that the oocyst source for the 1993 Milwaukee (U.S.) outbreak of cryptosporidiosis, which infected more than 400,000 persons, was of human origin from the Jones Island Wastewater Treatment Plant, which at the time was a conventional activated sludge plant with chlorine disinfection that discharged its effluent into Lake Michigan, which was the source water for the city’s drinking water supply (Zhou et al., 2003; Eisenberg et al., 2005).
As was discussed previously and shown in Figures 14 and 16, the major unit process for pathogen reduction may often be the activated sludge process itself upstream from tertiary processes. For tertiary processes to have a more significant effect on pathogen reduction they must be well-designed and operated, which is likely the case with the high range log10 reduction values shown in Table 8. It is evident, however, that full-scale activated sludge plants without disinfection and tertiary processes can have log10 pathogen reductions similar to some plants using tertiary processes, and tertiary processes are not necessarily a guarantee of increased pathogen reduction under field conditions of otherwise well-functioning plants (in terms of BOD and TSS removal) as shown in the data above by Kitajima et al. (2014a).
The summary log10 reduction results in Table 7a show that properly functioning full-scale waste stabilization pond systems and UASB/waste stabilization pond systems can remove ≥2.45 log10 bacterial indicators up to 5.74 log10, and that one three-pond system removed 4.26 log10Vibrio cholerae 01, which correlated well with a 4.89 log10 reduction of fecal coliforms (Figure 20). A four pond system, however, removed 3.00 log10E. coli, but only reduced Salmonella by a mean 1.24 log10 during the same monitoring period (Figure 23).
For protozoa (oo)cyst reduction, a three pond system removed 2.90 log10E. coli in the colder season and 5.00 log10 in the hotter season, but Giardia and Cryptosporidium were only reduced 1.57 and 1.98 log10 respectively. While one system approached the 6 log10 reduction for bacterial pathogens/indicators needed for agriculture reuse as reported by WHO (2006), most were below it in the range 2.90 − 5.00; the reported protozoan (oo)cyst reductions in the cited studies were all <2.00 log10, well below the maximum 4.0 log10 reported by WHO (2006) for agricultural reuse. Helminth egg reduction was sufficient in all systems to reduce effluent concentrations below the limit of detection or < 1 egg/L if the detection limit was reported.
With the exception of helminth eggs and protozoan (oo)cysts in one system, almost all well-performing natural systems had measurable concentrations of pathogens in the final effluent. The problem of differential pathogen reduction in relation to indicators and other pathogens has important implications also for apparently well-performing natural systems as with mechanized ones. The table below shows the results for reduction of indicators and pathogens in three different waste stabilization pond systems shown in Table 7a:
While the reduction in fecal coliforms and E. coli is relatively high, the log10 reduction for Salmonella, Giardia, and Cryptosporidium is not, and one cannot assume the effluent would be safe in terms of pathogens if the effluent were to be reused, for example, in agriculture. Even the 4.26 log10 reduction of Vibrio cholerae 01, which was achieved with a well-performing F/M1/M2 pond system with a hydraulic retention time of 50 days, would have had an effluent with a concentration of 23.5 MPN/100mL if the influent had entered with the concentration at the peak of the epidemic of 4.3 x 105 MPN/100mL Vibrio cholerae 01.
Unlike mechanized wastewater treatment systems, whose processes can be continuously modified and controlled during routine operation, there is very little operational control of the process in natural wastewater treatment systems after they are constructed. Table 7b shows the results for full-scale waste stabilization pond systems and UASB/waste stabilization pond systems that are poorly functioning with low log10 reductions and significant concentrations of measured pathogens in the final effluent, which for several systems was being used for agricultural reuse (Verbyla et al. 2016). Unfortunately, improper design, overloading, and inadequate operation and maintenance of natural wastewater treatment systems in developing countries is very common. For example, a recent study by the National Administration for Water in Peru estimates that only 15.8% of all treated wastewater in Peru meets effluent quality levels that protects public health in terms of thermotolerant coliforms effluent concentrations (ANA, 2017).
Curtis (2003) has stated that "recreational and wastewater reuse standards assume that the pathogens occur at much lower densities than the indicator and that the reductions of the indicator by 3 − 4 logs results in the elimination of bacterial pathogens." This concept is commonly applied in the U.S. and has also been applied to viral and protozoan pathogens in the WHO guidelines for wastewater reuse in agriculture, which recommends using E. coli for verification monitoring in lieu of monitoring specific pathogens, which is much more difficult and likely not possible in much of the developing world (WHO, 2006). This strategy should be revisited in light of recent work showing the reduction and survival of pathogens in various treatment processes around the world: Different pathogens can exhibit vastly different log10 reductions in both natural and mechanized wastewater treatment systems. One recommended strategy would be to focus on pathogens that are a local or regional public health concern, such as parasitic infections with protozoa and helminths instead of bacterial or viral pathogens, in the management of wastewater treatment systems where agricultural reuse is an important concern (Verbyla et al., 2015). This strategy can apply as well to mechanized and natural wastewater treatment systems alike.