Using indicators to assess microbial treatment and disinfection efficacy


Published on:
September 13, 2019

Chapter info

Copyright:


This publication is available in Open Access under the Attribution-ShareAlike 3.0 IGO (CC-BY-SA 3.0 IGO) license (http://creativecommons.org/licenses/by-sa/3.0/igo). By using the content of this publication, the users accept to be bound by the terms of use of the UNESCO Open Access Repository (http://www.unesco.org/openaccess/terms-use-ccbysa-en).

Disclaimer:

The designations employed and the presentation of material throughout this publication do not imply the expression of any opinion whatsoever on the part of UNESCO concerning the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries. The ideas and opinions expressed in this publication are those of the authors; they are not necessarily those of UNESCO and do not commit the Organization.

Citation:

Momba, M., Edbon, J., Kamika, I. and Verbyla, M. 2019. Using indicators to assess microbial treatment and disinfection efficacy. In: J.B. Rose and B. Jiménez-Cisneros, (eds) Global Water Pathogen Project. http://www.waterpathogens.org (A.Farnleitner, and A. Blanch (eds) Part 2 Indicators and Microbial Source Tracking Markers) http://www.waterpathogens.org/book/using-indicators-assess-microbial-treatment-and-disinfection-efficacy. Michigan State University, E. Lansing, MI, UNESCO.
https://doi.org/10.14321/waterpathogens.9

Acknowledgements: K.R.L. Young, Project Design editor; Website Design: Agroknow (http://www.agroknow.com)

Last published: September 13, 2019
Authors: 
Maggy Momba (Tshwane University of Technology South Africa )James Ebdon (University of Brighton)Ilunga Kamika (University of South Africa)Matthew Verbyla (San Diego State University)

Summary

One of the primary goals of this chapter is to provide the reader with a broad understanding and appreciation of the various roles indicators play in the design, risk assessment and performance monitoring of commonly used wastewater treatment and disinfection processes. The chapter outlines how different indicators such as faecal bacteria, bacteriophages, bacterial spores, parasites and helminth eggs can be used to assess the efficacy of microbial treatment and disinfection in a range of natural and engineered systems. The usefulness of the various indicators will be explored by assessing their presence and behaviour in raw sewage, primary, secondary, tertiary treated, final effluent, sludge and in reclaimed wastewater used for different reuse applications, such as irrigation of crops.
Using the very latest international scientific research, indicator performance, reliability, ease of use, safety, cost, and suitability for use in resource-limited and/or emergency settings will be assessed. Therefore, the content should provide a state-of-the-art reference resource to help guide water managers, regulators, stakeholders and anyone interested in gaining a better understanding of the advantages and limitations of current treatment indicators.
The chapter will summarize the extensive range of methods, from the enumeration and detection of traditional faecal indicators, such as Escherichia coli, Enterococcus, alternative indicators like spores of Clostridium perfringens and coliphages, to the application of novel surface modified “micro mimics” used for elucidating pathogen attenuation and transport in natural and engineered treatment systems. Readers should also gain an insight into the fate, behaviour and transport of various indicators and the different ways they may be applied in natural and engineered systems e.g. as surrogates for pathogens, as regulatory parameters, and as treatment performance parameters. As such, the content should help water managers, engineers, and water and sanitation specialists to decide on the most appropriate parameter, or group of parameters to use during microbial treatment and disinfection efficacy investigations.
Reference to international case studies by leading experts in the field should help the reader gain a broader insight into the potential application and geographical stability of the most promising indicators of treatment and disinfection efficacy. As such, it offers an international perspective on both current research and provides guidance on the potential utility and future use of the various microbial monitoring tools identified. The chapter will also be furnished with examples on the microbial reductions achieved using both intrinsic indicators as well as ‘spiked’ indicators in both large-scale and small-scale natural and engineered treatment systems, found in different parts of the world.
The effectiveness and suitability of microbial indicators and traditional surrogates for predicting pathogen fate and transport, will also be investigated with respect to high and low-income urban, peri-urban and rural settings, in order to identify the most useful (repeatable, reproducible) approaches and where possible highlight ‘best practise’. The chapter concludes with a summary of the current state-of–the-art, before identifying existing knowledge gaps and future challenges facing practitioners tasked with the further development and application of indicators of treatment efficacy.

1.0 Definition, sources, and composition of wastewater

In brief, wastewater generally refers to liquids and waterborne solids from domestic, industrial or commercial origins as well as other liquids included in human activities, which are typically discharged to a sewerage system. Table 1 illustrates the three common categories of wastewaters and their sources and compositions. Although, sewage generally refers to water containing only sanitary wastes, it technically represents any wastewater, which passes through a sewer (Aizenchtadt et al., 2008). The type and volume of wastewater generated is determined by both population size, water usage and the combination of surrounding domestic, recreational and industrial activities, all of which affect discharge patterns as well as the chemical and microbial status of the treated effluent (Teklehaimanot et al., 2015).

Table 1. Municipal, industrial and agriculture wastewater sources and composition

No.

Types of Wastewater

Sources

Composition

1

Domestic wastewater (Municipal)

  • Blackwater
  • Greywater

Toilet

Kitchens, laundry, washing/bathing

Blackwater, greywater

Urine, feces, toilet paper

Commercial wastewater (Municipal)

  • Sanitary wastewater
  • Commercial activity
Restaurants, workshops, etc.

Plant and food waste

oils

Inflow and Infiltration (Municipal) wastewater

  • Storm flows/street washing
  • Groundwater infiltration
Combined systems, separate systems – cross-connections or illegal connections in fractured pipes and manholes Sand, grit, hydrocarbons, metals, animal wastes etc.

2

Industrial wastewater

Manufacturing processes, electrolysis, heavy metals, equipment, cleaning, cooling systems

Highly variable based on the industry

3

Agricultural wastewater

Agricultural activities

Animal wastes/run-off and redundant agricultural products

Source: Based on Aizenchtadt et al., 2008

1.1 Wastewater Treatment

According to Mara (2004), in order to set up an efficient waste management system, proper identification and characterization of the influent entering a Wastewater Treatment Plant (WWTP) is essential. In addition to the physical and chemical characteristics of the wastewater, it is also important to understand the biological characteristics at the various stages through the treatment train if pathogen removal is to be optimised and impacts on the immediate and downstream environment (into which the treated wastewater is to be discharged) minimised.

Wastewater treatment can be regarded as the process of removing physical, chemical and microbiological contaminants from any kind of wastewater, with the aim of producing a final product (effluent) of a quality suitable for either disposal, or increasingly reuse purposes. To achieve this, different processes are applied depending on the main purpose of the treatment and final use (see Sanitation Technologies). Indicators and index (model) organisms can be used to elucidate removal and behaviour during

  • preliminary treatment, i.e., removal of coarse solids, grit and grease;
  • primary treatment i.e. removal of suspended solids and particulate organic matter;
  • secondary (or biological) treatment, i.e., removal of biodegradable organic matter (in solution or suspension) and suspended solids; and finally
  • tertiary treatment, i.e., removal of specific compounds, such as nutrients, pathogens, etc.

While the goal of WWTPs is to safeguard the quality of the aquatic environment and to conserve water resources, the key priority is the removal of pathogenic microorganisms in order to safeguard human health (typically through compliance with national/international effluent discharge standards). As a general rule, proper implementation of this management strategy results in the protection of water quality, the reduction of costs associated with drinking water treatment and the control or elimination of waterborne diseases. However, uncontrolled sewage discharges and effluent from poorly managed, or malfunctioning WWTP constitutes a major source of water pollution and public health risk. The potential risks can be exacerbated by increases in population, industrialisation and urban development as well as by climatic or seasonal variations. Therefore, indicators and index organisms are needed to ensure that adequate levels of treatment are achieved and maintained in order to make sure that final effluents comply with the necessary environmental standards.

1.2 Criteria for Choosing an Appropriate Indicator

The basic criteria for selecting an appropriate indicator have been widely described elsewhere (e.g. Berg, 1978; NHMRC, 2001; Payment, 1998; WHO, 2011). Table 2 provides a brief description of the key criterion for indicator organisms used for monitoring wastewater treatment and disinfection processes.

Table 2. The list of ideal criteria of an appropriate indicator of wastewater

Criteria of an Appropriate Indicator Organism

1

Is it detectable using simple, rapid and low-cost laboratory techniques?

2

Is it found in high concentrations in municipal wastewaters?

3

Is it present in wastewater in higher numbers than pathogens?

4

Is it unable to multiply in the environment (and within treatment system)?

5

Does it have similar survival characteristics to pathogens in treatment process?

6

Is it non-pathogenic and safe for the analyst to use?


However, to date, no single indicator has been identified which fully fulfils this set of criteria. It is therefore important to understand exactly what a specific indicator is to be used for, prior to establishing what information it may or may not be able to provide regarding treatment and, or disinfection efficacy. Appropriate indicators may be further sub-grouped into (i) process indicators, (ii) fecal indicators and (iii) index and model organisms depending on both which indicators are used and how they are applied (Table 3). More details on fecal indicators can be found in Harwood et al. (2017) and https://www.waterpathogens.org/book/bacterial-indicators).

Table 3. Definition of indicators and index microorganisms

Definition of Indicators and Index Microorganisms

Group

Definition

Process indicator

A group of organisms that demonstrates the efficacy of a process, such as coliforms for chlorine disinfection.

Fecal indicator

A group of organisms that indicates the presence of fecal contamination, such as E. coli.

Hence, they only infer that pathogens may be present.

Index organisms

A microbial group or species indicative of pathogen presence, such as E. coli as an index for Salmonella.

Model organisms (surrogates)

A microbial group or species indicative of pathogen behaviour, such as F-RNA phages as models for the reduction of human enteric viruses in treatment and disinfection systems.

Source: Ashbolt et al., 2001; WHO, 2001; WHO, 2002


The following sections outline many of the most widely used indicators, index organisms and model (surrogate) organisms and should facilitate the selection of the most appropriate approach (or group of approaches) for a specific purpose and/or setting. Therefore, whilst the Sanitation Technologies section (See http://www.waterpathogens.org/node/102) can help ascertain which form of treatment is most appropriate for specific pathogens, a given situation or location, this chapter aims to help establish i) which indicator or index approach to use, ii) which applications to use them for, iii) which pathogens are of most concern and which are most closely correlated (see Part Three: Specific excreted pathogens: Environmental and epidemiology aspects) and iv) hence which indicators (index) approaches are most suitable for monitoring efficacy and optimizing performance of a particular treatment or disinfection process. Unless specifically stated, the term ‘indicator’ will be used to describe process, fecal, index and model organisms from hereon in.

2.0 Indicator Organisms in Wastewater

Although, wastewater treatment processes commonly used around the world tend to be very effective at removing organic matter and suspended solids, they are often less effective at removing pathogenic microorganisms. However, the monitoring of pathogens including bacteria, viruses and protozoan parasites requires more costly and technologically demanding procedures to be supported by skilled labour. In addition, the sheer number and variety of pathogens present in wastewater can be bewildering and highly variable, often making their routine monitoring either unpractical, or financially unfeasible. The time required to complete analyses can also hinder their usefulness as a water quality control feedback tools. Consequently, the detection of pathogens cannot guarantee removal efficacy of all pathogenic organisms, nor can it ensure the complete safety of the waters. In contrast, indicator organisms have been shown to be present at consistently high concentrations in raw (untreated) wastewater, as shown in Table 4.

Table 4. Typical concentrations of indicator organisms in raw wastewater

Microorganisms

Typical Concentrations in Raw Wastewater

(Organisms/100ml-1)a,b,c

Total coliforms (TC) E+07 to E+10
Fecal (thermotolerant) coliforms (FC) E+06 to E+09
Eschericia coli (E. coli) E+06 to E+09
Fecal streptococci/intestinal enterococci (IE) E+04 to E+07
Protozoan (oo)cysts E+01 to E+04
Helminth ova (HO) 1 to E+03
Viruses E+02 to E+04
Somatic coliphages (SC) E+06 to E+07
F-specific phages (F-RNA) E+04 to E+05

aCFU: Colony Forming Units; bPFU: Plaque Forming Units; cOrganisms per 100ml-1 for (oo)cysts and ova for protozoa and helminths.

Source: Adapted from Jofre et al., 2016; von Sperling, 2007; Burton et al., 2014


As a result, indicators have been used to assist in the planning of wastewater treatment, by helping to determine the level of treatment necessary to ensure compliance with environmental standards (e.g. discharge consents). However, in order to achieve and maintain a particular level of wastewater treatment and disinfection, it is often necessary to first understand how efficient each of the various treatment processes are at removing pollutants (commonly referred to as removal efficacy). Indicator organisms are also useful in that they can help elucidate potential impacts on the removal efficacy caused by (i) extreme fluctations in wastewater quality, quantity and/or composition, (ii) seasonal, or maintenance-related variations in WWTP performance (e.g. levels of biological activity), and (iii) plant failures (e.g. disruptions in energy/chemical supplies, or membrane intergrity).

Fecal indicator organisms can be used to quantify the capability of treatment systems to remove or inactivate bacterial, viral, parasitic protozoan and helminth pathogens. Removal and inactivation are typically expressed in terms of a log10 reduction of indicators (or as percent reduction). Inactivation is also referred to as a reduction per unit of time. See persistence section: http://www.waterpathogens.org/node/103).

2.1 Logarithmic Reduction vs. Percentage Removal

The logarithmic (log10) reduction of microorganisms is one of the most common ways of reporting removal efficacy in natural and engineered systems (Rose et al., 1996; Rose et al.,2001), where ‘one-log10 reduction’ equates to a 90% reduction, a ‘two-log10 reduction’ equates to a 99% reduction, a ‘three-log10 reduction’ equates to a 99.9% reduction etc. Assuming that the initial concentration of fecal indicators is known, log10 reductions following a particular unit process (treatment), or series of processes can help improve understanding of the likely concentrations in effluents. Harwood et al. (2005) suggest the log reduction of microorganisms should also be supplemented with some of the routine physical chemical measurements of treatment or disinfection processes. Often removal efficiencies for suspended solids or other chemicals only represent a 1-2 log10 removal of microorganisms (Table 5), which if the initial concentration is already high (as shown in Table 4), it means that treated effluents will still contain significant concentrations of these organisms.

Table 5. Percentage and logarithmic removal efficiencies

Removal Efficiencies

Percentage %

Log10 units

90

1

99

2

99.9

3

99.99

4

99.999 5


Therefore, the selection of the most appropriate indicators for monitoring treatment efficacy will depend not only on the type of treatment system, but also on the quality of the source water, the intended use for the treated effluent and the sensitivity of the receiving waters. For example, a treatment process, which results in a 2 log10 (99%) reduction in FC levels (assuming raw wastewater contains between 106-109 organisms/100mL), would produce an effluent containing approximately 104-107 organisms/100mL. Given that final discharge consent levels (state/national limits on water quality for the effluent discharge) for FC are typically in the region of 102-103 organisms/100mL (EU, USEPA), it may be necessary to ensure that removal efficiency of 3 log10 (99.9 %) or even 4 log10 (99.99%) is achieved. Indicator concentrations are also commonly represented in terms of the order of magnitude (powers of 10) or in their logarithms, due to high levels of variability, uncertainty in more precise numerical values and their log-normal pattern of distribution (von Sperling and de Lemos Chernicharo, 2005). Table 6 lists the typical indicator removal efficiencies obtained in a range of natural and engineered wastewater treatment systems. Part Four of the GWPP contains relevant information about the reduction of fecal indicators in sanitation system technologies (See removal chapter: http://www.waterpathogens.org/node/5085).

Table 6. Typical log10 reduction values reported for fecal indicator bacteria (E. coli, enterococci, coliforms) and fecal indicator viruses (B. fragilis phages, F, specific coliphages, somatic coliphages, PRD1 phage) for a range of sanitation and wastewater treatment technologies

Technology

Indicator Bacteria

Indicator Viruses

Typical Reductiona

Maximum Reductionb

Typical Reductiona

Maximum Reductionb

Septic System

2

4

<1

2

Sedimentation

<1

1

<1

1

Constructed Wetlands

2

4

1 to 2

3

Waste Stabilization Pondsc

2 to 3

5

1 to 3

4

Trickling Filter

<1

2

<1

1

Activated Sludge

2 to 3

4

2

4

Membrane Bioreactor

5 to 6

>7

4

7

Sand Filtration (Tertiary)

<1

2

<1

2

Microfiltration (Tertiary)

3

5

1 to 2

3

aunder typical conditions; systems that are overloaded or poorly maintained may experience lower reductions; bunder optimal conditions; ca well to functioning system with several ponds in series and a hydraulic retention time of 20 to 40 days

Sources: Anceno et al., 2007; Appling et al., 2013; Bosch et al., 1986; Campos et al., 2016; Chernicharo, 2006; De Luca et al., 2013; Dixo et al., 1995; Elmitwalli et al., 2004; Farahbakhsh and Smith, 2004; Francy et al., 2012; Gantzer et al., 1998; Hamaidi et al., 2014; Kabler, 1959; Nicosia et al., 2001; Nnaji, 2011; Purnell et al., 2016; Rose et al., 2004; Verbyla and Mihelcic, 2015; Zanetti et al., 2010


The absence of fecal indicators (especially FIB) doesn’t necessarily mean an absence of pathogens (particularly enteric viruses). It is therefore, advisable (where possible) to select a combination of indicators, which posses a variety of physical, chemical characteristics (e.g. sizes, morphologies, surface charges, hydrophobicity’s etc.), and which better represent the wide range of bacterial, viral and protozoan pathogens often present in wastewater.

Indicators can provide crucial information regarding system performance and reliability and can even help identify events, or conditions that could lead to treatment system and compliance failures. However, with numerous indicators to choose between, it is important to be aware of suitable applications and potential limitations when trying to assess wastewater treatment and disinfection efficacy.

2.2 Fecal Indicators and Index Organisms

2.2.1 Fecal indicator bacteria

The term fecal indicator bacteria (FIB) describes the range of bacteria that inhabit the gastrointestinal tract of homeothermic (higher mammals and birds) animals. Their presence in water may indicate fecal contamination and possible association with enteric pathogens. Generally, indicator bacteria include the total coliforms (TC), thermotolerant or fecal coliforms (FC), Escherichia coli (E. coli), Enterococcus spp. (IE), most of which are recurrently excreted in feces (Rochelle-Newall et al., 2015). Together they comprise the most widely used and best understood group of indicators in wastewater treatment and disinfection.

2.2.2 Total coliforms (TC) and fecal coliforms (FC)

The coliform group includes a number of genera and species of bacteria described as facultative anaerobes (i.e. organisms which can survive in the absence of oxygen), which have common biochemical and morphological attributes that include Gram-negative, non-spore forming rods, capable of fermenting lactose in 24 to 48 hrs at 35°C. Fecal coliforms (FC) are also often referred to as thermotolerant coliforms due to their ability to produce acid and gas from lactose at a temperature of 44°C (within 24 hrs). Coliforms (TC and FC) have been widely used in many countries as a monitoring tool to predict the presence of bacterial, viral and protozoan pathogens in wastewater. However, no quantifiable relationship between TC and pathogenic microorganisms appears to exist, leading some scientists to refer to them as “environmental” coliforms, given their possible occurrence in non-fecally contaminated water and soils (von Sperling and de Lemos Chernicharo, 2005).


Fecal coliforms (FC) encompass the genus Escherichia and to a lesser degree, species of Klebsiella, Enterobacter and Citrobacter (https://www.waterpathogens.org/book/bacterial-indicators). Whilst FC are thermotolerant, the presence of free-living bacteria of non-fecal origins is also possible, though much less likely than when detecting TC. The widespread usage of the FC is due to their continuous association in wastes of human and animal origins (Cabral, 2010). For many years, most international water and wastewater quality guidelines and standards have included coliforms as a measurement of microbiological water quality, and for compliance reporting. However, their application has highlighted certain limitations (Table 7) and has led to a change of focus in recent years towards alternative indicators (Table 8).

Table 7. Limitations associated with the application of coliforms (TC and FC)

#

Limitation

Source

1.

Short survival in water body. Caution should be exercised when interpreting FC results

New Hampshire Department of Environmental Services, 2003; Savichtcheva and Okabi, 2006

2.

Reliable indicator of contamination, but their absence is not associated with the absence of fecal contamination as the source of contamination can be from animal excreta, wastewater, sludge, septage, or biosolids

New Hampshire Department of Environmental Services, 2003

3.

Some coliforms are non-fecal in origin

Scott et al., 2002; Simpson et al., 2002; Simpson et al., 2002

4.

Ability to multiply after release into water column

Desmarais et al., 2002; Solo-Gabriele et al., 2000

5.

Not always a reliable indicator of the destruction of individual species or groups of pathogens during wastewater treatment processes. Great weakness to the disinfection process

Hurst et al., 2002; New Hampshire Department of Environmental Services, 2003

6.

Inability to identify the source of fecal contamination (point and non-point). FC occur in both human and animal sources of pollution and detection does not tell whether the water contamination is of human or animal origin

New Hampshire Department of Environmental Services, 2003; Horman et al., 2004; Winfield and Groisman, 2003; Tyagi et al., 2006

7.

Reliable indicator of the survival of most bacterial pathogens, but are less reliable as an indicator for the presence of viruses and parasites

New Hampshire Department of Environmental Services, 2003; McQuaig et al., 2006; Field and Samadpour, 2007; Stoeckel and Harwood, 2007

8.

Low levels of correlation with the presence of pathogens and low sensitivity of detection methods

Horman et al., 2004; Winfield and Groisman, 2003


Table 8 evidences the apparent change of focus regarding the use of coliforms as indicators of fecal pollution by key international bodies.

Table 8. Worldwide change of focus regarding use of coliforms as fecal indicators

Guidelines

Changes of Indicators

Alternative Microbial Indicators

The European Union

Removal of TC in 1998

Enterococci (NHMRC, 2001)

Volume 2 of 2nd edition WHO

Detail discussion on inadequacies of TC as an indicator of fecal pollution

Debate the merits of alternative indicators such as enterococci and sulphate reducing clostridia

(WHO, 1996)

The New Zealand

Removal of FC or TC

Revision of water quality standards by the New Zealand ministry of health: Inclusion of E. coli only as a bacterial indicator of fecal pollution

Australian drinking water guidelines

Decision: TC be removed as a health compliance parameter for fecal contamination

E. coli be retained as the primary compliance parameter for fecal contamination (NHMRC, 1996)


It has also been suggested that the fact that coliforms do not form spores means they are much less resistant to destruction by environmental conditions, than bacterial, viral, and protozoan pathogens (APHA, 2001). What is clear is that there is a greater level of understanding about the behaviour (e.g. presence, fate and transport) of coliforms (and other FIB) in WWTPs found in temperate climates, compared with those present in tropical climates.

2.2.2.1 Limitations of coliforms

To summarize, the principal limitations of coliforms include the fact that they: (i) are not necessarily from an exclusive fecal source (Simpson et al., 2002), (ii) are capable of multiplying in the environment (under certain conditions) and (iii) have a low (or non-existent) correlation with the presence of many waterborne pathogens (McFeters et al., 1974; Farnleitner et al., 2000; Savichtcheva and Okabe, 2006; Stedtfeld et al., 2007). For example, it has been extensively demonstrated that TC and FC bacteria do not adequately reflect the occurrence of pathogens in disinfected wastewater effluent, due to their relatively high susceptibility to chemical disinfection (Miescier et al., 1982; Tyagi et al., 2006) and failure to correlate with protozoan parasites such as Cryptosporidium (Bonadonna, et al., 2002; Harwood et al., 2005), viral indicators such coliphages (Harwood et al., 2005) and enteric viruses (Havelaar et al., 1993). However, despite these potential limitations, many less-economically-developed countries (LEDCs) often rely upon TC and FC as the principal indicator organisms for monitoring of surface water resources and wastewater effluents. In addition, the fact that TC and FC are far more sensitive to disinfection than enteric viruses and protozoa means that they should thus be absent immediately after disinfection and that their presence serves as an indication of inadequate wastewater treatment (Ashbolt, 2004).

2.2.3 Escherichia coli (E. coli)

Escherichia coli have long been used as indicators of fecal pollution (Geldreich, 1966) and exhibit many good characteristics of a fecal indicator (Table 2). Although some E. coli strains are pathogenic (e.g. E. coli O157 H7) and play important roles in intestinal and urinary tract infections, the majority of E. coli strains reside harmlessly in the colon (Scheutz and Strockbine, 2005). Unlike TC and FC, E. coli is exclusively fecal in origin and is considered as a highly specific indicator of fecal pollution originating from humans and warm-blooded animals (DWAF, 1996; National Health and Medical Research Council, 2003). In human and animal faeces, 90-100% of coliform organisms isolated have been found to be E. coli (Hurst et al., 2002). Its laboratory detection is straightforward, principally involving fluorogenic, chromogenic methods. Escherichia coli is a useful treatment parameter due to its important role as a primary compliance parameter for fecal contamination by key International bodies such as USEPA, European Union (EU) and in national legislation e.g. Australian drinking water guidelines. In addition and according to Wiedenmann et al. (2006) E. coli has also been used in epidemiological studies to consistently relate recreational water quality to health outcomes. Further examples of studies using E. coli and typical concentrations and removal efficiencies obtained in a range of natural and engineered treatment systems can be found in Tables 4 and 6.

2.2.3.1 Limitations of E. coli

In spite of its support as the sole indicator bacteria for recent fecal contamination (Tallon et al., 2005), some studies, however, have suggested that E. coli may be a less reliable indicator in tropical settings due to concerns that it may persist or even proliferate in such environments, particularly, those with high temperatures and elevated levels of nutrients and organic matter (Solo-Gabriele et al., 2000; Desmarais et al., 2002; Winfield and Groisman, 2003). What’s more, care should also be taken as E. coli are much more sensitive to inactivation than pathogenic bacteria, viruses and protozoa (Sinclair et al., 2009). For example, E. coli (and coliforms) which are the indicators often chosen for reclaimed water regulations or guidelines, have been shown to be inactivated more efficiently than any other indicators through certain wastewater treatment processes (Havelaar et al., 1993; Leclerc et al., 2001; Harwood et al., 2005). The detection and enumeration of E. coli typically involves the use of a chromogenic agar, or confirmatory steps, which are more costly than other common used FIB, which may reduce its suitability in certain low resource settings.

2.2.4 Intestinal Enterococci

The enterococci were formally a subset of the fecal streptococci (FS) group that included the four species of fecal streptococci (Strep. avium, Strep. gallinarum, Strep. bovis and Strep. equinus) and now belong to the genus Enterococcus that was formed by the splitting of Strep. faecalis and Strep. faecium, which are most frequently found in humans (Schleifer and Klipper-Balz, 1984). Intestinal enterococci (IE) are differentiated from other streptococci by their ability to grow in 6.5% NaCl and at high pH (9.6) and temperature (45°C). Intestinal enterococci are most frequently used as FIB, or general indicators of fecal contamination, but they are also used as surrogates for pathogens and/or health effects in risk assessment and other modelling applications (Cizek et al., 2008; Ma et al., 2007; Schoen et al., 2011; Byappanahalli et al., 2012; Sinclair et al., 2012; Tseng and Jiang, 2012; Wade et al., 2006). E. faecium and E. faecalis have been shown to be the dominant species present in municipal wastewaters (Sinton and Donnison, 1994; Blanch et al., 2003; Moore et al., 2008; Ferguson et al., 2013). Intestinal enterococci have also proven to be especially reliable as indicators of health risk in fecally impacted marine environments and recreational waters (Cabelli et al., 1982; Cabelli, 1983).

The use of intestinal enterococci as indicator of fecal pollution is strongly recommended for monitoring of wastewater quality because they have been considered as a major group of indicators of fecal pollution (WHO, 1996; Gleeson and Gary, 1997). This is largely because in general they may not grow as readily as E. coli and FC in the environment (https://waterpathogens.org/book/bacterial-indicators). Moreover, they are highly NaCl, pH and temperature tolerant and show greater survival especially in marine waters (McFeters et al., 1974). Intestinal enterococci are shed in high numbers, can be detected and enumerated using rapid and simple methods (Pinto et al., 1999). However, Tyagi et al. (2006) suggest that it is important to combine intestinal enterococci with E. coli in order to obtain results that increase confidence in the absence or presence of fecal pollution. The behaviour of these FIB under environmental conditions is expected to reflect the presence of enteric pathogenic bacteria (Tallon et al., 2005). Although FC or TC are often used to assess the efficacy of disinfection (Harwood et al., 2005), other studies have suggested that the resistance of intestinal enterococci against disinfection is a better predictor of the fate of viruses than coliforms (De Luca et al., 2008; Zanetti et al., 2007). Further examples of studies using intestinal enterococci and typical removal efficiencies obtained in a range of natural and engineered treatment systems can be found in Table 6. It has also been suggested that increased survival (compared to E. coli) and greater resistance to chlorination make them suitable indicators of inefficient disinfection processes.

2.2.4.1 Limitations of intestinal enterococci

Previous studies have suggested that populations of intestinal enterococci may be endogenous in sediments and soils and not exclusively of fecal origin, which may confound accurate water quality assessments (Byappanahalli and Fujioka, 2004; Desmarais et al., 2002; Byappanahalli et al., 2012). It is therefore advisable, where possible to enumerate intestinal enterococci alongside other FIB (e.g. E. coli/FC) and vice versa.

2.2.5 Clostridium perfringens (CP)

Clostridium perfringens (CP) represents approximately 0.5% of the fecal microflora commonly found in human and animal faeces (Bitton, 2005). This anaerobe (grows only under anaerobic, without oxygen, conditions) has been included in the group of fecal pollution indicators primarily because it is a spore-forming bacterium. Clostridium perfringens and its spores constitute a desirable treatment indicator due to their similarity (in terms of size and persistence) to more resistant pathogens e.g. protozoan parasites (Giardia cysts and Cryptosporidium oocysts). Its relevance as the only spore-forming indicator of fecal contamination includes the following, CP:

  1. Can resist chemical and physical treatment processes (and less affected by predation than other FIB) (Burkhardt et al., 2000; Savichtcheva and Okabe, 2006; Wohlsen et al., 2006).
  2. Can form spores that allow detection but are too enduring to be good indicators of recent fecal contamination (Tallon et al., 2005).
  3. The recovery of vegetative cells may indicate an immediate, untreated fecal source. However, given the important role that water temperature has on the survival of CP, detection during the warmer seasons may reflect immediate wastewater pollution to a greater degree than during the cooler seasons (Bisson and Cabelli, 1979).
  4. Concentrations have been shown to be significantly correlated with the presence of human enteric viruses and protozoa (Giardia cysts and Cryptosporidium oocysts) (Payment and Franco (1993) and other pathogen groups such as Aeromonas sp.) (Gleeson and Gray, 1997).
  5. Rarely multiply in the environment because of the need for anaerobic conditions and spores of anaerobic bacteria are extremely resistant to environmental factors (Payment, 1998).
2.2.5.1 Spores and sulphite-reducing clostridia (SSRC)

SSRC are also attractive in that they have been shown to survive in wastewater and receiving waters much longer than bacterial indicators and can also resist disinfection. SSRC have been most widely used as indicators of disinfection efficacy, they have also been used to assess the thermal treatment of dewatered sludge and wastewater (Mocé-Llivina et al., 2003). SSRC have also been used alongside Cryptosporidium oocysts and Giardia cysts to determine the extent of reduction during both aerobic wastewater treatment and anaerobic digestion of sludge at a full-scale wastewater treatment plant in Ottawa, Canada (Chauret et al., 1999). Further examples of studies using CP and SSRC and typical removal efficiencies obtained in a range of natural and engineered treatment systems can be found in Table 6.

2.2.5.2 Limitations of CP and SSRC

Tyagi et al. (2006) suggested that, due to its facultative anaerobic nature, CP should be used only in conjunction with E. coli and/or FC, not individually. Lucena et al. (2005) suggest that while SSRC are always persistent in treatment processes and can consequently be useful to indicate the removal of persistent pathogens, their high residence time and resistance to inactivation in soil and sediments may compromise their use as performance indicators for waste stabilization ponds. SSRC are too enduring to be good indicators of recent wastewater discharges.

2.2.6 Bacteriophages (phages)

Growing evidence to suggests that FIB are unable to detect accurately the presence (and concentration) of viral pathogens, especially in WWTP (USEPA, 2015), has led to a renewed interest in the detection and enumeration of phages. Phages (viruses which infect bacteria) are considered to be better predictors of human enteric virus removal than FIB because of similarities such as composition, morphology, structure, size and site of replication (Jofre et al., 1986; Gantzer et al., 1998; Grabow, 2001; Sinton et al., 2002; Diston et al., 2012; Edbon et al., 2012;Jofre et al., 2014). Consequently, phages, which can be detected using relatively simple, affordable standardized laboratory techniques, have increasingly been used in a variety of different capacities for assessing wastewater treatment and disinfection processes (Tartera and Jofre, 1987; Tartera et al., 1989; Lucena et al., 1994; Grabow, 2001; Purnell et al., 2015, Purnell et al., 2016). According to Amarasiri et al. (2017) phages have been the most widely used microbial parameter for performance validation and operational monitoring with respect to virus reduction efficiency in wastewater treatment processes. In fact, phages have been used as:

  1. Fecal indicators - the occurrence and persistence of some groups of enteric phage relate to health risks associated with fecal pollution and the potential occurrence of enteric pathogens (Havelaar, 1987; IAWPCR, 1991; Leclerc et al., 2000; Morinigo et al., 1992; Lucena et al., 2006; Lucena and Jofre, 2010). As a result, phages infecting enteric bacteria are now accepted as useful indicators in water quality control e.g. coliphages (those viruses which infect E.coli) are used to establish wastewater treatment efficacy by regulatory agencies, such as in Australia (Keegan et al., 2012).
  2. Process indicators - certain phage groups have also been successfully employed as enterovirus surrogates in evaluating the effectiveness of treatment processes (e.g. filtration and disinfection) and final product quality (Stetler et al., 1984; Payment et al., 1985; Havelaar et al., 1993; Durán et al., 2003; Davies-Colley et al., 2005; Persson et al., 2005; Abbaszadegan et al., 2008; Amarasiri et al., 2017).
  3. Virus indices - Lucena and Jofre (2010) have suggested that as comprehensive pathogenic virus indices, phages are less useful, as their numbers seldom seem to correlate to numbers of pathogenic viruses in water samples (at least when conventional statistics are applied). However, the authors also suggest that the future application of advanced mathematical models to new databases may reduce uncertainty and provide better information about relationships between phage and pathogenic virus numbers.
  4. Models and tracers - phages are often used as biocolloids to estimate the fate and transport of pathogenic viruses through natural and synthetic saturated and unsaturated porous media e.g., reedbeds and in surface and subsurface aquatic environments (Mesquita et al., 2010). The use of either naturally occurring (indigenous), or more commonly ‘spiked’ (introduced) phages as surrogates for pathogen transport, facilitates the design of more efficient treatment systems (in terms of pathogen removal).
  5. Microbial source tracking (MST) - the high level of host specificity of certain phage has been harnessed using a range of genotypic, and phenotypic methods described elsewhere (see phage chapter: http://www.waterpathogens.org/book/coliphage). Phage-based MST tools have been applied to wastewaters (both raw and treated), most commonly during technique development (to ensure host specificity and geographical distribution) rather than as indicators of treatment efficacy per se. However, knowledge about specific inputs may be useful in certain situations, such as in wetland systems, or reuse schemes where it might be useful to determine the contribution of non-human inputs, such as those arising from avian inputs.

Payment et al. (1988) suggested early on that phages are useful for understanding and assessing treatment efficacy, due largely to the fact that like viruses, they adsorb to solids. This ability to attach to suspended solids facilitates their sedimentation and constitutes an important removal mechanism within both natural and engineered systems. Evidence of this can be seen in the increased phage densities reported in sediments (Araujo et al., 1997; Skraber et al., 2009b) and sewage sludges (Lasobras et al., 1999; Mignotte et al., 1999; Guzman et al., 2007). According to Jofre et al. (2007), adsorption also influences the effective retention of viruses and phages by microfilters used in water treatments, whose pore size is greater than viruses and phages (Herath et al., 1998; Farahbakhsh and Smith, 2004). In a recent review by Amarasiri et al. (2017) the authors suggest that even though there is no strong correlation between log10 removal values (LRV) of phages and human viruses in wastewater treatment unit processes. MS2 coliphages may be used as an indicator for human viruses in MBR given that phage LRV have been shown to be lower than those of human viruses (e.g. norovirus GII and enterovirus). However, other bacteriophages have provided higher LRVs compared to human viruses, though comparisons between the two are currently scarce except for MBR and activated sludge processes (Amarasi et al., 2017).

The most common phages used in wastewater monitoring fall into three main groups: (i) somatic coliphages – phages that infect E. coli strains; (ii) male-specific F-RNA coliphages – phages commonly used as indicators of human enteric viruses; (iii) and phages infecting strict anaerobic Bacteroides spp. comprising the major part of the human gastrointestinal microbiota (Grabow, 2001). Jofre et al. (2007) suggest that numbers of all three groups of phages (provided a geographically suitable host strain of Bacteroides is used) are fairly constant in raw sewage throughout the world, as are the numbers of FIB (Table 16.4). Therefore, the enumeration of certain groups of phages has been proposed as a way to model the removal of enteric viruses in treatment systems (IAWPRC, 1991).

2.2.6.1 Coliphage (Somatic and F-RNA)

Coliphage have been regarded as useful microorganisms for evaluating wastewater treatment efficacy (Duran et al., 2003; Lucena et al., 2004; Bitton, 2005; EPA, 2015). An extensive review of over 2,500 articles recently conducted by the US EPA suggested that this is because coliphages and human enteric viruses have similar morphological and structural characteristics, often co-occurring in feces, and often sharing similar fate and transport characteristics. (EPA, 2015). Therefore, the reduction of coliphages and human enteric viruses may follow similar patterns during wastewater treatment depending on the method of pathogen removal (Havelaar et al., 1993; Turner and Lewis, 1995; Rose et al., 2004).

The behaviour of coliphages through WWTP processes has been extensively documented in the international scientific literature. This also included frequent comparisons with FIB and viral pathogens (most notably Norovirus) (Lodder and de Roda Husman, 2005; Haramoto et al., 2006; Ottoson et al., 2006; Aw and Gin, 2010; Francy et al., 2011; Francy et al., 2012; Keegan et al., 2012; Flannery et al., 2012; Flannery et al.,2013; Carducci and Verani, 2013; Grøndahl-Rosado et al., 2014; Kauppinen et al., 2014). Whilst coliphages appear to be eliminated in a similar way in during both primary and secondary wastewater treatment (Jofre et al., 2016), this does not appear to be the case when additional treatment steps such as sand filtration, or MBR technologies and either chemical or physical disinfection are involved. Further examples of studies using coliphages and typical removal efficiencies obtained in a range of natural and engineered treatment systems can be found in Table 6. Methods for the simultaneous detection and enumeration of F-specific (F-RNA) and somatic coliphage (SC) have also been developed in recent years (Guzmán et al., 2007; Agulló-Barceló et al., 2016).

2.2.6.1.1 Somatic coliphage (SC)
SC are a heterogeneous group of phage whose members infect host cells (E. coli and other Enterobactereacea) by attaching to receptors located in the bacterial cell wall (Mesquita et al., 2010). The presence of SC in wastewater means that these viruses can serve as general fecal pollution indicators and may be associated with the presence of other enteric viruses (Havelaar et al., 1990; Calci et al., 1998). SC are the most abundant indicator phages in raw wastewater with values usually less than one order of magnitude lower than the number of FC (Nieuwstad et al., 1988; Grabow et al., 1993; Chung et al., 1998; Contreras-Coll et al., 2002; Lucena et al., 2003; Lodder and de Roda Husman, 2005; Jofre et al., 2007). SC have been reported to generally outnumber F-RNA phages in wastewater and raw water sources by a factor of about 5, and infectious human viruses by about 500 (Grabow et al., 1993; Gantzer et al., 1998; Grabow, 2001; Aw and Gin, 2010). Their numbers are reported to be low in human feces (often <10 per g), despite being highly abundant and widespread in untreated municipal wastewaters (Havelaar et al., 1986; Jofre et al., 2016).


Specific SC used as viral surrogates to assess different fate and transport mechanisms include the likes of ΦX174, PDR-1, T-2, T-4, and T-7 (WHO, 2004; Lucena and Jofre, 2010). According to Mesquita et al. (2010) phage PRD-1 in particular has emerged as an important viral model, due to its similarity in size to human adenoviruses (~62nm) and morphology (icosahedral), its relative stability over a range of temperatures and low degree of attachment in sediments (Harvey and Ryan, 2004; Ferguson et al., 2007). Further examples of studies using SC and typical removal efficiencies obtained in a range of natural and engineered treatment systems can be found in Table 6.

2.2.6.1.2 F-specific RNA phage (F-RNA)
F-RNA phages are the most extensively studied phage group due to their similarity to many pathogenic human enteric viruses such as enteroviruses, caliciviruses, astroviruses and Hepatitis A & E virus (Cramer et al., 1976; Jofre et al., 2011). Importantly, they are capable of surviving many sewage treatment processes (Ayres, 1977; Grabow et al., 1978; Grabow et al., 2001). According to Jofre et al. (2007), F-RNA phages rank second in abundance in both municipal and hospital raw sewage and raw wastewater from abattoirs, with values usually about one order of magnitude lower than SC (Havelaar and Hogeboom, 1984; Nieuwstad et al., 1988; Grabow et al., 1993; Chung et al., 1998; Contreras-Coll et al., 2002; Lucena et al., 2003; Blanch et al., 2004; Lodder and de Roda Husman, 2005).

The presence of F-RNA phages in high numbers in wastewater and their resistance to chlorination contribute to their usefulness as process indicators and indices of sewage pollution (Havelaar et al., 1993; Love and Sobsey, 2007). They are also promising MST tools since they can be subdivided in four distinct serogroups (GII & GIII- associated with humans, GI & GIV with animals). The results of a study conducted in Japan recently revealed that GI F-RNA coliphages were the most abundant genogroup present in the secondary-treated sewage and as such may be used as an appropriate indicator of virus reduction during wastewater treatment (Haramoto et al., 2015). F-RNA phages used as viral surrogates to assess different fate and transport mechanisms include f2, MS2, and Qß (WHO, 2004; Lucena and Jofre, 2010). MS2 and f2 are morphologically similar to enteroviruses and consequently, are frequently used to study viral resistance to environmental stressors, disinfection and other treatment processes (Havelaar, 1986; Havelaar et al., 1993; WHO, 2004). The US EPA guideline for water reuse (2012) suggest the use of MS2 coliphage for on-site validation of treatment processes. What’smore, MS2 is also considered to be the best surrogate for studying sunlight disinfection in wastewater treatment ponds. Further examples of studies using F-RNA phage and typical removal efficiencies obtained in a range of natural and engineered treatment systems can be found in Table 6.

2.2.6.2.3 Limitations of SC and F-RNA phage
The heterogeneous nature of both SC and F-RNA phages means that care should be taken when selecting specific phage as indicators, as their behaviour may not be indicative of the whole group. Fortunately, the high levels of SC and F-RNA phages present in matrices such as primary, secondary effluents, and raw sludge mean that it is often possible and more desirable to elucidate the effect of treatment using both groups of phages as a whole. Costan-Longares et al., 2008 observed differential removal for various indicators and pathogens through tertiary treatment, though not through secondary treatment. Phage reductions tend to be significantly lower than those observed for bacteria (Harwood et al., 2005; Mandilara et al., 2006; Costan-Longares et al., 2008; Edbon et al., 2012). The lower density and highly variable occurrence of indigenous F-RNA phages in influent has been suggested as a potential limitation of their use as an indicator. Reported removal rates of phages in stabilization ponds are also erratic (Ohgaki et al., 1986; Lewis, 1994; Davies-Colley et al., 1997; Hill and Sobsey, 1998; Campos et al., 2002; Verbyla and Mihelcic, 2015).

2.2.6.1.4 Use of coliphage as regulatory tools
Largely as a result of shortcomings associated with traditional FIB, coliphages are increasingly being used by regulatory agencies as indicators for establishing wastewater treatment efficacy. For example, as Keegan et al. (2012) note, when evaluating a WWTP, the South Australian and Victorian Departments of Health use minimum removal values as defaults for each treatment process, unless it has been demonstrated that a greater inactivation is achievable in the system. Table 9 shows the Log10 reductions for wastewater treatments used in Australia. Interestingly, in this instance the coliphage removals are more similar to human virus (adenovirus, rotavirus and enterovirus) removal than E. coli or bacterial pathogen removal for many treatments.

Table 9. Log10 removals of enteric viruses and indicator organisms by various wastewater treatment and disinfection processes

Treatment

Indicative Log10 Removalsa

Coliphages

E. coli

Viruses (Including Adenoviruses, Rotaviruses and Enteroviruses)

Bacterial pathogens

Primary

N/Ab

0 to 0.5

0 to 0.1

0 to 0.5

Secondary

0.5 to 2.5

1.0 to 3.0

0.5 to 2.0

1.0 to 3.0

Dual media filtration with coagulation

1.0 to 4.0

0 to 1.0

0.5 to 3.0

0 to 1.0

Membrane filtration

3.0 to >6.0

3.5 to >6.0

2.5 to >6.0

3.5 to >6.0

Reverse osmosis

>6.0

>6.0

>6.0

>6.0

Lagoon storage (waste stabilization ponds)

1.0 to 4.0

1.0 to 5.0

1.0 to 4.0

1.0 to 5.0

Chlorination

0 to 2.5

2.0 to 6.0

1.0 to 3.0

2.0 to 6.0

Ozonation

2.0 to 6.0

2.0 to 6.0

3.0 to 6.0

2.0 to 6.0

UVC light

3.0 to 6.0

2.0 to >4.0

>1.0 adenovirus, >3.0 enterovirus, hepatitis A virus

2.0 to >4.0

aReductions depend on specific features of the process, including detention times, pore size, filter depths, and disinfectant. Each row shows only the reduction for that treatment step; bNA: Not Applicable

Sources: Australian Guidelines for Water Recycling, 2008; Keegan et al., 2012

2.2.6.2 Bacteroides phage

Phage infecting Bacteroides rank third in abundance in raw wastewater after SC and F-RNA phage (Tartera and Jofre, 1987; Tartera et al., 1989; Lucena et al., 1994; Ebdon et al., 2007; Edbon et al.,2012) and their ratio with respect to SC and F-RNA phages has been shown to be remarkably constant (Puig et al., 1999; Contreras-Coll et al., 2002; Lucena et al., 2003; Blanch et al., 2004; Jofre et al., 2007). Most Bacteroides phages have a very narrow host range (Lucena and Jofre, 2010). Some strains, such as B. fragilis HSP-40 and GB-124, and B. thetaiotaomicron GA-17, appear to be restricted to human fecal sources (Tartera et al., 1989; Payan et al., 2005; Ebdon et al., 2007), whilst others, such as B. fragilis RYC-2056, are not (Lucena and Jofre, 2010). Bacteroides phages are more resistant than SC and F-RNA phages to most inactivating factors and treatments and they do not replicate outside the gut. The inability of these phage to multiply in the environment counts in their favour with regards to utilisation as models/surrogates to assess the survival of enteric viruses in water treatment and disinfection processes (Grabow, 2001).

Data from WWTP indicate that the survival of naturally occurring Bacteroides phages in primary and secondary treatment stages is similar to that of other phages (Lucena et al., 2004), but that their survival in tertiary treated wastewaters (including UV irradiation and/or chemical disinfection) more closely reflects that of more persistent micro-organisms (Costan-Longares et al., 2008). They have also been shown to accumulate in sewage sludge and are quite resistant to sludge treatments (Guzmán et al., 2007). The most commonly used Bacteroides phages in environmental and treatment resistance spiking studies are B40-8 and B56-3 (Lucena and Jofre, 2010). Tartera et al. (1988) found B. fragilis phage B40-8 to be more resistant to inactivation by chlorine than poliovirus type 1, simian rotavirus SA11, coliphage f2, Escherichia coli and E. faecalis. Purnell et al. (2015) used a range of indigenous and ‘spiked’ phages (MS2 and B124-14) of known size and morphology to demonstrate how their enumeration may offer a practical and conservative way of assessing the ability of MBR to remove enteric viruses of human health significance. Further examples of studies using Bacteroides phage and typical removal efficiencies obtained in a range of natural and engineered treatment systems can be found in Table 6.

2.2.6.2.1 Limitations of Bacteroides phages
The main drawback associated with Bacteroides phage is the requirement for geographically specific hosts. However, hosts such as RYC-2056, GA-17, GB-124 and ARABA84 have been successfully identified and applied in many parts of the world including Europe, and North and South America (Puig et al., 1999; Contreras-Coll et al., 2002; Payan et al., 2005; Ebdon et al., 2007; McMinn et al., 2014; Diston and Wicki, 2015). Whilst Bacteroides phages may be present in low numbers in treated wastewaters and require anaerobic conditions for growth, effective and standardised methods now exist for their detection, density and enumeration.

3.0 Pathogen Monitoring as Indicators

3.1 Enteric Viruses

According to Rose et al.(1996) enteric virus removal from wastewater continues to receive attention due to the epidemiological significance of viruses as waterborne pathogens and because of the high diversity that are excreted in human waste.

3.1.1 Enteroviruses (EntV)

EntV are members of the Picornaviridae family, are spherical and non-enveloped viruses, and are amongst the smallest RNA viruses (approx 27 to 32nm in diameter). EntV have been considered as potential indicators of the human enteric viruses in a range of different wastewaters and waters. Jofre et al. (2007) suggest that one reason for this is due to the fact that they are the most easily detectable by cell culture of all enteric viruses. Replication of EntV in Buffalo Green Monkey (BGM) cells is the only way in which the infectious nature of the virus can be fully determined. Molecular approaches, such as reverse transcription-polymerase chain reaction (RT-PCR), are also commonly used for the sensitive, specific, and more rapid (24 to 48 h) detection of the EntV genome.

EntV are a regulatory parameter in certain parts of the world, such as the US, where they are used to help determine the safe unrestricted use of sludge in agriculture (USEPA, 1992). Numerous studies have been conducted, which have involved the detection of RNA/DNA viruses such as EntV and human adenoviruses (HAdV) in wastewaters using real-time PCR (Bofill-Mas et al., 2006; Katayama et al., 2008; Rodriguez et al., 2008; Laverick et al., 2004; Carducci et al., 2008; da Silva et al., 2007) and in some cases also the accompanying viral infectivity by cell culture (Aulicino et al., 1995; Petrinca et al., 2009; Rodriguez et al., 2008; Sedmak et al., 2005).

For example, Simmons and Xagoraraki (2011) looked at removal of EntV (and HAdV, HAV and Norovirus GI/GII) across four points (including influent, pre-disinfection, post-disinfection and biosolids) at five full-scale WWTPs in the US. The study sought to compare the removal efficiency between MBR and conventional treatment processes (including activated sludge, oxidative ditch, UV, chlorination) to elucidate how enteric viruses are removed and inactivated during treatment. Their findings revealed that EntV were detected in 100% of influent samples (average concentration 2.1x105 viruses/L) and 67% of samples overall (using real-time PCR). The authors observed a significant log10 reduction (1.9-5.0 - average 4.2 log10) in infectious viruses throughout the water treatment process, though the log10 removal values for EntV were similar for both conventional treatment and MBR treatment processes (3.6 log10 for MBR and 2.9 log10 for conventional). This study illustrates the utility of directly detecting (either by cell culture or real-time PCR) aetiological agents of waterborne disease (EntV), which may not only provide information on the likely presence of other groups of enteric viruses but which help elucidate the effect of treatment and disinfection processes on human heath risk.

3.1.1.1 Limitations of EntV

Detection by cell culture is time-consuming (taking 1 to 2 weeks) and is difficult to perform, making it unsuitable for application in many low-income settings (e.g. LEDCs). In addition, not all viral serotypes can be detected and low concentrations of EntV can be a problem in certain situations. Molecular-based detection of EntV, like many genomic methods, are limited in that they do not allow, without introduction of additional steps, the distinction between infectious and non-infectious viruses (Nuanualsuwan and Cliver, 2002).

3.1.2 Human adenovirus (HAdV)

Human adenoviruses (HAdV) consist of at least 51 serotypes that have been defined and classified into six different species (A–F) under the Mastadenovirus genus of Adenoviridae family (Kuo et al., 2010). HAdV have been proposed as indicators of fecal viral contamination because of their consistent prevalence and stability in human sewage (Bofill-Mas et al., 2006) and resistance to certain environmental stressors and disinfectants (Enriquez et al., 1995; Gerba et al., 2002). In fact, Ogorzaly et al.(2010) suggest that the higher persistence of AdV during wastewater treatment compared to other enteric viruses may be due to high stability of its double stranded DNA genome as compared to RNA viruses. They also tend to be more abundant and stable in the environment than EntV. The literature shows that approximately 105–109 and 102–105 viral genomic copies/L of HAdV have been detected in untreated and treated wastewaters, respectively (He and Jiang, 2005; Bofill-Mas et al., 2006; Simmons and Xagoraraki, 2011). Multiple serotypes of adenoviruses may be present in sewage but it has been suggested that the enteric serotypes 40 and 41 dominate overall HAdV serotypes (Haramoto et al., 2006). A potential advantage of detecting DNA viruses such as HAdV is the fact that the nucleic acid amplification does not require the RT step, making slightly less complex, costly and time consuming when compared with the molecular detection of EntV.

Rodriguez-Manzano et al. (2012) concluded that HAdV quantified by qPCR represents a useful tool as pathogen/indicator in water and may be used as an indicator of removal efficiency of pathogens by WWTP. Their results revealed that HAdV were detected in the reclaimed wastewater following disinfection processes and exhibited similar log10 removal behaviour to that of Giardia cysts and Crytosporidium oocysts (detected using immunofluorescence assays), showing higher concentration than the other pathogens or fecal indicators, even in tertiary reclaimed water that complies with current regulations in Spain (Real Decreto 1620/2007). Further examples of studies using HAdV and typical removal efficiencies obtained in a range of natural and engineered treatment systems can be found in Tables 6 and 9.

3.1.2.1 Limitations of HAdV

Although molecular detection methods for HAdV are highly specific, sensitive and expeditious, they are unable to distinguish between infectious and non-infectious particles. Whilst cell culture techniques are available for the detection of adenovirus, the costs and time associated with this approach may prevent its regular use in certain parts of the world.

3.1.3 JC polyomaviruses (JCPyV)

According to Bofill-Mas et al. (2000) JCPyV were first described to occur in sewage in 2000 and have subsequently been reported in wastewater from all over the world at concentrations as high as 108 GC/L as well as in water matrices impacted by sewage discharges, albeit at lower concentrations due to dilution and inactivation. In sludge or/and biosolids produced in WWTP, JCPyV is present in high concentrations up to 103 GC/gr (Bofill-Mas et al., 2006). JCPyV has been proposed as a human fecal viral indicator due to its high prevalence in sewage from all geographical areas where it has been tested thus far and due to the persistent shedding by human population. Approximately, one-quarter of the human population sheds JCPyV DNA in urine in variable concentrations and excretion may increase as immunosuppressed populations increase (Yogo et al., 1990; McQuaig et al., 2009)

According to Nims and Plavsic (2012) polyomaviruses seem to be more resistant to UV radiation than are other small non-enveloped viruses such as the parvoviruses and caliciviruses. Relatively high temperatures (>70°C) are also required to effect thermal inactivation of the polyomaviruses. The chemical inactivants that are effective are those that have displayed efficacy for other small non-enveloped viruses (i.e. ethanol, sodium hydroxide, formaldehyde) (Nims and Plavsic, 2012). According to Bofill-Mas data reported on tertiary treatment of secondary wastewater effluents containing JCPyV are diverse: JCPyV reductions up to 1 log10 after application of chlorination, filtration and coagulation and UV disinfection was reported by Rusiñol et al. (2015), while no JCPyV reduction was seen when equivalent treatment was applied in another plant (Fernandez-Cassi et al., 2016).

3.1.3.1 Limitations of JCPyV

When present in tertiary treated wastewater disinfected with UV radiation, it is difficult to know whether JCPyV remain infectious or not due to difficulties in growing them in cell culture. Several studies have suggested that mean concentrations observed before and after secondary treatments such as lagooning (Fernández-Cassi et al., 2016) or polishing ponds (Jurzic et al., 2015) were not significantly reduced, and in some instances actually increased. Conversely, Rodriguez-Manzano et al. (2012) undertook a five-month study in Spain to determine the suitability of JCPyV through two full-scale sewage treatment plants, in order to define the most useful indicators for the microbiological control of reclaimed water. Whilst the authors detected JCPyV (by qPCR) in 100% of raw sewage samples (average conc. 5.44-9.11x104 GC/100mL), they were absent (unlike HAdV) from tertiary reclaimed water (post UV). The authors conclude that the results obtained indicate that HAdV quantified by qPCR represents a more suitable pathogen/indicator tool (compared to JCPyV) in such waters and unlike JCPyV they may be used as an indicator of removal efficiency of pathogens by WWTPs. The presence of often contradictory findings suggests that further research is needed in order to better understand the behaviour and potential utility of JCPyV.

3.1.4 Aichi virus (AiV)

AiV RNA concentrations in influent and effluent wastewater have been determined to be up to 2.2×107 and 1.8×104 GC/L, respectively (Kitajima et al., 2013). A recent study by Schmitz et al. (2016) compared the removal of 11 different virus types (pepper mild mottle virus (PMMoV); Aichi virus (AiV); Norovirous GI,II,IV; EntV; sapovirus (SaV); group-A rotavirus (ARV); adenovirus (AdV); and JC polyomavirus JCPyV) by two full-scale WWTP utilizing advanced Bardenpho technology (primarily for nitrogen reduction) and compared the results with previously monitored conventional treatment processes at the same WWTP in Southern Arizona (prior to upgrade)(Kitajima et al., 2014). The results revealed which wastewater treatment processes were most proficient at minimizing the incidence of pathogenic viruses in effluent waters intended for reclamation and recycling and which viral markers correlated best with viral pathogens. The results led these authors to suggest that AiV virus could be used as a conservative viral marker with which to determine adequate wastewater treatment, as it firstly, most often showed the best correlation coefficients to viral pathogens, and secondly, was always detected at higher concentrations than the other viruses. The viral markers were also able to help demonstrate that secondary treatment in the form of an advanced Bardenpho process effectively reduced pathogenic viruses better than WWTP using conventional processes such as trickling filters and activated sludge. Reasons for this were most likely due to virus sorption to solids and improved removal of nutrients and suspended matter, which may have may led to enhanced virus reduction downstream due to more efficient disinfection.

Rachmadi et al. (2016) determined the occurrence of enteric viruses (EntV, AdV, AiV1, Norovirus GI/II) and their reduction in two surface flow constructed wetlands receiving treated (biological treatment) wastewater in Arizona. In addition, the potential of Pepper Mild Molted Virus (PMMoV) and JC and BK polyomaviruses as indicators of treatment performance was also assessed. The findings revealed that the most abundant enteric viruses detected in treated wastewater (inlet of wetlands) were AdV and AiV1. These viruses were detected at concentrations of 102–105 GC/L in the inlet of one of the Wetlands, where virus removal efficiencies of up to 2.5 log10 were observed. AiV1 was detected in 51% (14/27) of samples from one of the wetland sites, leading the authors to suggest that the high abundance and persistence of AiV1 in the wetlands was probably due to their constant presence in wastewater effluent in Arizona (Rachmadi et al., 2016; Kitajima et al., 2014).

3.1.4.1 Limitations of AiV

AiV detection using qPCR may overestimate the potential virus concentrations (and hence risk) due to the presence of total nucleic acids from free RNA and DNA, infectious viruses, and non-infectious viruses.

3.2 The Plant Pepper Mild Mottle Virus (PMMoV)

As a conservative indicator, PMMoV has potential due to its high abundance in treated wastewater and persistence during wetland treatment (Rachmadi et al., 2016; Hamza et al., 2011). Rachmadi et al. (2016) studied two surface flow constructed wetlands receiving treated (biological treatment) wastewater in Arizona and their findinds suggests that PMMoV was more persistent than JC and BK polyomaviruses. PMMoV was detected in all wetland samples (inlet, outlet, and intermediate) ranging from 102-107 GC/L with less than 1 log10 reduction by wetland treatment. These results suggest that PMMoV is very stable and a potential conservative tracer of wetland treatment performance with respect to virus occurence and reduction. No correlation was observed between log10 PMMoV concentration and temperature (R= −0.1492), or turbidity (R = −0.222). However, the authors reported a weak correlation between log10 PMMoV concentration and pH (R = 0.4355, P value = 0.0334). Hamza et al. (2011) also showed that PMMoV was abundant in all raw and treated wastewater samples from 20 conventional activated sludge plants situated in Northern Germany. The detection of PMMoV in 100% of raw human sewage and final effluent samples is consistent with the findings by Rosario et al. (2009) and supports the concept of using this virus as an indicator of fecal contamination and possibly as an indicator of treatment efficacy.

3.2.1 Limitations of PPMoV

It has been suggested that PMMoV may be considered too conservative as PMMoV particles seem to be more stable than virions of HADV and HPyV (due to its capsid structure), (Fauquet et al., 2005), but its absence may ensure a very low probability of the presence of human viruses in the treated water. It is also important to bear in mind that virus detection by qPCR does not necessarily represent the presence of infectious viruses, however failure to detect a virus by qPCR (without extraction/qPCR inhibition) does indicate the absence of the virus and may be considered as a conservative performance target of a treatment system.

3.3 Helminth Ova (HO)

Most helminths (parasitic worms) are excreted in a non-inectious state and must mature in the environment, while they can survive in wastewater, or sludge they may not pose a health risk themselves, until the infective eggs (ova) of helminths (nematodes) are formed. Most helminths are not specifically looked for in sewage, however, Ascaris, Trichuris and Necator americanus can and may be used as indicators of other helminths (e.g. cestodes, trematodes and other nematodes), which are removed in wastewater treatment by the same mechanism (e.g. sedimentation) (von Sperling and Chernicharo, 2005). The HO can be regarded as indicators of treatment efficiency especially for the solid portion of the waste stream.

According to the WHO (2004) HO are one of the main health risks associated with reuse of wastewater and sludge due to long latency periods (period of time needed to mature), long persistence in the environment, and low infective dose. The WHO recommends for unrestricted irrigation, water containing less than one nematode egg per litre (WHO, 1989). Helminth ova are particularly useful when assessing treated wastewater and sludge for reuse e.g. irrigation, aquaculture where there is a possibility of either direct contact with contaminated waters or via consumption of uncooked produce. In less economically developed countries (LEDC) levels as high as 3000 HO per litre have been reported in municipal wastewater and 735 HO per litre in sludge (Jimenez-Cisneros, 2007).

Helminth Ova are removed from wastewater using the same processes necessary to remove suspended solids, namely sedimentation and coagulation-flocculation and filtration if used as a tertiary treatment. The viability of HO may also be used to establish the efficacy of specific disinfection processes. However, Jimenez-Cisneros, 2007 suggests that in contrast to FC, HO cannot be inactivated with chlorine, UV radiation or ozone (O3) (in the latter case at least not with economical doses because >36 mg O3 per litre are needed with 1 hour contact time). For example, eggs of parasitic worms such as Ascaris lumbricoides (large intestinal roundworm) are extremely resistant to chemical treatments such as lime stabilization but can be inactivated by high temperatures. Consequently, the use of FC are an inadequate indicator of helminth and viruses in anaerobically digested biosolids (New Hampshire Department of Environmental Services, (2003) and Savichtcheva and Okabi, (2006)).

Sossou et al.(2014) successfully used viable HO (Ascaris lumbricoides), and protozoan cysts (Entamoeba hystolitica cysts) to assess the removal and deactivation of intestinal parasites in urine diverting composting toilets (UDCT) in Burkina Faso. The authors assessed the removal of HO and protozoa cysts over 60 composting days, and compared them with the removal of indicator bacteria (coliforms, E. coli) and pathogenic bacteria (Salmonella sp). Compared to pathogenic bacteria, which were eliminated totally after 30 composting days, HO were progressively reduced in numbers during composting process and eliminated totally after 35 days; while protozoan cysts were still present after 60 days. At high concentration, protozoan cysts were shown to be more persistant than HO in the UDCT and may constitute a sanitary risk when used as fertilizer. Because of the persistence of intestinal parasites in UDCT, both HO and protozoan cysts are good indicators for the removal of intestinal parasites.

Another recent application by Amoah et al. (2016) involved the investigation of Soil Transmitted Helminth (STH) ova concentrations both in wastewater used for irrigation and the soil, as well as the STH ova load in the stool of farmers and their family members in Kumasi, Ghana. Their findings indicated that farmers and family members exposed to irrigation water were three times more likely to be infected with Ascaris and hookworm than the control group of non-farmers. Further examples of studies using HO and typical removal efficiencies obtained in a range of natural and engineered treatment systems can be found in Table 6.

Gyawali et al. (2017) recently reported the development of a novel quantitative PCR (qPCR) assay to quantify Ancylostoma caninum ova in wastewater (natural and artificially seeded) and sludge. The newly developed qPCR assay estimated an average of 3.7×103 gene copies per ovum, which was then validated by seeding known numbers of hookworm ova into treated wastewater. The further quantification of A. caninum ova in non-seeded wastewater matrices indicated that 50%, 90% and 67% of treated wastewater (1 L), raw wastewater (1 L) and sludge (~ 4 g) samples had variable numbers of A. caninum gene copies present. After conversion of the qPCR estimated gene copy numbers to ova numbers for unseeded wastewater samples, treated wastewater, raw wastewater, and sludge samples had an average of 0.02, 1.24 and 67 ova/L, respectively.

3.3.1 Limitations of HO

Not all wastewater and sludge contain significant amounts of HO, as such they are not universal parameters for monitoring wastewater and sludge treatment. HO are also less useful in applications involving high temperature treatment, due to inactivation. Recent findings suggest that whilst qPCR can be used for the quantification of HO from wastewater and sludge samples; caution should be excerised when interpreting such qPCR data for health risk assessment because of variable numbers of gene copies in an ovum depend on the stage of ovum cell development.

4.0 Molecular-based Dectection of Indicators and Markers

4.1 Polymerase Chain Reaction (PCR)

Molecular-based detection methods based on PCR for the detection and quanitification of microorganisms in raw and treated wastewaters, are rapidly gaining ground on traditional culture-based approaches (at least in more economically developed regions of the world). This is largely in response to improved reliability, sensitivity, greater standardization, significantly lower costs, more rapid assays and more effective targets (particularly enteric viruses) for detection (greater specificity). The most widely used approaches for detecting fecal indicators (and pathogens) include quantitative PCR (qPCR), or Real Time (RT-PCR) involving intercalating fluorescent probes (TaqMan) or dyes (SYBR Green) to measure the accumulation of amplicons (DNA/RNA) in real time during each cycle of the PCR (Douterelo et al., 2014). These approaches are quantitative, highly sensitive and offer fast and accurate results. However, while this technology does not for example allow an assessment of the infectivity of viruses, it has proven useful at providing information on the physical removal of virus and the loss of its genome by degradation processes (Kitajima et al., 2014).

Hata et al. (2013) demonstrated by reverse-transcription quantitative PCR (RT-qPCR) that the efficiency in the reduction of F-RNA coliphage genogroups at a wastewater treatment plant (WWTP) installing an activated sludge process and sand filtration was the lowest for F-RNA coliphages of GI (0.49 ± 0.29 log10), followed by that for GII (2.04±0.50 log10) and GIII (3.39±0.55 log10). Considering that the reduction ratios of various viruses, such as noroviruses and sapoviruses, were ranked between the reduction ratios for GI and GIII, these genogroups have been suggested as indicators that predict the magnitude of virus reduction at a WWTP (Hata et al., 2013). Meanwhile, the results did not provide any information about the reduction of the infectivity of F-RNA coliphage genogroups (Hata et al., 2013). Flannery et al. (2013) determined the reduction of GII F-RNA coliphages at a WWTP using both plaque assay and RT-qPCR: the reduction ratio determined by plaque assay (mean, 2.30 log10) was significantly higher than that determined by RT-qPCR (mean, 0.54 log10).

Another recent study by Haramoto et al. (2015) recently evaluated the applicability of using indigenous infectious F-RNA coliphage genogroups as an indicator of virus reduction at four points (raw, aeration tank effluent, AS return, secondary effluent) within a WWTP in Japan. Their findings revealed that GI F-RNA coliphages became the most abundant after treatment and that they had much lower reduction ratios than the other phage groups. As such, the authors suggest that indigenous infective GI F-RNA coliphages may be used as an appropriate indicator of virus reduction during wastewater treatment, though they recommend that further studies are needed to evaluate the wider applicability of this novel indicator of virus reduction at other WWTPs.

4.1.1 Limitations of molecular-based detection of indicators and markers

Wery et al. (2008) concluded that qPCR enabled the accurate and reproducible quanitification of non-dominant bacteria, such as Salmonella spp. and C. perfringens, in wastewater and treated water when this was not possible using culture techniques. However, Wery et al. (2008) also suggested that detection limits still have to be improved in order to use this technique to detect non-dominant bacteria in solid matrices such as sludge and compost. The presence of inhibitory compounds in wastewaters (e.g. humic substances, urea, bile salts etc.) and the expense associated with routine application of molecular approaches means that the detection and enumeration of culturable indicators are likely to have a continued role in studies into wastewater treatment and disinfection efficacy, particularly within resource limited settings. Results of previous studies (e.g. Flannery et al., 2013) have also indicated that the use of RT-qPCR alone underestimates the reduction of infectious F-RNA coliphages during wastewater treatment. However, molecular approaches (RT-qPCR) can be used in conjunction with culture-based approaches (e.g. Haramoto et al., 2015). Further examples of studies using a range of molecular-based approaches along with typical concentrations of gene copies (GC), limitations and removal efficiencies obtained in a range of natural and engineered treatment systems can be found in Tables 4 and 6.

In addition, nucleic acid extraction efficiencies vary considerably between different methods and the final nucleic acid yield depends on the methods used and the type of environmental sample. According to Girones et al. (2010), this makes direct comparison of absolute gene numbers between studies extremely problematic. The efficiency of the reverse transcription may also be variable and in general, qRT-PCR is considered to be more sensitive to inhibitors than qPCR. The qPCR methodology facilitates the evaluation of the efficiency of removal of indicators (or selected pathogens) in water treatment plants, including viruses.

4.2 High Throughput Sequencing

Molecular methods such as high throughput sequencing techniques (e.g. Roche 454 FLX, Illumina/Solexa Genome Analyzer) in which DNA fragment libraries are amplified and sequenced using massively parallel platforms should allow improved understanding of microbial diversity and structure analysis in water, wastewater and biofilms in the not too distant future. According to Douterelo et al. (2014), the methods are already faster and less expensive than traditional Sanger sequencing and have the advantage of allowing multiple samples to be combined in a run. However, these methods are not quantitative have high costs and are time-consuming. The need for data processing means that these methods are currently unsuitable for routine analyses of treatment and disinfection efficacy.

4.3 DNA-chip Array/Microarrays

DNA-chip array/microarrays invole the detection of fluorescent PCR amplicons (DNA/RNA), which are hybridized to known molecular probes attached to the chip/microarray, allowing the detection of fecal indicators and pathogens. Advantages include the fact that DNA-chip arrays are rapid and the intensity of the hybridization signal is proportional to the abundance of the target organisms present. Limitations associated with this approach are that it is currently very costly and requires highly trained personal to analyse and interpret data.

5.0 Emerging Indicators and Index (Model) Organisms

5.1 Micro Mimics as Pathogens Surrogates

Micro mimics such as fluorescent polystyrene microspheres have been proposed as surrogates for the transport, attenuation and removal of certain pathogenic microorganisms (including viruses, bacteria and protozoa) in natural and engineered systems (Bates et al., 1995; Auckenthaler et al., 2002; Harvey et al., 1989; Harvey et al.,1993; Harvey et al., 2011; Sinreich et al., 2009; Becker et al., 2003; Champ and Schroeter, 1988). Whilst fluorescent polystyrene microspheres are nongenotoxic (Behrens et al., 2001) and have often been used as safe surrogates for microbial particles in field experiments (Harvey et al., 2008; Mohanram et al.,2010), their use has been limited, because their surface properties (e.g., surface charge) are very different from certain pathogens (e.g. C. parvum oocysts), resulting in different attachment and filtration characteristics (Bradford and Bettahar, 2005; Tufenkji and Elimelech, 2005; Harvey et al., 2008).
However, recent advances in the form of modified protein-coated microsperes and DNA-labelled, protein-coated silica nanoparticles have been developed, whose behaviour appears to more effectively mimic pathogens of human health significance such as Clostridium parvum and adenovirus/rotavirus, respectively (Pang et al., 2012; Pang et al.,2014). A recent study of filtration and transport of rotavirus and adenovirus in porous media (sand columns) suggested that the protein-coated silica nanoparticles predicted virus removal and attachment kinetics better than phage MS2 and that the surrogates remained stable in size, charge and DNA signal for over a year (Pang et al., 2014). Providing such approaches can be successfully up-scaled, they may offer a cost-effective tool for studying virus retention and transport in treatment processes. As such, they may be particularly useful for assessing the removal efficacy of filters in water and wastewater treatment systems. For field and transport studies involving pathogens, there has been considerable interest in using fluorescent carboxylated microspheres (FCM) as surrogates, because they are chemically inert, negatively charged, easy to detect, available in a wide variety of sizes, and have been found to be non-hazardous in tracer applications (Behrens et al., 2001).

5.1.1 Limitations of micro mimics

Previous use of microspheres has been limited because their surface properties (e.g., surface charge) are very different from those of the oocysts, resulting in different attachment and filtration characteristics. Some microspheres have proven to be less-than-ideal analogs for capturing the abiotic transport behavior of viruses and bacteria, there is encouraging recent evidence regarding use of FCM as surrogates for C. parvum oocysts. Although the costs associated with the application of micromimics continue to decrease, they may still be prohibitively costly and technologically unsuitable for use in low resource settings.

5.2 Phages of Aeromonas, Enterobacter and Klebsiella

Among the methods less well understood are phages capable of infecting other groups of enteric bacteria such as strains of Aeromonas, Enterobacter, and Klebsiella.

5.2.1 Limitations of phages of Aeromonas, Enterobacter and Klebsiella

The application of these promising approaches currently remains limited (Wangkahad et al., 2014), and they have yet to be more widely applied to assess treatment and disinfection efficacy.

5.3 Genetic Bacteroidetes Fecal Markers (GeBaM)

Human-associated genetic Bacteroidetes fecal markers (GeBaM) have recently been successfully applied to raw and biologically treated wastewater at 13 well characterized wastewater systems and treatment plants in Austria and Germany (Mayer et al., 2016). The authors used volume-proportional automated 24-h sampling to monitor the occurrence, dynamics and removal of FIB (E. coli and IE), Clostridium perfringens spores, human-associated genetic Bacteroidetes fecal markers (GeBaM), human-specific viral fecal markers (HAdV, JCpV) and human-associated Bacteroides phage (GA-17) to determine the effect of WWTP size, type and seasonality. The findings provided strong empirical evidence of the ubiquitous and abundant occurrence of GeBaM in raw and biologically treated wastewater, regardless whether it was derived from single households or larger settlements. As such, these results provide the first comprehensive information on the occurrence and dynamics of GeBaM in raw and biologically treated wastewater from several well characterized wastewater systems and treatment plants. The authors suggest that GeBaM qPCR quantification offers a potentially new means to complement routine water quality testing as it can be performed with at least equal precision compared with traditional ISO-based cultivation techniques (Stapleton et al., 2009; Mayer et al., 2016; Betancourt and Fujioka, 2006; McQuaig et al., 2012; Molina et al., 2014). These findings are particularly promising given that the low abundance of many indicators and markers in human fecal matter or wastewater from small-scale decentralised treatment systems has limited their application in such situations.

5.3.1 Limitations of GeBaM

This method is still in its infancy and further research is needed to determine the geographical stability and suitability for use in a range of different matrices (e.g. biosolids, sludge, influent, effluent etc).

5.4 Biosensors

Biosensors allow the direct detection of microorganisms using a combination of immunoassay techniques, integrated optics and surface chemistry. Biosensors are suitable for the rapid detection of fecal indicators, but their application to monitor the performance of WWTP is yet to be fully recognised. However, such approaches are currently limited by their inability to discriminate between live and dead microorganisms and their sensitivity. Detection also depends on cultivation of the microorganisms (Douterelo et al., 2014).

6.0 Application of Treatment and Disinfection Indicators in Natural and Engineered Systems

6.1 Examples of Use of Indicators and Index (Model) Organisms to Assess Treatment Efficacy

6.1.1 Pit toilets

Graham and Polizzotto (2013) systematically reviewed the impact of pit toilets on groundwater quality, reporting that studies of pit toilets and groundwater contamination have been limited and have only focused on a few indicator contaminants, including E. coli, faecal coliforms, fecal streptococci, enteric viruses (e.g., adenovirus, rotavirus), and basic water chemistry parameters (e.g., ammonia, nitrate, dissolved solids, metals, chloride, sulfate, potassium, conductivity). The reported lateral travel distances from pit toilets (until contaminants could no longer be detected) ranged from 1 m to 27 m for bacterial indicators and chemical tracers (e.g., nitrate, chloride, salts), but as far as 50 m for viruses (Graham and Polizzotto, 2013, and references therein). This demonstrates that enteric viruses may travel further from pit toilets than fecal indicator bacteria or chemical tracers.

6.1.2 Composting toilets

Redlinger et al. (2001) studied the survival of fecal coliforms and investigated the most approriate method for reducing the contents of 90 dry composting toilets in Ciudad Juárez, Mexico. After classifying the composting into class A (1 g = <1,000 MPN) and class B (1 g = < 2x106 MPN), they reported for both the 3- and 6-month samples, class B was the most abundant classification at 70.6 and 60.5%, respectively, with the greatest percentage occurring at 3 months. On the other hand, class A samples, although not the dominant class, significantly (P 5 0.043) increased at 6 months (35.8%) from the 3-month period (19.4%). This demonstrated that with respect to fecal coliform reduction, one-third of the prefabricated composting toilets (Sistema Integral de Reciclamiento de Desechos Orga´nicos- SIRDOs) produced a high grade end product. The authors concluded that desiccation was the primary mechanism for fecal coliform reduction rather than biodegradation (Redlinger et al., 2001). Hill et al. (2013) showed that higher pH and NH3-N concentrations significantly reduced the concentration of E. coli in composting toilets.

6.1.3 Septic systems

Septic systems consist of septic tanks and a system designed to leach the liquid fraction leaving the septic tank into the soil (e.g. leach pit or leach field). Richards et al. (2016) studied the effluent from 32 septic tanks from a residential area in the northeast of Scotland and reported that tanks serving more than two people had significantly greater concentrations of coliforms than those serving only 1 or 2 people. Leach fields provide additional reduction of pathogens leaving the septic tanks, and viruses can travel much further and survive much longer than bacterial pathogens and fecal indicators. performed a modelling study of virus transport in septic system leach fields, suggesting that horizontal setback distances between leach fields and drinking water wells should be 39–144 m in sand aquifers, 66–289 m in gravel aquifers, and 1–2.5 km in coarse gravel aquifers. Parasites such as helminth eggs and protozoan (oo)cysts may be partially removed in the septic tank, and are removed much more quickly in the leach field than viruses. For example, Piranha et al. (2006) collected samples from 15 different wells located on rural properties in the region of São José do Rio Preto, São Paulo, Brazil, and found that human adenoviruses were detected in more than half of the samples, fecal coliforms were present in only 27.5% of samples, and Cryptosporidium oocysts were not detected in any of the samples. Bacteriophages have been used to assess virus removal efficiency in septic system leach fields, for example, to determine that the efficiency of pathogen reduction in the leach system is dependent on a range of factors including soil type, hydraulic conductivity, and the distance travelled between the leach pit/field and groundwater source. For example, Van Cuyk and Siegrist (2007) used unsaturated soil column experiments to show that the reduction of MS2 and PRD-1 bacteriophages in the unsaturated portion of leach systems ranges from < 1 log10 units to nearly 3 log10 units. The authors found that removal of the phages improved over time, and removal efficiencies were influenced by hydraulic loading rates (higher rates improved removal efficiency), soil type (sandy loam was better than medium sand), and dosing rate (removal was more efficient when phages were added to soil columns 24 times per day rather than only 4 times per day).

6.1.4 Waste stabilization ponds and constructed wetlands

Helminth eggs and protozoan (oo)cysts are removed via sedimentation more efficiently than bacteria and viruses in waste stabilization ponds and constructed wetlands. Bacteria in pond systems die off more rapidly than viruses, and high temperatures, high algal activity (high pH and DO), and efficient hydrodynamics are among the most important factors for the reduction of fecal indicator bacteria in waste stabilization ponds. A 10-year study of three waste stabilization ponds in series treating the effluent of a UASB reactor showed that the ponds achieved >4 log10 reduction during all seasons (Dias et al., 2014). The reduction of bacterial pathogens in waste stabilization ponds also appears to be similar to the reduction of fecal indicator bacteria such as E. coli (see chapter on Waste Stabilization Ponds in Part Four of the GWPP). Virus reduction is less efficient in pond systems. Coliphages have been used as indicators to assess virus removal efficiency in waste stabilization ponds, and results from a review of 71 different waste stabilization pond systems did not indicate any statistically significant difference between the efficiencies of the reduction of bacteriophages compared to the reduction of enteric viruses (Verbyla and Mihelcic, 2015).

Studies on wetland systems have traditionally focused on their ability to remove organic and inorganic contaminants. However, more recently attention started to be paid to their capacity to eliminate fecal indicators (e.g. Gersberg et al., 1989; Green et al., 1997; García et al., 2010; Headley et al., 2013; Wu et al., 2016). Indicators used to elucidate the removal mechanisms and efficacy of wetland systems typically include bacterial indicators, most commonly total coliforms, fecal coliforms, E. coli, intestinal enterococci (fecal streptococci), staphylococci, Salmonella, Bdellovibrio and Clostridium perfringens (and its spores), coliphages (somatic and F-RNA specific), and these may be either naturally occurring (i.e. indigenous), or introduced (i.e. ‘spiked’). As well as wastewater inputs, naturally occurring indicators present in wetland systems may also arise from the presence of wildlife (e.g. avian sources), or be the results of bacterial regrowth within the system.

According to a recent review conducted by Wu et al. (2016), knowledge on the fate and removal of fecal indicator bacteria in constructed wetlands is still not sufficient due to the complexity of removal mechanisms and influencing factors. What is s more certain is that bacterial indicators such as fecal coliforms (E. coli) are poor indicators for the removal of enteric viruses in wastewater treatment ponds WTPs (Maynard et al., 1999; Verbyla and Mihelcic, 2015). Therefore, a range of fecal indicators is likely to be required, if the influence of operational parameters such as hydraulic retention time, vegetation, seasonal fluctuation, and water composition is to be elucidated. Only then can treatment systems and regimes be optimized to ensure maximum pathogen removal. It has been suggested that there is limited data on removal of viruses in treated wastewater by constructed wetlands (Gerba et al., 2013). Almost all of the existing data is on coliphages; in addition, data on enteric viruses is limited to enteroviruses and reovirus (Harwood et al., 2005; Lodder and de Roda Husman, 2005; Rachmadi et al., 2016). Further information about removal of viruses and other pathogens of human health significance can be found in chapter 60.

Gerba et al. (1999) evaluated the removal of indicator bacteria (coliforms), coliphage, and enteric pathogens (Giardia and Cryptosporidium) using three different wetland systems in Arizona, (U.S.). These included a duckweed-covered pond, a multi-species subsurface flow (SSF) and a multi-species surface flow (SF) wetland. The findings revealed that the larger microorganisms (Giardia and Cryptosporidium) were removed most effectively by the duckweed pond (1.7 and 0.96 log10 removals, respectively). The lowest removal occurred in the SF wetland, 0.57 log10 for Giardia and 0.38 log10 removal for Cryptosporidium. In contrast, the greatest removal of coliphage, total and fecal coliforms occurred in the SSF wetland reduced by 1.3 log10, 2 log10, and 1.7 log10 respectively, whereas the pond had the lowest removals (0.22, 0.42, and 0.41 log10, respectively). The authors suggested that sedimentation was likely to be the primary removal mechanism within the duckweed pond since the removal was related to size, and that the optimum removal of the smaller microorganisms observed in the SSF wetland, may be related to the large surface area available for adsorption and filtration. These findings suggest that in order to achieve the highest treatment level of secondary unchlorinated wastewater, a combination of aquatic ponds and subsurface flow wetlands may be necessary.

More recently Reinoso et al. (2008) studied a combined constructed wetland formed by a facultative pond (FP), a surface flow wetland (SF) and a subsurface flow wetland (SSF) in north-western Spain in order to evaluate the removal of indicators and pathogens and to determine their relationships. Microbial removal was shown to range from 0.66 log10 for coliphages to over 2 log10 for helminth eggs, depending on the treatment system. The highest removal of indicator bacteria (total coliforms, E. coli, fecal streptococci and Clostridium perfringens) occurred in the stabilization pond, reaching 0.8 log10, 1.4 log10, 0.96 log10 and 0.66 log10, respectively. However, the greatest removal of protozoan pathogens (Cryptosporidium and Giardia) and coliphages was found in the SSF wetland, 1.7 log10, 1.5 log10 and 1.22 log10, respectively. In contrast, the SF wetland was most efficient in the removal of pathogenic parasites when considering superficial removal rates. Seasonal differences in removal were not found to be statistically significant during the study period. First-order removal rate constants ranged from 0.0027 to 0.71 m/d depending on the microorganism and type of wetland. Significant correlations were found between pathogenic parasites and fecal indicators in the influent of the treatment system but not in the other sampling points suggesting that such relations varied along the system due to the different survival rates of the microorganisms.

Abreu-Acosta and Vera (2011) used somatic coliphages in a study which compared the efficiencies of two decentralized natural reclamation systems in Tenerife, Canary Islands, (Spain) at removing FIB and enteric pathogens. The natural systems consisted of a combination of anaerobic treatment, small (12-80 Pop. Equiv) horizontal sub-surface flow constructed wetland refilled with volcanic ashes and a final pond as water reservoir. Data from both systems confirmed that somatic coliphage resistance was higher than the studied bacterial indicators, which is in accordance with previous results obtained from other wastewater treatment systems where their removal has also been found to be lower (Thurston et al., 2001; Lucena et al., 2003; Lucena et al.,2004). The authors concluded that the advantages of somatic coliphages as indicator are multiple: their abundance, their direct relationship to Giardia sp., their higher resistance than bacterial indicators, their simple and economical determination, and the fact that results of counts can be obtained in 4 h. Removal of bacterial indicators was found to be significant (ANOVA; p < 0.05), reaching 2 log10 units and the global abatement of somatic coliphage important (1.5 log10 units) (ANOVA; p < 0.05).

The results suggest that removal of fecal indicators and pathogens in wetlands may occur via a combination of physical (e.g. filtration, sedimentation and sorption to organic matter and/or growth matrix), chemical (e.g. oxidation and biocidal activity of plants), and biological factors (e.g. predation by nematodes and protists, antimicrobial activity of rhizome exudates, activity of lytic viruses or bacteria, entrapment within biofilms, die-off, and limiting nutrients (Axelrood et al., 1996; Brix, 1997; García and Bécares, 1997; Ottová et al., 1997; Green et al., 1997; Decamp and Warren, 1998; Decamp et al., 1999; Boutilier et al., 2009; Wu et al., 2016). However, Wu et al. (2016) suggest that the most significant removal mechanisms of fecal bacteria (and pathogens) in constructed wetlands might vary depending on the particular design type of the treatment unit, the hydraulic regime, wastewater characteristics, and even the local climate.

6.1.5 Activated sludge

Hamaidi et al. (2014) investigated the removal efficacy of a range of different indicators (intestinal enterococci, SSRC) and bacterial pathogens (Pseudomonas aeruginosa and Staphylococcus aureus) in a full-scale activated sludge system situated in Algeria, North Africa. Their results (Table 10) showed that removal rates for the different indicators were similar to the removal rates for the bacterial pathogens.

Table 10. Removal efficiencies for a range of different bacterial indicators in full-scale activated sludge systems in Algeria and the United States

Location

Microorganism

Log10 Reduction

Reference

Algeria

Fecal coliforms

1.4

Hamaidi et al., 2014

Algeria

Enterococci

1.4

Hamaidi et al., 2014
Algeria

Spores of sulphite-reducing clostridia

1.1

Hamaidi et al., 2014
Algeria

P. aeruginosa

1.4

Hamaidi et al., 2014
Algeria

S. aureus

1.2

Hamaidi et al., 2014

Ohio, USA

E. coli

3.0

Francy et al., 2012

Ohio, USA

Enterococci

3.0

Francy et al., 2012
Ohio, USA

Fecal coliforms

2.8

Francy et al., 2012

6.1.6 Trickling media filters

Edokpayi et al. (2015) recently monitored indicator concentrations (E. coli and IE) at 5 points through a WWTP in the Limpopo Province of South Africa over a 6-month period (Jan-June). The system consisted of preliminary treatment (screening and grit removal), primary sedimentation, trickling filters, secondary clarification, tertiary treatment with a maturation pond, and chlorine disinfection. Their findings revealed that the plant was not only running beyond its design capacity (receiving 13 megalitres of wastewater per day instead of 6 megalitres per day), but that levels of E. coli in the influent and effluent ranged from 6×103 to 2×106 CFU/100 mL and 8×103 to 1×106 CFU/100 mL, respectively. While intestinal enterococci were in the range of 4×103 to 8×105 CFU/100 mL and 2.6×103 to 1.1×105 CFU/100 mL, respectively. The indicator data also showed that during the months of January to April the E. coli effluent counts were higher than the influent, though, 25% and 15% reduction efficiencies were achieved during 2 months (May, June), respectively. Similarly, intestinal enterococci levels were higher in the effluent than the influent for the months of Feb and March, although, 40%, 86%, 38% and 20% reductions efficiencies were recorded in January, April, May, and June, respectively. The observed indicator reductions were insufficient to meet the recommended Department of Water Affairs (DWA) discharge guidelines. As, such the indicators were successfully used to demonstrate that the Thohoyandou WWTP is increasingly unable to cope with the quantity of wastewater it currently receives and that effluent from this plant poses and elevated potential risk to downstream water users. Another study (Momba et al., 2006) in the Eastern Cape Province of South Africa also used indicators to highlight inadequate wastewater treatment at WWTP in Buffalo City and Nkokonbe Municipalities and again showed non-compliance with the Department of Water Affairs effluent discharge guidelines (DWA).

6.1.7 Membrane bioreactor (MBR) systems

Membrane bioreactors are a relatively new wastewater treatment technology that combine a permselective membrane with a biological process (Judd, 2010; De Luca et al., 2013). Solids are therefore removed by the membrane, rather than by a secondary settling process. Further detail of MBR technologies can be found in the Sanitation Technologies Chapters. In brief, MBR membranes have relatively small pore sizes (0.03-0.40 µm), resulting in the physical exclusion of a wide variety of microorganisms (Ottoson et al., 2006; Simmons et al., 2011). Although most viruses are smaller than the membrane pore sizes presently used in many MBR systems, recent studies have reported high removal values for viruses (Kuo et al., 2010; Simmons et al., 2011; Hai et al., 2014; Chaudhry et al., 2015; Miura et al., 2015; Purnell et al., 2015; Purnell et al., 2016). Consequentely, there is still some disagreement as to the most important mechanisms for virus removal in MBR systems, but it is thought to be primarily influenced by the development of a biofilm on the membrane, and by virus adsorption to this biomass (Da Silva et al., 2007; Shang et al., 2009; Hirani et al., 2014; van den Akker et al., 2014).

Indicators used to improve understanding of MBR removal mechanisms include traditional FIB, but have more recently involved the use of phage–based indicators. For instance, several studies have consistently demonstrated that microbial removal in MBR systems is more effective than in conventional activated sludge treatment (Arraj et al., 2005; Ottoson et al., 2006; Francy et al., 2012; De Luca et al., 2013; Purnell et al., 2015; Purnell et al., 2016). Importantly, the application of phages in such studies is helping to elucidate the relative contributions of the different physical (pore size reduction), chemical (viral adsorption) and biological (predation) removal mechanisms. Phage-based indicators can also be used to monitor the state of biofilms and to assess the integrity of membranes. Table 11 Efficiency of MBR systems in removing microorganisms from wastewater.

Table 11. Microbial removal efficiency of MBR system

Microbes

Log10 Removal

Reference

Phages

Somatic coliphage (SC)

5.6

Purnell et al. (2016)

F-specific (F-RNA)

3.9

Purnell et al. (2016)

GB-124

4

Purnell et al. (2016)

Coliphage (F-RNA & SC)

5.8

Farahbakhsh and Zhang (2007)

Enteric viral pathogens

Norovirus GII

2.3a

Purnell et al. (2016)

Norovirus GII

0.2 to 4.7

Chaudhry et al. (2015); Kuo et al. (2010); Miura et al. (2015); Ottoson et al. (2006); Sima et al. (2011); Simmons et al. (2011)

Sapovirus (SV)

1.3 to 4.1

Calicivirus (CaV)

3.3 to 6.8

Adenovirus (AdV)

3.9 to 5.5

Adenovirus (AdV)

4.4a

Purnell et al. (2016)

Fecal Colifoms (FC)

5.8

Farahbakhsh and Zhang (2007)

aLog10 gene copies


For example, Purnell et al. (2018) used FIB, phage (SC, F-RNA and B. fragilis (GB-124)) enteric viral pathogens (AdV, HAV, NoV GI/II) to elucidate the potential risks associated with the reuse of MBR effluents for the augmentation of potable water supplies at a full-scale wastewater recycling plant in London, U.K. Of the three phage groups, SC demonstrated the strongest positive correlation with AdV and NoV across all seasons. As can be seen in Table 11, significant mean reductions in SC numbers were observed following MBR treatment (5.6 log10). GB124 and F-RNA phages also demonstrated notable reductions following MBR treatment, with mean reductions of 4.0 and 3.9 log10, respectively. GB124 and F-RNA phages were undetected (<1 PFU/100ml) in all post-MBR treatment samples. SC were the only group observed in the MBR product (effluent). Log10 reduction values post-MBR for NoV and AdV were 2.3 and 4.4, respectively.

Others similar full-scale systems have demonstrated viral pathogen removal rates of greater than 4 log10 (Chaudhry et al., 2015; Kuo et al., 2010; Simmons et al., 2011). It is important to note that both AdV and NoV were detected (using qPCR) after the chlorination stage of the treatment system in single samples. However, despite evidence of very occasional viral “breakthrough”, removal rates suggested that the system is capable of acting as an effective physical barrier to the transmission of human viral pathogens. Given the limitations of qPCR-based approaches to distinguish viable organisms, and the fact that the log reductions of the phages did not differ significantly from those of the viral pathogens (P >0.05, Kruskal-Wallis), Purnell et al. (2018) conclude that simple phage-based approaches, such as the enumeration of SC constitute a useful model organism for better understanding the removal of waterborne pathogenic viruses in such systems. Monitoring MBR systems using phages in this way may not only be useful for elucidating the physical (e.g. pore size reduction), chemical (e.g. viral adsorption) and biological (e.g. predation) removal mechanisms associated with biofilm development, but could provide early warning of breaches to membrane integrity that might pose a risk to human health.

The efficacy of MBR technology versus a conventional activated sludge (AS) wastewater treatment process in removing total and fecal coliforms, somatic and F-specific coliphages was determined by Farahbakhsh and Zhang (2007) in WWTP in Guelph, Canada. Their findings demonstrated how MBR systems can achieve effective microbial removal in far fewer steps than the conventional AS processes with advanced tertiary treatment (coagulation). Their findings also showed 5.7 and 5.5 log10 removal of fecal coliforms and coliphages (F-specific and somatic), respectively for the conventional treatment (with advanced tertiary treatment), versus complete removal and up to 5.8 log10 removal of fecal coliforms and coliphages, respectively for the MBR system. The authors concluded that the final effluent from either treatment was of a sufficiently high quality that it could potentially be used for reuse purposes.

Other workers have monitored fecal indicators alongside pathogenic viruses in MBR directly using qPCR methodologies, which are based on the detection of nucleic acids, rather than complete, infectious particles (virions). In a range of studies, MBR treatment systems have recorded removal rates of between 3.9 and 5.5 log10 units for adenovirus (Adv), 1.3 and 4.1 log10 units for sapovirus (SaV), 0.2 and 5.7 log10 units for norovirus (NoV GII), 0.3 and 3.6 log10 units for enterovirus, and 3.3 and 6.8 log10 units for calcivirus (CaV) (Chaudhry et al., 2015; Kuo et al., 2010; Miura et al., 2015; Ottoson et al., 2006; Sima et al., 2011; Simmons et al., 2011). Whilst qPCR allows for the detection of unculturable pathogens, such as NoV, the detection of nucleic acids from damaged particles in treated product, may lead to over-estimates of the potential risk to human health from reuse water. MBR technologies appear to be particularly suitable for water reuse (Hai et al., 2014).

6.1.8 Moving-bed biofilm reactor

Skraber et al. (2007, 2009a) investigated the presence and persistence of phage (F-RNA phage GI, II, III and IV) and viral pathogens (EntV and Nv GI/GII) in natural wastewater biofilms in a full-scale moving-bed biofilm reactor (MBBR) situated in Luxembourg. Their findings demonstrated that enteric phages can transfer from wastewater to biofilms and that the viral genomes of phage and Nv are very stable in biofilms, with no significant decrease over at least 49 days. The 7-month (Jan –July) study also revealed that concentrations of both viral genomes and infectious F-specific phages (quantified using both culture and molecular techniques (real-time RT-PCR)) were found to persist longer in the biofilm than in wastewaters, suggesting that wastewater biofilms may contribute to the persistence and dispersal of pathogenic viruses outside epidemic periods. The F-specific genogroups also provided information on wastewater input origin, such as animal inputs associated with storm run-off. The authors concluded that not considering the potential role of biofilms in the fate of enteric pathogens may lead to false assumptions in risk assessment, modelling research or epidemiological investigations.

6.1.9 Sludge composting

Validation of the treatment processes and assurance of microbiological quality of the effluent and the treated sludge is not easy to perform, as methods of pathogen identification, detection and enumeration are complicated, costly and time-consuming (Mandilara et al., 2006). Consequently, surrogate indicators (typically FC, E. coli, IE) have been used for routine evaluation of WWTP performance and effluent/sludge quality.

Wery et al. (2008) monitored indicator bacteria C. perfringens and E. coli along with two enteric pathogens (Salmonella spp. and Campylobacter jejuni), using both quantitative real-time PCR and traditional culture-based detection at a full-scale WWTP and sludge composting facility in France. The findings demonstrated that whilst a reduction of all bacteria was observed during wastewater treatment and during the thermophilic phase of composting, the bacterial groups behaved differently during the process. The main differences were observed during biological treatment (activated sludge). Interestingly, the results also showed that for C. perfringens the results obtained using real-time PCR and traditional cuture based detection were very similar, though this was not the case for E. coli and Salmonella spp., which demonstrated differences of up to 5 orders of magnitude greater using PCR.

In particular, Salmonella spp. and C. jejuni survived better during activated sludge treatment than E. coli. C. jejuni was the most resistant to wastewater treatment among the four bacterial groups. Overall, differences in survival were observed for all bacteria studied, when submitted to the same environmental pressure. This holds both for differences between indicators and pathogenic bacteria and between pathogenic bacteria. These results illustrate the difficulty in defining reliable indicators. The authors concluded that qPCR enabled the accurate and reproducible quantification of non-dominant bacteria, such as Salmonella spp. and C. perfringens, in wastewater and treated water when this was not possible using culture techniques. However, Wery et al. (2008) also suggested that detection limits still have to be improved in order to use this technique to detect non-dominant bacteria in solid matrices such as sludge and compost.

Somatic coliphages (SC) look to be a promising indicator for evaluating the thermal treatments of sludges. For example, Astals et al. (2012) successfully used SC and F-RNA phage to assess the impact of mesophilic and thermophilic anaerobic digestion, using the same type of reactor and the same raw sewage sludge, on process performance and sludge hygienization. The authors assessed the degree of hygienisation achieved using E. coli as a bacterial indicator and SC and F-RNA phage as viral indicators. The results showed that the reduction of the numbers of SC using mesophilic digestion (1.0 log10 units was significantly lower than the reduction achieved by E. coli (P=0.002), an observation supported by the scientific literature (Lasobras et al., 1999; Aitken et al., 2005; Mandilara et al., 2006; Guzman et al., 2007). The reduction of SC attained by thermophilic digestion, that averaged 2.3 log10 units, was significantly (P=0.005) higher that their reduction achieved in the mesophilic digestion. However, as in mesophilic conditions, this reduction was significantly lower than the reduction achieved using E. coli (4.2 log10 units). Poor removal of SC by mesophilic digestion has also been reported by Aitken et al., 2005. Somatic coliphage and F-specific RNA phage reductions also suggested that from the point of view of sanitation (hygienisation) it does not seem that the ST-STAD in the conditions tested contributes a substantial sanitation improvement.

6.1.10 Denitrifying woodchip bioreactor

The ability of low-cost and simple denitrifying bioreactors using woodchips or other slow release carbon sources have been shown to be effective at removing nitrate (NO3) from wastewater (Robertson et al., 2005;Robertson et al., 2008; Schipper et al., 2010; Christianson et al., 2012). Rambags et al. (2016), recently explored the use of fecal indicator bacteria (Escherichia coli) and viruses (F-specific phages) to assess removal efficacy of fecal microbes within a full-scale denitrifying woodchip bioreactor in New Zealand receiving secondary-treated septic tank effluent. Samples were analysed monthly (between February and June 2015) from 9 points through the bioreactor and demonstrated consistent and substantial (P < 0.01) reduction of E. coli (2.9 log10 reduction) and FRNA phage (3.9 log10 reduction), despite receiving highly fluctuating inflow concentrations (up to 3.5x105 MPN/100 mL and 1.1x105 PFU/100 mL, respectively). The bioreactor was also shown to be efficient at removing NO3 (>3 log10 reduction) and TSS (0.96 log10 reduction). Bacterial and viral indicators such as those applied by Rambags et al. (2016) are therefore helping to determine whether such bioreactors may have broader versatility for wastewater treatment, beyond nitrate removal. However, their results also suggest that further research into the removal mechanisms is needed in order to determine how long such indicators remain active, and how they are affected by factors such as seasonality, loading rate, and inflow concentration.

6.1.11 Wastewater reclamation facilities

Harwood et al. (2005) tested the validity of using indicator organisms (TC, FC, IE, Clostridium perfringens, and F-RNA phages) to predict the presence or absence of pathogens (infectious enteric viruses, Cryptosporidium, and Giardia) at six wastewater reclamation facilities in the U.S. over a 1-year period. Larger sample volumes for indicators (0.2 to 0.4 liters) and pathogens (30 to 100 liters) resulted in more sensitive detection limits than are typical of routine monitoring. Microorganisms were detected in disinfected effluent samples at the following frequencies: TC 63%; FC 27%; intestinal enterococci 27%; C. perfringens 61%; F-RNA phages 40%; and enteric viruses 31%. Cryptosporidium oocysts and Giardia cysts were detected in 70% and 80%, respectively, of reclaimed water samples. Viable Cryptosporidium (based on cell culture infectivity assays) were detected in 20% of the reclaimed water samples. No strong correlation was found for any indicator-pathogen combination. However, when data for all indicators were tested using discriminant analysis, the presence/absence patterns for Giardia cysts, Cryptosporidium oocysts, infectious Cryptosporidium, and infectious EntV were predicted for over 71% of disinfected effluents. The failure of measurements of single indicator organism to correlate with pathogens suggests that public health may not adequately be protected by simple monitoring schemes based on detection of a single indicator, particularly at the detection limits routinely employed. The authors of this study recommend monitoring a suite of indicators in reclaimed effluent, as they are more likely to be predictive of the presence of certain pathogens in such waters. The following section identifies some of the most promising approaches and technologies used to recycle wastewater, such as the application of membrane bioreactors (MBR) for irrigation, or for the augmentation of potable water sources (Harwood et al., 2005).

6.2 Use of Indicators and Index (Model) Organisms to Assess Disinfection Efficacy

6.2.1 Chlorination

In general, disinfection of wastewater has been used since the beginning of the previous century and has subsequently reduced the risks of human exposure to pathogenic microorganisms. To date, the mostly used disinfection technique is chlorination (Solsona and Pearson, 1995; Carvajal et al., 2017; Gagnon et al., 2005; Li et al., 2017a; Lie et al., 2017b; Ofori et al., 2017). Chlorination is the process of adding the most widely applied chemical oxidant such as chlorine or hypochlorite to water in order to inactivate pathogens. Oxidants typically used for disinfection include chlorine (Cl2 or hypochlorite, HOCl), chlorine dioxide (ClO2) and chloramine. The effectiveness of chlorination is a function of dose, contact time, pH and temperature (Carvajal et al., 2017) Chlorination can be placed at any position during the treatment process; however, it is important to note that organisms entrapped in particles may be shielded from the action of the chemicals. For over 100 years now, chlorination has shown to be more effective in inactivating indicator or model organisms and vanquished the outbreaks of cholera, typhoid, and other waterborne diseases in the developed world by the 1940s (Sossou et al., 2016).

Chlorine - Chlorine extensively damages the cell membrane by disrupting cell permeability to inactivate microbes, (Sossou et al., 2016). Membrane damage is not the only key event in the inactivation of bacteria by chlorine, by uncoupling of the electron chain or enzyme inactivation either in the membrane or in the cell interior, damage of nucleic acids and repression of gene transcription are also involved in the bactericidal mechanism of chlorine (Virto et al., 2005; Bitton, 2014).

Of the targeted organisms, viruses and protozoa have been reported to be more resistant to chlorination than bacteria. Studies have reported that protozoan cysts like those from Cryptosporidium and Giardia are more resistant to chlorine disinfection (Chauret et al., 2001; Stanfield et al., 2003). Cryptosporidium is one of the most resistant organisms to chlorine in wastewater and this may pose public health risk especially for immunocompromised as well as in immunocompetent individuals when the treated effluent is applied for unrestricted irrigation or recreation (Taran-Benshoshan et al., 2015).

The inactivation efficacy of chlorine on seeded bacterial (E. coli and Enterococcus faecalis) and viral (MS2) indicators was compared with their respective indigenous counterparts (E. coli, enterococci, FRNA bacteriophage, and enterovirus) in primary treated sewage effluent by Tree et al. (2003). The results of this study revealed a significant effectiveness of chlorination, particularly where virus contamination occurs with the inactivation rates for indigenous enteroviruses similar to those seen for FRNA bacteriophage at lower doses of chlorine. Furthermore, laboratory-grown poliovirus was inactivated much more rapidly compared to the naturally occurring indigenous enteroviruses (P < 0.001). To increase the inactivation efficacy of chlorination, Li et al. (2017a) proposed a two-step chlorination. A comparison of the disinfection efficiencies of a two-step chlorination and the commonly used one-step chlorination of a primary sewage effluent revealed that a two-step chlorination enhanced the disinfection efficiency by up to 0.81 or even 1.02 log10 at a time interval of 19 s and a dosage ratio of 5:1.

Chlorine dioxide-It has been pointed out that chlorine dioxide has several advantages and can be considered as an alternative to chlorine. Murphy et al. (2014) investigated the ability of chlorine dioxide (ClO2) to achieve 3 log10 inactivation of Cryptosporidium in water used for recreation purpose. This study revealed that Cryptosporidium can rapidly be inactivated by ClO2 and chlorine-free mixtures (5 or 1.4 mg/L ClO2) in the presence of dichlor. This is encouraging since dichlor and other stabilized chlorine products are routinely used in chlorinated recreation water venues throughout the United States. For Ofori et al. (2017), the kinetics and mechanism of chlorine dioxide (ClO2) inactivation of a Gram-negative bacteria Escherichia coli (ATCC 35218) in oxidant demand free (ODF) water is a function of disinfectant concentration (0.5–5.0 mg/L), water pH (6.5–8.5), temperature variations (4–37°C) and bacterial density (105–107 cfu/mL). Increasing temperature and disinfectant concentration was found to be proportional to the rate of cell killing, but efficacy was found to be significantly subdued at 0.5 mg/L and less dependent on the bacterial density. The bactericidal efficiency was higher at alkaline pH of 8 or above as compared to neutral and slightly acidic pH of 7 and 6.5. The disinfection kinetic curves followed a biphasic pattern of rapid inactivation within the initial 2 min, which were followed by a tailing even in the presence of residual biocide. The authors suggested that the inactivation of the Gram-negative bacteria was due to the disruption of the cytoplasmic membrane and subsequent efflux of intracellular components.

Although chlorine dioxide has relatively good disinfection properties and has received so much attention over the years, its chemical instability and concerns about its efficacy against resistant pathogens such as Cryptosporidium, as well as its safety, cost, and potentially adverse health effects, have led to the consideration of alternative disinfectants and disinfection strategies (Casson et al., 2006). Several studies have shown disadvantages of chlorination such as the production of disinfection byproducts (DBPS), which is the product of disinfectants reaction with natural organic matter in water. As a result, it is crucial to strike a balance between the chronic risk posed by lifetime exposure to DBPs and its advantages (Li and Mitch, 2018; Costet et al., 2011). Nevertheless, almost all surface water treatment plants in the United States of America (USA) still use chlorine-based disinfectants as part of their treatment process and almost 98% of Western Europe’s water is also chlorinated (Ngwenya et al., 2013). The promotion of alternative disinfection processes has become intense.

Choramine- Especially used when disinfection by-products in the treatment process exceed the concentration of free chlorine, chloramine is the desinfectant with more effective residual disinfection in the distribution system. In treated wastewater, inorganic chloramines, such as inorganic mono-, di-, and trichloramine, are much stronger disinfectants than the organic chloramines (Fayyad and al-Sheikh, 2001). Monochloramine is relatively the most used, stable and dominant form of inorganic combined chlorine in chloramination operations. However, it is not as efficace as free chlorine for the disinfection of planktonic organisms, but provides a relatively stable disinfecting residual that can be maintained over a relatively long period of time during and post-chlorination (Donnermair and Blatchley III, 2003). It has also been documented that chloramine can inactivate coliform bacteria, heterotrophic bacteria as well as Legionella bacteria (Olaolu et al., 2014). The advantages of chloramination are its ability to improve the odour, taste, smell and flavour of water and it remains active for a very long period.

6.2.2 Ozonation

Ozonation was firstly used for drinking water disinfection in France almost 100 years ago with a very limited application for wastewaters. In the United States, the use of ozonation in wastewater has only been encouraged since around 1980s and resulted in approximately 20 operating wastewater ozone facilities by 1986. Despite this long use, its inactivation mechanism against pathogens is still not well known, though several findings state that ozone in aqueous solutions reacts with microbes via radical species formed during ozone decomposition or by direct contact with molecular ozone (Martínez et al., 2011). Ozone is considered as a powerful disinfectant for critical microorganisms like norovirus, poliovirus, Escherichia coli, as well as a chlorine-resistant C. parvum (Amoueyan et al., 2017; Carvajal et al., 2017). According to USEPA (1999), ozone is the only chemical oxidant with effective ability to inactivate both Giardia (0.53 mg/min L-1 for Ct 99, at 5°C) and Cryptosporidium (2.4 mg/min L-1 for Ct 99, at 22°C) at a much lower concentration than those used routinely for water treatment. In temperate climates, a very low dose of this disinfectant (0.3-1 mg- ozone/min L-1) can effectively inactivate up to 99% of cysts and oocysts. Nevertheless, microbes containing carotenoid and flavonoid pigments can resist the bactericidal effects of ozone (USEPA, 1999).

In performing the kinetic analysis with ozone, the inactivation of pathogenic microbes such as bacteria, amoebic cysts, and viruses in wastewater only requires lower concentrations of ozone and shorter contact times. Alam (1993) investigated the use of ozone at the Middlesex plant in the United States of America. They found that at concentration ranges of 18 - 28 mg/L, ozone could inactivate fecal coliform up to 200 MPN/100 ml for a contact time of 45 minutes and as the dose was increased (25 mg/L) a 3 log10 reduction or a 99.9 percent kill was noted. This study also indicated that the effectiveness of ozonation was not affected by the variable effluent quality conditions. Nonetheless, by investigating the inactivation kinetics of Legionella in wastewater using ozone, Li et al. (2017c) revealed that the relationship of the initial O3 dose and Legionella inactivation rate was not linear, and thus, the Ct value required for a 99.99% reduction was not constant. Authors indicate that contact time was less important than the initial O3 dose level, and the latter cannot be compensated by increasing the contact time. A higher initial O3 concentration led to a higher inflection point value for the lnN/N0 vs C0t curve. According to Tyrrel et al. (1995) ozone contact time and residual concentrations, alone or in concert, could not enhance the inactivation rates of vegetative bacterial populations. Researchers suggested that ozone was able to diffuse through the cell membrane to react with bio-molecules, damage the chromosomal DNA and this might be one of the reasons for inactivation of E. coli. Other authors stated that in the case of a prolonged ozonation, lysis of the E. coli cell instead of direct reaction with cell membrane predominated. This resulted to a conclusion that ozone disinfection is a direct result of cell wall disintegration and cell lysis (White, 1999). When comparing to chlorine, Caravelli et al. (2006) reported that ozone expressed the highest bactericidal effect by reducing the total activated sludge microbiome through oxidation and cell lysis. In addition, a total ozone dose of 66.0 mgO3·gVSS-1 (ozone dose rate of 3.3 mgO3·gVSS-1·min-1 for a contact time of 20 min) were found to be the most suitable conditions to control filamentous bulking. Czekalski et al. (2016) revealed that due to its strong oxidant and disinfectant effects, ozone can be seen as an ideal way to face emerging challenges of water treatment namely multiresistant bacteria (MRB) and even intracellular antibiotic resistance genes (ARG). During their experiment, E. coli inactivation and ARG disruption were noted at specific ozone doses feasible for full-scale application. However, the presence of flocs appeared to interfere with the efficacy of ozonation, leading to microbial regrowth, but did not affect the presence ARG. To disrupt ARG and inactivate MRB during ozonation, an implementation of a removal step for flocs > 10 μm from secondary clarifier effluent prior to ozonation is important. Table 12 illustrates the disinfection efficiency of ozone.

Table 12. Efficiency of ozone in removing protozoan parasites and indicator bacteria from wastewater

Microbes

Log10 Removal

Dose mg/min L-1

Reference

Giardia

2

0.53

USEPA, 1999

Cryptosporidium

2

2.4

USEPA, 1999

Fecal coliform

3

25

Alam, 1993

E. coli

4

0.2

Blatchley III et al., 2012

6.2.3 UV radiation

UV radiation is the commonly used physical disinfection method for wastewater. Though this technology has been available for most of the 20th century, its acceptance for wastewater treatment was only widespread until the mid-1980. UV radiation has been reported to efficiently inactivate pathogens (bacteria, protozoa, and some viruses) by directly damaging microbial nucleic acids (Gross et al., 2015). This disinfection method is an attractive and eco-friendly technology, as it does not cause the generation of by-products during the wastewater treatment process (Winward et al., 2008). Regardless of its advantages, UV inactivation also shows limitation such as low or limited effectiveness when treating grey water with larger particle sizes and high concentrations of dispersed microorganisms (Taghipour, 2004). Of all three UV spectra, it has been reported that UV-C spectrum (ranging between 200-280 nm) is the one with the inactivation ability and this is because nucleic acids absorb light at 260-280 nm (Shoults and Ashbolt, 2018). Chahal et al. (2016) reported the efficient inactivation ability of UV disinfection method on pathogenic protozoans such as C. parvum (3 log10 reduction) and G. duodenalis (4 log10 reduction) at a UV doses of 25 mJ/cm2 and 40 mJ/cm2, respectively. It has also been revealed that at low-pressure UV doses of 30-40 mJ/cm2, up to 4 log10 of pathogenic viruses can be inactivated with exception of adenoviruses that get inactivated at UV dose of 200 mJ/cm2 (Eischeid et al., 2009). Table 13 illustrates the efficiency of UV irradiation in removing pathogenic organisms and indicator organisms.

Table 13. Efficiency of UV irradiation in removing protozoan parasitis and indicator bacteria from wastewater

Microbes

Log10 Removal

Dose

mJ/cm2

Reference

Cryptosporidium parvum

3

25

Eischeid et al., 2009

Giardia duodenalis

4

40

Eischeid et al., 2009

Pathogenic virus

4

200

Eischeid et al., 2009

Escherichia coli

4 to 5

4.1

Shoults and Ashbolt, 2018

Enterococcus faecalis

4 to 5

0.8 to 2.8

Shoults and Ashbolt, 2018

Staphylococcus

4 to 5

1.4

Shoults and Ashbolt, 2018

Enterococcus casseliflavus

4 to 5

0.8 to 3.6

Shoults and Ashbolt, 2018

MS2 coliphage

4 to 5

0.8 to 1.2

Shoults and Ashbolt, 2018


Shoults and Ashbolt (2018) also investigated the efficiency of UV radiation on various microbes such as E. coli, Enterococcus faecalis, E. faecium, E. casseliflavus, Staphylococcus aureus, S. epidermidis and Staphylococcus spp. They found that the doses required to achieve a 4 and 5 log10 reduction of Staphylococcus spp. was 1.4 and 4.1 for E. coli, 0.8 and 2.8 for E. faecalis, 0.8 and 3.6 for E. casseliflavus and 0.8 and 1.2 for MS2 coliphage, respectively. To compare the photo-reactivation and dark repair of microbes after UV radiation, Li et al., (2017c) used Escherichia coli as the target microbe, and UV-LEDs and low pressure (LP) UV disinfection with four UV-LED units such as 265 nm, 280 nm, the combination of 265 + 280 (50%), and 265 + 280 (75%). It was noted that 280 nm LEDs and LP UV lamps appeared to be less effective than 265 nm LEDs in the inactivation of E. coli. The reactivation (photo-reactivation and dark repair) of E. coli after 280 nm LEDs was significantly repressed at a low irradiation intensity of 6.9 mJ/cm2 and this could have been due to the impaired protein activities. Blatchley III et al. (2007) reported that the inactivation of indicator bacteria using UV light is not a guarantee for the total disinfection of all waterborne microorganisms as some can recover after treatment. Recently, King et al. (2017) investigated the removal and inactivation of oocysts in different stages of five wastewater treatment plants. It was revealed that under conditions of two distinct high-oocyst-challenge events, infective oocysts were detected in the effluent (microbial density greater than 3.0 log10) even after treatment with UV radiation and the inactivation value of 2.24 log10 was noted. As a result, the use of a multi-disinfection approach is suggested. Considering the emerging issue on antibiotic resistance genes (ARGs) in environment, it could also be suggested that an increase attention be raised in order to gradually mitigate the extensive concern caused by ARGs due to their presence in the environment. In investigating three disinfection processes (ultraviolet, chlorination, and ozone) on ARGs, Zheng et al. (2017) indicated that ARGs concentration decreased exponentially as UV dosage increased. UV disinfection resulted in apoptosis leading to bacterial DNA being released into the environment (Zheng et al., 2017). Table 14 summarises the advantages and disadvantges of common disinfection technologies used in the treatment of water and wastewater.

Table 14. Conventional disinfection technologies against different microbial groups (Source: Collivignarelli et al., 2018)

Conventional Technologies

Advantages

Disadvantages

Application

Chlorine

Easy to handle and economical;

Residual concentration;

Technologies consolidated

High contact time;

By-product formation;

Residual toxicity of effluent;

Very corrosive

Drinking water;

Wastewater

Chlorine dioxide

More effective than chlorine over short contact;

Long residual

Residual toxicity of effluent;

By-product formation;

Generation onsite;

Medium-high management costs;

Increase the concentration of solids in effluent

Drinking water;

Wastewater

Ozone

Short contact time

No residues of disinfectant;

By-product formation;

Generation onsite; High energy demand;

High management cost

Wastewater

UV radiation

No by-product formation; Short contact time; Inactivation of virus

No residues; High energy demand; Unsuitable for wastewater with high levels of suspended solids turbidity, color or soluble organic matter

Wastewater

Comments

Toggle