August 10, 2017
The designations employed and the presentation of material throughout this publication do not imply the expression of any opinion whatsoever on the part of UNESCO concerning the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries. The ideas and opinions expressed in this publication are those of the authors; they are not necessarily those of UNESCO and do not commit the Organization.
Yates, M. (2017). Persistence of Pathogenic Microorganisms in Fecal Wastes and Wastewater Matrices: An Introduction and Overview of Data Considerations. In: J.B. Rose and B. Jiménez-Cisneros (eds), Water and Sanitation for the 21st Century: Health and Microbiological Aspects of Excreta and Wastewater Management (Global Water Pathogen Project). (M. Yates (eds), Part 4: Management of Risk from Excreta and Wastewater - Section: Persistence), Michigan State University, E. Lansing, MI, UNESCO.
|Last published: August 10, 2017|
Determination of the length of time that pathogens can persist in wastewater or a wastewater-impacted environments (e.g., potable water source, bathing water, irrigation water, food crop), as well as fecal wastes, biosolids, and soil is essential when assessing the potential public health impact associated with use of that water or food crop. It is also a critical factor when examining treatment in fecal waste treatment, lagoons, wetlands and other land based methods. Persistence is a major component of transport models and effects risk of waterborne disease associated with surface and ground waters. To minimize risk of illness from exposure to untreated or inadequately treated wastes, it essential that one knows how long pathogens can persist in those wastes. It is also essential to understand how the data were obtained, so that they can be interpreted correctly.
Determining the length of time that pathogens can persist in wastewater or a wastewater-impacted environments (e.g., potable water source, bathing water, irrigation water, food crop), as well as fecal wastes, biosolids, and soil is essential when assessing the potential public health impact associated with the downstream use of that water or food crop. It is also a critical factor when examining fecal waste treatment, septic tanks, lagoons, wetlands and other land based methods. Persistence is a major component of transport models and effects the risk of waterborne disease associated with surface and ground waters. To minimize risk of illness from exposure to untreated or inadequately treated wastes, it essential that one knows how long pathogens can persist in solid feacal wastes as well as wastewater.
Pathogen persistence varies substantially based on the environmental factors and on the specific pathogen of interest. The factors that affect pathogen persistence, particularly viruses have been the subject of several reviews with a focus on ground water (Yates and Yates, 1988; John and Rose, 2005). In a review of viral survival data in fresh surface waters, a wide range of variation (3 to 99%) was explained by temperature as one of the key factors influencing pathogen inactivation rates based on studies with temperatures ranging from 4 to 37ºC (Kutz and Gerba, 1988).
Most studies of microbial persistence have also been conducted using indicator organisms, rather than pathogens. Those that have been conducted using pathogens were typically conducted in water (marine, fresh surface, or ground water), rather than in wastewater. It is likely that useful information regarding the effects of different environmental factors can be obtained from studies using pathogens. Studies in which pathogen behavior was compared to indicator behavior have typically found that the indicators do not provide accurate information on pathogen persistence (Locas et al., 2010).
Many pathogens in waste are adapted to the human or animal gut, and thus persist for a limited period of time in the environment. For example, enteric bacteria require organic carbon and other nutrients, as well as suitable temperature and pH conditions; these elements rarely exist in the environment. Additionally, the native bacteria are adapted to the environment, thus better able to compete for the scarce nutrients available. Viruses are not metabolically active outside of a susceptible host, so their persistence is typically determined by the length of time their external layer (e.g., protein capsid) remains intact. Protists and helminths typically have a non-vegetative lifestage that consists of a structure such as a cyst or ovum that is relatively resistant to environmental stressors. Regardless of the pathogen, the concentration decreases over time – it is the rate of decrease that varies.
Studies have examined how many factors affect pathogen persistence; these include temperature, organic material, biotic activity, dissolved oxygen, pH, the concentrations of different salts and minerals, and the presence and wavelength of light. With the exception of temperature, these factors have been found to have different effects, depending on the pathogen studied. A general description of how these factors have been found to affect pathogens is provided in Table 1.
Numerous studies have been conducted to assess the length of time that pathogens can persist in wastewater or wastewater-impacted environments over the last few decades. But in order to appropriately use the resulting data, it is important to understand the assumptions inherent in the methods used to conduct the experiments, including the sample design, collection, processing, and pathogen detection methods. This is especially critical, as there are numerous options available, and often there is no standard method that has been developed for the situation of interest. There are a few notable exceptions to this, such as the EPA method for the monitoring the occurrence of enterovirus and norovirus in water (Fout et al., 2014), Standard Methods for the Examination of Water and Wastewater (APHA et al., 2012). However, many times these methods do not reflect the universe of available options that may be needed for the specific situation that needs to be addressed.
Most pathogen persistence experiments are conducted in the laboratory, using conditions that may or may not mimic the environmental conditions of interest. Other experiments, relatively fewer in number, are conducted in the field. Each of these types of experiments has benefits and drawbacks.
In general, pathogen detection methods can be categorized into two groups: those that use culture techniques and those that use some other method. Over the last several years, the most common types of non-culture techniques are those that employ some type of molecular detection method, such as polymerase chain reaction (PCR). Each of these methods has advantages and shortcomings that must be recognized to enable the appropriate use and interpretation of the data derived from the studies.
This section of the chapter will discuss various aspects of experiments in which persistence of pathogens is studied, and the considerations that should be given when using data from these experiments.
Relatively few studies of pathogen persistence have been conducted in the field, due to the many issues associated with such studies. One issue is that it can be difficult to completely characterize a site so that the study can be designed to collect a sufficient number of samples to represent the entire site. This can be especially difficult for subsurface studies, as soils and aquifers are rarely homogenous, so determining the paths through which water and microorganisms flow, enabling samples to be collected appropriately, can be a challenge.
Another issue is that the number of pathogens present at a given site is typically too low to allow a meaningful study of persistence. An exception to this is a site receiving raw sewage or feces, such as a primary treatment lagoon receiving domestic wastewater (e.g., Locas et al., 2010), in which the concentrations of pathogenic microorganisms may be sufficiently high to study persistence over several days or weeks, thereby allowing measurements. To overcome this, typically pathogens must be added to the environment at a high enough concentration to allow detection over a meaningful period of time. In most areas, addition of pathogens to an environmental water would not be allowed, due to concerns over exposure and consequent potential adverse health effects. Therefore, the studies must either be conducted in a contained environment, or pathogen indicators are used.
One of the earliest devices constructed to allow in situ studies of pathogen persistence was developed by Mc Feters and Stuart in 1972 (McFeters and Stuart, 1972). This device was constructed using a dialysis chamber with side walls made of membrane filters (Figure 1). The pathogens of interest are placed inside the chamber, and the nutrients and other constituents in the water are able to flow through the chamber, allowing the organisms to be exposed to actual environmental conditions. Because the membrane pores are smaller than the pathogens, the pathogens cannot contaminate the surrounding environment.
Figure 1. Chamber used for field studies of microbial persistence in water (from: Francy et al., 1972)
Mc Feters and Stuart (1972) used this device to study the persistence of fecal coliform bacteria in mountain streams. This, and similar devices, have since been used by numerous other investigators (e.g., Altherr and Kasweck, 1982; Anderson et al., 1983; Bollmann et al., 2007; Davies et al., 1995; Moriñigo et al., 1990) to study microbial persistence in various types of environmental waters.
The other approach to studying pathogen persistence in field studies is to use fecal indicator organisms, such as E. coli or enterococci, rather than the pathogens themselves. An advantage of this is that the indicator organisms are typically present in much higher concentrations than the pathogens, so their persistence over time can be measured easily. In addition, the detection methods used for indicator organisms are typically simpler, more rapid, less expensive, and do not require highly trained personnel (Santiago-Rodriguez et al., 2016) However, it is well documented that there is no consistent correlation between persistence of bacterial indicators and persistence of many pathogenic microorganisms, particularly viruses and parasites (see, e.g., McClellan and Eren, 2014).
Many researchers choose to conduct persistence studies in the laboratory to avoid many of the challenges associated with field studies. Typically, these studies are conducted in small test tubes or other small containers of water (either natural or artificial) to which the microorganisms of interest are added. The containers are then incubated, and samples collected over time and analyzed to determine the number of organisms remaining. Some benefits of laboratory persistence studies include:
However, laboratory studies also have drawbacks, as it is much more difficult to replicate the exact environmental conditions in the laboratory. Thus, the combined effects of several variables are more difficult to establish.
The methoads chosen for sample collection will depend on the specific organisms of interest, as well as on the type of water being studied. Many pathogenic microorganisms are present in very low numbers in environmental waters, even wastewaters, so large samples may be required. On the other hand, some pathogens, such as some enteric bacteria, may be present in sufficiently high numbers, especially in raw wastewaters, that relatively small volumes of water (<500 ml) may be sufficient to detect them.
Direct sampling can be used for laboratory studies in which large numbers of pathogens have been added to water so that their persistence can be studied. In addition, some pathogenic bacteria, such as pathogenic E. coli or Salmonella may be present in wastewaters at sufficient high concentrations that direct sampling and analysis of the water can be done. In some field situations, it may be desirable to take several samples and composite them before analysis, especially if there are concerns over variability in the spatial distribution of the organisms (Hill, 2016).
Many pathogenic microorganisms, especially viruses and parasites, are present in extremely low numbers in environmental waters, even raw domestic wastewater. In these cases, it is typically necessary to collect a very large volume of water (e.g., tens to hundreds of liters), then concentrate the sample to a volume that can be more readily analyzed in the laboratory. This is typically accomplished by filtering the water through some type of filter, then removing the pathogens from the filter using some type of eluting solution. In many cases, that volume of solution is still too large to permit ready analysis, so a secondary concentration step is performed. While these large-volume sampling techniques provide increased detection sensitivity for these pathogens, there are a number of issues that one faces when using these methods. For example, some methods have been optimized to collect a single organism or group of organisms, while others are broader in scope.
There can be considerable variation in the efficiency of recovery of the organisms, depending on the specific methods used. A thorough review of the methods used to concentrate and recover viruses from water was compiled by Ikner et al. (2012), and a more general discussion of these issues is provided by Hill (2016). The efficiency of the methods, variability in recovery efficiencies, and other issues that must be considered when concentrating large volumes of water are also discussed. Depending on the methods used, the recovery of the microorganisms can be as high as 90% or greater, or less than 10%. Additionally, the recovery efficiency can vary for different organisms, even when the same methods are used.
It is critical to know the recovery efficiency of the method so that the number of microorganisms in the original water sample can be calculated, and the health impacts accurately assessed. Petterson et al. (2015), using mengovirus for a process control, documented the difficulties associated with accurately calculating virus concentrations in water due to the extreme degree of variability in virus recovery efficiency. They found that the magnitude and variability of virus concentration when corrected for the variable recovery efficiency was several orders of magnitude higher than the uncorrected concentration. They recommended the development of a sample-specific spiking control that closely mirrors the behavior of human viruses in environmental samples so that better estimates of recovery efficiency can be made.
There are numerous types of methods that can be used to detect pathogens; typically they can be grouped into culture methods, molecular methods, and other methods. If the persistence data are to be used to assess public health impact, or for a QMRA, it is important to know the number of infective organisms present. Historically, this has meant that culture methods were used to analyze the samples from the persistence studies. However, as new advances are made using other methods, these are being used as well.
Whatever method is used, it is important to take into consideration the different sources of variation associated with the method (MacArthur and Tuckfield, 2016). For example, it is common to analyze only a portion of the sample that is collected. While this may be necessary for practical reasons (e.g., the volume is too large to analyze the entire sample), non-homogeneity among subsamples can introduce significant errors in the estimation of the number of microorganisms present in the original sample.
Historically, cultural methods were the only ones available to use for detecting microorganisms in water. Culture methods involve the addition of required nutrients and incubation in appropriate environmental conditions over a period of time to allow the microorganism to grow. In some cases, culture conditions can be made to allow only very specific strains to grow; in other cases, the media are more general.
A major disadvantage of culture methods is that methods have not been developed to allow the growth of all pathogens of interest. For example, noroviruses, one of the most common water- and food-borne pathogens, cannot easily or reliably be grown in culture (Straub et al., 2013).
Another disadvantage of culture methods is the inability to detect viable but non-culturable (VBNC) cells. These are bacterial cells that are alive, but cannot be grown on their regular culture medium. The presence of VBNC cells in a sample thus lead to an underestimation of the number of infective cells in a sample, and an underestimation of the potential health effects associated with exposure to that water. In a recent review, Li et al. (2014) list 85 different bacteria that have been documented to exist in a VBNC state, more than 50 of which are pathogens. Many of these bacteria are water- and wastewater-borne, including Campylobacter, E. coli, Salmonella, Shigella, and various species of Vibrio.
Other disadvantages of culture methods compared to molecular methods include:
Over the last few decades, methods that detect the nucleic acid of microorganisms have been developed; a history of this is provided by Metcalf et al. (1995). Initially, the methods involved the direct detection of DNA, but over time, methods that amplify the nucleic acid using polymerase chain reaction (PCR) were developed. This resulted in a revolution in our ability to detect many microorganisms that were previously non-detectable, and to detect them with a higher degree of sensitivity that has previously been possible.
Some of the advantages of molecular methods compared to culture methods include:
Molecular methods also suffer from some disadvantages relative to culture methods. Some of these include:
While culture and molecular methods remain the most commonly used for persistence studies, there are other methods being developed that may become more frequently used as the technology develops. For example, there has been an increase in the development of biosensors for the detection of pathogens in environmental waters. In a review of more than 2,500 articles on pathogen detection from 1987-2007, Laczka et al. (2007) found that biosensors were the fourth-most commonly used method, and the most rapidly-increasing technology for pathogen detection.
Biosensors are analytical devices composed of two elements: a biological element that is used to recognize a microorganism, and a transducer, which translates the biological response that occurs upon recognition into an electrical or optical signal that can be detected and measured. There are three main classes of biological recognition elements used in biosensor applications, including enzymes, antibodies, and nucleic acids (Laczka et al., 2007).
There are two main classes of sensors – optical and electrochemical. In general, it has been found that the optical sensors are more sensitive, but they are more expensive and complex. On the other hand, electrochemical sensors are easier to use, but they tend to provide inconsistent results (Laczka et al., 2007).
While significant advancements have been made in these sensors over the last few years, there is still a great deal of improvement needed before they can be routinely used for detecting pathogens in water. For example, Shirale et al. (2010) developed a nanowire-based immunosensor to detect viruses in water, however, the sensor did not distinguish between infective an non-infective viruses. In addition, improvements in detection limit are needed, so that they are comparable to traditional methods. Finally, the cost of many biosensors is still too high to permit routine use.
The difference in detection of Legionella using culture and qPCR was reviewed by Whiley and Taylor (2016). They used studies conducted between 2003 and 2013 in which both qPCR and culture were used to quantify Legionella from environmental sources. They found that 26 of the 28 studies detected Legionella at higher levels using qPCR than culture; in one study both methods provided similar results, and in a single study, culture detected more Legionella than qPCR. When data from all 28 studies were aggregated, they found that 72% of the samples tested positive for Legionella when analyzed using qPCR, while only 34% tested positive using culture. While this is not a study on persistence, it dramatically illustrates the differences in results obtained using these two different methods, hence the need to carefully assess the source of the data when using it to make public health impact assessments.
A long-term study of virus persistence in water was conducted by de Roda Husman et al. (2009). In this study, three enteroviruses, poliovirus 1, poliovirus 2, and coxsackievirus B4, were added to artificial ground water or artificial surface water and stored at 4 or 22ºC for up to 606 days. Samples were taken at pre-determined time points, and analyzed for viruses using either cell culture or RT-PCR. Virus decay rates, as well as the time for the concentration to be reduced by 50% were calculated for each virus for each condition; the times for 50% reduction in virus concentration are shown in Table 1. Clearly, the results are very different when using data obtained from culture methods than those using RT-PCR. In some cases, the rate of decay of the virus RNA was so slow that no reduction was detected, although the number of infective cells decreased significantly. This has important implications if the data are being used to assess public health impacts, for example being used in a quantitative microbial risk assessment (QMRA).
These investigators also examined the ratio of infective cells to PCR detectable units. At the beginning of the experiment, all of the viruses had a ratio of approximately 100 PCR detectable units per infectious virus particle. By the end of the experiment, the ratios varied considerably. In artificial ground water the ratios were 117.9, 164.0, and 99.5 for poliovirus 1, poliovirus 2, and coxsackievirus B4, respectively. No consistency for a given virus was seen, as the ratios in artificial surface water were 39.7, 62.8, and 80.6 for poliovirus 1, poliovirus 2, and coxsackievirus B4, respectively. These results illustrate the difficulties associated with relying on molecular data for public health impact assessments.
An overview of the literature is presented on the persistence/ survival of pathogens and indicator organisms in sewage, surface water, groundwater and marine waters by Heather Murphy. The chapter is based on a scoping review of the literature and includes a summary of the survival of bacteria, viruses, protozoa and indicator organisms under various temperature and light conditions in each of the four water matrices. The data presented herein can be used to understand the survival dynamics of these organisms in aquatic environments and can subsequently be used to inform risk assessment models.
Organism survival/ die-off data are presented and reported as T90, T99, T99.9 or T99.99 values. The T90, T99, T99.9, T99.99 data represent the time in days that it takes for a 1 log10 (T90), 2 log10 (T99), 3 log10 (T99.9) or 4 log10 (T99.00) reduction of the microorganism to be observed.
For example, in sewage, bacterial pathogens such as Salmonella typhimurium, Enterobacter spp. and Streptococcus faecalis can survive for over 100 days before seeing a 1 log10 reduction. Adenoviruses in secondary and primary effluent have been found to have T99’s of up to 58 and 48 days under dark conditions at cold temperatures (4ºC). These die-off rates decrease as temperature increases as well as when the organisms are exposed to a light source.
Temperature, sunlight, DO, DOC, availability of nutrients, and salinity were found to be important environmental conditions to consider when evaluating the persistence of microorganisms in environmental waters. In general, very few data are available on the persistence of pathogens in aquatic environments, Significant gaps remain, particularly on the persistence of protozoa and pathogens found in developing regions of the world.
The persistence of pathogens in fecal wastes, and other solids including biosolids is important as these are often reused in agriculture or disposed of to soil. The exploration of these pathogens’ persistence is valuable to understand the length of time that is needed to render the risk of human infection low enough to be acceptable. Parameters such as temperature and soil moisture are important parameters with regards to how likely the pathogens will be able to persist. The regulations and treatment methods for biosolids by the United States Environment Protection Agency (US EPA) and WHO guidelines are focused on surrogate and pathogenic organisms including Ascaris, Salmonella and enteric viruses.
Persistence modeling facilitates the accurate simulation of different stages of growth, survival, and death of microorganisms in environmental matrices by describing the changes in population size of microorganisms over time. The most commonly used model for simulating persistence patterns of pathogens is the first order exponential one-parameter model. However, has predictive microbial modeling has developed as a filed over many years, persistence curves for microorganisms in a number of environments were observed that do not follow this classic linear pattern. Therefore, it is essential for an evaluation of linear and non-linear curves (models) in order to provide an accurate description of both pathogen and matrix specific persistence (or inactivation).
Seventeen linear and nonlinear persistence models were used to find the best models for describing the persistence of water microbes - bacteria, viruses, bacteriophages, bacteroidales, and protozoa in human urine, wastewater, freshwater, marine water, groundwater matrices, biosolids and manure. A total number of 30 datasets were used in this study containing 180 different pathogen (or indicator)/matrix combinations to find the best fitting models through linear regression techniques to describe persistence subject to various conditions. Like the exponential decay model, these models contain general parameter(s) to mathematically describe the relationship between reductions in microbial populations with time. The models do not contain explanatory variables to isolate the effects of environmental conditions (i.e. temperature, UV exposure) on inactivation. Overall, three models (JM2, JM1, and Gamma) were found to be the best fitting models across the entire data set and represented 59%, 34% and 25% of the data sets respectively. JM2 fit the persistence data the best across environmental matrices except in human urine, and groundwater in which JM1 performed the best at describing the persistence patterns. Across pathogens, JM2 was the best model for bacteria, bacteriophages, and bacteroidales. However, viruses were best fit by JM1. The models which best describe the persistence pattern of each pathogen or indicator in a matrix under different treatments and their corresponding parameters are presented in this chapter. In addition, T90 and T99 values, which are commonly used to specify the time required for a pathogen concentration to decrease by one and two log units, respectively, is reported for all the datasets and compared between matrices and microorganisms types. While this metric is often used to describe pathogen persistence, it is only relevant in the linear region of the persistence curve and will be misleading if an incorrect model is assumed or a model other than the best fitting model is used to estimate these values. Therefore, the results in this chapter that contains the best fitting models and parameters along with the associated calculation of T90 and T99 for various pathogen/matrix combinations can reduce uncertainty in estimations of pathogen population size in water environments over time.