A QMRA Framework for Sanitation Treatment Decisions

Published on:

Chapter info


This publication is available in Open Access under the Attribution-ShareAlike 3.0 IGO (CC-BY-SA 3.0 IGO) license (http://creativecommons.org/licenses/by-sa/3.0/igo). By using the content of this publication, the users accept to be bound by the terms of use of the UNESCO Open Access Repository (http://www.unesco.org/openaccess/terms-use-ccbysa-en).


The designations employed and the presentation of material throughout this publication do not imply the expression of any opinion whatsoever on the part of UNESCO concerning the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries. The ideas and opinions expressed in this publication are those of the authors; they are not necessarily those of UNESCO and do not commit the Organization.


Sano, D., Haas, C.N. and Rose, J.B. 2019. A QMRA Framework for Sanitation Treatment Decisions. In: J.B. Rose and B. Jiménez-Cisneros, (eds) Global Water Pathogen Project. http://www.waterpathogens.org (J.B. Rose and B. Jiménez-Cisneros) (eds) Part 1 The Health Hazards of Excreta: Theory and Control) http://www.waterpathogens.org/book/a-QMRA-framework-for-sanitation-treatment -decisions. Michigan State University, E. Lansing, MI, UNESCO.


Acknowledgements: K.R.L. Young, Project Design editor; Website Design: Agroknow (http://www.agroknow.com)

Last published:
Daisuke Sano (Tohoku University )Charles Haas (Drexel University)Joan Rose (Michigan State University)


Quantitative microbial risk assessment (QMRA) represents the bridge between the level of pathogens in untreated wastes and the determination of the treatment required to reduce risk and assure acceptable levels of public health safety from particular end-point uses. This process can and should be used to address the log reductions needed for pathogens by wastewater treatment prior to discharge in order to protect public health for downstream users. The framework and models needed are available and the approach is compatible with the WHO Sanitation Safety Planning (SSP) approach where it is briefly mentioned that a formal QMRA could be undertaken. The acceptable risk level needs to be established and can be found in some international (WHO) and domestic guidelines. There are several steps needed to calculate the difference between the acceptable pathogen concentrations in treated wastewater and their concentrations in the untreated wastewater using a reverse QMRA approach that will guide the levels of pathogen reduction during wastewater treatment (including disinfection and via die-off). The log reductions that have been estimated via various approaches range from 2 to 7 log10, depending on use of the water being impacted by the treated sewage. An example for wastewater used for irrigation (wastewater reclamation) is presented.

1.0 Introduction: Sanitation and Water Quality Grand Challenges

It is often said “water is life” but it must also be said that "Water Quality is Health”. Of all the global struggles around the protection and restoration of water quality addressing fecal waste, wastewater and sanitation to control waterborne pathogens, chemicals of emerging concern and overloading of nutrients to water systems may be the most significant challenges we face in the anthropocene.

The Blue Planet depends upon water, one of the most critical of all the world’s life support systems. In the 2030 Agenda for Sustainable Development, adopted by U.N. members in 2015, the sustainable development goal (SDG) on water, SDG 6 is integral to all other SDGs (United Nations, 2015). Water quantity and quality (access and management) are interlinked with the global biohealth servicing a sustainable plant, animal and human network. The understanding of water quality at larger scales, ground and surface water interactions impacted by land and climate is essential to our future investments for protection and restoration. During the last 60 years, there has been an acceleration of population growth (in both people and animals), land-use changes, use of fertilizers, and increased demand for water. This has led us into the anthropocene where continued water quality degradation as demonstrated by widespread recalcitrant chemical contamination, increased eutrophication, hazardous algal blooms and fecal contamination associated with microbial hazards and antibiotic resistance are a global phenomenon. These environmental impacts are exacerbated by climate change and extreme weather or precipitation events, which directly affect health associated with disease, malnutrition and loss of economic development opportunities. There is a need to improve the investment in innovative wastewater treatment/infrastructure, resource recovery and better environmental protection policies to improve and secure water quality and to promote reuse. It will be more important than ever to more formally implement risk-based and evidenced based approaches in order to effectively and efficiently mitigate the impacts of fecal wastes, and sewage on water systems and health. While public health is the goal, investment in sanitation and wastewater infrastructure will have dramatic economic benefits and in the face of these global changes will be critical there is to be an improvement and protection of the BioHealth of the planet in the future.

Along with the risk assessment and risk management there is a need to develop the human capacity to implement any stated water quality goals. Startling data presents the dramatic and grave situation in regard to human resources needed to fill technically trained personnel positions to meet the intentions for water and sanitation service delivery (from The ‘Mind the Gap’ study (2009), and the “Human resource capacity gaps in water and sanitation: Main findings and the way forward” (International Water Association, 2014). Academic institutions will need to provide these educational opportunities graduating hundreds to thousands of water scientists, engineers and technicians to remedy the severe state of affairs. These trained individuals will also have to be trained as trainers of others with varying educational backgrounds.

Thus the challenges to improve and provide water quality can be summarized as:

  • How can we better define the opportunities for improving global water quality and health as we address fecal wastes, wastewater collection, treatment, and reclamation through an integrated and adaptive risk analysis framework?
  • How do we fund water quality monitoring programs to acquire the data needed and best return on the investment to meet sanitation and public health goals?
  • How do we build the human resources needed to address these challenges?

A quantitative and formal risk assessment for sanitation is needed. The integration of sanitation safety plans with knowledge on pathogens must be done with quantitative approaches. Finally, scientists, engineers, water resource planners must be trained to use quantitative microbial risk assessment and develop the data needed for these assessments to aid in decision making.

This book chapter is directed toward government officials who are responsible for decision making with regards to the implementation of wastewater treatment, reclamation and reuse, and wastewater engineers who are going to manage the human health risks in the wastewater reuse.

2.0 The Risk Framework

Risk by definition at its simplest form includes the likelihood of harm associated with exposure to a particular hazard. People undertake risk assessments constantly in their daily lives by answering several questions. What is the hazard? How will it affect me? What can I do to avoid the hazard or minimize the risk of harm? Take crossing the street as a pedestrian where cars are the hazard and getting hit has serious consequences of injury or death. We examine the information we have by knowledge of the particular street crossing, e.g., how many cars, how busy the road is. We know we can avoid the cars more readily by using a cross-walk with a stop sign or traffic light. The stop sign or traffic light is a community approach to reducing the risk, and many risks are controlled or reduced in this manner.

Formal risk assessment as a process was designed to bring data and facts together to evaluate the hazards and to provide this information to stakeholders and decision makers for policy purposes. It is often used to examine quantitative probabilities of risk. What is the chance of the hazard causing harm in a certain time frame? Quantitative risk assessment developed for chemicals in particular was an approach for producing guidelines and standards to control and minimize the impact on the environment and to protect public health (U.S. Environmental Protection Agency, 2002).

Risk assessment has been used as a 4-step process that for microbial hazards found in feces and sewage can elicit a probability of infection and focuses on disease impacts.

1. Hazard Identification
This step identifies the pathogens of concern including key representatives of the bacterial, helminth, protozoan and viral groups.

2. Dose-response
Dose-response assessment uses laboratory feeding studies or outbreak data to determine a mathematical relationship (model selection and parameter estimation) between pathogen doses and the probability of infection, disease or death in the exposed population.

3. Exposure assessment
Exposure pathways include data on pathogen concentrations in the sources of the microbes, the fate and transport from source to exposure and estimates the doses via final concentrations and the amount consumed in key populations as single or multiple exposures.

4. Risk Characterization
Characterization brings the exposure and dose-response together and elicits a probability of infection, disease and/or death for the pathogen of concern, addressing assumptions, variability/uncertainty.

The 4-step process is now included a framework as shown in Figure 1, which is a 7-step process that includes re-iteration step and risk management strategies (NRC, 2009). This process is recommended to establish a scenario-specific best means of action.

  1. Problem Formulation
  2. Hazard Identification
  3. Dose-response
  4. Exposure assessment
  5. Risk Characterization
  6. Risk management
  7. Re-evaluation the problem and exposure

In the cases of “safely managed” fecal wastes and wastewater as well as promoting reuse as mentioned in SDG6, quantitative microbial risk assessment can be used to integrate science and policy and promote the translation of science into action. The risk framework indicated in Figure 1 involves defining the sanitation problem at the appropriate scale. In the case of sewage, the hazards should be better identified by testing for and quantifying pathogens found in feces and sewage. Exposures involve the pathogen flow from wastewater to surface or groundwaters from local scales to the larger watersheds/basins. The exposure assessment can address primarily three pathways from pathogen laden sewage: 1) to source waters for drinking purposes, 2) to surface waters, rivers and lakes used for hygiene, fishing and bathing (recreation); and 3) to irrigation waters used for crops. Exposures once linked to dose-response functions provides the probabilities of infection and disease in the risk characterization step. All of these data and analyses then guide risk management options (Figure 1).

Figure 1. QMRA framework

The safety of drinking water and wastewater reuse goals have been advanced through the use of the quantitative microbial risk assessment (QMRA) framework and the use of advanced diagnostic technology for monitoring pollution sources and specific hazards. With these data, management strategies with stakeholder involvement are more likely to move forward. However, the use of this framework and tools for risk assessments for wastewater and sanitation have not been universally adopted. QMRA using pathogen data has not been used for recreational guidelines although there are some academic papers on QMRA for recreational waters. In addition, fecal indicator bacteria are still used by the states in the US and elsewhere for evaluating ambient water quality criteria for irrigation waters and food production, which underestimates risk from key pathogen groups. Thus it is appropriate that QMRA be used to address goals for pathogen control by wastewater treatment.

Further reading on QMRA as used for drinking water and reclaimed water can be found in Appendix A.

2.1 Sanitation Safety Planning (SSP) and QMRA

The World Health Organization describes their approach (known as “sanitation safety planning” (SSP)) for addressing fecal wastes and wastewater treatment as “a risk based management tool for sanitation systems” (WHO, 2016).

The goals are to address human fecal waste and wastewater treatment as viewed through what is known at the sanitation value chain (Figure 2). This value chain is a combination of management methods from cradle (source) to grave (final disposal) and addresses containment, emptying (in the case of onsite systems), transport, treatment, disposal and reuse. A large focus is on developing a team who can evaluate the various types of management systems (sewers and treatment to solid fecal waste) and assign relative risk categories (e.g. high, medium, low) to these systems along with addressing various scenarios or events (i.e. errors or failures) that would result in human exposure and produce some type of health risk.

Figure 3 shows the steps to be undertaken when implementing an SSP according to WHO.

  1. Prepare for the SSP. Form the team, address the data needed.
  2. Describe the sanitation systems. Gathering information on and describing what the value chain looks like, how many people are served by these systems and what are the volumes generated are part of this step.
  3. Identify hazardous events, assess existing control measures and exposure risks. This step focuses on who is at risk including occupational workers, where the problems might exist in the system in terms of increasing exposures and approaches for examining relative risks.
  4. Develop and implement an incremental improvement plan. This step formalizes and seeks to compare various management strategies and has suggested log reductions of E.coli and helminths for reuse for agriculture (WHO, 2006).
  5. Monitor control measures and verify performance. This involves occupational methods for avoiding exposure. as well as for reuse and for irrigation waters the monitoring of E. coli and helminth eggs (suggested agricultural application goals of <1 ova/Liter and 103E. coli/100ml).
  6. Develop supporting programmes and review plans. This addressing human resources, operations and updates.

Figure 2. The sanitation value chain

(Source: This image is a derivative of "Sanitation Value Chain" by SuSanA Secretariat, which is licensed under CC BY 2.0)
Figure 3. Sanitation safety plan modules

Source: WHO, 2015

2.2 Comparison and Interface Between SSP and QMRA

The SSP and QMRA are quite compatible and synergistic despite some of the differences (Table 1). In fact, it is briefly mentioned in the SSP manual that a formal QMRA can be undertaken. As mentioned previously the SSP is considered a tool and the manual is a step by step process to address options for improving sanitation in a community. Some of the limitations of the SSP are that it treats all microbes as equal in their concentrations and risks, exposure pathways are primarily focused along the sanitation value chain (although irrigation is included) and quantitative approaches are only briefly mentioned. SSP also includes occupational risks and communication plans. QMRA is more data intensive and may be more difficult to undertake and communicate the results. However, the ability to examine health outcomes associated with specific pathogens and a broader array of exposure pathways may provide the information needed for political decision making. In addition, if the goal of QMRA is truly focused on reducing specific pathogen concentrations and loads, public health may be better protected by focusing on both barrier technologies across all exposure pathways associated with disposal and reuse, which can be facilitated by QMRA.

Table 1. Comparison of SSP and QMRA frameworks and approaches





Key Differences

Key Similarities




Other stakeholders are included in the QMRA potentially those at greatest risk.

QMRA addresses explicitly the problem that needs to be addressed.

The goals of the process are described. Interdisciplinary teams are assembled.

Data needs are identified.

Describe the system

Hazard identification

QMRA identifies specific microbe or groups of concern (one bacteria, virus, protozoan and helminths).

While SSP does not address quantifying specific pathogens of concern.

SSP has overlap here with problem formulation and exposure assessment which is similar to QMRA.

Identify events


and controls

Exposure assessment

QMRA looks at exposure pathways in a quantitative manner.

SSP is explicit in regard to occupational risks and event scenarios.

Describing the exposure pathways are central to these steps.

Here SSP overlaps with risk management step in QMRA.

Dose Response

QMRA examines explicitly probability of infection.

Risk Characterization

QMRA examines variability, assumptions, data gaps and sensitivity analysis (e.g. what is driving the risk).

Develop implementation

Risk management


QMRA examines quantitatively the log reductions of pathogens by the various management approaches.

SSP examines how to improve risk reduction often through expert judgement.

Both can examine and include the reliability of the management approaches.

Monitor control measures

SSP suggests how to monitor the control measures to better ensure reliability.

Develop support programs and review

Reassess the problem and exposure through iteration

SSP is explicit about including a communications plan

Both can be used with an adaptive management strategy.

3.0 A Decision Framework for Sanitation using QMRA

3.1 Gathering the Quantitative Data

The use of QMRA depends on having data on pathogen concentrations in feces and sewage and the data on removal by various sanitation technologies. Thus it is the goal of this online GWPP to provide the best data currently available on the helminths, protozoa, bacteria and viruses and to make the data accessible. Not all pathogens are created equal in their ability to cause harm. They are excreted differently, their concentrations are different, their persistence is different and their removals through treatment are different. Many technologies use time and natural die-off of pathogens to render the effluents or biosolids “safe” to reuse. Again the most appropriate models for persistence supported by data from latrines, digesters or lagoons for example are not often used. The dose-response relationships (potency) of these pathogens are different as well as the associated disease severities and need to be addressed via the QMRA framework to understand the differences in risk. It is important that the statistical distributions of the various pathogens’ concentrations be compared in and out of containment and treatment systems, yet often the data are limited.

One of the problems that needs to be defined is how much risk occurs to downstream users from sewage discharges to surface waters. To address this, rotavirus was used as the example hazard because it is an important cause of diarrhea in children and there is a sufficient amount of related data (e.g., WHO, 2006), however the same approach could be used for pathogens from any taxa. Figure 4 is an illustrative process of the quantitative data which might be needed for assessing risks associated with viruses in untreated sewage. Here three exposure pathways are considered: (i) recreational use of the river system (swimming); (ii) use of the river system for hygiene (e.g. washing clothes; or bathing seen as similar to recreational use); and (iii) use of the river as a potable source. Sewage was discharged with and without treatment. In this example, dilution in the river was assumed to be 10% however the exact dilutions for any given river over the seasons could be ascertained and used. It was assumed that there was no viral inactivation in water environments in Figure 4, but it is possible to incorporate the die-off. The exposure process entails assessing the concentration of the viruses in the volume of water that is ingested. Exposure for swimming or bathing was estimated at 30 ml, for hygiene transfer of the virus from the water to the hand and from the hand to the mouth the volume was estimated at 1 ml. The ingested volume is multiplied by the concentration in the water to determine the exposure dose for a single exposure. Drinking water exposure is assumed to be 1 to 2 L per day, this could be examined where individuals drink the river water without treatment or with further treatment.

Risk per exposure dose can be computed by using the calculated dose in the dose-response equations (Haas, Rose and Gerba. 2014). Additionally multiple exposures, for example drinking water every day for a single year can also be estimated by simplifying assumptions of independence in order to determine risk an annual risk (Haas, 1996). One source of dose-response equations is via the QMRAwiki (http://qmrawiki.org).

Because rotavirus is excreted in very high levels, is highly potent and survives treatment, this example of the risks without sewage treatment are high even with drinking water treatment (>10-2) using the values assumed in Figure 4. This example demonstrates why sewage treatment as part of the multiple barrier is of extreme importance and needs to be recognized.

There has also been a set of assumptions used that certain sewage treatment systems will achieve specific log reductions of pathogens yet the data are often not available and have not been generated for some types of treatment systems. Le-Thi et al. (2017) in Vietnam found high levels of E. coli and Cryptosporidium and Giardia in effluents from anaerobic digesters. While energy recovery was being achieved the reuse of effluents was posing a public health risk.

Much of the literature suggests large ranges of pathogen removals (e.g. 2 to 7 log10) for any given technology and these assumptions are repeated and repeated in many references. There is little information on how design and operations can influence pathogen removal in various climatic regions of the world. It may be necessary to wait for more monitoring data to appropriately express the degree of uncertainty in the pathogen removal efficiency in treatment systems, which may be influenced by operational conditions and season. At the moment, one practical option is to employ the minimum reported value of the removal efficiency in QMRA from the standpoint of being protective of public health, using a precautionary principle.

An effort to gather pathogen occurrence data through funded research and investigations using new diagnostic tools in developing regions of the world is needed. While GWPP provides some data, these may not be sufficient.

There are new approaches which map pathogen loads and in combination with hydrological data can map concentrations in waterways which can be used at numerous scales (city to country to global). These maps use population densities and disease incidence. Linking these loadings to hydrology of the model estimates concentrations used for estimates of health risk associated with potential end-uses of water for river systems receiving sewage. These methods have been used to examine rotavirus and Cryptosporidium (Hofstra et al., 2019; Kiulia et al., 2015; Vermeulen et al., 2019). The future development of such maps that include the ability to examine sanitation technologies with various scenarios would allow for improved prioritization and decision making (Hofstra et al., 2019).
Figure 4. Illustration of data flow for a QMRA analysis. While viruses are specified, this approach is applicable for all pathogens

3.2 Acceptable Risk and Acceptable Pathogen Concentrations

Policy and management decisions often involve not just reducing pathogen exposures but meeting a specific target of acceptability. Defining acceptable risks involves many factors not covered in this article (e.g., analysis of values, cost-benefit analysis). Some key questions asked are “Are the risk reduction targets primarily achievable via community approaches?” (thus often seen as involuntary risks). How easily or cost-effectively can the risks be managed? How serious are the consequences associated with harm? How well is the risk understood? How easily is the management strategy implemented and monitored?

Infectious disease agents in water have historically been viewed as having a zero tolerance or a goal of zero particularly for drinking water. It was assumed that treatment and use of E. coli as an indicator could achieve this. It is now well understood that this is not the case. Early on in the history of QMRA an approach for examining low risk levels for drinking water and use of performance criteria was developed for viruses and protozoa (Regli et al, 1991; Rose et al., 1991). What has emerged is a goal of 10-4 annual risk for drinking water (equal to or less than 1 infection per 10,000 population in one year acquired from drinking water). The USA articulated this as a goal in the Surface Water Treatment rule) but is also stated as a standard by the Dutch government and a criterion or guideline by WHO (which is analogous or very similar to a 10-6 DALY) (WHO, 2011; Schijven et al., 2011; Dutch Ministry of Infrastructure and the Environment of the Netherlands, 2011). Guidance on the performance of drinking water treatment suggested that to achieve this acceptable levels of risk equivalent of 5.89, 5.96 and 5.96 log10 removals of protozoa, viruses and bacteria, respectively, was needed. This was based on the general levels of these microbes found in source waters from surface water supplies such as rivers, lakes and reservoirs (WHO, 2011).

The VIROCLIME project (Rusinol et al., 2014) studied the transport of viruses in river catchments in Europe where in some cases over 40% of the flow was dominated by sewage effluents (due to approximately 50 wastewater treatment plants (WWTP) in a particular basin). Virus monitoring was accomplished using qPCR and 29% and 18% of river water and seawater samples were positive, respectively. The analysis of risk using a QMRA approch associated with the contamination by viral pathogens demonstrated the disease potential. This group concluded that viruses originate from wastewater and management of these sources is necessary. It was recommended that 2 to 3 log10 reductions were needed to prevent environmental dispersion of these human fecal pathogens. In addition, acceptable water quality levels may be guaranteed only if wastewater containment and treatment are fully operational when floods or extreme rainfalls occur, thus treatment must be resilient to high flows and climate extremes.

A risk of 8x10-3 for recreation in the US has often been suggested as acceptable based on epidemiological studies of swimmers diarrhea at beaches impacted by sewage. Yet this is not a universal goal (USEPA, 2012). In addition, this has not been readily translated into performance criteria for pathogen log reduction by sewage treatment.

3.3 The Decision Framework for Addressing Log Reductions

An overall decision framework is shown in Figure 5. The key steps are as follows:

  • The key pathogens of concern are identified (this would include several with at least one from each taxonomic group – i.e. a bacterium, a virus, a protozoan and possibly a helminth). These can be chosen as representative pathogens or can be chosen in regard to pathogens of concern for the region.
  • The levels of the key pathogens in the untreated fecal waste are estimated (hopefully using guidance in the various chapters of this book, levels in untreated sewage can be ascertained).
  • The nature of the end use is specified (e.g., potable source, water for food crop irrigation, water for bathing, etc.)
  • A reverse QMRA (USEPA, 2010) starting with the risk goal (e.g., the desired probability of infection, such as 1/10,000) and working backwards to calculate the concentration of each pathogen that would be achieved to make the end use risk acceptable. Note that the level of acceptable risk itself is one that requires external specification including social and economic factors.
  • The difference between the acceptable pathogen concentrations and their concentrations in the untreated waste define the levels of treatment (possibly including disinfection) and dieoff that need to occur before use.
  • Indicator methods (coliform, enterococci, coliphage, Clostridium etc.) are often used to operationally assess the performance of treatment to achieve the public health objectives.

Figure 5. Decision framework for specifying treatment requirements

4.0 An example QMRA for Wastewater Reclamation

4.1 Introduction to Reuse

The SDG 6 has begun to set goals for treating wastewater and for reuse. There is also a promotion of resource recovery facilities to obtain energy, nutrients and water for use. Nations and communities in Africa and Asia are now strategizing on how to build their infrastructure to meet the new goals for access to sanitation and wastewater treatment and reuse is often thought to be one of the best economic strategies in the face of climate change and other socioeconomic changes. The reuse of reclaimed wastewater has been widely employed to compensate for increasing water demand in arid and semi-arid regions (Verbyla et al., 2016), but wastewater contains hundreds of waterborne pathogens, most notably enteric viruses (Kitajima et al., 2014) and thus must be treated in order to be used safely.

As mentioned above the wastewater reclamation process needs to be designed to achieve some target health goal, implemented via some performance which will lead to some total log10 reduction (LR) by the various treatment unit processes (Sano et al., 2016). These target LR values can be calculated using the reverse QMRA (USEPA, 2010) as mentioned above in Section 3.3.

4.1.1 World Health Organization (WHO) reuse guidelines

According to the World Health Organization (WHO) guidelines (2006), a wastewater reclamation process is designed to achieve a target value of LR by combining different types of wastewater treatment units and health protection measures. The LR target values are stipulated in international and domestic guidelines for wastewater reclamation and reuse, usually with respect to each intended end-usage of reclaimed wastewater. The WHO guidelines (2006) require a total of 6 to 7 LR for viruses by wastewater reclamation processes if reclaimed wastewater is used for unrestricted irrigation (i.e., irrigation of all crops including vegetables eaten raw). In Australia, the states of Queensland and Victoria have also proposed that water reclamation plants achieve virus reductions of 6.5 and 7 log10, respectively, for agricultural irrigation (Environment Protection Agency Victoria, 2005, 2003; The State of Queensland, 2008). These LR target values recommended by the established guidelines can be employed in other countries and regions where wastewater reclamation guidelines have not been prepared. However, several reports have argued that these target LR values should be determined based on the context of the countries/regions where water reuse is implemented (The Victoria Department of Health, 2013). It is of advantage for wastewater engineers to prepare guidelines that instruct how to calculate LR values, based on the virus types of concern/prevalence, virus monitoring data in untreated wastewater, the water reuse scheme, and hygiene practices implemented in specific countries/regions (Environment Protection Agency Victoria, 2005, 2003; The State of Queensland, 2008).

4.2 Health Goals Established

Health goals are often described as “tolerable disease burden”. Two distinct values for the tolerable annual disease burden include 10-4 and 10-6 disability adjusted life years per person per year (DALYpppy). The tolerable annual disease burden of 10-6 DALYpppy for waterborne infections due to certain pathogens is recommended by WHO (WHO, 2006). Meanwhile, a higher risk (10-4 DALYpppy) is proposed as a mitigated tolerable burden for waterborne infectious diseases mainly for developing countries/regions, because overly small values of the tolerable annual disease burden may raise other issues due to the excess costs in precautionary regulations (Mara, 2011). In fact, the actual annual disease burden of diarrhoeal disease in developing countries is estimated to be ~10-2 DALYpppy (Mara and Sleigh, 2010); therefore, the mitigated annual disease burden of 10-4 DALYpppy is more reasonable target reflecting the current local situations. The 2011 WHO guidelines for drinking water quality stipulate that waterborne disease should have little impact on the overall disease burden compared to all exposure routes such as foodborne or hospital acquired (WHO, 2011).

Below we introduce how to calculate the virus LR target values required for wastewater reclamation under two hypothetical exposure scenarios of water reuse for agricultural irrigation.

4.3 Problem Formulation and Hazard Identification

Vegetables irrigated with contaminated water can retain waterborne pathogens and then will pose a significant burden of infectious diseases. This requires wastewater engineers to sufficiently reduce pathogens in untreated wastewater for agricultural reuse. Among waterborne pathogens, enteric viruses are usually assigned the highest LR target values, because of higher resistance to wastewater treatment processes and higher persistence in environment compared to bacterial and protozoan pathogens (See Persistence section of the GWPP http://www.waterpathogens.org/node/103).

Norovirus genogroup II (NoV GII) was selected as a model virus for this example, because NoV was attributed as causing the most waterborne disease cases in developed countries (Murphy et al., 2016; Gibney et al., 2017), and it has been the most significant cause of gastroenteritis outbreaks among enteric viruses over the world (Katayama and Vinje, 2017). However, any model virus can be selected at this step, if its dose-response has been successfully established and the datasets of its concentration in untreated wastewater have been accumulated.

4.4 Exposure Pathways

Two exposure scenarios were used for this QMRA example: Scenario I assumed accidental ingestion of reclaimed wastewater by farmers; while Scenario II assumed daily consumption of vegetables grown with reclaimed wastewater irrigation.

In Scenario I, accidental ingestion of 1 mL of reclaimed wastewater per person per day was assumed for farmers (Ottoson and Stenström, 2003). In this scenario, reclaimed wastewater was assumed to be sprayed or sprinkled on agricultural land. Typical work days were assumed to be at least 5 to 6 days a week per person in intensive farming, and a total of 300 days of irrigation as used in other QMRA studies (Mara et al., 2007).

In Scenario II, 10 mL of reclaimed wastewater was theoretically ingested by consuming 100 g of fresh vegetables grown with reclaimed wastewater irrigation (Asano et al., 1992). To assess microbial risks by fresh vegetable consumption, 10 mL of irrigation water was assumed to remain in 100 g of fresh vegetables (e.g., lettuce) as used in another agriculturally-based QMRA study (Asano et al., 1992). It has been experimentally demonstrated that most crops hold less water compared to leafy vegetables (Shuval et al., 1997). However, 10 mL per 100 g of every vegetable was adopted as a conservative assumption in this example. It was assumed that the prepacked fresh vegetables were sold at wholesale and retail markets, and 38.9 g per person per day of the prepacked fresh vegetables were consumed every day. The mean daily consumption of fresh vegetables in Netherland was used (Voorrips et al., 2000).

Pathogens may die after being applied to vegetables. The WHO guideline suggests that a 2.0 log10 of natural die-off might occur after the last irrigation yet the die-off conditions temperature and number of days are not well described thus this assumption is not supported but will be used in this example (WHO, 2006). In the future better persistence data should be obtained. An additional 1.19 log10 reduction of norovirus was also factored in this exposure due to washing vegetables with tap water containing disinfectants (Dawson et al., 2005). Therefore, in this example, a total of 3.19 (= 2.0 + 1.19) log10 reduction of norovirus was assumed from last irrigation to consumption.

4.4.1 Concentrations of norovirus in sewage

One of the most important pieces of quantitative data needed is the pathogen concentrations in wastewater. This can be taken from literature but if possible monitoring directly of the facility over a year time frame can provide greater assurance of how this applies to the specific wastewater treatment plant.

Norovirus genotype II (NoV GII) for this example was monitoring for in untreated wastewater. Samples collected from a pilot-scale wastewater treatment plant were quantified using reverse transcription-microfluidic quantitative PCR (RT-MF-qPCR), as explained elsewhere (Ito et al., 2017).

In this example, NoV GII gene was detected in 46% (12/26) of the untreated wastewater samples, and the previously-published Bayesian model (Ito et al., 2015; Kato et al., 2013) was employed to estimate the NoV GII concentration distributions for datasets containing non-detects. The logarithmic mean and logarithmic standard deviation (SD) of the influent NoV GII concentrations were 7x102 gene copies/L to 1.7x103 gene copies/L, respectively (translated to log10-copies/mL for use in the risk assessment exposure estimate at 0.7 and 1.7, respectively).

4.5 Dose-response Models Used

The dose-response model is a mathematical equation showing the relationship between the dose of an agent (pathogen) to which a person is exposed and the probability of an adverse health effect in humans. Two dose-response models for NoV GI and GII, the hypergeometric model and the fractional Poisson model, are available for QMRA (Messner et al., 2014; Schmidt, 2015; Teunis et al., 2008; Van Abel et al., 2016). Both models are following the single-hit theory, in which it is assumed that one viral particle initiates infection in the host when it successfully escapes from the immune responses. The hypergeometric model is an exact beta-Poisson model, which is modified to account for aggregation of viral particles and implicitly assuming no adaptive immunity in the studied population (Schmidt et al., 2015). Meanwhile, the fractional Poisson model was newly developed as a special case of the hypergeometric model, in which it is assumed that a population is divided into two populations that are fully immune or susceptible to the infection (Schmidt et al., 2015). Since there was no difference between calculated log reduction values using these two models, only the result obtained using the fractional Poisson model was indicated in the next section.

4.6 Calculation of the Target LR Values

The relationship among virus concentration distribution, tolerable virus concentration in reclaimed wastewater and tolerable annual disease burden is shown in Figure A1. All parameter values used in the calculation are indicated in Table 2.

Table 2. Parameter values used for the quantitative microbial risk assessment


Parameter Name






Concentration of norovirus genogroup II (NoV GII) in influent



logarithmic mean = 0.7;

logarithmic SD = 1.7

Ito et al., 2017

Exposure parameters (Scenario I: intensive farming)


Volume of water ingested per day




Ottoson and Stenström, 2003


Days of exposure per year




Mara et al., 2007


Daily dose of NoV GII in intensive farming



λ = 10c×V1

Exposure parameters (Scenario II: raw vegetable intake)


Volume of water remaining on vegetables




Asano et al., 1992; Shuval et al. 1997


Daily intake of raw vegetables




Voorrips et al., 2000


Days of intake per year




Voorrips et al., 2000

NoV GII reduction by various health protection measures


NoV GII reduction between last irrigation and intake




Die-off after last irrigation




WHO, 2006

Reduction with washing using water




Dawson et al., 2005

Reduction with washing using chlorinated water




Dawson et al., 2005


Daily dose of NoVGII in raw vegetable intake



λ = 10c×V1×W/10R

Dose-response parameters

Pfp(inf | exp)

Daily probability of infection (fractional Poisson model)


pfp(inf | exp) = (1 - exp(-(λ/μ)))

Messner et al., 2014



P = 0.722



μ = 1,106

p(ill | inf, exp)

Illness rate among infection


p(ill | inf, exp) = 1 - (1 + η×λ)-r

Teunis et al., 2008



η = 0.00255



r = 0.086

p(ill | exp)

Daily probability of illness


p(ill | exp) = Pinf(d)×Pill|inf


Annual probability of Illness


pill = 1 - (1 - pill(y)n)

Disability adjusted life year (DALY) calculation


DALY per person per year





Disease burden

DALY per cases of illness



Mok et al., 2014

aC: Calculation; F: Fixed; D: Distribution.

Note: The 2006 WHO guidelines stated calculation steps for LR target values for reclaimed wastewater irrigation of leafy vegetable crops were as follows: (1) 5,000 rotavirus per liter of untreated wastewater. (2) 10 mL of treated wastewater remaining on 100 g lettuce after irrigation, (3) 100 g lettuce consumed per person every second day throughout the year (WHO, 2006). The only parameter similar to the variables in Table 2 above is the concentration of the virus.

The LR target values of NoV GII were calculated based on its concentration distributions in untreated wastewater and the tolerable concentration in reclaimed wastewater. In Scenario I, the LR target values of NoV GII corresponding to the annual disease burden of 10-6 DALYpppy were 3.4, 4.7, and 6.4 log10 at 95, 99, and 99.9% reliability, respectively, for both dose-response models (Table 3). The LR target values of NoV GII was also calculated based on the annual disease burden of 10-4 DALYpppy, which were 2.4, 3.6, and 5.4 log10 at 95, 99, and 99.9% reliability, respectively (Table 3). In Scenario II, the LR target value of NoV GII was likewise calculated based on the annual disease burden of 10-6 DALYpppy, which were 2.3, 3.6, and 5.3 log10 at 95, 99, and 99.9% reliability, respectively. When the annual disease burden of 10-4 DALYpppy was employed, the 95, 99, and 99.9% reliability corresponds to 1.3, 2.6, and 4.3 log10, respectively. An approximately 3.0 log10 difference of the LR target values of NoV GII was observed between the 95 and 99.9% reliability in Scenario I and Scenario II.

Table 3. Log reduction target values


Tolerable disease burden (DALYpppy)

Log10 Reduction





Scenario I










Scenario II










a95, 99 and 99.9% represents the cumulative probability of NoV GII concentrations = distribution in untreated wastewater below tolerable annual disease burden of 10-4 or 10-6 disability-adjusted life year per person per year (DALYpppy).

4.7 Final Conclusions

The reliability to meet a given tolerable annual disease burden was defined in this example by considering the variability of virus concentrations in untreated wastewater and the virus reduction efficiency. The consideration of uncertainty and variability in risk assessment is critical for a decision-making (Morgan et al., 1985). The 95, 99, and 99.9% reliability values were employed for calculating the LR target values of NoV GII. As a result, approximately 3.0 log10 difference of the calculated LR target values was observed between 95% and 99.9% reliability values (Table 3). This difference in LR values is subject to change according to the mean and SD of the probabilistic distribution of virus concentration in untreated wastewater. This emphasizes the importance of obtaining greater accuracy in the parameter estimation of virus concentration distribution used in the LR calculation. The factors of virus concentration fluctuation, including seasonal prevalence, need to be carefully elaborated to determine the distribution parameters used in the LR calculation.

Optimally, during the operation of wastewater reclamation facilities, would monitor the target virus in influent. If the peak-loading of a target virus is found is substantially higher than that expected from the estimated probabilistic distribution used in the LR calculation, it could be necessary to implement additional protection measures in the wastewater reclamation process, such as increment disinfectant doses. Even options for emergency operation also may need to be included in water reuse guidelines.

However, if data are not available, one of the options in terms of pathogen concentration in untreated wastewater is to employ non-parametric (boot-strap) generation of a concentration dataset based on a systematic literature review such as in the GWPP via a meta-analysis (Eftim et al., 2017). Although the left-censored data issue (low values and non-detects) cannot be overcome when the number of detected samples is small. The other option is to use the maximum virus concentration which would give conservative LR values (Gerba et al., 2017).

The LR target values of 6-7 log10 for pathogens including viruses are indicated in 2006 WHO guidelines (including wastewater treatment and various health protection measures) for unrestricted irrigation, in order to achieve the tolerable annual disease burden of 10-6 DALYpppy (WHO, 2006). The guideline of Queensland, Australia (The State of Queensland, 2008) states that 6.5 log10 reduction is needed for the highest quality reclaimed water (Class A+), which can be used for agricultural irrigation. In the state of Victoria, reclaimed wastewater used for irrigation of vegetables eaten raw is classified into the highest quality reclaimed water (Class A). Class A standard in Victoria is defined that 7 log10 reduction is required from untreated wastewater (Environment Protection Agency Victoria, 2005, 2003).

These LR values including WHO-recommended values are approximately 1-2 log10 higher than the calculated LR target values in this study when the reliability of the LR target value calculation was 99.9%. While the differences in the LR target values are caused by a difference in virus type of concern, its concentration in untreated wastewater, and exposure parameters.

As mentioned above, the virus type of concern, its concentration in untreated wastewater, and the water reuse scheme should be based on the epidemiological background in the countries/regions where water reuse is implemented. Many countries or regions do not have the resources for monitoring the pathogens of concern, thus hopefully the GWWP with it’s literature review of the data for various pathogen concentrations will be useful. It is recommended that any QMRA for wastewater treatment or reclamation be disclosed the calculation steps of the LR target values along with the virus monitoring procedures and quantitative data used. This would feed into the decision framework described above in Section 3.0.

This means that decision makers can decide regarding type of hazard and levels of reliability which would be provide feasible target log reductions as the first step forward.

Appendix A. Calculation Steps for The Target LR Values

The QMRA model was developed based on a previous study (Symonds et al., 2014) and used to calculate the tolerable concentration of NoV in reclaimed wastewater and the LR target value in wastewater reclamation systems, given the tolerable annual disease burden associated with NoV infections under each exposure scenario of 10-4 or 10-6 DALYpppy (Mara et al., 2010; WHO, 2006). In this Appendix, a chronological step process of the calculation of target LR values is indicated.

R codes for these calculations are available in the supplementary data of Ito et al. (2017). URL: https://www.sciencedirect.com/science/article/pii/S0043135417307194

STEP 1: Determination of virus concentration in influent

The quantified data of NoV in wastewater influent at the target facility is preferable to calculate the required LR values for reflecting site-specific conditions. There is a clear seasonality in the epidemic of NoV, in which larger number of infections are usually observed during winter, and the NoV concentration in wastewater influent is following this periodic fluctuation (Kazama et al., 2016). It is highly recommended that NoV in wastewater influent be quantified frequently during the epidemic season (once or twice a week). It is more preferable if NoV concentration datasets from a multi-year period are available.

NoV genome quantity in wastewater influent is measured by RT-qPCR. Prior to the NoV genome quantification using RT-qPCR, a wastewater sample is usually concentrated by a two-phase separation technique using polyethylene glycol (Lewis and Metcalf, 1988). To ensure the accuracy of the quantified value, it is necessary to include a series of control materials, including a process control, a molecular process control, and an RT-qPCR control to detect inhibition in the recovery, RNA extraction and genome quantification steps (Haramoto et al., 2018). When the recovery of these controls is too low (e.g., less than 1%), virus quantification processes need to be repeated, otherwise the quantified values of genome copy number are not reliable.

The calculation process requires datasets following log-normal distribution (Kato et al., 2013). The lognormality of the datasets needs to be checked by statistical tests such as Kolmogorov–Smirnov test and Shapiro-Wilk test.

STEP 2: Determination of exposure parameter values

The default values of exposure parameters for the calculation are shown in Table 2. There are two exposure scenarios: Scenario I assumed accidental ingestions of reclaimed wastewater by farmers; while Scenario II assumed daily consumption of vegetables grown with reclaimed wastewater irrigation. Each parameter values can be replaced with revised values that are reported in scientific literatures.

When other exposure scenarios are employed, you need to find exposure parameter values from scientific literatures. QMRA wiki provides some parameter values for the ingestion through drinking water and recreational water (http://qmrawiki.canr.msu.edu/index.php?title=Human_Environment_Exposure_Parameters).

STEP 3: LR target value calculation

All calculations were performed in “R” version 3.1.0. The R is a free software environment for statistical computing and graphics, which compiles and runs on a wide variety of UNIX platforms, Windows and MacOS. You can download and install the R software according to the R Project for Statistical Computing (https://www.r-project.org). The R codes used in this calculation are found in the supplementary data of Ito et al. (2017). URL: https://www.sciencedirect.com/science/article/pii/S0043135417307194

The R codes provided by the first author of this article include default values exposure parameters indicated in Table 2.

Mathematics in the target LR calculation is as follow:

The DALYpppy is the tolerable annual disease burden per person per year derived from NoV GII infection-related illness. DALYpppy can be estimated as the product of the annual probability of illness and the estimated burden of disease per case of illness (i.e., DALYs per case), each of which is denoted by pill and DB, respectively, hereinafter. Namely, DALYpppy is expressed as:

$DALY= P_{ill}\times DB$ (1)

Although the parameter DB is given by a uniform distribution in a previous literature (Mok et al., 2014), the maximum value in the range of the disease burden per case of illness (6.23×10-3) was used in the calculation to take the worst case into account.

Let us denote the event of illness and exposure by ill and exp, respectively. For example, the notation p(ill | exp) represents the probability of illness under the condition of an exposure event. Letting n be the number of exposure days per person per year, the annual probability of illness can be written as

$P_{ill}= 1-(1- p(\mathbf{ill} |\mathbf {exp}))^n$ (2)

Let inf denote the event of infection. The illness probability under exposure is derived as

$p(\mathbf{ill}|\mathbf {exp})= p(\mathbf{ill} |\mathbf {inf}, \mathbf {exp}) p(\mathbf{ill} |\mathbf {exp})$ (3)

In this study, the following model for the dose-dependent conditional probability of illness was employed (Teunis et al., 2008):

$p(\mathbf{ill}|\mathbf{inf},\mathbf{exp})=1 - (1+\eta\lambda)^{-r}$ (4)

where η and r are the pre-defined parameters of this model (Messner et al., 2014), and the parameter λ is a daily dose of norovirus. The parameter λ is used again in the model of infection p(inf | exp).

Two models, the hypergeometric model and the fractional Poisson model -- denoted by phg(inf | exp) and pfp(inf | exp), respectively -- were compared as the model of infection p(inf | exp) in this study. The hypergeometric model phg(inf | exp) was introduced by Teunis et al. 2008. Their model has three parameters α, β, and a (0 ≤ a ≤ 1) and its probability function is given by:

$p_{hg}(\mathbf{inf} |\mathbf {exp})=1-(_{2}F_{1}(β,^\frac{λ(1-a)}{a},α+β;a)\times (\frac{1}{1-a})^{-(^\frac{λ(1-a)}{a})})$ (5)

This equation is the Pfaff transformation of hypergeometric model (Barker et al., 2013). In this study, for the three parameters α, β, a, the values determined by Messner et al. 2014 were used. These dose-response parameters for the hypergeometric model were obtained by fitting with multiple dose data of NoV GI genotype 1 (NoV GI.1) and NoV GII genotype 4 (NoV GII.4) in human challenge studies (Atmar et al., 2014; Frenck et al., 2012; Seitz et al., 2011; Teunis et al., 2008). The Pfaff transformation of the model was used as a close approximation here, assuming all doses ≤33,323, because the fit value for parameter “a” provided by Teunis et al. 2008 exceeds a constraint of the Gauss hypergeometric model used in Eq. (5) (Barker et al., 2013).

The fractional Poisson model pfp(inf | exp) was proposed by Messner et al. 2014. The fractional Poisson model assumes secretion status of histo-blood group antigen (HBGA) that has been suggested as an infection factor for human noroviruses (Lindesmith et al., 2003). In the fractional Poisson model, it is assumed that secretor positive individuals (Se+) are perfectly susceptible and secretor negative individuals (Se) are protected from norovirus infection. The probability mass function of this model is given as:

$P_{fp}(\mathbf{inf}|\mathbf{exp})=P\times(1-e^{-(\fracλμ)}) $ (6)

where λ is the daily dose of norovirus described above, µ is the mean aggregate size, and P is the fraction of secretor positive individuals (Messner et al., 2014). These dose-response parameters for the fractional Poisson model were obtained by fitting with the multiple dose data of NoV GI.1 and NoV GII.4 obtained from human challenge studies (Atmar et al., 2014; Frenck et al., 2012; Seitz et al., 2011; Teunis et al., 2008).

The daily dose (λ) for NoV was calculated using Eq. (7) and (8) for scenario I and II, respectively:

$λ= 10^c\times V (7)$

$λ=10^c\times ((W_{L}\times W_{Ca} )\times V_L+(W_{T}×W_{Cu})\times V_{NL})/10^R (8)$

where c is logarithmic tolerable concentrations in reclaimed wastewater, V is volume of water ingested per person per day, W is daily consumption of fresh vegetables and R is total NoV log10 reduction from last irrigation to consumption.

The tolerable concentration of NoV in reclaimed wastewater was calculated using iterative calculations according to Symonds et al. 2014. Briefly, the value of tolerable concentration was increased from zero in increments of 0.1 for at most 10,000 iterations. Once the value of tolerable concentration with a maximum DB (6.23×10-3) produced the DALY value above the tolerable annual disease burden of 10-4 or 10-6 DALYpppy, its operation could break out of the loop. At the end of the loop, the value of tolerable concentration decreased by 0.1 to meet the tolerable annual disease burden of 10-4 or 10-6 DALYpppy.

The virus LR target values were then calculated to ensure less than or equal to the annual disease burden of 10-4 or 10-6 DALYpppy at 95, 99, and 99.9% reliability values. Briefly, a defined percentile (95, 99, and 99%tile) of NoV concentration was obtained from influent concentration distribution of NoV. The required LR target value of NoV was calculated by the difference between 95, 99, and 99.9%tile of NoV concentration in influent and the tolerable concentration of NoV in reclaimed water. This process is depicted in Figure A1.

Figure A1. The relationship among virus concentration distribution, tolerable virus concentration in reclaimed wastewater and tolerable annual disease burden