Cryptosporidium spp.


Published on:
-

Chapter info

Copyright:


This publication is available in Open Access under the Attribution-ShareAlike 3.0 IGO (CC-BY-SA 3.0 IGO) license (http://creativecommons.org/licenses/by-sa/3.0/igo). By using the content of this publication, the users accept to be bound by the terms of use of the UNESCO Open Access Repository (http://www.unesco.org/openaccess/terms-use-ccbysa-en).

Disclaimer:

The designations employed and the presentation of material throughout this publication do not imply the expression of any opinion whatsoever on the part of UNESCO concerning the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries. The ideas and opinions expressed in this publication are those of the authors; they are not necessarily those of UNESCO and do not commit the Organization.

Citation:

Betacourt, W.  2019. Cryptosporidium spp. In: J.B. Rose and B. Jiménez-Cisneros, (eds) Global Water Pathogen Project. http://www.waterpathogens.org (R. Fayer and W. Jakubowski, (eds) Part 3 Protistshttp://www.waterpathogens.org/book/cryptosporidium  Michigan State University, E. Lansing, MI, UNESCO.

Acknowledgements: K.R.L. Young, Project Design editor; Website Design (http://www.agroknow.com)

Last published:
Authors: 
Walter Betancourt (University of Arizona)

Summary

Cryptosporidium is a genus of protists recognised as a major cause of diarrhoeal illness, contributing significantly to the global burden of gastroenteritis, especially in young children. Cryptosporidium is an apicomplexan traditionally considered a coccidian but is more closely related genetically to the gregarines. Cryptosporidium occurs worldwide but infection is especially prevalent where drinking water quality and sanitation are poor, and is most significant clinically in young children, malnourished people and immunocompromised patients.

The oocyst stage of the life cycle is shed in faeces of humans and animals and survives many environmental conditions and disinfectants. Oocysts have been detected in surface and ground waters, drinking water, wastewaters, treated and untreated recreational waters, soil and biofilm, and food and beverages including fruit and vegetables, juice, milk and shellfish. These can be transport vehicles from infected to susceptible hosts, in addition to direct transmission through person-to-person and animal contact.

Cryptosporidium oocysts can be detected by microscopy, immunological or molecular techniques. The 26 or so species can only be differentiated by molecular techniques as the oocysts are morphologically similar. Although many are host-adapted to animals, the most common species infecting humans are C. hominisC. parvum and C. meleagridis, with geographic differences in species and subtype distribution. Despite description of several genotyping methods, there is no standardised method of investigating subtypes within species. The increasing availability of sequence data and whole genome sequences will help the discovery of suitable markers for multilocus genotyping. The limited ability to propagate Cryptosporidium spp. in culture and the lack of quantitative molecular methods for assessment of human infectivity potential (species; viability) of isolates in food, water and environmental samples hamper risk assessments.

Prevention and control measures include personal hygiene, effective sanitation and drinking water protection and treatment. There is a lack of effective specific therapy and no vaccine.

Members of the genus Cryptosporidium are eukaryotic protozoan parasites that infect epithelial cells in the microvillus border of the gastrointestinal tract of all classes of vertebrates. Cryptosporidium has been recognized as one of the most common etiological agents of waterborne disease.

1.0 Epidemiology of the Disease and Pathogen(s)

1.1 Global Burden of Disease

Since the emergence of Cryptosporidium as a human pathogen in 1976, its global distribution has been recognized but we are still learning about the severity of disease (i.e., cryptosporidiosis), true prevalence, and impact on human and animal health. In 2004 this parasite was included in the World Health Organization’s Neglected Diseases Initiative (Savioli et al., 2006) acknowledging identified links to poverty, the disproportionate effect on young and malnourished, children impairing growth and development, and effect on immunocompromised people. Studies undertaken using improved diagnostic assays have revealed that the contribution of Cryptosporidium to childhood morbidity and mortality is significant and have established Cryptosporidium among the most important infectious causes of moderate-to-severe diarrhoea, with long-term clinical impacts, cognitive deficit and socio-economic impact (Shirley et al., 2012Kotloff et al., 2013Khalil et al., 2018).

For example, in India, positivity rates for Cryptosporidium diagnostic tests ranging from 1.1% to 18.9% have been reported among people in communities and attending healthcare-settings. To generate annual national estimates of the burden of diarrhoea, hospitalization, and mortality in children under two years of age (the age group with the highest prevalence), data from studies in Vellore have been extrapolated to the whole population (Sarkar et al., 2013). These estimates suggest that each year due to cryptosporidiosis, one in every 6 to 11 children will have an episode of diarrhoea (3.9 to 7.1 million episodes nationally), 1 in every 169 to 633 children will be hospitalized (66.4 to 249.0 thousand hospitalisations nationally) and 1 in every 2,890 to 7,247 children will die (5.8 to 14.6 thousand deaths nationally) (Sarkar, Ajjampur et al. 2013).

The Global Enteric Multicenter Study (GEMS) is the largest prospective study of diarrhoeal morbidity and mortality in children to date (Kotloff et al., 2013). It was designed as a longitudinal case–control study during the first 3 years of the lives of 9,439 children with moderate to severe diarrhoea, and 13,129 control children without diarrhoea, in 7 sites in sub-Saharan Africa and South East Asia. Cryptosporidium was found to be one of the leading causes of, and was significantly associated with, moderate-to-severe diarrhoea; it was the only gastrointestinal pathogen associated with an increased risk of death in children aged 12 to 23 months.

By extrapolating the GEMS data, an enormous burden of Cryptosporidium disease has been identified: the estimated annual number of Cryptosporidium-attributable diarrhoea cases in children aged <24 months was 2.9 million in sub-Saharan Africa and 4.7 million in the India/Pakistan/Bangladesh/Nepal/Afghanistan regions, leading to a combined ~202,000 Cryptosporidium-attributable deaths (Sow et al., 2016).

However, the global significance of cryptosporidiosis is widespread and far-reaching. It is also now understood that the cause of diarrhoea in low-income countries frequently comprises multiple pathogens with Cryptosporidium as one of the most important enteric pathogens (Shirley et al., 2012). One implication is that the challenges this parasite presents for control in drinking water (see below) need to be considered when general control measures for diarrhoeal illness are put in place.

1.1.1 Symptoms

Diarrhoea is the most common symptom of cryptosporidiosis, but its manifestation varies according to the clinical history for the person affected and different syndromes tend to be seen in three broad populations (Chappell et al., 1999Chappell et al., 2006Chalmers and Davies, 2010):

  1. People who are otherwise healthy and immune competent experience prolonged, watery diarrhoea of sudden onset, abdominal pain, nausea and vomiting, and low-grade fever. These symptoms usually last up to two weeks but can last for up to four weeks and are ultimately self-limiting. During this time, symptoms may stop only to re-start again a day or so later. Significant weight loss can occur. This is the symptom most commonly seen in children and adults in middle and high-income countries.
  2. Malnourished young children (<2 years of age) experience moderate to severe diarrhoea that can be persistent and result in increased morbidity and mortality. This is the syndrome often seen in low-income countries.
  3. Severely immunocompromised individuals may suffer chronic severe diarrhoea of large volume (sometimes > 2L per day) that is hard to control, and is often associated with profound weight loss and prostration, and increased morbidity and mortality. At most risk are people with acquired T-cell immune deficiency (e.g. advanced HIV infection) or congenital T-cell immune deficiency, and those with hematologic malignancies (particularly children). Complications include pancreatobiliary cryptosporidiosis, which may rarely lead to sclerosing cholangitis and liver cirrhosis observed in patients from industrialized countries. Pulmonary cryptosporidiosis is also a rare complication.

In addition to the gastrointestinal syndrome usually seen, pulmonary cryptosporidiosis may be an under-diagnosed disease in children with diarrhoea and unexplained cough (Mor et al., 2010).

1.1.2 Asymptomatic carriage

Asymptomatic carriage occurs in both immunocompetent and immunocompromised patients (Clayton et al., 1994) but has been mainly investigated and demonstrated among young children, in low and high-income countries. Asymptomatic carriage may present a risk of onward transmission where hygiene is poor. Carriage rates of 1.3% to 3.8% have been found in cross-sectional studies of immunocompetent children in childcare settings in France, Spain, the United Kingdom, and the United States (Cordell and Addiss, 1994Davies et al., 2009). A birth cohort study of children in the first two years of life in a shanty town in Lima, Peru identified 36/207 (18%) recruits with asymptomatic Cryptosporidium infections, which were found to be associated with poor growth (Checkley et al., 1998), indicating that asymptomatic infection also causes an important burden of disease.

1.1.3 Sequelae

The consequences of infection with Cryptosporidium are only just beginning to be understood. Associations between Cryptosporidium, malnutrition, and deficits in the growth and cognitive development of young children have been identified (Mølbak et al., 1997Agnew et al., 1998Checkley et al., 1998Guerrant et al., 1999Mondal et al., 2009), and a cycle of infection and impairment becomes established. It can be difficult to tell from some studies whether malnutrition is part of the sequelae or is a causative factor in infection; in fact, both may be true. One study of a birth cohort in Bangladesh found that malnutrition-linked stunting at birth was a risk factor for subsequent infection (Mondal et al., 2012). A study in Peru found that stunting in early childhood was associated with persistent growth deficits post-infection compared with non-stunted children in whom post-infection growth deficits were transient (Checkley et al., 1998). In a study of children in a favela in Brazil, C. hominis was associated with heavier infections and greater growth shortfalls, even in the absence of symptoms, compared to C. parvum (Bushen et al., 2007). In addition, a recent study in Brazil has identified that diarrhoea in early childhood can cause cognitive delays independently of malnutrition (Pinkerton et al., 2016).

There is also growing evidence that suggests that there may also be longer-term health effects after resolution of acute cryptosporidiosis in high-income countries. There are reports of a sero-negative reactive arthritis in adults (Hay et al., 1987Ozgul et al., 1999Hunter et al., 2004) and children (Shepherd et al., 1989) including one report of Reiter’s syndrome (arthritis, conjunctivitis and urethritis) (Cron and Sherry, 1995). In the UK, a case-control study found that cases of cryptosporidiosis caused by C. hominis (but not C. parvum) were more likely to experience joint pain, eye pains, headaches and fatigue in the two months following infection (Hunter et al., 2004). A case control study in Sweden (Rehn et al., 2015) after two waterborne outbreaks of C. hominis found that cases were more likely than controls to report diarrhoea, abdominal pain and joint pain several months after the outbreak. It has also been suggested that Cryptosporidium infection may cause relapses in Crohn’s disease and ulcerative colitis (Manthey et al., 1997). Several gastrointestinal pathogens lead to an increased risk of developing irritable bowel syndrome (IBS), and in experimental infection of rats, C. parvum trigged long-term pathological changes similar to those seen in patients with IBS (Marion et al., 2006Khaldi et al., 2009).

Associations between cryptosporidiosis and cancer have been reported in immunocompromised populations. In one study of human immunodeficiency virus (HIV)-infected people, the incidence of colorectal cancer was reported to be higher than in the general population (Patel et al., 2008) and a study of people with acquired immunodeficiency syndrome (AIDS) found a significantly higher risk of developing a colon carcinoma among those with cryptosporidiosis (Shebl et al., 2012). It has been suggested that children with X-linked hyper-IgM syndrome and cryptosporidiosis may be at greater risk of developing liver cancer (Tomizawa et al., 2004). In two studies of patients with colorectal cancer in Poland prior to chemotherapy, the prevalence of Cryptosporidium spp. was comparable to that reported for patients with immune deficiency (Sulzyc-Bielicka et al., 2007Sulzyc-Bielicka et al., 2012). Gene expression studies and induction of neoplastic changes in the digestive epithelium of the mouse model that contribute to colonic carcinogenesis lend support to an association of cryptosporidiosis with cancer in humans and justify further epidemiologic and experimental studies (Benamrouz et al., 2014).

1.1.4 Economic impact, cost-benefit

Although there has been an increasing focus on understanding the clinical impact of infection with Cryptosporidium, the economic impact has been little studied, and hence there are no Cryptosporidium-specific studies of the economic value of interventions aimed at improving water supplies, although the need has been highlighted both for developed as well as developing countries (Bridge et al., 2010). In developing countries, there is a need to increaseawareness of water contamination by Cryptosporidium as a public health issue, and even in developed countries, greater consideration needs to be given to small or community supplies or mains supplies where Cryptosporidium is not considered adequately in risk assessments (Bridge et al., 2010). In one study, Hunter and colleagues identified the median annual risk of infection with Cryptosporidium from very small supplies to be of 25–28%, which is substantially greater than for public supplies and well above that considered tolerable (Hunter et al., 2011). One difficulty is that the reported rate of illness is much lower especially in adults perhaps resulting from increased immunity in those who drink water from private supplies, but this does not acknowledge the risk to young children from these supplies. Hunter and colleagues also undertook a cost-benefits analysis of interventions for the prevention of acute diarrhoeal disease and post-infectious IBS, although not specifically for Cryptosporidium (Hunter et al., 2009). They concluded that the cost-benefit ratio was positive and increased when IBS was included. The impact of acute disease and the impact of chronic disease need to be taken into account as well as the costs of investigating outbreaks. For example, the city of Milwaukee in the USA was affected by a Cryptosporidium outbreak in 1993, resulting from a drinking-water filter failure. The outbreak infected an estimated 403,000, and led to the death of 69 people at a total cost of US $96.2 million (Corso et al., 2003).

1.2 Taxonomic Classification of the Agent(s)

Traditionally, the genus Cryptosporidium was classified based on similarities in life cycle, structure, and ultrastructure of the parasite. This originally placed Cryptosporidium in a group known as the coccidia in the phylum Apicomplexa, class Sporozoasida, subclass Coccidiasina, order Eucoccidiorida, suborder Eimeriorina, family Cryptosporidiidae (Fayer and Ungar, 1986). The coccidia are obligate, intracellular parasites which means they need to live and reproduce within a host animal cell. Other human infective examples of Apicomplexans include Toxoplasma gondii and Cyclospora cayetanensis. However, Cryptosporidium sat uneasily in this classification as many of its other features are not typical of the coccidia: for example, because Cryptosporidium oocysts “mature” within the host and are excreted sporulated, reinfection or autoinfection can occur; the location of the parasite in the host cell is different (intracellular but extracytoplasmatic); and, perhaps most importantly, Cryptosporidium is resistant to treatment with anti-coccidial drugs (Barta and Thompson, 2006Garcia-Aljaro et al., 2018).

In many aspects, Cryptosporidium is more like the group of organisms known as the gregarines in the class Gregarinomorphea, such as in the way it derives nutrients from the host through a feeder organelle (Barta and Thompson, 2006). The evidence for reclassification from morphological, biological and biochemical data has been further strengthened by whole genome comparisons (Ryan et al., 2016). The class Gregarinomorphea has now been revised to include a new subclass Cryptogregaria, containing a single, new order Cryptogregarida into which the family Cryptosporidiidae has now been placed (Cavalier-Smith, 2014Ruggiero et al., 2015). Cryptosporidium is currently the only genus in the family, although analysis of isolates from fish provides some evidence for a new genus Piscicryptosporidium, but further genetic and biological characterisation is needed (Palenzuela et al., 2010).

The recently-confirmed ability for Cryptosporidium to reproduce and multiply in a culture environment free of animal host cells has yet to be reported outside of the optimal conditions in the laboratory, and the implications for growth, transmission, public health, and water quality management are as yet not known (Ryan et al., 2016).

Historically, many Cryptosporidium species were named based on the assumption that they were host-specific (Fayer et al., 2010). However, while there may be some degree of host adaption for most species, strict host specificity is rare. At the time of writing, there are at least 37 species with formally described biological and genetic characteristics of which 17 have been reported in humans worldwide (Ryan et al., 2014Zahedi et al., 2017Condlova et al., 2018Kvac et al., 2018Zahedi et al., 2018b). Not all species pose a human health threat (Table 1), and it is useful in source water testing to know which are present to inform risk assessments. Cryptosporidium hominis and Cryptosporidium parvum are the major human pathogenic species of Cryptosporidium and have worldwide distribution (Xiao, 2010). The transmission of C. parvum in humans is mostly anthroponotic in developing countries, with zoonotic infections playing an important role in developed countries (Xiao, 2010). Large drinking water associated cryptosporidiosis outbreaks have been recognized in Europe, Australia and North America (Efstratiou et al., 2017). In addition, C. hominis is most prevalent in most developing countries whereas C. parvum is more prevalent in the Middle East. Where C. parvum has been found in developing countries this is frequently an anthroponotic genotype (Sow et al., 2016), as previously mentioned, indicating that controlling and eliminating human contamination are important protection measures aimed at reducing the public health risk of cryptosporidiosis. The life cycle stage sought in routine testing is the oocyst but these do not define species as many of the oocyst sizes are similar (Table 1), and they cannot be differentiated from each other by antigen detection. Molecular methods detecting DNA sequence variations are therefore necessary to differentiate species. Variants detected by sequencing the small subunit ribosomal (SSU r) RNA gene from isolates for which further biological data are lacking are referred to as genotypes, not species, and are often named after the host in which they were first detected (Fayer et al., 2010). The word genotype is also used to describe variants detected at other loci within defined species.

Table 1. Formally described Cryptosporidium species, their oocyst sizes, major hosts and reported human infections

Cryptosporidium species

Mean Oocyst Dimensions (µm)a

Major Host(s)

Reported Human Infection

C. andersoni

7.4 x 5.5

Cattle

Yes, but rare

C. avium (previously avian genotype V)

5.3 x 6.9

Birds

No

C. apodemi

4.2 x 4.0

Rodents

No

C. baileyi

6.2 x 4.6

Poultry

No

C. bovis (previously bovine B genotype)

4.9 x 4.6

Cattle

Yes, but rare

C. canis (previously dog genotype)

5.0 x 4.7

Dog

Yes, occasional

C. cuniculus (previously rabbit genotype)

5.6 x 5.4

Rabbit, humans

Yes, occasional. One drinking water outbreak reported

C. ditrichi

4.7 x 4.2

Rodents

No

C. ducismarci

NR

Tortoises

No

C. erinacei

4.9 x 4.4

Hedgehog

Yes, but rare

C. fayeri (previously marsupial genotype I)

4.9 x 4.3

Marsupials

Yes, but rare

C. felis

4.6 x 4.0

Cat

Yes, occasional

C. fragile

6.2 x 5.5

Black spined toad

No

C. galli

8.3 x 6.3

Chicken

No

C. homai

Not observed

Guinea pig

No

C. hominis (previously C. parvum human genotype, genotype 1, and genotype H)

4.9 x 5.2

Humans

Yes, common. Outbreaks are reported frequently

C. huwi (previously piscine genotype 1)

4.6 x 4.4

Guppy

No

C. macropodum (previously marsupial genotype II)

5.4 x 4.9

Eastern grey kangaroo

No

C. meleagridis

5.2 x 4.6

Birds, mammals

Yes, rare to common, depending on the setting. One farm-related and one school-related outbreak reported

C. molnari

4.7 x 4.5

Sea bream

No

C. muris

7.0 x 5.0

Rodents

Yes, but only rarely

C. occultus

5.20 x 4.94

Rodents

No

C. parvum (previously bovine genotype, genotype II, and genotype B)

5.0 x 4.5

Humans, pre-weaned mammalian livestock

Yes, common. Outbreaks are reported frequently

C. proliferans

7.7 x 5.3

Rodents

No

C. ryanae (previously deer-like genotype)

3.7 x 3.2

Cattle

No

C. rubeyi

4.7 x 4.3

Ground squirrels

No

C. scrofarum (previously pig genotype II)

5.2 x 4.8

Pig

Yes, but rare

C. serpentis

6.2 x 5.3

Reptiles

No

C. suis (previously pig genotype I)

4.6 x 4.2

Pig

Yes, but rare

C. tyzzeri (previously mouse genotype I)

4.6 x 4.2

Mice

Yes, but rare

C. ubiquitum (previously cervine genotype, cervid genotype, W4 genotype, or genotype 3)

5.0 x 4.7

Various mammals

Yes, occasional

C. viatorum

5.4 x 4.7

Humans

Yes, occasional

C. varanii
(syn. C. saurophilum)

4.8 x 4.7

Reptiles

No

C. wrairi

5.4 x 4.6

Guinea pig

No

C. xiaoi (previously C. bovis-like genotype or C. bovis from sheep or C. agni)

3.9 x 3.4

Sheep, goat

No

NR: Not Reported


Different types of molecular diagnostic tools have been used in the identification of Cryptosporidium spp. and revised in detail elsewhere (Ryan et al., 2014). These tools can be categorized into genotyping, subtyping, multilocus typing/population genetics, and comparative genomics depending on the approaches and usages. These highly discriminatory molecular techniques allow intra-specific differentiation of C. parvum and C. hominis which leads to the identification of many subtype families and many subtypes within each species. Subtyping of Cryptosporidium species, specifically C. hominis and C. parvum, can provide clarity of mode of transmission in addition to being important epidemiological tools, especially in outbreak situations (King et al., 2019).

1.3 Transmission

People become infected with Cryptosporidium either by direct contact with other infected people (anthroponotic transmission), animals (zoonotic transmission), or by ingestion of contaminated food or water (foodborne or waterborne transmission) (Fayer, 2004Xiao and Feng, 2008Chalmers, 2012Ryan et al., 2014Zahedi et al., 2016Shrivastava et al., 2017Pumipuntu and Piratae, 2018Ryan et al., 2018). Studies in the United States and Europe have indicated that direct person-to-person or anthroponotic transmission of cryptosporidiosis is more common (Xiao, 2010). The role of zoonotic transmission in the acquisition of cryptosporidiosis in humans has been examined by a few case-control studies (Hunter et al., 2004Lake et al., 2007). Contact with farm animals, especially cattle, has been identified as a major risk factor for sporadic cryptosporidiosis in industrialized countries (Hunter et al., 2003Pollock et al., 2008Brankston et al., 2018).

The infectious dose varies between isolates of the same species and between different species but is low to relatively low for both C. parvum and C. hominis, respectively. There is a chance of infection with a single oocyst (Chappell et al., 1999Benamrouz et al., 2012). In human volunteer studies a median infectious dose for C. hominis ranged from 10 to 83 oocysts and for C. parvum from below 10 to over 1,000 oocysts (Okhuysen et al., 1999Chappell et al., 2006). Human challenge studies have also established susceptibility to infection and illness from non-C. parvum species (Chappell et al., 2011). Volunteers who took part in the study received an oocyst dose of C. meleagridis that would likely be much higher than encountered in a community setting. Nevertheless, the diarrheal illness that they experienced was similar to the illness reported in naturally acquired C. meleagridis infection, which, in turn, was indistinguishable from illness caused by C. parvum or C. hominis.

In addition, epidemiological studies have shown that the geographical distributions of Cryptosporidium spp. vary around the world. C. hominis is more prevalent in North and South America, Australia, and Africa, while C. parvum is localized more in Europe, especially in the UK (Caccio, 2005).

1.3.1 Routes of transmission

Cryptosporidium mainly infects the gastrointestinal tract, and the oocyst stage is shed in the feces. These thick walled, hardy oocysts contain two infectious sporozoites which emerge (excyst) in the gut after oocysts are ingested. The sporozoites infect the cells lining the gut before transforming into life cycle stages enabling asexual and sexual reproduction, the latter resulting in the formation of oocysts containing infectious sporozoites. Thus, when oocysts are shed in feces they are already sporulated. Oocysts in feces, slurry, farm wastes, or sewage may contaminate food (e.g. raw milk, vegetables), water (e.g. surface waters, recreational waters) and fomites (e.g. dirty farm gates, boots, utensils) (Fayer and Ungar, 1986Fayer et al., 2000). The hardy oocysts can survive drinking water treatment processes: they may not be removed by filtration if it has not been designed for their removal and they are not killed by regular chlorination (Betancourt and Rose, 2004). They can also survive wastewater treatment (Gennaccaro et al., 2003Quintero-Betancourt et al., 2003King et al., 2017).

Transmission of Cryptosporidium is either direct animal-to-person (especially when people have contact with livestock) or person-to-person (such as in households or day care settings) through feces, often from hand-to-mouth contact especially in young children, or indirect via a contaminated vehicle such as water or food such as raw milk and leafy green vegetables (Fayer et al., 2000Bourque and Vinetz, 2018Ryan et al., 2018). In addition to the more recognised faecal-oral route, inhalation of oocysts in droplets from feces or aspiration of gastrointestinal contents during vomiting may cause respiratory cryptosporidiosis (Sponseller et al., 2014).

Of the two Cryptosporidium species that cause most human cases of cryptosporidiosis, C. hominis is transmitted between people (anthroponotic) whereas C. parvum is transmitted from livestock animals (zoonotic) or people (Xiao, 2010Ryan et al., 2014). Although C. parvum is sometimes found in companion animals, they generally harbour other Cryptosporidium species (Table 1) and occasionally transmission is reported from cats (C. felis) and dogs (C. canis) (Lucio-Forster et al., 2010). Similarly, wildlife harbours a range of Cryptosporidium species and genotypes many of which have not been identified in human infections (Table 1) (Appelbee et al., 2005Xiao and Fayer, 2008), although C. cuniculus (from rabbits) has caused a waterborne outbreak in the UK (Chalmers et al., 2009).

A global review of published waterborne outbreaks caused by parasitic protozoa between January 2011 and December 2016 reported that 63% were caused by Cryptosporidium (Baldursson and Karanis, 2011Efstratiou et al., 2017). There was a reporting bias towards Australasia (Australia, New Zealand, and some neighbouring islands), North America and Europe where surveillance systems are established. Although most waterborne outbreaks are caused by C. parvum or C. hominis, it is likely that any human infective Cryptosporidium species could be transmitted through drinking water if it becomes contaminated. For example, an outbreak in the UK in 2008 was caused by C. cuniculus when a rabbit entered a chlorine contact tank at a water treatment works (Puleston et al., 2014).

Historically, there has been an emphasis on Cryptosporidium as a waterborne pathogen, perhaps to the detriment of focus on other probably important routes of transmission, particularly food. Eighteen outbreaks identified (Robertson and Chalmers, 2013) have been linked to food, including raw vegetables and salad items, herbs, raw meat, unpasteurised milk and dairy products and apple juice. There is also a bias towards countries with enhanced surveillance and investigation of cases in the identification of foodborne outbreaks of cryptosporidiosis.

Food contamination may be from feces or water used during food production, processing, or preparation. Because Cryptosporidium oocysts are hardy, they are able to survive some processing treatments including chlorine baths used for washing fresh produce and blast freezing (Duhain et al., 2012). Standard methods for the detection of Cryptosporidium in any foodstuff were only published in 2016 for leafy green vegetables and berry fruits (ISO, 2016). Natural contamination of food has been identified using methods similar to this standard, essentially involving a three-stage process (i) elution of oocysts from the surface, (ii) concentration and removal from debris by immunomagnetic separation, and (iii) detection of oocysts by immunofluorescence microscopy (IFA).

Sample surveys using methods based on this outline have reported Cryptosporidium oocysts in produce from commercial distributors in Norway where 4% of lettuce and 9% of mung bean sprouts were positive for Cryptosporidium (Robertson and Gjerde, 2001). Other produce found positive for oocysts has included leafy greens (33% Chinese cabbage, 75% Lollo rosso lettuce and 78% Romaine lettuce) from a grower in Spain where irrigation water at the site was contaminated with Cryptosporidium (Amoros et al., 2010) plus locally grown produce from farmers markets in Poland where one leek sample, one celery sample, and four cabbage samples were positive (Rzezutka et al., 2010). Industry washing procedures may not remove contaminating oocysts, not only because they adhere to surfaces but they may also be embedded in the stomatal openings of leaves (Macarisin et al., 2010).

1.3.2 Reservoirs

Cryptosporidium spp. infect a wide range of animal hosts of different classes, but main reservoirs of human-infective Cryptosporidium spp. are mammals (Table 1). Although C. hominis and one of the anthroponotic subtypes of C. parvum (IIc) have been found in animals, this is not common and the reservoirs for human infection seem to be solely humans in developed countries. The main reservoirs of zoonotic C. parvum are young ruminants (cattle, sheep, goats) and the prevalence of infection is often high (Santin, 2013). Additionally, there is plenty of epidemiological and genotyping data that support the transmission of C. parvum from animals to humans (Feltus et al., 2006Xiao, 2010Ryan et al., 2014). Camelids can also be infected with C. parvum (Zahedi et al., 2018a). Many country-wide studies have shown the occurrence of C. parvum on most if not all farms where many animals within the herd or flock become infected. Infection may not be symptomatic and even healthy lambs have been shown to shed numbers of oocysts in the order of magnitude seen in symptomatic animals (Pritchard et al., 2007). There is huge potential for environmental contamination; it has been estimated that an infected calf might shed around 6 x 1011 oocysts during an infection (Uga et al., 2000). Moreover, in disease-endemic areas clinical manifestations of cryptosporidiosis in healthy populations can be attributed to different species of Cryptosporidium (e.g., C. canis/C felisC. meleagridis) and subtype families of C. hominis (Cama et al., 2008).

However, there are exceptions: in some settings, Cryptosporidium is rare. For example, a large-scale study of dairy calves in Tanzania failed to confirm the presence of Cryptosporidium by IFA and PCR (Chang'a et al., 2011). Interestingly, one study in Scotland UK not only found a high prevalence in calves but also in adult cattle when large volumes of feces were screened by PCR (Wells et al., 2015). It is therefore important that risk assessments are supported by local information. In addition, point prevalence and longitudinal studies have documented age-related patterns of Cryptosporidium species in dairy cattle suggesting that parasite fitness may play an important role in transmission of cryptosporidiosis among cattle and in zoonotic infections (Santin et al., 2004Fayer et al., 2006Xiao et al., 2007). For instance, C. parvum has been found in a high numbers of young monogastric calves but in relatively few calves after weaning when they convert to rumenal nutrition (Fayer et al., 2010). In contrast, the number of cattle found infected with C. bovisC. ryanae, and C. andersoni increases immediately after weaning and then decreases steadily as cattle approach maturity.

1.3.3 Incubation period

The human incubation period between ingestion of oocysts and onset of symptoms is usually 3 to 14 days with 5 to 7 days most common, and is influenced by the number of oocysts ingested, host immune responses, and other host factors (Chappell et al., 1999). Oocysts are usually detectable in stools 1 to 3 days ahead of onset of symptoms (Chalmers and Davies, 2010).

1.3.4 Period of communicability

Infectious sporulated Cryptosporidium oocysts are excreted with feces, and therefore are immediately infective, allowing the spread of infection to other susceptible hosts (Bouzid et al., 2013).

The symptoms of acute cryptosporidiosis can be prolonged, usually lasting about two weeks but sometimes a month. The symptoms may relapse and remit in about one third or more of cases (MacKenzie et al., 1995Hunter et al., 2004). Oocysts are usually detected in all stools during the diarrhoea phase but may not be detectable in all sequential stools during recuperation or asymptomatic infection (Chappell et al., 1999Chalmers and Caccio, 2016). Even the sensitive immunofluorescent microscopy test has a reported threshold of detection of >103 oocysts per g of stool with <5 x 103 oocysts per g of stool unlikely to be seen (Weber et al., 1991). Similar to young calves, symptomatic children shed high numbers of oocysts. In a study in a favela in Brazil, children with C. hominis shed significantly more oocysts/ml of stool (3.5 x 106 oocysts/ml; range 2.5 x 104–1.2 x 107 oocysts/ml) than children with C. parvum (1.7 x 106 oocysts/ml; range 8 x 102–1.4 x 107 oocysts/ml; P = 0.001) (Bushen et al., 2007).

In some patients, oocyst shedding has been reported to continue for weeks after recovery (Jokipii and Jokipii, 1986). Symptomatic patients pose more of an infection risk because diarrhoeic stools are less easy to contain but even asymptomatic shedding may cause contamination, for example through swimming pools and will contribute to oocysts in human waste and wastewater. Cryptosporidium oocysts are hardy and resistant to many environmental pressures; survival is largely depended on moisture conditions, temperature, and exposure to UV light.

1.3.5 Population susceptibility

Cryptosporidium is recognized as a leading cause of childhood diarrhea around the world, revealed by the large Global Enteric Multicenter Study (GEMS) (Mmbaga and Houpt, 2017). In addition to children younger than 2 years old, Cryptosporidium is particularly problematic in the immunocompromised (Mmbaga and Houpt, 2017). Studies have shown that healthy adults are susceptible to infection with small numbers of Cryptosporidium oocysts, resulting in self-limited infection (Okhuysen et al., 1998). Animals and humans with intact immune systems are typically capable of clearing the parasite within 1–3 weeks after infection (Chappell et al., 1999). In developing countries, cryptosporidiosis is most prevalent during early childhood, with as many as 45% of children experiencing the disease before the age of 2 years (Valentiner-Branth et al., 2003). Therefore, three main epidemiological scenarios for Cryptosporidium have been described: (i) sporadic, often water-related, outbreaks of self-limiting diarrhea in otherwise healthy persons; (2) chronic, life-threatening illness in immunocompromised patients, most notably those with HIV/AIDS; and (3) diarrhea and malnutrition in young children in developing countries (Mor and Tzipori, 2008). The highest burden of Cryptosporidium infection is in developing countries and recent studies concluded that the most frequently reported risk factors in these settings include overcrowding, household diarrhoea, poor drinking water quality, animal contact, open defecation/ lack of toilet and breastfeeding (Bouzid et al., 2018).

1.4 Population and Individual Control Measures

1.4.1 Hygiene measures

Three main categories of infection prevention and control measures to prevent the spread of infectious diseases have been identified: (i) those that decrease host susceptibility, (ii) those that increase host resistance, and (iii) those that decrease host exposure to infectious agents. The most effective and often the most practical approach is decreasing exposure. If a host does not encounter a particular pathogen, or if exposure can be limited to a level below the infectious dose, then disease simply cannot occur (Anderson, 2015).

Cryptosporidium is highly contagious and can be found in water, food, soil, hands and materials or surfaces that have been contaminated with the excreted faecal matter of humans and animals infected with the parasite. Practicing good hand hygiene is an effective method in preventing the spread of cryptosporidiosis since the transmission of the disease typically occurs through the faecal-oral route. Hand washing with soap and hot water before preparing and eating food, after defecation or changing diapers and cleaning up after others with diarrhoea or after contact with domestic or farm animals is the simplest, most basic intervention sanitation for prevention of the disease. Bleach solutions and hand sanitizers that contain ethanol or isopropanol as the active microbicide are not effective against Cryptosporidium (Barbee et al., 1999McDonnell and Russell, 1999).

During food preparation and consumption, hand hygiene, food safety and hand washing techniques are key factors in preventing the spread of cryptosporidiosis and other enteric diseases. Washing or peeling raw fruit and vegetables thoroughly before eating and avoiding unpasteurised milk and fruit juices constitute important measures in the prevention of cryptosporidiosis. Food handling employees experiencing diarrhoea or other gastrointestinal symptoms should avoid working in food services, schools, and other group settings.

In treated recreational water venues, such as swimming pools, developing and applying robust administrative and engineering controls are imperative to reducing the likelihood of a Cryptosporidium outbreak. Concentrations of chlorine currently recommended for pool water (1-5 mg/L) do not inactivate Cryptosporidium oocysts in a timeframe that prevents ingestion exposures (Murphy et al., 2014Suppes et al., 2016). It is highly recommended (Suppes et al., 2016) that pool facilities develop or adopt policies excluding lifeguards and swimmers experiencing diarrhea from entering pool water. In addition, pools with high use by children should develop or adopt protocols that improve faecal release identification techniques. Swimmers can be exposed to Cryptosporidium oocysts before contaminated water is filtered and disinfected if staff are unaware of a faecal release. Therefore, pool facilities should train staff to recognize faecal releases and educate patrons on the importance of reporting releases.

Recreational access to drinking water catchments is a serious public health risk and therefore policies limiting activities to the outer catchment should be supported to avoid contamination of drinking water supplies (Loganathan et al., 2012). In addition, agricultural communities active in cattle ranching including beef cow-calf operations, where animals tend to be more confined, represent a serious public health concern. Cryptosporidiosis is a significant, manure-related disease, therefore the potential for contamination of water and food supplies by commercially relevant manure is large. Runoff from agricultural land that has been treated with manure has the potential to contaminate local surface water and wells that supply water for human consumption. The resource management of the agricultural waste stream is important to reduce environmental effects, enhance agricultural yields, and create viable best management plans. Best management practices to alleviate both nutrient and microbial contamination must be followed to reduce public health effects. For instance, residuals from agricultural activities must be processed adequately to ensure the protection of the animals producing the manure, the public that may be exposed to these wastes, and the environment.

Although the impact of climate on the transmission of diarrheal pathogens remains uncertain, several studies have shown that high temperatures and rainfall increase the risk of waterborne enteric diseases (Onozuka et al., 2010Guzman Herrador et al., 2015Levy et al., 2016Ghazani et al., 2018). For instance, studies have demonstrated that contamination of drinking water represented the greatest health risks and therefore interventions designed to increase drinking water treatment were necessary following periods of heavy rainfall (Carlton et al., 2014). Studies have also shown that the risk of infectious diseases after weather or flood-related natural disasters although specific to the event itself, is highly dependent on a number of factors. These factors include the endemicity of specific pathogens in the affected region before the disaster, the type of disaster itself, the impact of the disaster on water and sanitation systems, and the functionality of the surviving public health infrastructure (Ivers and Ryan, 2006).

1.4.2 Drug therapy

Cryptosporidiosis is self-limiting in immunocompetent patients and treatment is not normally required other than measures to prevent dehydration, such as oral rehydration solution. If symptoms persist, treatment modalities are limited; only nitazoxanide is licensed by the USFDA for use in immunocompetent patients >1 year of age (Fox and Saravolatz, 2005). Patients at high risk of severe cryptosporidiosis include those with HIV infection, leukemia, and lymphoma (particularly children), or those with primary T-cell immune deficiency (Hunter and Nichols, 2002). Usually only improvement of the underlying immune condition results in significant improvement, the primary treatment for immunosuppressed patients is reduction or tapering of the immunosuppression if possible. In patients with HIV infection, highly active antiretroviral therapy (HAART) is the treatment of choice as not only are the level of CD4 cells improved restoring a degree of immunity, but also protease inhibitors reduce host-cell invasion and parasite development (Pozio, 2004). Asymptomatic carriage occurs in both immunocompetent and immunocompromised patients but treatment is not required. More recent studies have shown that pyrazolopyridine KDU731 might be a promising anti-cryptosporidial drug candidate with activity against both C. parvum and C. hominis. Unlike nitazoxanide, KDU731 demon-strated in vivo efficacy in immunocompromised mice. Additionally, treatment in neonatal calves, which closely matches the pathophysiological and pharmacological challenges faced in the treatment of young malnourished children, led to a significant decrease in parasite shedding and rapid resolution of diarrhoea and dehydration (Manjunatha et al., 2017).

2.0 Environmental Occurrence and Persistence

There are numerous approaches for the detection of Cryptosporidium. Comparison of studies and understanding their significance to human health risk is often challenging due to the variety of methods used. Frequently, microscopy-based methods are used which are unable to discriminate human-pathogenic from animal-associated Cryptosporidium oocysts, nor determine viability or infectivity (USEPA, 2005USEPA, 2012). However, microscopy methods are suitable for quantifying reductions in Cryptosporidium concentrations. Finally, recently developed methods integrating cell culture, microscopy and PCR are capable of detecting and quantifying low levels of total and infectious human-pathogenic Cryptosporidium oocysts (Lalancette et al., 2010Lalancette et al., 2012King et al., 2015), but few field studies have been performed to date. Therefore, the strengths and limitations of the methodologies used in occurrence and persistence studies must be taken into consideration.

2.1 Detection Methods

Standard methods for the detection and enumeration of Cryptosporidium oocysts in water and environmental samples are described in ISO 15553 (ISO, 2006). These are similar to those described by the United States Environmental Protection Agency USEPA method 1623, 1623.1 (USEPA, 2005USEPA, 2012) and in the UK the Environment Agency (UK Environment Agency, 2010). These methods are all based on the following key steps: (i) filtration, usually of water volumes between 10 and 1,000 L; (ii) elution from the filter and concentration by centrifugation; (iii) immunomagnetic separation (IMS) using paramagnetic beads coated with specific antibodies to capture oocysts and facilitate separation from the sample matrix; (iv) staining with an immunofluorescent antibody and (v) detection and enumeration by epifluorescence microscopy. Studies have shown that the physicochemical and organic properties of the sample such as turbidity, pH, the presence of interfering particles and inorganic compounds influence the recovery efficiency of oocysts (Quintero-Betancourt et al., 2002Hallier-Soulier and Guillot, 2003). Since the introduction of Method 1623 in 1996, a number of revisions have been proposed to improve the efficiency of oocyst recovery (Rochelle and Di Giovanni, 2014). Detection by microscopy is limited to identification of the genus as the morphology is not sufficiently different to identify species. In contrast, PCR methods for the detection of total and viable human-pathogenic Cryptosporidium such as Cryptosporidium propidium monoazide PCR are available, (Brescia et al., 2009) but even quantitative PCR methods cannot accurately quantify low concentrations of oocysts (Staggs et al., 2013). DNA can be extracted from material captured by IMS or from material scraped from slides after microscopy and tested by molecular methods (LeChevallier et al., 2003Nichols et al., 2006). The benchmark method is polymerase chain reaction (PCR) and sequencing of the SSU rRNA gene (Xiao, Singh et al. 2001). A variety of other PCR assays have been described for the detection of certain species, including conventional PCR, real-time PCR (qPCR) and multiplex-PCR (mPCR) (Xiao et al., 2004Staggs et al., 2013Li et al., 2015). These methods usually have high sensitivity and specificity, but their performance on environmental samples is influenced by the presence of inhibitors that can suppress or reduce the amplification efficiencies; thus, strategies to remove or minimize PCR inhibitors during nucleic acid preparation steps are essential to ensure optimal method performance (Pavli et al., 2016).

Most studies on the occurrence and removal of Cryptosporidium have utilized microscopy and PCR assays, which do not determine viability or infectivity. Cryptosporidium infectivity assays have typically been used to assess the efficacy of disinfection processes but they can also be used to inform risk assessment models by determining the infectivity of oocysts in environmental water samples (Slifko et al., 2002). The infectivity of oocysts recovered by Method 1623 can be tested in cell culture assays by modification of the oocyst-magnetic bead disassociation step at the end of the IMS procedure and omitting microscopic examination of oocysts (LeChevallier et al., 2003). A variety of methods have been developed for detecting infection in cell culture and assays have been used to detect infectious Cryptosporidium oocysts in raw wastewater, disinfected reclaimed effluent, raw source water, treatment plant filter backwash water, and finished drinking water (Di Giovanni et al., 1999Gennaccaro et al., 2003Quintero-Betancourt et al., 2003Aboytes et al., 2004Aboytes et al., 2004Johnson et al., 2012).

2.2 Data on Occurrence

This section briefly summarizes the global occurrence of Cryptosporidium oocysts in urban sewage, surface water, ground water, drinking water, irrigation waters, and seawater including shellfish harvesting waters. Data on environmental persistence of Cryptosporidium is also presented in this section as a major feature of this parasite, which allows survival for months in water, soil, vegetables, and molluscs that become major reservoirs for human infection.

2.2.1 Raw sewage and sludge

Endemic levels of cryptosporidiosis in most populations lead to the frequent worldwide occurrence of Cryptosporidium oocysts in raw sewage. Table 2 provides the most current comprehensive data for global occurrence of Cryptosporidium oocysts in raw sewage. In general, the occurrence of Cryptosporidium oocysts in urban sewage will largely be associated with the incidence of cryptosporidiosis in the population and the level of sanitation services available. Prevalence rates based on oocyst excretion vary from approximately 1-3% in industrialized countries, up to 10% or higher in developing countries. The higher prevalence in developing countries is likely associated with lack of clean water, poor sanitation, crowded housing conditions, and closer contact with domestic animals (Dixon, 2015). The concentrations of Cryptosporidium in sewage and corresponding removal by wastewater treatment processes have been reviewed (Nasser, 2016Hamilton et al., 2018). Nasser presented 25 studies from 13 countries spanning five continents indicating that oocyst prevalence ranged from 0 (where samples contained oocysts below the detection level) to 100% (Cryptosporidium detected in all samples). The average prevalence of Cryptosporidium oocysts in raw sewage was 61% with a concentration of oocysts that ranged from 0 (below the detection limit) to a maximum of 60,000 oocysts/L. The highest concentration of oocysts found in sewage corresponded to a study conducted in Brazil (Neto, Santos et al. 2006). The dominant values for the concentration of Cryptosporidium in raw sewage ranged from 10 to 200 oocysts/L. More recent studies reported maximum concentrations of 21,335 oocysts/L (King et al., 2017). Hamilton et al. (2018) reported frequent detection of Cryptosporidium in sewage and treated effluents in different countries. In untreated sewage, across all studies reporting the number of positive samples, 763 of 1,801 (42%) were positive for Cryptosporidium (Hamilton et al., 2018). These data indicate that Cryptosporidium can be found in raw sewage year-round, although individual studies have reported different seasonal peak occurrence.

Table 2. Frequency of detection and concentration of Cryptosporidium in untreated sewage

Country

Sample Type

Percent Positive Samples

Concentration

Average (Range) Oocysts/L

Reference

Australia

Sewage

NR

(8 to 25,675)

King et al., 2017

Brazil

Raw sewage

44.4%

135

(0 to 180)

Santos and Daniel, 2017

Brazil

Raw sewage

6.4%

6.0E+04

Neto et al., 2006

Bulgaria

Sewage

100%

12 to 480

Karanis et al., 2006

China

Sewage

100%

 

142.31

(40 to 420)

Xiao et al., 2018a

Germany

Sewage

31.1%

0 to 1,745

Gallas-Lindemann et al., 2013

Hungary

Sewage

20%

13

Plutzer et al., 2008

Italy

Sewage

100%

4.5 ±0.8

(3.8 to 5.4)

Carraro et al., 2000

Norway

Sewage

29%

100 to 2.4E+04

Robertson et al., 2006

South Africa

Sewage

77.4%

3

(0 to 50)

Kfir et al., 1995

Spain

Sewage

58.0%

18.8

(1 to 80)

Castro-Hermida et al., 2010

Spain

Sewage/influent

92.4%

96

(2 to 1041)

Ramo et al., 2017

United Kingdom

Sewage

NR

<1 to 6E+03

Robertson et al., 2000

United States

Sewage

78%

<5.6 to 2.63E+04

Gennaccaro et al., 2003

NR: Not Reported


Studies have also indicated that up to 1.6x1010 oocysts may be discharged from a treatment plant daily, of which as many as 5x109 may be considered viable (Whitmore and Robertson, 1995). To prevent the environmental transmission of Cryptosporidium from sewage, treatment needs to be applied to ensure the removal/inactivation of oocysts in effluent before discharge. The four core stages of treatment include: preliminary treatment, primary treatment, secondary treatment, and tertiary or advanced treatment. The main objective of conventional wastewater treatment is to reduce nutrient and contaminant loads to the environment, thereby maintaining the health of aquatic ecosystems. A section describing the removal efficiency of Cryptosporidium oocysts by wastewater treatment processes is included below in this chapter. In water reuse applications, conventional treatment can be supplemented with additional processes to achieve a quality that is consistent with the intended use.

Sludge is a by-product produced during the sewage treatment process that when treated becomes biosolids that can be used as soil conditioner in agriculture. The occurrence of oocysts has been studied in sludge from wastewater treatment plants before and after different stabilization treatments. A study conducted in Spain found Cryptosporidium in 26 of the 30 raw sludge samples (86.6%) at a concentration ranging from 1 to 498 oocysts/g (Amorós et al., 2016). Another study conducted in Ireland indicated that all sewage sludge samples were positive for C. parvum and C. hominis, with maximum concentrations of 20 oocysts/g in primary sludge indicating the need for further sludge sanitization treatments (Cheng et al., 2009).

Another study was conducted to quantitatively determine and compare the concentrations of viable C. parvum and C. hominis oocysts in sewage sludge during the activation secondary treatment process and in the corresponding sewage sludge end products (Graczyk et al., 2007). The highest concentration of Cryptosporidium was 650 oocysts/L in the activated sewage sludge, and the lowest was 43 oocysts/L. The concentration of these two human pathogens in dewatered and biologically stabilized sewage sludge cake was on average 88% lower than in the same wastewater-derived sludge during its activation process. In another study, the levels and relationships between Cryptosporidium and other pathogens were investigated in biosolids from nine wastewater treatment plants throughout the United States. Cryptosporidium oocysts were detected sporadically (38% positive) with a range between 0-1.9 × 103 oocysts dry g-1 (Rhodes et al., 2015).

2.2.2 Surface water - including recreational water

The estimated global mean prevalence for Cryptosporidium in untreated surface waters is about 46%. The concentration of Cryptosporidium in surface waters is a determinant for probability of exposure and the risk of disease. Total global Cryptosporidium emissions to surface water for 2010 have been estimated to be 1.6 × 1017 oocysts per year, with hotspots in the most urbanised parts of the world (Hofstra and Vermeulen, 2016). However, surface water concentrations are expected to change with population growth, urbanisation and changes in sanitation (Table 3). A much higher total global oocyst load from animal manure of 3.2 × 1023 oocysts/year to land has been recently reported (Vermeulen et al., 2017). Due to highly variable oocyst occurrence, variations in method recoveries, and the effects of different water matrices on method detection limits, the actual occurrence of human-pathogenic Cryptosporidium oocysts in source waters is still unclear. It is also important to note that oocysts in source waters may be derived from multiple sources, including humans, livestock and wildlife. Major sources for protozoan contamination of surface water include the intrusion of animal feces or wastewater due to heavy rains. Manure management practices are more important in terms of watershed contamination than infection rates determined in farm animals (Budu-Amoako et al., 2012). Since many of the Cryptosporidium species and genotypes associated with animals are not zoonotic, human health implications are difficult to estimate for studies utilizing IFA for the detection of Cryptosporidium oocysts in surface waters. Moreover, IFA does not provide information on oocyst viability.

Table 3. Occurrence of Cryptosporidium spp. in surface water and recreational waters

Country

Sample Type

Percent Positive Samples

Concentration

Average (Range) Oocyst/L

Reference

Australia

Surface source water (recreational/non recreational)

8.3 to 52.4

(0.2 to 80)

Loganthan et al., 2012

Belgium Sub-catchments 92

17

(0.01 to 17.7)

Burnet et al., 2014

Belgium

River

 

95

5.2

(0.05 to 31.3)

Burnet et al., 2014

Belgium Reservoir 67

0.4

(0.01 to 0.34)

Burnet et al., 2014
Belgium DWTP inlet 68

0.04

(0.01 to 0.34)

Burnet et al., 2014

Brazil

Surface source water

9.22

0.2

(0.3 to 6)

Sato et al., 2013

Bulgaria

River water

92

(4 to 92)

Karanis et al., 2006

Canada

River water

4.5

0.05

Wallis et al., 1996

Canada

River water

93

(0.05 to 14.6)

Budu-Amoako et al., 2012

Canada

Three watersheds

76

NR

Ruecker et al., 2013

China

Surface source water

32

(0.18 to 2.2)

Feng et al., 2011

China

Recreational lakes

82.7

(0.365 to 1.5)

Xiao et al., 2018b

Finland

Rivers and lakes

7.4

NR

Rimhanen-Finne et al. 2002

France

Recreational surface waters

26

(0.1 to 0.5)

Coupe et al., 2006

Germany

Surface source water

9.5

240

Gallas-Lindemann et al., 2013

Hungary

Rivers

60

(0.1 to 0.4)

Plutzer et al., 2008

Hungary

Reservoirs

10

0.5

Plutzer et al., 2008
Hungary

Canals

71

(0.1 to 0.7)

Plutzer et al., 2008
Hungary

Recreational sites

100

(0.4 to 7)

Plutzer et al., 2008

Iran

Surface source water

NRa

0.005

(0.020 to 0.080)

Hadi et al., 2016

Italy

Lagoon for shellfish farming

36

(3.60 to 5.8345E+04)

Giangaspero et al., 2009

Italy

Surface water: river upstream

100

0.21

(0.1 to 0.33)

Carraro et al., 2000

Italy Surface water: downstream of effluent discharge 100

0.17

(0.08 to 0.42)

Carraro et al., 2000

Nepal

River water

NAb

140

Haramoto et al., 2011

Netherlands

Canals and recreational lakes (recreational)

25

(0.01 to 1.2)

Schets et al., 2008

Netherlands Canals and recreational lakes (non-recreational) 56 (0.04 to 2.9) Schets et al., 2008

Peru

River water

Location 1

30

5.25

(15 to 37.5)

Bautista et al., 2018

Peru

River water

Location 2

50

194.6

(176 to 348)

Bautista et al., 2018

Portugal

River water

47

NR

Lobo et al., 2009

Portugal

River water

15

NR

Alves et al., 2006

Portugal

River water

82

(0.1 to 5.3)

Julio et al., 2012

Romania

River water

7.5

(0.17 to 48)

Imre et al., 2017

Russia

River water

100

(3 to 113)

Karanis et al., 2006

South Africa

Surface source water

74.5

(0.6 to 25)

Kfir et al., 1995

Spain

Surface source water

16.6

1 (1 to 1)

Castro-Hermida et al., 2011

Spain

Surface water source/recreational

40.4

1.8

(1 to 13)

Castro-Hermida et al., 2010

Spain

Surface source water

35

6.7

(0.3 to 16)

Ramo et al., 2017

United States

DWTP inlet

7

0.067

Messner and Wolpert 2003

United States

River watershed

Rural and urban

52% rural

37% urban

(3 to 5.99E+03)

Dreelin et al., 2014

United States

Surface source water

25

0.74±0.27

King et al., 2016

Venezuela

DWTP inlet

75

0.15

(0.1 to 10)

Betancourt and Mena, 2012

Vietnam

River

41.6

 

1,500

(100 to 8.4E+03)

Nguyen et al., 2016

Vietnam Water canals 39.1

200

(100 to 2E+03)

Nguyen et al., 2016

aNR: Not reported; bNA: Not available


In the first survey of oocyst occurrence in surface waters, samples that were impacted by domestic and agricultural waste had Cryptosporidium concentrations as high as 5,800 oocysts per liter (Madore et al., 1987). A large survey of North America spanning 1988 to 1993 reported that 209 (60%) of 347 surface water samples were positive for Cryptosporidium oocysts (LeChevallier and Norton, 1995), while a study conducted between 1991 and 1995 in Canada found oocysts in only 53 (4.5%) of 1,173 surface water samples (Wallis et al., 1996). A study of six US watersheds reported 60 of 593 samples (10.1%) positive for oocysts by Method 1623 and 22 of 560 samples (3.9%) positive for infectious oocysts by cell culture PCR (CC-PCR) (LeChevallier et al., 2003). A study of Portuguese river beaches reported oocysts in 82% of samples with concentrations up to 5.3 oocysts/L (Julio et al., 2012). In addition, a study of raw water samples in Canada using USEPA Method 1623 (approximately 20 L grab samples), reported that >45% of 1,296 water samples contained Cryptosporidium oocysts (Ruecker et al., 2013).

Regulatory monitoring programs typically demonstrate lower prevalence than most research studies. The first regulatory monitoring of Cryptosporidium in US waters was conducted under the USEPA’s Information Collection Rule (ICR) (Obolensky and Hotaling, 2013Ongerth, 2013). This survey of 5,838 untreated source waters throughout the US reported an average occurrence of 6.8% with a mean concentration of 0.067 oocysts/L (Messner and Wolpert, 2003). In follow-up supplementary surveys of fewer facilities, 14% of samples were positive with an average concentration of 0.053 oocysts/L. Starting in 2006, monitoring for Cryptosporidium was mandated in the US by the Long-Term (2) Enhanced Surface Water Treatment Rule (LT2) (USEPA, 2000Obolensky and Hotaling, 2013; Ongerth, 2013). Drinking water facilities serving over 10,000 people were required to monitor for oocysts in at least 24 consecutive monthly samples of raw water using USEPA Method 1623. Out of almost 40,000 samples (mostly 10 L raw water grab samples) collected under the first round of monitoring at 1,670 sampling sites from 1,376 facilities across the US, seven percent contained at least one Cryptosporidium oocyst (Obolensky and Hotaling, 2013Ongerth, 2013). Over half of the facilities (51%) reported no positive samples at all throughout the entire monitoring period. In oocyst-positive samples (1,720 of 2,895), concentrations ranged from 0.1 oocysts/L to a high of 16/L. A total of 57 samples (0.14%) had concentrations greater than one oocyst/L. The locations with the highest prevalence of oocysts were on the Mississippi River downstream of the Ohio River and Missouri River confluence. The first round of LT2 monitoring data indicated that at least 80 treatment plants representing 7% of large systems in the US were required to implement additional treatment to increase oocyst removal or inactivation. Small system estimates were unreliable because the data represented only 3% of applicable systems. The data also indicated that Cryptosporidium occurrence was higher in flowing stream–type surface water sources compared with reservoir/lake-type surface water sources or groundwater under direct influence GWUDI supplies (Obolensky and Hotaling, 2013).

Although much of the early research and regulatory focus was on drinking water, Cryptosporidium has become the leading cause of recreational water outbreaks in the US (Hlavsa et al., 2015). The spatial and temporal proximity of contaminating individuals and “recipient” individuals in recreational water, particularly swimming pools, means that outbreaks linked to recreational water probably result from recent contamination and so oocyst survival in recreational water is not a primary issue. In the 30 years spanning 1971 – 2000, Cryptosporidium caused 15% of outbreaks associated with recreational water in the United States (Craun et al., 2005). Between 1988 and 2008, 136 documented outbreaks of cryptosporidiosis linked to recreational use of water in nine countries sickened a total of 19,271 people (Beach, 2008). The average number of people infected in these outbreaks was 142 but there were over 5,000 infections in one outbreak that occurred in 1995 in a water park in Georgia. There were 5,697 cases of illness in a statewide outbreak linked to swimming pools in Utah in 2007 (Hlavsa et al., 2015) leading to restrictions on the use of public pools by young children. Of the UK outbreaks linked to recreational water in the period 1983 – 2005, the number of cases ranged from 3 to 152 and the sources of infection were public and private swimming pools, rivers, interactive water features (e.g., splash zones), and water fountains (Beach, 2008). During a one-year monitoring period of seven swimming pools in the Netherlands, 4.6% of samples contained Cryptosporidium oocysts. Additional information of outbreaks associated with recreational water was recently reviewed (Beach, 2008Chalmers, 2012).

2.2.3 Ground water

Surveys of occurrence of Cryptosporidium in numerous countries indicate that oocysts occur in some groundwater systems. Previous and more recent studies have evaluated the potential for the transport of Cryptosporidium through soil to land drains and groundwater systems using simulated rainfall and intact soil columns, to which were applied raw slurry or separated liquid slurry (Mawdsley et al., 1996Petersen et al., 2012). These studies indicated that leaching of oocysts down through the soil profile did occur although the extent of this was affected by soil type. For instance, transport of C. parvum oocysts through soil and into leachate was greater in a silty loam and a clay loam soil than in a loamy sand soil.

In a study conducted in the United States, Cryptosporidium oocysts were found in 11% (21/199) of the sites including 5% (7/149) of the vertical wells, 20 percent (7/35) of the springs, 50% (2/4) of the infiltration galleries, and 45% (5/11) of the horizontal wells (Hancock et al., 1998). Average densities found in the study corresponded to 0.2–528 oocysts/100 L with a mean of 19 oocysts/100 L. Another study conducted in the Lower Rhine area of Germany reported 5 of 66 (7.5%) ground water samples tested positive for Cryptosporidium oocysts by IFA, with a mean of 2.4 oocysts /100 L (Gallas-Lindemann et al., 2013). Additional studies on the occurrence of Cryptosporidium spp. in groundwater systems from different countries are listed in Table 4.

Table 4. Occurrence of Cryptosporidium oocysts in groundwater systems

Country

Sample Type

Percent Positive Samples

Concentration Average (Range) Oocysts/L

Reference

Bulgaria

Groundwater well

67

0.02

Karanis et al., 2006

Canada

Groundwater

15

(0.1 to 7.2)

Budu-Amoako et al., 2012

Germany

Radial and vertical wells

7.57

0.24

(0.004 to 0.066)

Gallas-Lindemann et al., 2013

India

Deep wells

14

0.7 to 6

Daniels et al., 2018

India Shallow wells 5 5 to 6 Daniels et al., 2018

Nepal

Groundwater wells

11

2.2

(0 to 2.2)

Haramoto et al., 2011

Norway

Groundwater wells

15

0.1

Gaut et al., 2008

Portugal

Treated groundwater for consumption

59

NR

Lobo et al., 2009

United States

Vertical wells

Springs

Infiltration galleries

Horizontal wells

5

20

50

45

(0.002 to 0.45)

Hancock et al., 1998

United States

Shallow wells

23

NR

Borchardt et al., 2017

NR: Not Reported


In the United States, drinking-water systems that obtain water from a raw water supply, which is surface water or ground water directly under the influence of surface water, must have a treatment process that is capable of producing water of equal or better quality than a combination of well operated, chemically assisted filtration and disinfection processes would provide. This treatment process must achieve an overall performance that provides at a minimum a 2 log10 (99%) removal or inactivation of Cryptosporidium oocysts.

In 2015, it was estimated that 663 million people worldwide still use unimproved drinking water sources, including unprotected wells and springs (http://washdata.org/). A recent study examined the contribution of Cryptosporidium found in groundwater sources used for drinking to the total burden of diarrheal disease among children <5 years old in rural India (Daniels, Smith et al. 2018). Cryptosporidium oocysts were detected in 14% of deep tubewells (n = 110) at maximum concentrations ranging from 13 to 110 oocysts/20 L and 5% of shallow tubewells (n = 96) at concentrations ranging from 94 to 115 oocysts/20 L. The results of the study revealed that the mean daily risk of Cryptosporidium infection (0.06-1.53%) far exceeded the tolerable daily risk of infection from drinking water in the infection from drinking water in the US (<0.0001%).

2.2.4 Drinking water

Similarly, studies conducted worldwide have demonstrated the occurrence of Cryptosporidium oocysts in treated drinking waters (Table 5). Oocysts are resistant to chlorine disinfection at the concentrations typically applied during drinking water treatment but correctly operating treatment plants that utilize filtration usually remove oocysts from source water (Rochelle and Di Giovanni, 2014). However, studies conducted by Rose (1997) in the United States, reported oocysts in 3.8–40% of treated drinking water samples at concentrations up to 0.72 oocysts/L. In addition, multiple treatment plants monitored in Wisconsin detected oocysts in 4.2% of finished water samples (Archer et al., 1995) followed by another survey detecting oocysts in 3.5% of treated drinking waters in Ontario (Canada) (Wallis et al., 1996). High oocyst concentrations detected in the treated water and multiple detections in source water of Sydney (Australia) led to a period of on-again, off-again boil water advisories over two months (middle of August to beginning of September 1998). The incident had long-ranging political, technical, operational, and managerial consequences in Australia (McClellan, 1998Clancy, 2000). Oocysts were detected during the subsequent six years of monitoring Sydney’s water supply that was precipitated by this incident. The rate of detection corresponded to 0.04% of 4,961 treated drinking water samples (O'Keefe, 2010). The monitoring period was characterized by an extreme multi-year drought and so low oocyst occurrence may not have been reflective of “normal” conditions.

Table 5. Occurrence of Cryptosporidium oocysts in treated drinking waters

Country

Sample Type

Percent Positive Samples

Concentration Average (Range) Oocysts/L

Reference

Brazil

Treated drinking water

26.6

(0.06 to 0.15)

Nishi et al., 2009

Brazil

Treated drinking water

25.0

(0 to 0.01)

Razzolini et al., 2010

Canada

Treated drinking water

7.6

(0.03 to 0.95)

Isaac-Renton et al., 1999

Germany

Treated drinking water

12

0.13

(0 to 0.16)

Gallas-Lindemann et al., 2013

Hungary

Treated drinking water

13.3

0.3

Plutzer et al., 2007

Japan

Filtered water

35

(0.002 to 0.6)

Hashimoto et al., 2002

Russia

 

Tap water

Well water

8.3

0.01

Karanis et al., 2006

South Africa

Treated drinking water

1.1

1

Kfir et al., 1995

Spain

Treated drinking water

16.6

1

(1 to 1)

Castro-Hermida et al., 2011

Spain

Treated drinking water

15

8.8

(4 to 16.6)

Ramo et al., 2017

Spain

Treated drinking water

32.7

1.2

(1 to 4)

Castro-Hermida et al., 2010

United Kingdom

Treated drinking water

5.5

0.0002

Smeets et al., 2007

United Kingdom

Treated drinking water

7

1.6

Wilson et al., 2008

United States

Treated drinking water

3.8 to 40

0.001 to 0.72

Rose, 1997

United States

Treated drinking water

26.8

0.13 to 4.8

LeChevallier et al., 1991

Venezuela

Treated drinking water

20

0.5

(0.2 to 0.7)

Betancourt and Mena, 2012


In Latin America, information on the prevalence and detection of waterborne parasitic protozoa is more limited (Rosado-Garcia et al., 2017). Moreover, most studies that have documented the presence of Cryptosporidium oocysts in drinking water, with some exceptions, did not provide reliable quantitative data. A study of a large drinking water treatment plant in Venezuela reported 10/11 (90%) finished drinking water samples contained oocysts, with the geometric mean of 4.4 oocysts/100 L as determined by immunofluorescent assay microscopy (Quintero-Betancourt and De Ledesma, 2000).

In Asia, Japan has documented studies on Cryptosporidium in treated drinking water. Finished water monitored at a treatment plant showed that Cryptosporidium oocysts were detected in 35% (9/26) of filtered water samples (geometric mean concentration was 1.2 oocysts/1,000 L) (Hashimoto et al., 2002)

Regulatory or systematic and widespread monitoring of Cryptosporidium in either raw or treated water has only been conducted in a few countries (see Section 2.2.2). The UK drinking water regulations included the most intensive Cryptosporidium monitoring program ever undertaken. The regulation required continuous monitoring of Cryptosporidium oocysts in finished drinking water for at least 23 hours per day at a flow rate of at least 40 L per hour (DWI, 1999). A decade-long monitoring program revealed that Cryptosporidium oocysts were occasionally detected in finished drinking water. During the period 2000 – 2002, a total of 97,999 samples were analyzed (total volume = 115,303,050 L), 5.5% were positive, and the average oocyst concentration was 0.0002 oocysts/L (Smeets et al., 2007). Continuous monitoring during subsequent years demonstrated the presence of Cryptosporidium oocysts in 1.9% (2002) and 1.1% (2003) of the samples with rates of detection from sample sites corresponding to 68% and 54%, respectively (DWI, 2006). Extensive monitoring occurring in 204 plants during 2008 indicated that none of them exceeded the treatment standard of <1 oocyst/L (DWI 2008). Increased awareness of the occurrence and impact of Cryptosporidium oocysts in drinking water might have led to the application of rigorous risk management practices to prevent the introduction of oocysts in the drinking water supply.

In Scotland, continuous monitoring of finished water during 22 months revealed the presence of oocysts in 7% (1,417 of 20,249) of treated water samples from 304 drinking water supplies. The maximum concentration reported corresponded to 1.6 oocysts/L (Wilson et al., 2008). From a public health perspective, the riskiest source water catchments were those that were drier than average but had occasional high rainfall events, which flushed oocysts into the water. The application of a molecular approach during the monitoring period demonstrated the presence of multiple species and genotypes in different proportions: C. andersoni (8% corresponding to a cervine genotype), C. parvum (5%), C. baileyi and C. bovis (0.3%). Similarly, C. andersoniC. parvum, and C. ubiquitum were identified on 4.1%, 4.3%, and 12.6% of regulatory finished water monitoring slides in Scotland (Nichols et al., 2010). During 2015, 9,483 finished water samples were analyzed under the regulatory directive in Scotland. Of these, a total of 84 (0.89%) samples from 26 of 238 (11%) treatment plants reported tested positive for oocysts, the lowest percentage of positive samples since 2007 (DWQR, 2017). Again, better watershed management and operational improvements might have led to a decline in Cryptosporidium occurrence in the drinking water supply.

Optimized cell culture assays that enabled detection of low levels of oocysts were used to assess the prevalence of infectious oocysts in finished drinking water in the U.S. (Aboytes et al., 2004). The study reported that 26.8% of surface water treatment plants (N = 82) were releasing infectious oocysts in their finished water. Overall, 1.4% of treated drinking water samples (N = 1,690) contained infectious Cryptosporidium oocysts but in all cases the follow-up repeat samples were negative. This detection rate translated into a calculated annual risk of 52 infections in 10,000 people, far exceeding the US EPA’s 1 in 10,000-risk goal. In a study with contrasting results, treated drinking water samples from 14 treatment plants across the U.S. were analyzed using a modified version of Method 1623 coupled with a cell culture infectivity assay and IFA detection of infection. Sample volumes were 83.5–2,282 L with an average of 943 L (N = 370). None of the 370 finished water samples produced infections (Aboytes et al., 2004). This lack of infectious oocysts in a total volume of 349,053 L translated to an annual risk of less than one infection per 10,000 people. More research and more extensive sampling are needed to accurately determine the prevalence of human-infectious oocysts in treated drinking water and thus improve the accuracy of risk assessments (Rochelle and Di Giovanni, 2014).

2.2.5 Irrigation water

Sources of irrigation water such as surface and groundwater can become contaminated with parasites through many routes including agricultural runoff, sewage discharge, storm water discharge, and direct access to water sources by domestic and wild animals (Dixon, 2015). In agricultural settings, fruit and vegetables can become contaminated with protozoan parasites by contact with soil or improperly composted manure and irrigation or postharvest washing with contaminated water (Steele and Odumeru, 2005). Several studies have documented the occurrence of Cryptosporidium oocysts in waters for irrigation of agricultural crops (Thurston-Enriquez et al., 2002Chaidez et al., 2005Amorós et al., 2010). In a study conducted in Mexico, Cryptosporidium oocysts were found in 48% of surface waters used for irrigation, washing and disinfecting applications with oocyst concentrations ranging from 17 to 200/100 L (Chaidez et al., 2005). A multi-country study reported that 36% of the irrigation waters tested positive for Cryptosporidium oocysts. The study reported frequent occurrence of oocysts in irrigation waters of Central American countries with levels of 227 oocysts/100 L while irrigation waters for crop production in the United States did not contain oocysts (Thurston-Enriquez et al., 2002). In another study conducted in Spain, Cryptosporidium oocysts were found in water samples collected from canals used for irrigation of vegetables (Amoros et al., 2010). The mean values of oocysts was 475/100 L ranging from 100 to 700 oocysts/100 L. A study conducted in Africa reported an overall prevalence of 66.67% (48/72) for Cryptosporidium oocyst in waters for irrigation of farm products in the Kumasi Metropolis of the Ashanti Region of Ghana with levels between 52 and 105 oocysts/100 L (Sampson et al., 2017). A more recent study conducted in the county of Wuzhi, Henan Province, China reported concentrations of 0.6 oocysts/100 L in the Qinhe River, which is the main source of agricultural irrigation in the local farmland (Xiao et al., 2017).

High quality effluents derived from wastewater treatment and reclamation technologies have been used as alternative water sources for agricultural and landscape irrigation in many countries (Asano and Cotruvo, 2004). The health risks associated with the widespread practice of wastewater irrigation for crop production are higher in developing countries where wastewater receives little or no treatment before use.

A study reported the occurrence of Cryptosporidium oocysts in recycled wastewater used for unrestricted irrigation in Florida (USA) (Quintero-Betancourt et al., 2003). The water reclamation facilities evaluated in the study provided conventional activated sludge treatment followed by filtration and chemical disinfection with chlorine gas. The levels of oocysts reported were based on IFA and the focus detection method most-probable-number (FDM-MPN) assay to confirm infectious potential of oocysts. The percentage of samples positive for infectious oocysts was 50% (6/12), and the numbers of oocysts by IFA ranged between 2 and 209/100 L. The level of infectious oocysts found in reclaimed effluents ranged between 17 and 27 MPN/100 L. In most cases, these levels were below the numeric pathogen standard (maximum limit of 22 viable oocysts/100 L) proposed for reclaimed effluents in the State of Florida. However, adjustment of oocyst levels based on recovery efficiency data indicated that the concentrations found in the study were underestimations. The study revealed the importance of the application of advanced Cryptosporidium detection methods required for the establishment of a more useful risk assessment approach for reclaimed water and the importance of pathogen-specific monitoring in recycled waters for irrigation. Due to the recalcitrant nature of oocysts to disinfection, ultrafiltration (pore size of 0.002 to 0.1 µm) is recommended for complete removal of Cryptosporidium oocysts for reuse of treated wastewater in irrigation (Lonigro et al., 2006).

2.2.6 Seawater and shellfish

The shallow coastal ocean impacted by human wastes and pollution-laden runoff may harbour numerous pathogenic microorganisms that are epidemiologically associated with diseases in the human population that lives along the coastline (Shuval, 2003). Cryptosporidium oocysts have been detected in marine waters worldwide. Bathing beaches in Hawaii impacted by marine sewage discharge were reported to contain levels of oocysts ranging from 1 to 2.5/100 L (Johnson et al., 1995). Another study reported low concentrations of oocysts in tropical recreational marine waters of Venezuela contaminated with sewage, with oocysts concentrations ranging from 2 to 20/100 L (Betancourt et al., 2014). Higher concentrations of oocysts than any other study were reported in marine recreational waters of Sinaloa, Mexico, where the overall concentration of oocysts ranged from 1,500 to 20,050/100 L with an average of 5,810 oocysts/100 L (Magana-Ordorica et al., 2010).

In addition, bivalve molluscan shellfish (mussels, oysters, clams, and cockles) through their filter feeding process can accumulate and concentrate Cryptosporidium from sewage-polluted coastal waters (Willis et al., 2013). Studies have shown that shellfish filter ~20–100 L of seawater in 24 h (depending on species) and can accumulate very high (oo)cyst loads in their digestive gland, intestinal tract and gills (Robertson and Gjerde, 2008). One study demonstrated that Cryptosporidium persisted longer (up to 33 days) than Giardia in oysters and water inoculated with (oo)cysts (Graczyk, Girouard et al., 2006). Consequently, commercial shellfish species harvested from sewage-polluted waters can act as transmission vehicles of infectious oocysts, especially within 24–72 h of contamination as demonstrated in recent studies (Sutthikornchai et al., 2016). The detection of Cryptosporidium and other protozoan parasites such as Giardia and Toxoplasma in shellfish destined for human consumption is therefore necessary due to the public health risk that these parasites pose (Willis et al., 2013). The two main species of Cryptosporidium reported in oysters are C. parvum and C. hominis but other species including C. baileyiC. meleagridisC. andersoni, and C. felis have also been reported (Robertson, 2007). In a recent study, Cryptosporidium oocysts were isolated from four species of edible bivalves using a combination of sucrose flotation and immunomagnetic separation and oocysts were found in 67 out of 144 samples collected (Pagoso and Rivera, 2017). DNA sequence analysis of the 18S rRNA gene revealed the presence of C. parvumC. hominis, and C. meleagridis. Studies have also been conducted to identify subtypes of C. parvum and C. hominis using the 60 kDa glycoprotein (gp60) gene in edible shellfish. Nucleotide sequencing of amplicons showed that 60% of mussels contained C. parvum subtypes belonging to family IIa (IIaA15G2R1, IIaA15G2 and IIaA14G3R1) (Giangaspero et al., 2014).

The presence of infectious Cryptosporidium oocysts was reported in shellfish from France using a mouse infectivity assay (Li et al., 2006). The study demonstrated the presence of oocysts in all samples for all sites and seasons and flesh was the most contaminated part. The rate of detection was apparently related with seasonal rain precipitation variations. Molecular analysis revealed the predominance of C. parvum likely cattle-breeding origin in cultured edible mussels confirming their resistance to sea environments and underlining the potential risk of food-borne infection.

Commercial and non-commercial oysters and oyster culture water from the Oosterschelde, the Netherlands, were examined for the presence of Cryptosporidium oocysts and Giardia cysts. Nine of 133 (6.7%) oysters from two non-commercial harvesting sites contained CryptosporidiumGiardia or both. Six of 46 (13.0%) commercial oysters harboured Cryptosporidium or Giardia in their intestines. The study indicated that infectious Cryptosporidium oocysts entering the Oosterschelde could be accumulated by oysters destined for human consumption (Schets et al., 2007).

C. parvum oocysts were also recovered from Zebra mussels at four locations throughout the Shannon River drainage area in Ireland. The study reported a mean concentration of 16 oocyst/g mussels, which was the highest concentration among all the other enteric pathogens (Giardia lambliaEncephalitozoon intestinalisE. hellem, and Enterocytozoon bieneusi) tested (Graczyk et al., 2004).

The presence of Cryptosporidium in bivalve shellfish has also been demonstrated in other European Union countries such as Italy, the United Kingdom, Spain, and Portugal (Gomez-Couso et al., 2003). The study revealed that there was no relation between the presence of Cryptosporidium oocysts and the microbiological contamination detected in the samples expressed as Most-Probable-Number (MPN) of fecal coliforms. One important finding was that the depuration process was ineffective in totally removing oocyst contamination. Moreover, the study revealed the existence of viable oocysts in samples with microbiological contamination levels lower than 300 fecal coliforms/100 g, which in accordance with European Union legislation are considered suitable for human consumption. The study highlighted the importance of including parasitological analyses in the quality control for these molluscs.

In another study, gill washings from 37 commercial harvesting sites in 13 Atlantic coast states from Maine to Florida and one site in New Brunswick, Canada were examined for Cryptosporidium oocysts by immunofluorescence microscopy (IFA) and PCR. In total, Cryptosporidium was detected in 35 (3.7%) of 925 shellfish. In contrast, PCR examination of 110 pools of gill washings from oysters and 75 pools from clams detected Cryptosporidium in 21 (19.1%) and 12 (16.0%) pools, respectively. In total, PCR detected Cryptosporidium in 27 (14.6%) of 185 pools of gill washings from shellfish. Gene sequencing of PCR products identified Cryptosporidium species at 14 sites. C. parvum genotype 2 (zoonotic) was identified at 11 sites and C. parvum genotype 1 (C. hominis, anthroponotic) at three additional sites. At four of the 14 sites, C. meleagridis was also identified. The study covered the largest geographic area of any survey for Cryptosporidium and indicated widespread fecal contamination from human and possible animal sources (Fayer et al., 2003).

Since bivalve shellfish can efficiently concentrate and retain environmentally derived pathogens for long periods, they have been recognized worldwide as bioindicators of aquatic environments with fecal origin organisms. For instance, sentinel clam outplanting used to assess the distribution and magnitude of fecal contamination in three riverine systems in California demonstrated the presence of C. parvum in clams from riverine ecosystems (Miller et al., 2005). In addition, indigenous blue mussels (Mytilus spp.) used as biosentinels to monitor for the presence of parasites along the central California shoreline revealed the presence of Cryptosporidium and Toxoplasma gondii in areas not previously reported to be contaminated with these pathogens (Staggs et al., 2015).

2.3 Persistence

Data on the persistence of Cryptosporidium oocysts in the environment outside the host organism are crucial to understand the ecology of the parasite and for determining the risk to the human population. The environmental transmission of Cryptosporidium is governed by the physicochemical properties of the oocysts, which allow their transport, retention, and survival for months in water, soil, vegetables, and mollusks which are major reservoirs for human infection (Dumetre et al., 2012).

Cryptosporidium oocysts are excreted fully sporulated with feces protected by a rigid bilayer waxy coat of lipids that is thought responsible for their survival in the environment and for their transit through the stomach and small intestine (Bushkin et al., 2013). Current models of the oocyst’s surface chemistry suggest that the charge and hydrophobic characteristics of the parasite surface can generate and modulate electrostatic attractive and/or repulsive interactions with the surrounding particles, which critically affect the behavior of this pathogen and its distribution in terrestrial and aquatic environments. Characterizing these interactions may play a crucial step for managing the environmental matrices at risk of microbial pollution (Dumetre et al., 2012).
The persistence of Cryptosporidium oocysts in terrestrial and aquatic environments is influenced by the combination of structural features (oocyst wall and oocyst surface chemistry), extrinsic physical (temperature, relative humidity, desiccation, ultraviolet radiation) and chemical (pH, organic matter, salinity, ammonia) factors as well as by the matrix or substrate the oocysts are present in (e.g., feces, soil, water, inanimate surfaces or fomites). The biological antagonism and potential predation of Cryptosporidium may enhance the die-off of oocysts (Stott et al., 2001Stott et al., 2003King and Monis, 2007Peng et al., 2008Reinoso et al., 2008a). Moreover, aged oocysts are more susceptible to disruption by environmental changes and disinfectants (King and Monis, 2007).

Studies on oocyst persistence generally report viability and infectivity, however these two features do not always correlate, and oocysts that may have remained viable may have reduced infectivity. Table 6 summarizes selected studies that have tested for survival of C. parvum using viability and infectivity assays (Carey et al., 2004). Temperature, desiccation, and extremes in pH have the most detrimental effect on oocyst survival in the environment (Jenkins et al., 1999Olson et al., 1999Carey et al., 2004Peng et al., 2008). In fact, temperature is one of the most critical processes governing the fate of oocysts in the environment (King and Monis, 2007Lucio-Forster et al., 2010). Other studies have indicated that desiccation is perhaps the most lethal, with 100% of oocysts being inactivated after 3 hours (Olson et al., 1999King and Monis, 2007).

Table 6. Environmental factors influencing the viability and infectivity of C. parvum oocysts (Carey et al., 2004)

Environmental

Media

Temperature (°C)

Time

Days

(Unless Otherwise Noted)

Reduction in

Oocyst Viability/Infectivity

Assessment Method

References

Laboratory Water

10 to 30

14

Retained infectivitya

BALB/c mouse infectivity

Fayer, 1994

Laboratory Water

59.7

5 minutes

Retained infectivitya

BALB/c mouse infectivity

Fayer, 1994

Laboratory Water

64.2

2 minutes

Non-infectious

BALB/c mouse infectivity

Fayer, 1994

Laboratory Water >72 1 minute Non-infectious BALB/c mouse infectivity Fayer, 1994
Laboratory Water

-22

0.88

(21 hours)

0.48 Log10

DAPI/PI

Robertson et al., 1992

Laboratory Water

-22

6.3

> 1 Log10

DAPI/PI

Robertson et al., 1992

Laboratory Water

NR

Snap freezing

Non-infectious

DAPI/PI

Robertson et al., 1992

On surfaces

Desiccation

room temp

4 hours

> 2 Log10

DAPI/PI

Robertson et al., 1992

Sterile water

-4

>84

Retained viability and infectivity

PI exclusion and mouse infectivity

Olson et al., 1999

Tap water

lab flow-through system

room temp

176

1.4 Log10

DAPI/PI

Robertson et al., 1992

River water

ambient temp

176

1.2 Log10

DAPI/PI

Robertson et al., 1992

River water

-20

7

Non-infectious

Cell culture, FDM

Pokorny et al., 2002

River water

4 to 10

98

Retained infectivity

Cell culture, FDM

Pokorny et al., 2002

River water

21 to 23

84

> 5 Log10

Cell culture, FDM

Pokorny et al., 2002

Seawater

static lab conditions

4

35

0.21 Log10

DAPI/PI

Robertson et al., 1992

Seawater with salinities of

10, 20, and 30 ppt

10

 

 

84

Retained infectivity

BALB/c mouse infectivity

Fayer et al., 1998aFayer et al., 1998b

Artificial seawater with salinities of

10, 20, and 30 ppt

20 84 Retained infectivity BALB/c mouse infectivity Fayer et al., 1998aFayer et al., 1998b

Artificial seawater with saliniy of

20

20

56

Retained infectivity

BALB/c mouse infectivity

Fayer et al., 1998aFayer et al., 1998b
Artificial seawater with salinity of 30 ppt 20 28 Retained infectivity BALB/c mouse infectivity Fayer et al., 1998aFayer et al., 1998b

Cow feces

(submerged in semi-solid feces)

Ambient Temp

176

0.47 Log10

DAPI/PI

Robertson et al., 1992

Human feces

4

178

0.66 Log10

DAPI/PI

Robertson et al., 1992

aretained infectivity in the mouse model but no quantitative levels of loss reported


Laboratory experiments, for instance, indicated, that the duration of oocyst infectivity in water decreased as the temperature increased from 4°C to 23°C (King et al., 2005). Oocysts maintain high levels of infectivity for periods of up to 24 weeks at temperatures below 15°C. Inactivation is more rapid with a slight increase in environmental temperature of 20°C and 25°C with a 3 log10 reduction in infectivity after 8 weeks of incubation. Exposure of the oocysts to 30°C and 37°C results in complete inactivation (i.e., 4 log10 reduction) within 500 h and 72 h, respectively (King et al., 2005). Exposure of the oocysts to temperatures of 72.4°C for 1 min or 64.2°C for more than 2 min lead to complete loss of infectivity (Fayer et al., 1998). Temperatures below freezing are also detrimental to oocyst survival due to physical damage, however studies have shown that oocysts remain infectious for >12 weeks in water, soil, and feces at temperatures below freezing (-4°C) (Peng et al., 2008). The predicted increase in global temperature may have dramatic consequences for oocyst longevity in the environment, with small increases in temperature above 15°C increasing inactivation. Warmer temperatures, on the other hand, may have an opposite effect in areas prone to soil subsurface freezing or lake ice covers, where substantial numbers of oocysts may remain infective after winter, where previously they may have been inactivated (King and Monis, 2007).

First-order exponential models have been used to calculate die-off rates of Cryptosporidium oocysts in water, soil, and feces under different environmental conditions. Studies have shown that for a given percentage of inactivated oocysts, K (i.e., die-off rate coefficient over an entire incubation period) is inversely proportional to the incubation time. For instance, if K is 0.01 day-1, the inactivation of 99.9% of oocysts requires 690 days, compared to 138 days when K is 0.05 days-1 (Peng et al., 2008). The relationship between K and temperature in the environment can be used to evaluate the risk of oocyst contamination for public health.

In natural waters, including river and lake water, groundwater, seawater, as well as tap water, the natural die-off of Cryptosporidium oocysts is likely to be affected by a combination of abiotic and biotic stresses. Thus, the combined effects of chemical, physical, and biological water properties are used to calculate K values. Die-off rate coefficient values for river water, tap water, and seawater taken directly from published data indicate that over a range of 4 to 30°C K values are not significantly different. The means ± standard deviations of K values reported for river water, tap water, and seawater at 4°C are 0.193 ± 0.011, 0.0194 ± 0.006, and 0.009 ± 0.017 days-1, respectively (Robertson et al., 1992Alum et al., 2014).

When present on surfaces or in solids (e.g., soil or sludge), Cryptosporidium may respond differently to variations in environmental conditions including temperature, relative humidity (RH), porosity, and organic matter (Alum et al., 2014). Oocysts in soil, for instance, are less sensitive to air temperature and solar radiation because of their association to soil particles. Studies have also found that oocysts in manure and soil can survive more than 12 weeks to a year frozen at -4°C and for eight weeks at 4 to 6°C, but only 4 weeks at 20 to 30°C (Olson et al., 1999). K values determined for different soils present a wide range of variation with respect to temperature, with average values of 0.0055, 0.011, and 0.076 day-1 at 4, 20, and 30°C, respectively (Peng et al., 2008). The level of ammonia, pH and temperature in soil and manure storages have significant effects on oocyst survival. Soils typically contain ammonia levels between 1 and 5 ppm, which are not high enough to affect oocysts. However, freshly fertilized soils that contain levels of ammonia as high as 3000 ppm may lead to increased rates of inactivation. Therefore, ammonia-induced inactivation that occurs during storage of animal waste products is an effective strategy to reduce oocyst numbers in livestock wastes before being spread onto the land (Hutchison et al., 2005). At low concentrations of ammonia (5 and 50 ppm) the reduction of C. parvum oocyst survival is more pronounced over prolonged periods of time (up to 4 days) with 0.73 log10 (81%) reduction in viability as determined by the differential uptake of DAPI/PI (DAPI+/PI+) (Reinoso et al., 2008a). Moreover, soil complexity is intertwined with spatial and temporal variability more than water and oocysts in soils are subjected to stress driven by the interaction of the water content and the texture. Incubation of oocysts for 10 days in dry loamy soil at 32°C resulted in a 3 log10 reduction in oocyst infectivity while in saturated soil at the same temperature caused only a 1 log10 reduction (Nasser et al., 2007). In feces, oocyst degradation is faster than in water with ammonia and fecal organisms as well as temperature playing an important role as inactivation agents of oocysts (Peng et al., 2008).

Recent studies conducted to evaluate the persistence of C. parvum oocysts exposed to chemical disinfectants (e.g., ethanol, denatured ethanol, sodium hypochlorite, and peroxide) indicated that long-term exposure times are more effective at inactivating oocysts than those used in previous studies. For instance, 6% NaOCl for at least 12 hours was effective to achieve a 2 log10 inactivation of oocysts while a poor effect (<1 log10)  was observed at the same concentration of sodium hypochlorite (6%) with an exposure time of 33 min. Similarly, a 2 log10 inactivation was observed by using 10% H2O2 at an exposure time over 2 h. Ethanol was less effective at inactivating Cryptosporidium oocysts (Delling et al., 2016).

A study examining the effects of heat treatment (60 or 75°C) on the viability of C. parvum oocysts inoculated onto the surface of beef muscle observed that at 60°C viability decreased from 100% at T zero to 64.2% at T60. At 7°C the viability of the oocysts decreased from 100% at T zero to 53.7% T15 and finally to 11.2% at T60 (Moriarty et al., 2005). The infectivity of the oocysts was assessed against monolayers of HCT-8 cells following treatments of 60°C/ 45 seconds and 75°C/ 20 seconds. The study concluded that the washing of carcasses with hot water and standard thermal treatments is sufficient to kill C. parvum on beef muscle.

3.0 Reduction by Sanitation Management

Wastewater treatment represents the first barrier in protecting water supplies from contamination of parasites (Stadterman et al., 1995). Table 7 summarizes the results from studies conducted worldwide at different wastewater treatment plants that provide different levels of treatment for removal/inactivation of Cryptosporidium from sewage.

Table 7. A Summary of studies on the removal efficiency of Cryptosporidium oocysts by different wastewater treatment processes

Country

Plant

Population Served

Primary Treatment

Secondary Treatment

Tertiary Treatment

Disinfection

Oocyst/L Influent Average

Oocyst/L Effluent Average

Log10 Removal

Reference

Brazil

1

NR

Screening, aeration

Activated sludge, secondary clarification

None

UV

6E+04

±2.8

2E+02

2.5

Neto et al., 2006

Brazil

1

NR

NR

Activated sludge

Filter screen, sand-anthracite filter,

membrane filtration

Chlorine

32

0.81

1.59

Hachich et al., 2013

Brazil

2

NR

NR

Activated sludge

Sand filtration

Chlorine

12

0.67

1.25

Hachich et al., 2013

Brazil

3

NR

NR

Anaerobic and facultative pond

Maturation pond and trickling filter

None

12

0.09

2.12

Hachich et al., 2013

China

WTP-G

NR

Screening and grit removal

Activated sludge

Sand filtration

None

2.4E+02

0.11

3.35

Fu et al., 2010

China

WTP-Q

NR

Screening and grit removal

Anaerobic-anoxic-oxic process

Membrane ultrafiltration

Ozone/chlorine

1.4E+02

1.5

1.97

Fu et al., 2010

Ireland

Plant A

1.9E+03

Screening and grit separation

Sludge activation in oxidation ditch

None

None

5.92E+02

±22.6

4±2

2.17

Cheng et al., 2009

Ireland

Plant B

1.0E+03

None

Sludge activation in extended aeration tanks

None

None

2.8E+02

±33.9

8±3.6

1.54

Cheng et al., 2009

Ireland

Plant C

2.5E+03

Screening and grit separation

Sludge activation in extended aeration tanks

None

None

11±4.5

4±1.5

0.43

Cheng et al., 2009

Ireland

Plant D

2.1E+03

Screening and grit separation

Biofilm-coated percolating filter

None

None

1±1

4±2

0

Cheng et al., 2009

Israel

Plant A

2.4E+05

Primary settling

Activated sludge

Slow sand filtration

Chlorination

17.3

9.94

0.24

Taran-Benshoshan et al., 2015

Spain

Plant 1

2.5E+05

Screening and grit separation, sedimentation

Anaerobic digestion, sedimentation

None

UV

NR

0.4 to 0.8

2.27

Rodriguez-Manzano et al., 2012

Spain

Plant 2

2.5E+05

Screening and grit separation, sedimentation

Anaerobic digestion, sedimentation

Sand filtration

UV

NR

0.4 to 3.6

1.75

Rodriguez-Manzano et al., 2012

USA

Plant A

5.0E+05

NR

Activated sludge

None

Chlorination

7.4E+01

1.3E+01

0.71

Kitajima et al., 2014

USA

Plant B

2.5E+05

NR

Activated sludge

None

Chlorination

1E+02

1.2E+01

0.81

Kitajima et al., 2014

USA

Plant A

NR

Primary sedimentation

Activated sludge, secondary sedimentation

NR

Chlorination/dechlorination

7.4E+01

1.3E+01

0.76±0.22

Schmitz et al., 2018

USA

Plant B

NR

Primary sedimentation

Trickling filters, secondary sedimentation

NR

Chlorination/dechlorination

1.0E+02

1.2E+01

0.96±0.38

Schmitz et al., 2018

USA

Plant C

1.5E+06

Dissolved Air Flotation

5-stage Bardenpho, secondary sedimentation

Disc Filtration

Chlorination/dechlorination

3.0E+02

8.0E+00

1.67±0.39

Schmitz et al., 2018

USA

Plant D

NR

Primary sedimentation

Pseudo-bardenpho/5-stage Bardenpho, secondary sedimentation

NR

Chlorination/dechlorination

2.4E+02

4.0E+00

1.52±0.54

Schmitz et al., 2018

USA

Plant 1

NR

NR

Activated sludge, lime treatment

Sand filtration, Upflow carbon adsorption

Chlorination

1.1E+01

3.7E-04

4.47

Rose et al., 2001

NR: Not Reported

3.1 Excreta and Wastewater Treatment

3.1.1 On-Site sanitation
3.1.1.1 Dry on-site sanitation systems

Composting-based sanitation systems also known as composting toilets, dry toilets, biological toilets, or waterless toilets are ecological sanitation technologies which require neither water nor sewerage infrastructure for their operation. Although they are primarily used in the developed world, waterless sanitation systems represent a viable solution for areas where poor sanitation exists and water is scarce (Graham et al., 2003Anand and Apul, 2014). Composting toilets also fit in with today’s understanding of sustainable construction since they reduce water and wastewater flows within a building. These dry sanitation systems can be very cost efficient and furthermore produce black soil and water that can be reused [terra preta sanitation (TPS)] (Otterpohl and Buzie, 2011). There are two basic types of waterless toilets, based on whether they treat biosolid waste by biodegradation or dehydration. Biodegrading toilets promote pathogen reduction by increasing the temperature of the composting pile to as high as 70°C by the action of thermophilic aerobic bacterial growth. Dehydrating toilets rely on desiccation and high pH (>10), a result of low moisture content (<25% moisture by weight) and the addition of an alkaline agent.
One study evaluated the effectiveness of biodegrading and dehydrating waterless toilets in reducing Cryptosporidium in human feces over six months using IFA. The study found a statistically significant difference over time in Cryptosporidium detected in the dehydrating system compared to the biodegrading system. According to this study, the dramatic decrease achieved by the dehydrating toilets was the result of the added lime and low moisture levels (7%), which increased pH in the system to 10 and produced desiccating conditions (Graham et al., 2003). Both environmental factors, high pH and desiccation are known to promote inactivation and killing of pathogens.

3.1.1.2 Inactivation by storage

Composting is increasingly considered a good way for recycling the surplus of manure as a stabilized and sanitized end-product for agriculture (Bernal et al., 2009). Composting of organic wastes is a biooxidative process involving the mineralization and partial humification of the organic matter, leading to a stabilised final product, free of phytotoxicity and pathogens and with certain humic properties. In places where dry sanitation is used, a common method for treatment is addition of ash or lime for raising the pH and drying the surface of the faecal matter after defecation, combined with long-term storage.

Thermal composting and ammonia treatment with storage are sanitation methods for production of feces and manure safe to use as a fertilizer on arable land. During the process of anaerobic digestion up to 100% reduction in Cryptosporidium viability occurs within a short period of time, four days, when heat is used in the system. Mesophilic anaerobic digestion leads to a 2 log10 reduction in oocyst survival while thermophilic anaerobic digestion leads to a 5 log10 reduction (Vermeulen et al., 2017).

The survival of C. parvum in animal manures and manure slurries has been often studied under controlled laboratory conditions. These studies have demonstrated that temperatures between 35 and 50°C that normally occur in composting animal waste piles may inactivate 4 log10 (99.99%) of viable oocysts in approximately 82 to 3 days, respectively (Bean et al., 2007). A study based on the dye permeability assay, reported coefficients of inactivation or die-off rate (K) of 0.140±0.041 at 35 C and 3.840±0.653 at 50°C (Jenkins et al., 1999). In another study, a bench scale model of lime stabilized biosolids was designed to evaluate the persistence of viral, bacterial and parasitic pathogens. Cryptosporidium oocysts remained viable following 72 hours of liming. The persistence of oocysts after liming suggested that C. parvum would be a better choice than Salmonella as indicator for evaluating biosolids intended for land application (Bean et al., 2007).

A recent study calculated livestock Cryptosporidium spp. loads to land on a global scale using spatially explicit process-based modelling (Vermeulen et al., 2017). The study revealed a total global Cryptosporidium oocyst load from livestock manure of 3.2 x 1023 oocysts per year. Cattle, especially calves, were the largest contributors, followed by chickens and pigs. Spatial differences were linked to animal spatial distributions. North America, Europe, and Oceania together accounted for nearly a quarter of the total oocyst load, meaning that the developing world accounted for the largest share. The study found that although manure storage halved oocyst loads, manure treatment, especially of cattle manure and particularly at elevated temperatures, had a larger load reduction potential (up to 4.6 log10 units) than manure storage. Regions with high reduction potential include India, Bangladesh, Western Europe, China, several countries in Africa, and New Zealand.

3.1.1.3 Water-based onsite sanitation (septic tanks)

Septic tanks and pit latrines are two types of on-site sanitation systems in which excreta and waste water are collected, stored or treated at the same location where they are generated. These types of waterless sanitation systems, which require neither water nor sewerage infrastructure for their operation, represent a viable solution for areas where poor sanitation exists and water is scarce (Graham et al., 2003).

Pit latrines usually lack a physical barrier, such as concrete, between stored excrement and soil and/or groundwater. Therefore, enteric pathogens from pit latrines can enter groundwater and thus increase the risk of human exposure to these pathogens. Studies have estimated that approximately 1.77 billion people use pit latrines as their primary means of sanitation (Graham and Polizzotto, 2013). However, a limited number of studies have explicitly examined links between groundwater pollution and contamination from pit latrines. No relevant data on the use and effect of septic tanks on survival of Cryptosporidium oocysts or occurrence were found. However, pit latrines remain an important strategy for improving the conditions of human excrement removal despite the potential for groundwater contamination. This system is considered the most basic option for low-income countries to reduce the level of open defecation and expand access to improved sanitation (Omarova et al., 2018).

3.1.2 Waste stabilization ponds

Waste stabilization ponds (WSPs) employ natural processes to treat domestic wastewater, septate, and sludge, as well as animal or industrial wastes and are frequently used in combination with other sanitation technologies. The most common types of WSPs are anaerobic ponds, facultative ponds, maturation ponds, aerated ponds, and high-rate algal ponds (HRAPs). Pathogen removal from wastewater in WSPs and waste stabilization pond systems was recently reviewed (Verbyla et al., 2017). According to the review, under optimal conditions, removal efficiencies in full-scale WSP systems with several units in series can be as high as 4 log10 (oo)cysts, however the efficiency of pathogen removal in full-scale systems is highly variable, and in practice many WSP systems achieve only 2 to 3 log10 removal. Another study investigated the efficiency in the removal of Cryptosporidium oocysts in a WSP system formed by two anaerobic ponds, a facultative pond and a maturation pond. Cryptosporidium oocysts were reduced by an average of 1.4 log10 (Reinoso et al., 2011). The anaerobic ponds showed significantly higher surface removal rates (4.6-5.2 log10 m-2 day-1) than facultative and maturation ponds. The study also demonstrated that sunlight and water physicochemical conditions were the main factors influencing C. parvum oocysts removal both in the anaerobic and maturation ponds, whereas other factors like predation or natural mortality were more important in the facultative pond.

Inactivation of oocysts by high-rate algal ponds (HRAP) was higher than with conventional wastewater treatment systems (Araki et al., 2001). High pH (9.02) and sunlight in the HRAP were the two major physicochemical conditions responsible for the inactivation of more than 97% (1.6 log10) of the C. parvum oocysts achieved in less than 3 days.

Microbial removal rates were studied in maturation ponds at four WSPs with and without baffles in rural and remote communities in Australia. Cryptosporidium spp. oocysts in maturation ponds were measured at the inlet and outlet. Low numbers of Cryptosporidium oocysts were detected in the inlet samples of all ponds but the recovery rates were also low. Despite these results, the study demonstrated a significant removal of oocysts in the WSP with baffled maturation ponds (Sheludchenko et al., 2016). This system comprised a primary facultative pond, baffled maturation pond with 12-20 days retention times, two constructed wetlands and three reef beds. On the other hand, one the non-baffled systems was deeper than expected with marked stratification and high turbidity. These factors together with short (40 min) retention time caused reduced light penetration and reduced mixing, which led to poor reduction of oocysts. No reduction in the number of Cryptosporidium was observed in another non-baffled system consisting of a primary facultative pond, three consecutive maturation ponds of similar size, an estimated 3.3 days of retention time each, and a final evaporation pond. The high log10 removal of microorganisms, including protozoan oocysts, observed at the baffled system was attributed to the longest retention time of the maturation ponds. Verbyla et al. (2017) have indicated that some of the most important factors influencing pathogen removal efficiency in WSPs include hydraulic retention time and efficiency, water clarity, pond depth, sunlight exposure and penetration, temperature, and pH.

3.1.2.1 Aerated lagoons

Aerated lagoons are commonly used for biotreatment of municipal and industrial wastewaters such as pulp and paper mill effluents. Such lagoons harbor complex microbial communities, which are selected by the physicochemical properties of the wastewater, the design and operation of the lagoon, and the ambient environmental conditions (Yu and Mohn, 2001). One study evaluated the reduction of Cryptosporidium oocysts in two sewage treatment plants in Malaysia which employed extended aeration (EA, plant A) and aerated lagoon (AL, plant B) (Lim et al., 2007). The concentration of Cryptosporidium oocysts in raw sewage ranged from 1 to 80 oocysts/L. In treated sewage, Cryptosporidium was found in 3 of 12 (25%) of the samples in both plants with concentrations ranging from 20 to 40 oocysts/L for plant A and 40 to 80 oocysts /L for plant B. However, the statistical analysis showed that the sewage treatment process which employed extended aeration system provided more significant reduction of Cryptosporidium oocysts (0.57 log10) than the treatment process which encompassed aerated lagoon (0.17 log10). The extended aeration system consisted of oxidation with O2 which introduced air in the form of fine bubbles through submerged diffusers. Fine bubbles promote higher oxygen transfer efficiency whereas in the aerated lagoon system, the oxidation with O2 uses surface aerators to provide air.

3.1.3 Constructed wetlands

Constructed wetlands are a sustainable tertiary treatment alternative that use natural processes involving wetland vegetation, soils, and their associated microbial assemblages to improve water quality (https://www.epa.gov/wetlands/constructed-wetlands). Wetland behavior and efficiency concerning wastewater treatment is mainly linked to macrophyte composition, substrate, hydrology, surface loading rate, influent feeding mode, microorganism availability, and temperature (Almuktar et al., 2018). The main mechanisms by which microorganisms and other pollutants are removed in constructed wetlands have been described in previous studies (Faulwetter et al., 2009García et al., 2010Saeed and Sun, 2012). Filtration and adsorption to root-substrate complexes and associated biofilm have been shown to be the main removal mechanisms of intestinal parasites in natural wastewater treatment systems such as wetlands (Quinonez-Diaz et al., 2001Reinoso et al., 2011). Other mechanisms of removal could be sedimentation and predation (Quinonez-Diaz et al., 2001Stott et al., 2003Reinoso et al., 2008b). The two major types of constructed wetlands are subsurface flow (SSF) and free water surface (FWS) or surface flow systems. FWS systems are those with plants grown in soil or another media with visible water typically 0.15 – 0.6 m in depth. In SSF configurations, the wastewater flows through the media and below the surface level and there is little if any visible water (Quinonez-Diaz et al., 2001). A pilot scale constructed wetland consisting of two cells, one planted with bulrush and the other unplanted bare sand, were used to compare their efficiency in removing Cryptosporidium oocysts from raw sewage (Quinonez-Diaz et al., 2001). Overall, >2 log10 removal was accomplished by both the surface flow from planted cell and flow from the unplanted cell. However, the study reported greater removal in the surface flow of the planted cell than the bare sand, which was attributed to the enhancement of sedimentation or adsorption by the presence of plants (bulrush) in the system.

In another study, multi-species wetlands, one receiving unchlorinated secondary effluent and the other potable (disinfected) groundwater, were used to assess the removal of Cryptosporidium oocysts (Thurston et al., 2001). The unchlorinated secondary treated wastewater had been previously treated by passage through a duckweed-covered pond. Each wetland had a retention time of 4 days. No Cryptosporidium oocysts were detected in the influent and effluent flows of the potable water supplied wetland. In the wastewater supplied wetland, the level of Cryptosporidium oocysts ranged from 0.09 to 10.8/L in the influent and <0.1 to 4.99/L in the effluent. The log10 reduction of oocysts ranged from 0.17 to 1.7 log10.

A combined constructed wetland formed by a facultative pond (FP), a FWS and a SSF was studied in order to evaluate their efficiency in the removal of Cryptosporidium and indicator microorganisms and to determine their relationships (Reinoso et al., 2008b). Overall, the wetland system effectively removed all microorganisms studied. The SSF wetland was significantly more efficient in the removal of Cryptosporidium oocysts (1.98 log10) than the FWS wetland (0.28 log10) and the facultative pond (0.62 log10). No Cryptosporidium oocysts were detected in the final effluent indicating a cumulative treatment system removal of >2.9 log10. Significant correlations were found between Cryptosporidium and faecal indicators in the influent of the treatment system, but not in the other sampling points suggesting that such relations varied along the system due to the different survival rates of the microorganisms. The study demonstrated that the combined use of different natural wastewater treatment systems removed significant amounts of pathogenic and indicator microorganisms. However, these reductions were not enough for a safety reuse of the final effluent.

3.1.4 Combined sewer overflows – treatment of fecally polluted stormwater

Combined sewer overflows (CSOs) are sewage collection systems that transport rainwater runoff, domestic sewage, and industrial wastewater into one pipe. Under normal conditions, it transports all of the wastewater it collects to a sewage treatment plant for treatment, then discharges to a water body. However, when the volume of wastewater exceeds the capacity of the collection system or treatment plant (e.g., during heavy rainfall events or snowmelt), this runoff is discharged directly to nearby streams, rivers, and other water bodies causing adverse environmental impacts. CSOs contain untreated or partially treated human and industrial waste, toxic materials, and debris as well as stormwater. The USEPA has identified CSOs as a priority water pollution concern for the nearly 860 municipalities across the U.S. that have combined sewage systems (https://www.epa.gov/npdes/combined-sewer-overflows-csos).

A study investigated the occurrence of Cryptosporidium in an urban drainage during dry weather compared to their occurrence in a CSO end-of-pipe discharge to determine the loading potential and the potential human health impacts (Gibson et al., 1998). Cryptosporidium oocysts were commonly observed in the urban stream during dry weather conditions, with concentrations of 5-105 oocysts/100 mL. The CSO end-of-pipe samples during wet weather conditions discharged high levels of Cryptosporidium ranging from 250 to 40,000 oocysts/100 L. The study demonstrated that CSOs can significantly contribute to the load of Cryptosporidium in ambient waters used for recreation and potable water supply.

Another study of the Chicago Area Waterway System (CAWS) investigated the impact of a CSO on the microbial quality of the CAWS (Rijal et al., 2011). Dry and wet weather samples were collected upstream and downstream of three water reclamation facilities that discharge secondary treated wastewater effluent into the CAWS. Cryptosporidium enumeration included infectious oocysts in the CAWS using cell culture. The concentrations of oocysts across all samples were generally very low (0.5-0.6 oocysts/L); as few as two, if any, detected in each sample analyzed. For dry weather CAWS samples, no infectious oocysts were detected. Overall, the combined wet and dry weather percentage of infectious foci was estimated to be approximately 2.4% (3 of 125 samples [75 dry weather and 50 wet weather samples] contained infectious foci). The study found pathogen loads in the CAWS and recreational user risks for GI illness associated primarily with wet weather inputs. As a result of wet weather impact being the largest source of microbial pathogen load to the CAWS, the study determined that disinfecting the effluents of three major WRPs that discharge to the CAWS would result in an extremely small reduction in the aggregate recreation season risk to incidental contact recreators.

3.1.5 Wastewater treatment and resource recovery facilities

Table 7 summarizes the results from studies conducted worldwide at different wastewater treatment plants that provide different levels of treatment for removal/inactivation of Cryptosporidium from sewage. The removal efficiency of Cryptosporidium oocysts during sewage treatment processes is variable and depends upon the concentration of oocysts in the influent and the level of treatment applied to sewage (Carraro et al., 2000).

The treatment of municipal wastewater for reclamation includes biological treatment followed by filtration and disinfection. However, differences in treatment operations, variations in filter designs, and disinfection approaches can produce effluents of varying quality. The effectiveness of full-scale biological treatment, filtration, and disinfection for removal of Cryptosporidium oocysts and other pathogens was compared in six water reclamation facilities that produce reclaimed water for nonpotable urban applications (Scott et al., 2004). The relative impacts of loading conditions, process design, and operating parameters on the removal/inactivation of oocysts were evaluated. In the influent, Cryptosporidium was detected in 74% of the samples with concentrations ranging from 1 to 100/L. In secondary effluents, the concentration of oocysts ranged from 0.1 to 10/L for all facilities with an overall detection rate of 84%. Cryptosporidium oocysts were detected in 39% of the final reclaimed effluents at levels between 0.46 and 11.4/L. Overall oocyst removal achieved through wastewater treatment ranged from 0 (no removal) to 2 log10. Oocyst removal was associated with enhanced nitrification and deep-bed filtration. In addition, operation of biological treatment with higher levels of mixed liquor suspended solids and longer mean cell residence times tended to result in increased removal of Cryptosporidium and other pathogens. On the contrary, shortest MCRTs, anthracite monomedia filtration, and low levels of chlorine residual had the poorest finished water quality with respect to Cryptosporidium. Infectivity assays for Cryptosporidium suggested that the proportion of infective/viable oocysts remained unchanged throughout treatment; however the concentration of infective oocysts did decrease with increasing degree of treatment. The study demonstrated the importance of integrating microbiological monitoring with control factors associated with process design and operations as a more robust approach for ensuring the safety of reclaimed water.

3.1.5.1 Primary/preliminary treatment

Primary treatment is inefficient for removal of Cryptosporidium from sewage (Whitmore and Robertson, 1995Carraro et al., 2000Omarova et al., 2018) and studies have reported log reductions between 0.12 and 0.18 log10 (Payment et al., 2001Zhang et al., 2008Fu et al., 2010). Primary treatment is not intended for removal of oocysts and essentially designed for removal of large solids and grit particles by screening processes (preliminary) combined with settling and sedimentation tanks for removal of suspended solids and some organic matter (Gerba and Pepper, 2015).

Dissolved air flotation (DAF) is a more recent innovation for removing suspended solids from sewage, which is now being introduced into new wastewater treatment plants as an alternative to conventional primary sedimentation processes. DAF clarifiers remove suspended solids more rapidly than does conventional primary sedimentation and are cost effective from an engineering standpoint (Gerba and Pepper, 2015). However, recent studies have demonstrated that the use of DAF instead of sedimentation tanks did not result in more efficient removal of oocysts from sewage (Schmitz et al., 2018).

3.1.5.2 Secondary treatment

The majority of protozoal removal occurs during secondary treatment as indicated by previous and more recent studies (Stadterman et al., 1995Rodriguez-Manzano et al., 2012Schmitz et al., 2018). Secondary treatment processes (e.g., activated sludge, trickling filters, waste stabilization ponds), commonly involves biological treatment of primary effluent in a trickling filter bed, an aeration tank or a sewage lagoon (Gerba and Pepper, 2015). The activated sludge process and anaerobic digestion in secondary treatment play a major role in removal and inactivation of oocysts (Stadterman et al., 1995). However, secondary treatment processes have been reported to be highly variable and by some accounts ineffective in removing Cryptosporidium (Chauret et al., 1999). This is largely due to the difference between the sedimentation velocities of protozoan oocysts with respect to the process (Carraro et al., 2000). Removal of oocysts (1.40 log10) during aerobic wastewater treatment and anaerobic digestion of sludge at the wastewater treatment plant in Ottawa was inefficient (Chauret et al., 1999). King et al. (2017) reported highly variable oocyst removals ranging from 0.21 to 3.27 log10 for a variety of treatment processes from 5 sewage treatment plants in Australia. Removal across secondary treatment processes were seasonal, with poorer removals associated with inflow variability. Lagoon systems (wastewater stabilization ponds) incorporated in the treatment process demonstrated significant reduction of oocysts ranging from 1.0 to 3.5 log10. Unlike other secondary treatment processes evaluated in the study (sedimentation plus activated sludge or intermittently decanted extended aeration reactor tanks) which facilitated only physical removal of oocysts, the lagoon systems demonstrated substantial removal and inactivation of oocysts. Similarly, Fu et al. (2010) reported an increase in removal of Cryptosporidium oocysts by a combination of conventional activated sludge (AS), anaerobic-anoxic-oxic treatment (A-A-O) and oxidation ditch processes corresponding to 1.52 (AS), 1.79 (A-A-O) and 2.17 log10. Additional studies have reported average removal efficiencies of Cryptosporidium spp. between 0.08 and 0.87 log10 during the combined primary and secondary treatment processes (Omarova, Tussupova et al. 2018). Rodriguez-Manzano et al. 2012, reported removal of oocysts between 1.28 and 1.37 log10 for primary and secondary treatment involving screening and grit removal, primary sedimentation, anaerobic digestion, and secondary sedimentation.

The variation in the occurrence and removal of faecal indicators, Cryptosporidium oocysts, Giardia and enteric viruses was investigated in four Swedish secondary wastewater treatment plants representing different climatic zones (Ottoson et al., 2006). All four WWTPs were operated with chemical precipitation and activated sludge treatment. Some facilities included additional nitrogen removal and others rapid sand filtration. Cryptosporidium oocysts were present in 5 out of 19 untreated sewage samples, with a mean of 20 oocyst/L. The number of Cryptosporidium oocysts was significantly lower than Giardia cysts in wastewater, reflecting a difference in disease incidence. The overall removal of oocysts ranged from 0.85 to >1.52 log10, with a mean of 1.18±0.32. No correlations were found between Cryptosporidium oocysts and faecal indicators and no significant difference in removal capacity between plants could be proven.

The occurrence of Cryptosporidium oocysts in the effluent of 13 municipal wastewater treatment plants (MWTP) located in the Lublin region of eastern Poland was investigated (Sroka et al., 2013). Secondary treatment consisted of activated sludge or activated sludge combined with increased removal of nitrogen and phosphorous. Cryptosporidium oocysts were found in 61% of treated effluents at concentrations ranging from 2.2 to 154.1 oocysts/L. The overall oocyst removal reported in this study was 0.9 log10, which demonstrated that the conventional wastewater treatment process applied by the different MWTP was not effective for removal of oocysts.

Trickling filters are also part of the secondary treatment of sewage. During the process, the primary effluent is pumped through an overhead sprayer onto the filter bed, where bacteria and other microorganisms have formed a biofilm on the filter surfaces. These microorganisms intercept the organic material as it trickles past and decompose it aerobically (Gerba and Pepper, 2015). Trickling filters have been reported to achieve oocyst removal of 0.2 log10 or 0.3 log10 after sedimentation (Stadterman et al., 1995). Moreover, no differences in oocyst removals have been found between wastewater facilities using trickling filters and activated sludge (Kitajima et al., 2014).

New technologies such as the Bardenpho process have exhibited higher removals of oocysts than facilities utilizing activated sludge or trickling filters (Schmitz et al., 2018). The five-stage Bardenpho (anaerobic, anoxic, oxic, anoxic, oxic) process is an advanced modification of the activated sludge process, which results in nutrient removal of nitrogen and phosphorus via microbial processes in a multistage biological reactor. Bardenpho systems are designed similarly to activated sludge by recycling mixed liquor, but incorporate two aerobic (oxic) stages for nitrification and two anoxic stages for enhanced denitrification. By incorporating an initial anaerobic stage, phosphorus removal is enhanced as microbes release the nutrient, then take it up for cell functions in subsequent aerobic/oxic conditions. Log removals of Cryptosporidium corresponding to 1.67±0.39 log10 and 1.52±0.54 log10 were found in wastewater treatment facilities utilizing the Bardenpho process while 0.96±0.38 log10 and 0.76±0.22 log10 were found in facilities utilizing trickling filters and activated sludge, respectively (Schmitz et al., 2018).

Membrane bioreactors (MBR) are part of the relatively new treatment technologies suitable for wastewater reclamation and reuse (Lefebvre et al.,2013Sari Erkan et al., 2018). The MBR process consists of a biological reactor integrated with microfiltration or loose ultrafiltration membranes (pore sizes ranges from 0.4 to 0.02 µm) that combine clarification and filtration of an activated sludge process into a simplified, single step process (Hai et al., 2014Zhang et al., 2019). Treated water quality from an MBR system is quite similar to the quality of treated water from tertiary wastewater treatment plant due to the high selectivity of the membrane, since it blocks the passage of activated sludge flocs, living bacteria, and particles (Sari Erkan et al., 2018). MBRs offer many advantages over conventional treatment technologies. They include reduced foot printing and sludge production through maintaining a high biomass concentration in the bioreactor. The system is also capable of handling wide fluctuations in influent quality, and the effluent can be reused directly for nonpotable purposes because filtration efficacy is such that a high-quality product water is generated (Chang et al., 2002). Studies have shown that MBRs can consistently achieve efficient removals of suspended solids, protozoa, viruses and bacteria (Pellegrin et al., 2018). However, pathogen removal efficiencies in MBRs have been reported only for Giardia with log10 removal values of >3.3 (Katz et al., 2017). Studies have indicated that for indirect potable reuse application a 4 log10 Cryptosporidium credit by MBRs is required. Other studies have indicated that MBRs must be granted at least 2.5 log10 credits for Cryptosporidium, unless an alternative disinfection process such as UV is used downstream of the MBR process (Hirani and Jacangelo, 2017).

3.1.5.3 Tertiary treatment

Tertiary treatment includes additional processes to further remove pathogens, residual suspended solids and dissolved constituents. Most processes involve some type of physicochemical treatment such as coagulation, filtration, activated carbon adsorption of organics, reverse osmosis and additional disinfection (Gerba and Pepper, 2015). Among a wide range of tertiary and advanced treatment technologies for effluent, the most widespread include tertiary filtration and disinfection processes. Filtration by the adoption of shallow or deep-bed filtration may contribute to the removal of oocysts with 2 log10 reduction, however the level of reduction largely depends on the combination and type of primary, secondary and tertiary treatment applied as demonstrated in previous studies (Gennaccaro et al., 2003Quintero-Betancourt et al., 2003Harwood et al., 2005Rodriguez-Manzano et al., 2012Schmitz et al., 2018). Moreover, infectious oocysts have been found in tertiary treated effluents undergoing sand filtration and chlorination (Gennaccaro et al., 2003Quintero-Betancourt et al., 2003Harwood et al., 2005).

3.1.5.4 Coagulation

Coagulation and flocculation is an important step in several wastewater treatment processes. A common example is chemical phosphorus removal and another, in overloaded wastewater treatment plants, is the practice of chemically enhancing primary treatment to reduce suspended solids and organic loads from primary clarifiers (https://www.iwapublishing.com/news/coagulation-and-flocculation-water-and-wastewater-treatment).

In general, coagulation and flocculation, with dissolved air flotation (DAF) for clarification, has achieved average 2.1 log10 removals of Cryptosporidium. Optimum coagulation conditions seemed to be governed by turbidity and natural organic matter removal requirements, rather than by pathogen removals. Previous studies have reported up to 3.41 log10 reduction of oocysts after chemical lime treatment when preceded by primary treatment, activated sludge, and second stage carbonation (Rose et al., 2001). Removals of Cryptosporidium was evaluated by clarification (DAF and lamella sedimentation) combined with dual media filtration under challenge conditions of high oocyst levels. DAF and filtration together achieved average >5 log10 removals, which were comparable to those achieved by sedimentation and filtration (Edzwald et al., 2000Edzwald, 2010).

3.1.5.5 Membrane technologies

Membrane processes have been widely applied in wastewater treatment to reclaim water, recover resources or even produce energy. Membrane processes can be driven by pressure (microfiltration, ultrafiltration, nanofiltration, and reverse osmosis), electricity (electrodialysis or other electro-membrane processes), a concentration gradient (forward osmosis) or a thermal gradient (membrane distillation and membrane evaporation) (Zhang et al., 2019).

Studies have indicated that physical straining is the primary mechanism of action for removal of protozoan oocysts from the feed water during microfiltration and ultrafiltration processes (Jacangelo et al., 1995). A wastewater tertiary treatment based on membrane ultrafiltration fed with secondary-treated municipal wastewater was evaluated for the removal efficiency of Cryptosporidium oocysts (Lonigro et al., 2006). The results of the study showed that the membrane filtration system was useful for removal of Cryptosporidium from wastewater, however oocyst removals were limited by the low amount in the feed and the permeate. The presence of oocysts in the permeate suggested the need for complete examination of the filtration plant and the membrane module for possible failures, since the membrane nominal porosity (0.03 µm) should guarantee complete oocyst removal.

3.1.5.6 Sludge management

Various treatments are used to remove water and reduce viable pathogen loads in sludge. Treatment of sludge for the reduction of pathogens range in complexity and may include air drying, composting, aerobic or anaerobic digestion, and various thermal and chemical treatments. The resulting biosolids are rich in nutrients and organic matter, and therefore are frequently applied to agricultural land as fertilizer or are disposed. However, concerns exist over the land application of biosolids due to the potential spread of pathogens and other contaminants. A survey of nine wastewater treatment plants in the US found a Cryptosporidium prevalence of 38% in minimally treated Class B biosolids, with concentrations ranging between 35 and 1867 oocysts per dry gram based on immunofluorescence assay microscopy (Rhodes et al., 2015). In Spain, investigators found Cryptosporidium oocysts in treated biosolids from all five wastewater treatment plants studied, with average concentrations ranging between 7 and 192 oocysts per gram based on immunofluorescence assay microscopy (Amorós et al., 2016). A study in Finland examined the persistence of Cryptosporidium in biosolids produced using a variety of composting, quicklime and peat treatments at 22 wastewater treatment plants (Rimhanen-Finne et al., 2004). In that study, Cryptosporidium oocysts were found using direct or immunofluorescence assay microscopy in 37% of the 10-week-old compost samples and in 10% of 30-week-old compost samples. The latter had a maximum concentration of 1 oocyst/450 mg of biosolids. Additional studies have demonstrated reductions of Cryptosporidium load associated with the sewage sludge activation process with log removal efficiencies ranging from 0.46 to 1 log10 (Graczyk et al., 2007). Viable oocysts were still detected in sewage sludge end products. These studies clearly indicate that sludge treatments such as anaerobic and aerobic digestion, lime stabilization and heat drying do not eliminate Cryptosporidium oocysts. Composting of wastewater treatment sludge appears to be an interesting alternative to traditional disposal methods, since it is the only stabilization treatment capable of eliminating Cryptosporidium oocysts (Amorós et al., 2016).

3.2 Disinfection

3.2.1 Chemical and physical disinfection (chlorine, ozone, ultraviolet radiation, advanced oxidation processes)

Chlorine is the most widely used disinfectant for municipal wastewater because it destroys microorganisms by oxidizing cellular material. However, Cryptosporidium oocysts exhibit an extreme resistance to chlorine and standard chlorine-based disinfectant treatments, typically applied in wastewater treatment plants (0.5-1.5 mg/L), do not inactivate oocysts. For instance, viability was not affected by exposure to 1.05 and 3% chlorine as sodium hypochlorite for up to 18 h (Korich et al., 1990). For 2 log10 (99%) inactivation, the chlorine concentration and contact time (CT) for C. parvum oocysts can be as high as 7,200 mg/L. min (minutes) (Korich et al., 1990). In addition, a 15,3000 CT (20 mg/L chlorine for 12.75 h at pH 7.5) is required for inactivating Cryptosporidium in pools (Suppes et al., 2016).

Alternative disinfection systems, such as ultraviolet (UV) radiation and ozonation (O3) have been used worldwide to disinfect wastewater (Wojtenko et al., 2001Paraskeva and Graham, 2002Turtoi, 2013). Studies have indicated that oocysts are irreversibly inactivated by UV radiation (Rochelle et al., 2005). Three log10 inactivation of C. parvum oocysts has been demonstrated at germicidal UV doses greater than approximately 25mJ/cm2 for both medium-pressure and conventional low-pressure lamps (Craik et al., 2001). In another study, a 15-month survey of Cryptosporidium oocyst occurrence was conducted at ten US wastewater treatment plants and Cryptosporidium oocysts were found in all wastewater matrices from raw sewage to tertiary effluents (Clancy et al., 2004). However, the study demonstrated that low doses of UV radiation (3 mJ/cm2) from either low or medium pressure UV lamps (250 to 270 nm range) were effective in achieving an inactivation level of greater than 3 log10 for Cryptosporidium oocysts in wastewater effluent as measured by cell culture. No evidence of oocyst DNA repair was observed. The study indicated that while the occurrence of Cryptosporidium oocysts was common in wastewater effluents, UV was highly effective for oocyst inactivation in wastewater.

Ozone is one of the strongest and fastest reacting disinfectants for treating wastewater (Wojtenko et al., 2001). A study evaluated the efficacy of UV and ozone treatment for reduction of Cryptosporidium in CSOs. The results of the study indicated that disinfection with ozone was more effective at the doses used than UV irradiation. log10 reductions of 1.1±1.9 and 1.3±1.8 by ozone disinfection were observed, however disinfection by UV radiation could not be proven in this study (Tondera et al., 2015).

In addition, advanced oxidation processes (AOPs), based on highly reactive oxidants, have been proposed and applied to reduce the health risks of Cryptosporidium in water (Nasser, 2019). AOPs are based on the in situ generation of strong oxidants for the oxidation of organic compounds. This includes processes based on OH-radicals (·OH), which constitute the majority of available AOPs, but also processes based on other oxidizing species favoring sulfate or chlorine radicals. AOP processes involving ozonation and UV radiation are already well established and operated at full-scale in drinking water treatment and water reuse facilities. New studies of numerous emerging AOPs for water treatment (i.e.,electrochemical AOP, plasma, electron beam, ultrasound or microwave based AOPs) are constantly being reported by various researchers (Miklos et al., 2018). Recent studies have reported from 1.89 log10 to 2 log10 inactivation of Cryptosporidium parasites by photocatalysis with Titanium dioxide (TiO2) or a combination of TiO2 and UV, respectively (Sunnotel et al., 2010Abeledo-Lameiro et al., 2016). Although these processes are mainly designed for attenuation of trace organic chemicals (e.g., pharmaceuticals, consumer products, and industrial chemicals) during water and wastewater treatment, the reduction of microbial pathogens is also feasible.

Comments

Toggle