Chapter 5 Microbiological methods are used in all steps of the risk assessment process, and while there has been a dramatic evolution from traditional cultivation methods to genomic approaches [1], the application of some of the new methods within QMRA has not seen as rapid a transition as in other fields. The purpose of this chapter is not to provide a comprehensive review of such methods but to provide information on how, when, and why certain methods might be used within the risk framework. All methods have specific characteristics that are important for QMRA. This includes many criteria that are interrelated: In addition, the ability of a method to provide a quantitative measure (count) along with standard errors rather than a qualitative measure (presence/absence (PA)) is important for QMRA as this may be one of the most important variables influencing the risk estimates. Finally, viability methods, those that measure the organisms’ ability to infect, remain of great interest particularly in the risk management of microorganisms where disinfection or inactivation is a primary approach used for control (Table 5.1). Viability of an organism has traditionally been based on its ability to grow or reproduce. Bacterial methods typically assess the increase in numbers of the organisms by forming colonies on agar media or growth in a broth. Virus viability has been and continues today to be addressed by replication in living cells (cell culture) [2] and the viability of protozoa by their ability to excyst (a necessary step in replication), infect animal models, or grow in cell culture [5]. All of these methods require time, which becomes significant particularly for viruses, since it may require several weeks by cell culture methods to establish viability. Table 5.1 Method Attributes, Usefulness, and Limitations The speed at which a pathogen is identified can be important for certain applications. For risk assessment, the speed of the method is not critical; however, it is significant during extreme events (e.g., floods or hurricanes) for water or food and within the plant treatment of processing when assessment of exposure is critical in operational malfunctions or public health decisions. The health impact of microbial pathogens is immediate, unlike many chemical contaminants, which may require a long-term or lifetime exposure before there is an impact on health (e.g., trihalomethanes produced by chlorination). Ideally, in these situations, methods are needed that can produce results in real time (immediately or within 1–2 h) before exposure can occur and corrective action can be taken. In reality, the most rapid methods require at least several hours to 24 h to complete. In these situations, sensitivity of the method is not necessarily as important as speed. For example, if it is determined that if 1000 Cryptosporidium oocysts per 10 l when present in the raw intake water of a drinking water treatment plant will exceed its ability to prevent an outbreak or will produce an unacceptable risk to the population consuming the water, a method of greater sensitivity is not needed. Speed becomes critical in this situation since it allows an operator of the plant to take corrective action to reduce the level of oocysts or notify public health officials that a “boil water” order may be needed. The level of sensitivity needed for any given method depends on the expected concentration of organisms at the sampling site and the expected exposure. Highly sensitive methods are not necessary for the detection of enteric pathogens in sewage; only a few liters need to be sampled. Exposure directly to sewage would be expected to be incidental and accidental. In treated drinking water, methods capable of detecting pathogens in hundreds or thousands of liters are needed. Exposure to drinking water is direct and has a daily frequency. Because the health effects are pathogen specific, it is necessary to have methods that are both selective and specific. Methods are required that can identify the genus, species, serological type, strain, and virulence of pathogens. Only certain strains of E. coli are capable of producing human illness. These strains represent only a small proportion of all the E. coli found in the environment. Over 140 enteric viruses are excreted in human feces, but some have never been identified with illness (e.g., reovirus), while others cause a very serious illness (hepatitis). Not all strains of waterborne protozoan parasites may be capable of producing illness in humans (e.g., Giardia). Each of the steps in the QMRA process has specific needs. For the HAZ ID, specificity and sensitivity are of primary concern. In addition, the recognition of new and multiple hazards is ongoing due to emerging infectious diseases. This is being addressed using genomic approaches including metagenomics and techniques like microarrays [8]. Clinical methods that identify hazards in specimens from infected individuals (e.g., feces, sputum, urine, and mucus) are often used for this portion of the QMRA. For dose–response, highly precise and accurate methods are needed that theoretically measure infectivity. Unfortunately, in many dose–response experiments, the various doses are crude approximations with high variability with lower doses simply diluted out and estimated. For doses in experiments for many of the parasites, for example, microscopic counts were used with no assessment of the infectivity. In other cases, the methods such as in the case of viruses underestimated the numbers of virion particles in the dose. In general, it is preferable to have the same methods used for dose–response as for exposure assessment. Exposure assessment is where the method needs have been most apparent, and the exposure data often contributes the greatest uncertainty to the QMRA. In addition, recovery is a major issue as often this can be as little as 1%. Even in clean waters at best, less than 50% recoveries are reported. For example, early methods for the detection of some of the enteric viruses from tap water were capable of 30% recoveries [10]. Standard methods continue to be used such as those published in Standard Methods for the Examination of Water and Wastewater [12]. However, newer methods using a combination of various techniques including immunomagnetic capture systems, cell culture, and molecular techniques are now being applied to air, soil, water, and foods. Much of the past microbial occurrence data are nonquantitative (quantal methods), reported as PA and developed with very different protocols and monitoring approaches. These data have limited application for quantitative risk assessment. However, quantal data can be used with statistical approaches to address the concentrations through dilutions called a most probable number (MPN) that deals well with censored data and environmental monitoring [9, 13] The MPN method is a dilution-to-extinction approach in which the sample and dilutions of the sample are tested in a series of replicates and the last set of dilutions/replicates that turn from positive to negative can be used along with the volumes tested to provide an MPN quantifiable result. MPN methods have less precision than quantitative methods. New methods need to be developed, tested, and applied for application to QMRA and quantitative assessment of exposure pathways, as well as control of exposure. Sampling protocols and schemes and the interpretation of the data should be considered. It is now recognized that quantitative, statistically evaluated databases must be developed, as often this is the major data gap for adequate risk assessments. These must be combined with models for prediction of transport and fate of microorganisms through the environment and treatment processes. This includes the field of predictive microbiology as a rapidly developing area that will be able to fill some of the data gaps on exposure. Quantitative microbial risk assessment has primarily focused on bacterial, protozoan, and viral pathogens; however, other pathogens are emerging. Critical to this process to address the new hazards is the understanding of which methods and new methods are to be used. This includes algae toxins and fungi and noncultivatable microbes, particularly viruses. Overall, methods are available to isolate and identify bacteria, fungi, protozoa, and viruses as well as microbiological toxins from environmental samples [1, 14]. Table 5.2 describes the multitude of methodological approaches that could be used for various microbes and the units of measurement. It should be recognized that currently many pathogens are being detected, identified, and quantified using molecular techniques, in particular PCR and more specifically qPCR. The units in these cases are gene copies per equivalent volume, or the gene copies can be translated to cells or virions per equivalent volume. This is dependent on the number of gene targets per cell as well. Table 5.2 Microorganisms, Methods, and Units for Measurement aViable-but-non-culturable Exposure is part of a complete process where data are obtained from monitoring the sources of the microbial hazards over time and the transport of that hazard from the source to the exposure point and monitoring the air, water, or food at the point of exposure. For example, from farm to fork and from source water to tap water which would include the final food product prior to consumption: the glass of water from the tap or the aerosol that is inhaled. The changes in concentrations and viability are of great interest but are difficult and often an impossible task to assess all the various concentrations in the microbial hazards along the transport chain. Unlike chemicals, microorganisms act as particles, and their concentrations in water, soil, air, and food and on surfaces are not normally nor homogeneously distributed. Microorganisms can change concentrations through die-off or regrowth over time. The sources of the microorganisms (e.g., animal wastes or sewage, etc.) are also diverse in concentrations over time (e.g., seasonal and climatic influences). Finally, many controls have already been implemented (disinfection) to reduce the concentrations and the exposure. Therefore, other strategies have been developed for assessing exposure and developing occurrence databases for microorganisms, the sources of microorganisms, the transport and fate of microorganisms, and the reduction through the use of treatment/process controls of microorganisms. These include the monitoring of indicators as well as pathogens. These approaches include field data (surveys) as well as laboratory-based experimental data and the use of models for evaluating transport (e.g., subsurface migration) and fate (e.g., inactivation rates). Ecosystem studies are necessary for most microorganisms (e.g., Legionella in biofilms and release during aerosolization) and more ecosystem modeling is needed. In the area of food safety, the concept of farm to table is being used to follow the microbial contaminants from their source on the farm through harvest and production to the final packing of the food product. For drinking water, a similar system based on watershed assessment, drinking water treatment efficacy, and distribution system integrity is being promoted. In most of these cases, an understanding of the transport patterns, survival, and regrowth of the microorganisms must be gained, and monitoring data must be developed to support the likelihood of exposure through the various pathways. Therefore, the evaluation of exposure will require extensive development of a variety of databases and models. Table 5.3 gives an example of the data needed for assessing exposure to drinking water. Table 5.3 Types of Data Needed for Exposure Assessment in Drinking Water Methods for the detection of bacterial pathogens in clinical specimens were developed near the turn of the last century using liquid broth or solid agar to grow and isolate the bacteria from clinical specimens. Selective media and biochemical methods were used to differentiate the specific pathogen, allowing for identification. However, development of methods for detection of specific microbial hazards in environmental samples, such as food and water, proved to be more problematic. Pathogens are present in much lower numbers (usually, orders of magnitude less) and not in as a robust state of growth as that found in clinical specimens from infected persons. Historically, methods focused on environmental identification of pathogens of significance that were causing large community disease outbreaks, which were V. cholera and S. typhi. Because these were spread through the fecal contamination of water in many cases, the goals were to find a way to assess this pollution and manage it. By the 1900s, environmental microbial quality focused on fecal indicator systems. The sanitary quality extended to food, water, and surfaces. While E. coli was the indicator of choice early on, there was no easy method for its detection, and so the larger umbrella group of coliform bacteria, which were always present in the feces of warm-blooded animals, was adopted (members of the Enterobacteriaceae that ferment lactose to acid and gas). Coliform bacteria could be detected in water within 24 h with a simple test. Application of this indicator played a major role in the reduction of waterborne disease (particularly enteric bacteria) around the world. However, as methods developed, it became clear that the specificity of coliform was not adequate, and by the 1940s, thermotolerant coliforms were being used (which gained specificity through higher incubation temperatures at 44.5°C). Finally, methods evolved in the 1970s for identifying E. coli specifically. All countries and the WHO use E. coli now for drinking water. While the United States also still uses the coliform bacteria, this is only for addressing distribution integrity and disinfection efficacy, and uses E. coli method with greater specificity as a health risk indicator. For recreational waters such as beaches, both E. coli and enterococci are being used as indicators to assess the safety. The rapid development of simple tests focused on assessing and controlling exposure to waterborne and foodborne pathogens using indicator organisms. Efforts to develop new methods for the detection of enteric pathogens in the environment began in the 1960s particularly for enteric viruses, known to have very different transport and fate, resistance to disinfection, as well as health outcomes compared to bacteria. Waterborne outbreaks of Giardia in the 1970s and Cryptosporidium in the 1980s followed, and later, foodborne outbreaks of Salmonella, E. coli O157:H7, and Cyclospora emphasized the need for new and better methods and development for the direct pathogen detection in water and food. Traditionally, methods for the isolation of microorganism have been by cultivation of the organisms in the laboratory. While many pathogenic bacteria transmitted through the environment can be cultivated on solid agar media, they do move into a VBNC state, which includes emerging hazards like Helicobacter. In contrast, viruses and protozoan parasites usually require laboratory animals or cell culture to ascertain viability. A number of environmental conditions, such as low levels of nutrients or toxic substances (e.g., disinfectants, metals, heat, ultraviolet light, etc.), place stresses on many persistent bacteria creating this noncultivatable state. This makes it difficult to isolate the cells using cultivation techniques on selective or other media [15] with only a small percentage of the total viable organisms detected using standard procedures. In the case of viruses, not all of the virions observed under an electron microscope appear capable of infecting cell culture used in the laboratory. The ratio of virions to tissue culture infective cells may range from 1:50,000 in the case of rotavirus in children’s stools to 1:100 in laboratory-adapted strains of poliovirus [16, 17]. This may be due to several factors: (1) the genome not containing all or only part of the nucleic acid of the virus, (2) lack of receptors on the host cell for the virus, and (3) inability of all the virions to find receptors during exposure to the cells. The work of Ward et al. [18] suggests that virions present in the stools of infected persons (which are released into the environment via sewage) are likely to be underestimated by current cell culture methods. Not all pathogens have even been cultivated in the laboratory, and other methods have to be used; this includes many protozoan parasites and viruses (e.g., norovirus). Noncultivatable methods use specific antibodies labeled with fluorescent probes, which react with the pathogen proteins (antigens) or with the pathogen genome with a molecular probe. More often, however, PCR is used with targets and amplifies specific DNA sequences. All existing methods for pathogen detection in the environment will underestimate the true exposure due to inability to recover and detect the microorganisms efficiently. Losses occur at each step in the method process from collection to final quantification. The methods used for environmental samples to detect pathogenic microorganisms involve a series of steps shown in Figure 5.1. The initial sampling strategy often influences the overall occurrence database that is developed. The numbers of samples and the spatial and temporal distribution of samples need to be considered. Microorganisms in water and air are most often concentrated by large volumes through filtration, whereas in solid matrices including sediments, soils, and foods, microbes are eluted (washed from) or separated from the solids (grams) using buffers, antibody–antigen complex, or density gradients. Fomites are sampled by increasing the number of locations and surface area (cm2). After sampling and concentration, some type of purification may be needed: magnetic antibody separation can be used, for example, centrifugation and washing or further filtration. Ultimately, identification, quantification, and other types of methods for assessing viability (through culture in animals and cells or on media) would be undertaken. Isolation and further characterization usually are accomplished with cultivation. Particularly for bacteria, the presence of virulence genes and antibiotic resistance are of interest, and often, final identification can’t take place without isolation. Losses occur at each step and are influenced by the microbe and the matrix. Water is most often sampled through filtration. It is desirable in most cases to sample large volumes up to hundreds of liters, and such sampling is often done on site by passing the water through a device that concentrates the organism. The microorganisms are most commonly concentrated onto a filter by size exclusion (e.g., pore size smaller than the organism) or by adsorption [14]. Thus, very specific methods have been developed for bacteria, parasites, and viruses, and this has been tied to most often drinking water and development of rules and regulations to produce safe water by the United States, EU, and WHO ([19], http://www.microrisk.com/publish/cat_index_6.shtml). The Information Collection Rule in the United States led the EPA to develop standard methods that could be used for monitoring water intake to a drinking water facility for Cryptosporidium, Giardia, and viruses to be used for QMRA, cost–benefit analysis, and the development of surface water treatment regulations (http://water.epa.gov/lawsregs/rulesregs/sdwa/swtr/). Temporal changes in water are well known to be influenced by environmental conditions, particularly rainfall and temperature. But sampling during extreme events and flooding has proven to be difficult. In wastewater reclamation, sampling is often used to reflect all possible pathogens, and databases have been developed for fecal indicator bacteria, Cryptosporidium and Giardia, and cultivatable enteric viruses using more conventional cell culture methods [22]. This allows for assessment of removal by various water treatment processes, including activated sludge, lime, filtration, and disinfection at full scale. However, wastewater discharges to surface water are not routinely sampled for pathogens as indicators are generally part of the legal permits. Research studies however have focused on collection of small volumes of untreated raw sewage in order to characterize pathogens, being excreted by the community, thus addressing HAZ ID [24]. Biosolids are also examined but are treated as solids (see following text). In recreational waters, sampling has been venue specific, whereas fecal indicators and not pathogens are generally used for natural waters and beaches, on occasion pathogen specific sampling to address predictive modeling has been undertaken [28]. Bacteria like pseudomonads and Legionella have been examined in spas and hot tubs. Swimming pools have generally been deemed safe via chlorine residuals and good maintenance. However, now that Cryptosporidium is of concern due to its resistance to chlorine and widespread occurrence in swimming pool outbreaks, there is more interest in sampling these venues. The free-living amoeba have also been monitored in recreational waters, but as of 2011, there has been growing interest in examining drinking water distribution systems associated with biofilms. The quality of the water will change the efficiency of concentration and detection of microorganisms. So in case of tap water, groundwater, river water, lake water, stormwater, and wastewater, a wide range of physical and chemical characteristics of the water affect the methods. Highly turbid organically laden river water, for example, can interfere with microbial adsorption techniques, cultivation techniques, and microscopic techniques. Even given these limitations, methods have been developed that are capable of detecting organisms in thousands of liters of water. On a weight-to-weight basis with water, the methods for viruses have a sensitivity of detection of 10−18. This extreme sensitivity [most chemical methods are only capable of detecting parts per million (10−6) or billion (10−9)] is also another probable reason why variability in the efficiency of detection is observed with these methods. In the case of solids such as food, soil, biosolids [29], and clothing, the organisms are either assayed directly (the case most common with bacteria) or extracted. Because of the difficulty with working with large amounts of solids, most methods cannot process greater than 100 g. Biosolids have been sampled in particular for viruses and Salmonella via elution followed by culture and PCR [28]. Sampling of inanimate surfaces (fomites) includes a variety of instruments to recover the organisms from the fomite (e.g., small swab or sponge that is wetted and then placed in contact with the surface) with microbe size and eluent influencing the recovery [31]. The swab/sponge is then placed in a fluid for extraction of the organisms. The interest in fomites has been enhanced due to the bioterror incident when B. anthracis was released through mail envelopes in 2001, contaminating government buildings including potentially a post office. The need for QMRA became evident for addressing the assessment of the decontamination process. Yet the data on detection of spores from fomites yielded little information on whether the detection methods could effectively determine if the environment is clean. Herzog et al. [32] undertook a meta-analysis on the limits of detection of methods for B. anthracis including for soil, water, and fomites and the influence this had on understanding risk. Sampling fomites for QMRA is not done in isolation but often includes sampling of food as well as hands (see Fig. 2.6). Although both approaches could be expressed as CFU/cm2, for example, more chickens and more surface area would be sampled with approach 1. Collection and isolation of bacteria from the wash water could be more efficient than with a swab method; on the other hand, regrowth of bacteria in wash water is a possibility. The use of approach 1 gave much greater results, indicating much greater levels of contamination. Most microorganisms can become airborne through both natural (sneezing) and man-made activities (cooling towers). Respiratory viruses such as influenza and Coxsackie A21 can be effectively transmitted via aerosols generated from infected individuals during sneezing or talking. Aerosols containing L. pneumophila can be generated from a number of sources including shower heads, air humidifiers, cooling towers, hot tubs, etc. These bioaerosols may be short lived or persist for prolonged periods of time, but generally, they are diluted rather quickly, especially in outdoor environments. Loss of organism viability for most nonspore-forming human pathogenic bacteria and viruses may be fairly rapid (minutes to hours), whereas the acid-fast mycobacteria and bacterial and fungal spores survive for long periods of time. Currently, there are no standard methods for the sampling of aerosols, and no one method is suitable for collection of all types of microorganisms; however, the basic methods used trap microorganisms in a fluid by bubbling of the air through the liquid or through impacting directly on solid agar. Because of this, data generated from different studies are often difficult to compare. This complicates the use of sampling results for use in risk assessment [33]. Sampling of bioaerosols is more often done to demonstrate the presence of an organism rather than to quantify the numbers present. Because of limitations in current sampling methods, the absence of the target organism does not guarantee its absence [33]. A great deal of uncertainty exists particularly due to the small volume of sample over a limited period of time. The location of the sampler and time the sampling occurs may not coincide with the release of the aerosol. There are four general types of methods used for aerosol sampling:
Analytical Methods and the Qmra Framework: Developing Occurrence and Exposure Databases
Introduction
Attribute
Useful for
Limitation
Quantitative
Exposure assessment but needed for regrowth or inactivation data
Temporal and spatial distributions are variable. Sample matrix affects precision
Speed
Recreational waters, extreme events, treatment failure, corrective actions.
There are no rapid (real-time and sensitive) methods available; currently, qPCR could provide results within 4–6 h
Specificity
Improves HAZ ID
False negatives are a bigger problem
Sensitivity
Needed for low levels, often found in final product (tap water)
Time and effort are needed to assay large volumes of sample
Viability
Estimating inactivation
May not be possible in the field, so indicators, models, and experimental data may be needed
Microbe Group
Common Methods
Units
Notes
Algae
Microscopic (experts can identify the types by size and shape and features)
Cells
Indirectly look for chlorophyll a, relates to amount of algae present
Blue-green toxic algae
Microscopic (experts can identify the types by size and shape and features)
Cells, but interested in toxin concentrations (nanograms (ng) or micrograms (µg))
Use an enzyme-linked immunoassay that produces a color if the toxin is present. Read on a spectrophotometer
Bacteria
Cultivation using biochemical tests
CFU or MPN such as Colilert
Some bacteria are difficult to culture on media and do not have specific media for all types, and in the environment, many bacteria move into a “VBNC”a state
Parasites
Microscopic (do not determine viability)
Look and count specific life stages, for example, eggs, cysts, oocysts, and larvae
Some parasites can be cultured. Many are obligate parasites and may require cell culture or animal models
Viruses
Assays in mammalian cell culture (mosquito-borne viruses can replicate in mosquito cell lines)
PFU or MPN of CPE (infected cell cultures undergo observable morphological changes called CPEs)
Scanning electron microscopy shows there may be 100–1000 virus particles to every culturable unit. Many viruses cannot be cultured
Any pathogen
Assay that targets specific genes
Gene copies or cell equivalents
qPCR
Approaches for Developing Occurrence and Exposure Databases
Locations
Examples
The sources in the watershed
Concentrations in sewage discharges
Seasonal changes
Temperature affecting viable levels
Hydrologic changes
Peaks associated with rain
The intake
The levels that enter the plant
Postfiltration
Used to estimate removals
Postdisinfection
Concentrations of viable organisms
Distribution system
Levels associated with tap water and aerosols
The method recovery
Adjust levels based on recoveries
Overview of Methodological Issues
Sampling Water
Sampling Surfaces and Food
Sampling Aerosols