Health Canada
Symbol of the Government of Canada

Common menu bar links

Environmental and Workplace Health

Guidelines for Canadian Drinking Water Quality: Guideline Technical Document - Radiological Parameters

May 2009
ISBN: 978-1-100-16767-1
Cat. No.: H128-1/10-614E-PDF

Help on accessing alternative formats, such as Portable Document Format (PDF), Microsoft Word and PowerPoint (PPT) files, can be obtained in the alternate format help section.

Table of Contents

Part I: Overview and Application

1.0 Guidelines

Maximum acceptable concentrations (MACs) have been established for the most commonly detected natural and artificial radionuclides in Canadian drinking water sources and are listed in the table below. A guideline for radon in drinking water is not deemed necessary and has not been established.

The MACs are based on exposure solely to a specific radionuclide. The radiological effects of two or more radionuclides in the same drinking water source are considered to be additive. Thus, the sum of the ratios of the observed concentration to the MAC for each contributing radionuclide should not exceed 1.

MAC for Natural and Artificial Radionuclides
Natural radionuclides MAC Artificial radionuclides MAC

Table a footnotes

Table a footnote 1

For information on the chemical aspects of uranium toxicity, the reader should refer to the Guideline Technical Document on uranium (Health Canada, 1999).

Return to table a footnote 1 referrer

Table a footnote 2

See Appendix B for explanation of units.

Return to table a footnote 2 referrer

Total uraniumTable a footnote 1 0.02 mg/L Tritium (3H) 7000 Bq/L
Lead-210 (210Pb) 0.2 Bq/LTable a footnote 2 Strontium-90 (90Sr) 5 Bq/L
Radium-226 (226Ra) 0.5 Bq/L Iodine-131 (131I) 6 Bq/L
    Cesium-137 (137Cs) 10 Bq/L

2.0 Executive summary

Radionuclides are naturally present in the environment; they may also enter the environment as a result of human activities. Natural sources of radiation are responsible for the large majority of radiation exposure (greater than 98%), excluding medical exposure. Additional exposure can result from human activities associated with radioactive materials. This document focuses on routine operational conditions of existing and new water supplies and does not apply in the event of contamination during an emergency involving a large release of radionuclides into the environment.

This Guideline Technical Document assesses the human health risks of radionuclides in drinking water, taking into account new studies and approaches, including dosimetric information released by the International Commission on Radiological Protection (ICRP) in 1996 (ICRP, 1996). Maximum acceptable concentrations in drinking water have been established for three natural (210Pb, 226Ra, and total uranium in chemical form) and four artificial (tritium, 90Sr, 131I, and 137Cs) radionuclides. These represent the natural and artificial radionuclides that are most commonly detected in Canadian water supplies. The MACs are derived using internationally accepted equations and principles and are based solely on health considerations. They are calculated using a reference dose level of 0.1 mSv for 1 year's consumption of drinking water, assuming a consumption of 2 L/day at the MAC.

A guideline for radon is not considered necessary. The health risk from ingesting radon-contaminated drinking water is considered negligible, because most of the radon escapes at the faucet or water outlet, leaving only minimal amounts in the water itself. However, it should be noted that radon levels in drinking water, if sufficiently elevated, can significantly affect airborne radon concentrations.

2.1 Health effects

Various mechanisms are responsible for radiation damage. Exposure to radiation from all sources can result in changes to sensitive biological structures, either directly through the transfer of energy to the atoms within the tissue or indirectly by the formation of free radicals. Since the most sensitive structure in the cell is the deoxyribonucleic acid (DNA) molecule, exposure to radiation may damage the DNA, causing the cells to die or to fail to reproduce. This can result in the loss of tissue or organ function or the development of cancer. The likelihood of these events occurring increases with the amount of radiation received. Types of cancer most frequently associated with radiation exposure include leukaemia and tumours of the lung, breast, thyroid, bone, digestive organs, and skin. These cancers can develop between five years and several decades after exposure. This latency period depends on several factors, including individual sensitivities to radiation exposure, the type of radionuclides to which an individual has been exposed, as well as the level of the dose and the dose rate. The MACs are based on a reference dose of 0.1 mSv/year, which represents a lifetime excess risk (i.e., above background levels) of both fatal and non-fatal cancers of 7.3 × 10-6 if an individual is exposed to the MAC for one year.

2.2 Exposure

The occurrence of natural radionuclides in drinking water is associated most commonly with groundwater. Natural radionuclides are present at low concentrations in all rocks and soils. In the cases where groundwater has been in contact with rock over hundreds or thousands of years, significant concentrations may build up in the water. These concentrations are highly variable and are determined by the composition of the underlying bedrock as well as the physical and chemical conditions prevailing in the aquifer. Although rare, natural radionuclides have also been known to occur in shallow wells.

Increased levels of radionuclides in surface waters may be linked to industrial processes, particularly uranium mining and milling operations, fallout from nuclear weapons testing (mostly before 1963), emissions from nuclear reactors, as well as cosmogenic and other artificial radionuclides. Surface waters in close proximity to point sources may contain higher levels of radionuclides; however, levels in groundwaters are less likely to be influenced by point sources.

2.3 Treatment

Although the establishment of drinking water guidelines for a contaminant usually takes into consideration the ability to measure the contaminant and remove it from drinking water, the MACs for radionuclides are based solely on health effects. However, most radionuclides can be reliably measured to levels below the established MACs.

Most radionuclides, with the exception of tritium, can be effectively treated in municipal-scale treatment facilities, with removal efficiencies ranging from 70 to 99%, depending on the type of treatment. However, for artificial radionuclides such as tritium, the strategy should be to prevent contamination of the source water.

At the residential scale, treatment devices are available for the removal of radionuclides with the exception of tritium, with efficiencies generally similar to those of municipal-scale treatment. However, they cannot necessarily be certified to recognized standards, as standards are not available for all radionuclides. In addition, appropriate authorities should be consulted for the disposal of liquid and solid waste from the treatment of drinking water containing radionuclides.

3.0 Application of the guideline

Note: Specific guidance related to the implementation of drinking water guidelines should be obtained from the appropriate drinking water authority in the affected jurisdiction.

MACs have been established for three natural (210Pb, 226Ra, and total uranium in its chemical form) and four artificial (tritium, 90Sr, 131I, and 137Cs) radionuclides. These represent the most commonly detected radionuclides in Canadian drinking water supplies. Every effort should be made to maintain radionuclide levels in drinking water as low as reasonably achievable. The levels of radionuclides normally encountered in drinking water are far below the threshold for acute effects of radiation. In virtually every case, the MACs are based on chronic or cumulative exposure over a period of one year.

3.1 Monitoring

The sampling and analyses for individual radionuclides should be carried out often enough to accurately characterize the annual exposure. If the source of the radioactivity is known or expected to be changing rapidly with time, then the sampling frequency should reflect this factor. If there is no reason to expect concentrations to vary with time, then sampling may be carried out seasonally, semi-annually or annually. If measured concentrations are consistent and well below the MACs, this would be an argument for reducing the sampling frequency. In contrast, the sampling frequency should be maintained, or even increased, if concentrations are approaching individual MACs or if the sum of ratios of the observed concentration to the MAC for each contributing radionuclide approaches 1.

Jurisdictions with facilities where environmental releases of radionuclides are likely to affect drinking water sources may wish to establish monitoring programs to ensure that drinking water treatment plant operators are made aware of these releases so that appropriate action can be taken. If a situation exists where ongoing exposure to radiological parameters is likely, a jurisdiction may choose to apply additional measures based on the toxicity, the expected level in the source water, and the frequency of occurrence, in order to mitigate risk.

3.2 Gross alpha and beta measurements

Water samples may be initially analysed for the presence of radioactivity using techniques for gross alpha and gross beta determinations rather than measurements of individual radionuclides. Alpha emissions are generally associated with naturally occurring radionuclides, whereas beta emissions are generally associated with artificial radionuclides. Although facilitating routine examination of large numbers of samples, these procedures do not allow for confirmation of the identities of the contributing radionuclides. These measurements are generally suitable either as a preliminary screening procedure to determine if further radioisotope-specific analysis is necessary or, if radionuclide analyses have been carried out previously, for detecting changes in the radiological characteristics of the drinking water source. Gross alpha and gross beta screening is also useful in determining if the activities from specific radioisotopes account for all of the activity found in the screening test.

Water samples may be initially screened for radioactivity using techniques for gross alpha and gross beta activity determinations, subject to the limitations of the method. Compliance with the guidelines may be inferred if the measurements are less than 0.5 Bq/L for gross alpha activity and less than 1 Bq/L for gross beta activity. These screening levels are consistent with those established by the World Health Organization (WHO, 2008). Specifically, the screening level for gross alpha activity is based on the strictest MAC (226Ra) for alpha activity, whereas the screening level for gross beta activity will be protective of all beta-emitting species that can be expected to be found in drinking water, including iodine species and 90Sr.

If either screening level is exceeded, then the specific radionuclides should be identified and individual activity concentrations measured. When the sum of ratios of the observed concentration to the MAC for each contributing radionuclide is below 1, no further action is required, and the water is acceptable for human consumption from a radiological perspective. Where the sum exceeds unity for a single sample, the reference dose level would be exceeded only if exposure to the same measured concentration were continued for a full year. Hence, an exceedance from a single sample does not in itself imply that the water is unsuitable for consumption and should be regarded only as a level at which further investigation, including additional sampling, is needed.

Radionuclides emitting low-energy beta activity, such as tritium, and some gaseous or volatile radionuclides, such as iodine, will not be detected by standard gross activity measurements. If their presence is suspected, radionuclide-specific sampling and measurement techniques should be used.

3.2.1 International considerations

WHO has established screening levels for drinking water at 0.5 Bq/L for gross alpha activity and 1 Bq/L for gross beta activity (WHO, 2008). The gross alpha screening level reflects values near WHO's radionuclide-specific guidance reference dose level. The gross beta activity screening level, in the worst case, would lead to a dose close to the guidance reference dose level of 0.1 mSv/year. The rationale for these screening levels is currently under review by WHO and the International Atomic Energy Agency.

The U.S. Environmental Protection Agency (EPA) has established a gross alpha maximum contaminant level of 15 pCi/L (0.56 Bq/L), which includes 226Ra but excludes radon and uranium. This screening level accounts for the risk from 226Ra at 5 pCi/L (0.19 Bq/L) (the 226Ra maximum contaminant level) plus the risk from 210Po, the next most radiotoxic alpha emitter in the uranium decay chain (U.S. EPA, 2000a). The U.S. EPA's gross beta screening level is set at a fixed dose of 4 mrem/year (0.04 mSv/year) and involves two limits. For normal water supplies, a beta screening level of 50 pCi/L (1.85 Bq/L) has been set, above which speciation would be required to determine which beta emitters are present; for water supplies known to contain radionuclides, the beta screening level has been set at 15 pCi/L (0.56 Bq/L).

The screening level set by Australia for either gross alpha or gross beta activity is 0.5 Bq/L. The gross beta measurement includes a contribution from 40K, which is a natural beta emitter. Water meeting these screening guidelines is expected to result, at worst, in an annual dose of approximately one-third of the minimum dose at which intervention should be considered. If the screening level for gross alpha or gross beta activity is exceeded, specific radionuclides should be identified and their activity concentrations determined.

3.3 Radon-specific considerations

The health risk from ingesting radon-contaminated drinking water is considered negligible, because most of the radon escapes at the faucet or water outlet, leaving only minimal amounts in the water itself. However, it should be noted that radon levels in drinking water, if sufficiently elevated, can significantly affect airborne radon concentrations. Where indoor air radon concentrations exceed 200 Bq/m3 as an annual average concentration in the normal living area (Government of Canada, 2007), the source of the radon should be investigated, including through the monitoring of concentrations in drinking water. If radon concentrations in drinking water exceed 2000 Bq/L, it is recommended that actions be taken to reduce the release of radon from the drinking water into ambient air.

Part II: Science and Technical Considerations

4.0 Identity, use, and sources in the environment

Radionuclides may be naturally present in the environment, or they may enter the environment as a result of human activities. The contribution of drinking water to total exposure to radionuclides is typically very small, as shown in Figure 1. According to WHO (2008), exposure to radiation in both food and water for the world population in general, represents only 8% of the total exposure to radiation. to radionuclides is typically very small, as shown in Figure 1. According to WHO (2008), exposure to radiation in both food and water for the world population in general, represents only 8% of the total exposure to radiation.

Figure 1: Sources and distribution of average radiation exposure for the world population (WHO, 2008).

Figure 1 is a schematic of the sources and distribution of average radiation exposure for the world population (WHO, 2008).

4.1 Natural radionuclides

The greatest contribution to radiation exposure of the general public comes from naturally occurring radioactive elements in the Earth's crust and from cosmic radiation of extraterrestrial origin. Natural sources contribute more than 98% of the global human radiation dose, excluding medical exposures. The global average individual dose from natural sources is estimated to be about 2.4 mSv/year (UNSCEAR, 2000), which is comparable to 2.6 mSv/year for Canada (NCRP, 1987). About one-third of this dose is due to external radiation (terrestrial plus cosmic); the other two-thirds is due to the inhalation and ingestion of radionuclides in air, water, and food.

Natural radionuclides are either primordial (having half-lives comparable to the age of the Earth) or secondary (produced by the decay of primordial radionuclides). The natural radionuclides belong principally to the uranium series, the thorium series, and the actinium series, originating from 238U, 232Th, and 235U, respectively. These radionuclides are present at low concentrations in all rocks and soils. In addition, the primordial radionuclide 40K, comprising 0.012% of natural potassium, is present in the environment at significant concentrations. Because potassium is widespread in the environment and is taken into the body through ingestion as an essential nutrient, 40K is a major contributor to both internal and external exposure, providing about 180 Sv internally and 150 Sv externally to the average annual background dose (UNSCEAR, 1988). However, the absorption of elemental potassium by the body is under strict homeostatic control and is therefore not influenced by variations in environmental levels. For this reason, the dose from 40K in the body is constant and is not considered further in these guidelines.

The occurrence of natural radionuclides in drinking water is associated mainly with deep wells drilled into aquifers containing elevated mineralizations of radioactive elements, and is not necessarily correlated with surface geological features. Dissolution of these minerals takes place very slowly; however, in cases where groundwater has been in contact with the rock over hundreds or thousands of years, significant concentrations may build up in the water. These concentrations are highly variable. Concentrations are determined not only by the composition of the underlying bedrock, but also by the particular physical and chemical conditions prevailing in the aquifer. Radionuclide concentrations can vary significantly in wells just a few metres apart. Even concentrations from the same well can vary seasonally or annually, depending on groundwater flow patterns. Nor can one assume that radionuclides in groundwater will be in secular equilibrium, even if equilibrium prevails in the bedrock.

The radionuclides most frequently reported in Canadian groundwater sources are 226Ra, 222Rn, and 210Pb from the uranium series. There are many instances in Canada where elevated uranium concentrations are found in groundwater but none of its decay products are detectable. Likewise, one may see 226Ra without 238U or 210Pb without its 226Ra precursor. The radionuclides most frequently reported in Canadian surface water sources are 226Ra, tritium, 90Sr, and 137Cs.

The occurrence of natural radionuclides in shallow wells is less frequent, although it cannot be ruled out. Elevated concentrations may occur if the overburden contains a significant amount of radioactive minerals or if the overburden is being fed from deep groundwater sources containing radionuclides.

4.1.1 Radon

Radon-222 is a chemically inert gas formed through the radioactive decay of 226Ra. Both are members of the 238U decay series. RadonFootnote 1 has a half-life of 3.82 days. Its decay products form a series of short-lived radionuclides (all solid elements) that decay within hours to 210Pb. Because of their short half-lives, the radon daughters rapidly approach radioactive equilibrium with their radon parent. Radon is soluble in water, with its solubility decreasing rapidly with increasing temperature (51.0, 22.4, 13.0 mL/100 mL at 0C, 25C and 50C, respectively) (IARC, 1988). Radon is extremely volatile and is readily released from water (NCRP, 1988).

Although most of the radon produced in soil from radium is retained in the earth, where it decays to 210Po and ultimately to 206Pb, a small portion diffuses out of soil pores and enters the atmosphere. One square metre of typical soil containing radium at 0.03 Bq/g will release between 1000 and 2000 Bq of radon to the atmosphere each day (UNSCEAR, 1988). Other sources of radon include groundwater that passes through radium-bearing rocks and soils, traditional building materials such as wallboard and concrete blocks, uranium tailings, coal residues, and fossil fuel combustion.

4.1.2 Natural radionuclides from human activities

Environmental levels of natural radionuclides may be enhanced by industrial processes, particularly uranium mining and milling operations. This source of radionuclides is particularly important in northern Saskatchewan, where most of the uranium mining activities in Canada are now located. Earlier uranium mining activities were centred in the Elliot Lake area, with drainage through the Serpent River into Lake Huron. An additional source of natural radionuclides is the uranium refining industry, with facilities at Port Hope on Lake Ontario and Blind River on Lake Huron.

The radionuclides observed are similar to those described above for groundwater sources. However, the water bodies most likely to be affected are surface water sources such as streams and lakes. At the back end of the nuclear fuel cycle, there is the possibility of natural radionuclides being leached from nuclear wastes in shallow burial sites. These radionuclides could potentially contaminate nearby shallow wells, although there are no known instances in Canada of drinking water supplies being affected by this source of radioactivity.

Other sources include fossil fuel combustion and the production and use of phosphate rock products (such as fertilizers). The combustion of fossil fuels, such as coal, for electric power generation releases 238U and 232Th decay series radionuclides and 40K in fly ash (Tracy and Prantl, 1985). Tracy and Prantl (1985) did not report any significant pathways from this source to drinking water supplies.

4.1.3 Cosmogenic radionuclides

Cosmogenic radionuclides, which are produced naturally by continuous cosmic ray bombardment of gases in the Earth's atmosphere, provide a small additional exposure to radiation. These radionuclides are removed to the Earth and enter surface drinking water supplies by the same processes as for nuclear weapons fallout, described in Section 4.2.1 The four important cosmogenic radionuclides, 14C, tritium, 22Na, and 7Be, together contribute a total dose to humans of about 15 Sv/year (UNSCEAR, 1988). They also provide a small natural background to the artificial radionuclides described below, particularly to the tritium and 14C concentrations in surface water.

4.2 Artificial Radionuclides

4.2.1 Fallout from nuclear weapons testing

Over the past 50 years, nuclear technologies have introduced significant quantities of artificial radionuclides into the global environment. These radionuclides contribute an additional radiation exposure over and above the natural background. The majority of these radionuclides resulted from atmospheric nuclear weapons tests conducted prior to the limited ban on atmospheric testing in 1963; additional tests conducted since that time have contributed only about 6% of the total global inventory of fallout radionuclides.

The fallout radionuclides receiving the greatest attention in environmental monitoring programs have been tritium (half-life = 12 years), 14C (half-life = 5730 years), 90Sr (half-life = 29 years), and 137Cs (half-life = 30 years), owing to their persistence in the environment and their entry into food chains leading to humans. Other fission products with short and intermediate half-lives that have been routinely detected include 95Zr (half-life = 64 days), 95Nb (half-life = 95 days), 106Ru (half-life = 368 days), 131I (half-life = 8 days), and 144Ce (half-life = 284 days). The transuranic elements 239Pu, 240Pu, 241Pu, and 241Am have also been detected in global nuclear weapons fallout. The total dose received by individuals in the North Temperate Zone (40-50N latitude), accumulated to the year 2000, for all atmospheric weapons tests conducted between 1945 and 1980 is estimated to be about 2.1 mSv (UNSCEAR, 1982).

Radionuclides from nuclear weapons testing enter the environment mainly through the atmospheric pathway. Lower-yield tests (less than 100 kt trinitrotoluene [TNT] equivalent) inject material primarily into the troposphere, where it is transported on a time scale of days to weeks around the globe. In contrast, the enormous heat generated by high-yield thermonuclear tests can lift radioactive material into the stratosphere (greater than 10 km altitude), where it may reside for months or years before returning to the lower levels of the atmosphere. Radioactive material in the atmosphere eventually settles to the ground as fallout. This may occur as a result of dry deposition (gravitational settling) or wet deposition (rainout). As a result of vertical and horizontal air circulation and mixing, traces of fallout radionuclides can be found virtually everywhere on Earth, in water, soils, and vegetation.

Fallout radionuclides may enter drinking water supplies either by direct deposition on the surfaces of rivers and lakes or by runoff of material previously deposited on land. The impact is mainly on surface water supplies, in contrast to the situation with natural radionuclides, as described above. Fallout radionuclides percolating into soils or sediments will tend to bind to particles near the surface and thus do not reach deep-lying groundwater supplies.

4.2.2 Emissions from nuclear reactors

In addition to global fallout from nuclear weapons testing, emissions from nuclear reactors are a potential source of artificial radionuclides in the environment. Although widespread contamination of the environment could occur in the unlikely event of a major accident, such emergency situations are beyond the scope of this document.

There are seven nuclear power generating stations in Canada: The Bruce Nuclear Generating Stations A and B, located in Kincardine, Ontario, on Lake Huron; Pickering Nuclear Generating Stations A and B, located in Pickering, Ontario, on Lake Ontario; The Darlington Nuclear Generating Station, located in Bowmanville, Ontario, on Lake Ontario; the Gentilly-2 Nuclear Generating Station, located in Gentilly, Qubec, on the St. Lawrence River; and the Point Lepreau Nuclear Generating Station, located in Point Lepreau, New Brunswick, on the Bay of Fundy. There are also several non-power nuclear reactors, used for scientific research and creation of certain radioisotopes for medical uses. This includes the Chalk River Laboratories, whose activities include non-power reactors, isotope production, fuel fabrication and research, tritium processing, waste management and waste treatment, and is located in Chalk River, Ontario, on the Ottawa River (CNSC, 2010). There are an additional 12 nuclear generation sites on the U.S. side of the Great Lakes basin that could have an effect on Great Lakes water quality.

In nuclear power reactors, large quantities of fission products are formed within the fuel rods, and large quantities of activation products are found in the structural materials and cooling circuits. Under normal conditions, virtually all of these fission products are contained until they undergo radioactive decay and become stable. Low levels of radionuclides are released routinely to the environment under controlled and monitored conditions, in quantities dependent on the reactor type and design. Atmospheric releases include tritium, radioiodine, fission product noble gases (88Kr, 133Xe), activation gases (14C, 16N, 35S, 41Ar), and particulates such as 60Co, 90Sr, and 137Cs. Radionuclides released into the aquatic environment include tritium and other fission products and activated corrosion products (UNSCEAR, 1988). Tritium in aqueous and gaseous emissions is the principal radionuclide released from Canadian Deuterium Uranium (CANDU) reactors.

Aquatic releases into streams and lakes can affect surface water supplies. Many of these radionuclides are readily adsorbed onto the surfaces of suspended particulates as a result of their low water solubilities and are removed from the water column by sedimentation. Examples of such radionuclides are the isotopes of cesium, manganese, iron, cobalt, and the actinides (including thorium and uranium). Elements that tend to remain in solution in water include strontium, chromium, and antimony.

The Canadian Nuclear Safety Commission (CNSC) regulates all nuclear activities in Canada. Regulations under the Nuclear Safety and Control Act address the development, production and use of nuclear energy in Canada: the production, possession, use and transport of nuclear substances; and the production, possession and use of prescribed equipment and prescribed information. The regulations also establish dose limits for the public and nuclear energy workers, with the latter partially based on the length of exposure (CNSC, 2010).

4.2.3 Other sources of artificial radionuclides

Artificial radionuclides may also be released into the environment from non-nuclear fuel cycle activities in industry and research and from usage in diagnostic and therapeutic medicine. Canadian facilities employing radionuclides are licensed by the CNSC for radionuclide usage, and their emissions into the environment are usually insignificant. The low activities and short half-lives of the radionuclides employed generally permit disposal through dilution and discharge into municipal sewer systems. Studies conducted to assess the importance of these sources show that the majority of radionuclides contained in sewer discharge are from natural or fallout origin (Durham and Joshi, 1981).

5.0 Exposure

This section summarizes the results of federal programs that monitor the levels of radionuclides in water supplies across Canada. It illustrates which radionuclides occur most frequently in Canadian waters, at what concentrations, and in what geological or geographical settings. The results of provincial/territorial monitoring surveys are presented in Appendix C.

From 1973 to 1983, the Radiation Protection Bureau of Health Canada monitored the levels of the natural radionuclides 226Ra, 210Pb, and total uranium in the drinking water supplies of 17 communities across Canada (Health Canada, 2000a). Most of these communities utilized surface water supplies, and the radionuclide concentrations were consistently low or non-detectable. The concentrations measured were as follows: uranium, < 0.1-1 g/L; 226Ra, < 0.005-0.02 Bq/L; and 210Pb, < 0.005-0.02 Bq/L.

After 1983, the program was reduced to just three municipalities: Elliot Lake and Port Hope (because of uranium mining and processing activities possibly impacting surface water supplies) and Regina (because of elevated uranium levels in the groundwater supply). During the later period, only 226Ra and total uranium were monitored; the levels of 210Pb had been shown to be consistently low or non-detectable in surface waters. The results for a 14-year period, from 1983 to 1996, are summarized in Table C-1 in Appendix C. Elliot Lake showed a slight elevation in 226Ra at the beginning of this period, although there appears to have been a gradual decline in both 226Ra and total uranium levels towards the end of the period. It is likely that detectable levels of 226Ra and total uranium in the Elliot Lake results were due to higher levels of natural radionuclides in a uranium-rich region rather than to uranium mining operations. Radium levels in Port Hope remained non-detectable throughout this period, and uranium concentrations were within the normal range for surface water supplies.

Drinking water for the city of Regina consistently showed uranium levels that are elevated above national averages, but still well below 20 g/L. Radium-226 levels remained below the detection limit of 0.005 Bq/L during the period 1983-1996. Historically, Regina derived its drinking water from both surface water and groundwater sources and adjusted the blend according to water quality and availability. Consequently, the uranium concentrations varied from year to year and from season to season.

The National Water Research Institute (Canada Centre for Inland Waters) of Environment Canada monitored radioactivity in Canadian surface waters from 1973 to the mid-1980s. Much of this work was focused on open waters of the Great Lakes (Durham and Joshi, 1981). A special project was carried out from 1981 to 1984 to characterize the levels of 226Ra, 137Cs, 125Sb, tritium, and uranium at 13 sites scattered across Canada (Baweja et al., 1987). Concentrations measured were as follows: 226Ra, 0.001-0.013 Bq/L; 137Cs, 0.0007-0.006 Bq/L; 125Sb, 0.001-0.016 Bq/L; tritium, 5-12 Bq/L; and uranium, 0.16-4.7 g/L.

From 1962 to 1994, Health Canada's Radiation Protection Bureau monitored 90Sr and 137Cs routinely in the drinking water supplies of communities located near nuclear reactor facilities (Health Canada, 2000a). Table C-2 in Appendix C compares the open lake results for 90Sr and 137Cs with results from locations near the Pickering and Bruce nuclear generating stations. The results near the nuclear generating stations are quite similar to the open lake values, confirming the conclusion that all 90Sr and 137Cs levels are due to global fallout rather than to emissions from nuclear reactors.

The results also show that concentrations of these radionuclides have varied with time and from one lake to another. Radionuclides are gradually removed from Great Lakes waters by radioactive decay, sedimentation, and flushing. Tracy and Prantl (1983) analysed the earlier data from Lakes Superior and Huron and found very slow half-times for removal of 90Sr from water (20 years for Lake Superior; 10 years for Lake Huron). The 137Cs concentrations showed two removal components: one fast, with the bulk of the activity being removed in much less than 1 year, and the other slower and more persistent, with a half-time of several years. They attributed this slow component to re-entry from lake sediments.

Monitoring of drinking water intakes downstream from the Gentilly nuclear reactor in Quebec has not shown any evidence of contamination by fission or activation products. Liquid releases from the Point Lepreau nuclear generating station in New Brunswick enter directly into the Bay of Fundy and thus do not impact drinking water supplies.

Average tritium concentrations in surface waters across Canada are on the order of 5-12 Bq/L (Baweja et al., 1987). Great Lakes open water values ranged from 7 to 10 Bq/L during 1982-1984. Chant et al. (1993) reported average tritium concentrations in Lake Ontario of 9-11 Bq/L. They concluded that only about 10% of this amount could be due to reactor releases from Pickering.

Occasionally, upsets at nuclear facilities have given rise to brief increases in tritium levels at nearby drinking water intakes. These increases have always been of short duration, not lasting more than a few days. In June 1991, Health Canada monitored the levels of tritium in the Ottawa River following a spill from Chalk River Nuclear Laboratories. The highest concentration reached in drinking water was about 400 Bq/L at Petawawa. At Ottawa (200 km downstream), the levels had been diluted to about 150 Bq/L. Minute traces of tritium from the release were detected at Montreal, past the point of confluence of the Ottawa and St. Lawrence rivers. In September 1983, a release of 222 TBq of tritium from the Douglas Point reactor on the Bruce Peninsula, Ontario, provided an opportunity to model the dispersal of tritium in Lake Huron (Veska and Tracy, 1986). A prevailing counter-clockwise circulation pattern in the lake carried the tritium plume northeastward to Port Elgin, where drinking water levels reached 1600 Bq/L during a 2-day period.

The Newfoundland and Labrador Department of the Environment (Guzwell, 2002) initiated a program of screening public water supplies for natural radionuclides by testing for uranium. Of the 128 public groundwater supplies tested, only one contained uranium at a concentration above 20 g/L (79 g/L). Retesting this one supply for 210Pb and 226Ra showed these parameters to be well below the current Canadian drinking water guidelines. One private well, which has since been abandoned, had a uranium concentration of 160 g/L. Testing was also carried out at public schools utilizing groundwater sources. Two water supplies out of 68 tested had uranium levels above 20 g/L (51 and 78 g/L). In both cases, the water fountains were turned off, and other sources of drinking water were provided.

In Nova Scotia, extensive uranium mineralizations have led to elevated concentrations of radionuclides at a number of locations. In a 2002 survey (Drage et al., 2005), 52 public school water wells were tested for total uranium and 14 naturally occurring radionuclides, most of which are daughter products of the uranium and thorium decay series. The results are summarized in Table 1.

Table 1. Radionuclide results for 52 school water wells in Nova Scotia
Radionuclide Maximum level Number of guideline exceedances

Table 1 footnotes

Table 1 footnote 1

The original sampling results from this survey showed that 12 water supplies exceeded the 210Pb guideline; however, it was subsequently determined that radon decay in the sample bottles had affected the sample results. Follow-up sampling, which corrected for radon decay effects, showed that only one of these water supplies exceeded the 210Pb guideline.

Return to the first reference of table 1 footnote 1 referrer

7Be < 5 Bq/L -
210Bi 0.44 Bq/L -
210Pb 0.24Table 1 footnote 1 Bq/L 1 of 52 (2%)Table 1 footnote 1
210Po 0.12 Bq/L -
224Ra < 0.01 Bq/L -
226Ra 0.04 Bq/L -
228Ra 0.14 Bq/L -
228Th < 0.01 Bq/L -
230Th 0.01 Bq/L -
232Th < 0.01 Bq/L -
234Th < 4 Bq/L -
234U 1 Bq/L -
235U 0.05 Bq/L -
238U 1 Bq/L -
Total uranium 0.081 mg/L 2 of 52 (4%)

A follow-up survey (Drage et al., 2005) tested for total uranium, 210Pb, and 226Ra at all public schools in Nova Scotia with drilled wells (178 wells). The results in Table 2 show that the maximum 210Pb level was 0.24 Bq/L, and the maximum total uranium concentration was 0.12 mg/L.

Table 2. Radionuclide results for 178 school water wells in Nova Scotia
Radionuclide Maximum level Number of guideline exceedances
210Pb 0.24 Bq/L 1 of 178 (< 1%)
226Ra 0.08 Bq/L -
Total uranium 0.12 mg/L 3 of 178 (2%)

New Brunswick has geological formations similar to those of Nova Scotia; thus, there is the potential for entry of natural radionuclides into groundwater. In the summer of 1983, Health Canada carried out a survey of natural radionuclides from 53 community wells across New Brunswick. The results are summarized in Table 3. These levels are typical of background levels across Canada; however, in a separate survey, wells from at least one community, Harvey Station, had uranium concentrations of 0.01-0.4 mg/L.

Table 3. 1983 survey of New Brunswick wells
Radionuclide Mean level Range
Uranium 0.013 mg/L 0.0001-0.0039 mg/L
226Ra 0.0032 Bq/L 0.0001-0.012 Bq/L
222Rn 13.4 Bq/L 0.2-39 Bq/L

In Ontario, the average concentration of tritium in over 3000 drinking water samples taken between 2000 and 2006 was 5-10 Bq/L. Single values of 120 Bq/L and 24 Bq/L, respectively, were observed in raw water at Southampton and Port Elgin, near the outflow of the Bruce Nuclear Power Development. One value of 18 Bq/L was observed at the R.C. Harris water treatment plant in Toronto. Earlier, from 1989 to 1992, a subset of about 530 Ministry of Labour samples was collected from surface waters known to have been impacted by uranium mining and milling operations (MOEE, 1993). The ranges of measured 224Ra and uranium concentrations are shown in Table 4. The highest 224Ra concentration was 12 Bq/L, and the highest uranium concentration was 2.9 mg/L. Note that these water bodies were not being used as sources of drinking water.

Table 4. Concentrations of 224Ra and uranium in Ontario surface waters near uranium mining and milling operations from 1989 to 1992
Range of 224Ra concentrations (Bq/L) Number of samples within the range Range of uranium concentrations (mg/L) Number of samples within the range
0-0.05 385 0-0.02 413
> 0.05-0.1 88 > 0.02-0.1 89
> 0.1-0.5 51 > 0.1-1 27
> 0.5 4 > 1 4
Total number of samples 528 Total number of samples 533

More recent data (2000-2006) from the Ontario Ministry of the Environment's Drinking Water Surveillance Program indicated that the highest level of gross alpha activity in drinking water was 0.95 Bq/L, the highest level of gross beta activity was 0.38 Bq/L, and the highest level of tritium was 16 Bq/L. The corresponding levels in the raw water are similar.

In Saskatchewan, approximately 16 communities have uranium levels in drinking water that relatively consistently exceed 0.020 mg/L. The maximum concentration of uranium detected in a Saskatchewan municipal water supply is 0.039 mg/L; however, water from this system is reportedly not consumed.

5.1 Radon

Radon is the most important source of naturally occurring radiation exposure for humans. For the world population, radon exposure represents 43% of the total exposure to natural background radiation, followed by Earth gamma radiation at 15%, cosmic radiation at 13% and natural radiation from food and water at 8% (WHO, 2008). Ninety-five percent of exposure to radon is from indoor air, with about 1% coming from drinking water (WHO, 2004).

There are few data on radon concentrations in Canadian drinking water supplies. Water drawn from surface water supplies does not generally contain appreciable levels of radon, which are expected to be on the order of 0.01 Bq/L. One survey of Canadian groundwater sources containing elevated levels of radon found radon concentrations in the range 1700-13 700 Bq/L in Halifax County, Nova Scotia. A second survey detected radon at concentrations as high as 3000 Bq/L in well water in Harvey, New Brunswick, with 80% of the wells containing radon concentrations below 740 Bq/L. In a more recent study, as part of a province-wide radionuclide testing program, levels of radon were measured in the drinking water from 16 schools in Nova Scotia; radon levels reportedly ranged from 120 to 1400 Bq/L, with an average of approximately 600 Bq/L (Drage et al., 2005). Levels of radon this high in drinking water can lead to significant levels of 210Pb in only a few days (Drage et al., 2005).

The major concern for radon gas as a hazard is its occurrence in the air inside buildings and underground work areas. In air, the principal effect is to the lung, because of the inhalation and accumulation within the respiratory system of the short-lived decay products attached to inert dust particles normally present in the atmosphere. Although radon may be ingested through drinking water, radon contained in water is, to some extent, transferred to air, and therefore the total radiation exposure can result from both ingestion and inhalation.

The relationship between the concentration of radon in the water supply and the concentration of radon in indoor air depends on several factors, including the rate and type of usage of the water (e.g., drinking water, showers, laundry), the loss or transfer of radon from the water to the air, and the characteristic ventilation of the house. The rate of release of radon from water depends on such factors as agitation, surface area, and temperature. Numerous authors (Cothern, 1987; UNSCEAR, 1988; Life Systems Inc., 1991; U.S. EPA, 1991) have reported a water-to-air transfer factor of 10-4 for a typical residential dwelling, which would mean that a radon concentration of 1000 Bq/L in drinking water would on average increase the indoor air radon concentration by 100 Bq/m3, with the highest concentration being expected in the rooms where radon is released (UNSCEAR, 1988). Nazaroff et al. (1987) estimated that, based on measurements in U.S. homes and water supplies, public supplies derived from groundwater serving 1000 or more persons contribute about 2% to the mean indoor radon concentration for houses using these sources. The average doses from radon in drinking water have been calculated as being as low as 0.025 mSv/year via inhalation and 0.002 mSv/year via ingestion, compared with the background inhalation dose of 1.1 mSv/year from air (UNSCEAR, 2000).

6.0 Analytical methods

A brief summary of analytical methods is given below for the radionuclides most commonly encountered in Canadian drinking water supplies and for which MACs are established. U.S. EPA-approved methods are listed in this document. There are also a number of U.S. EPA-approved analytical methods that were not developed by the U.S. EPA but are listed on their website (U.S. EPA, 2008). Where relevant, methods used or developed by Health Canada are also noted, although these are generally used in a research context.

6.1 Lead-210

Analytical methods for 210Pb generally involve an initial purification of the lead by precipitation (Chiu and Dean, 1986). After allowing sufficient time for ingrowth of 5-day 210Bi, the bismuth is isolated by solvent extraction and subsequently precipitated (e.g., as bismuth oxychloride). The precipitate is collected on filter paper, and the high-energy beta particles are detected in a low-background gas-flow proportional counting system (Chiu and Dean, 1986). Detection limits of 0.005-0.02 Bq/L are routinely achievable for a 1-L water sample.

Since 210Pb is a decay product of 222Rn, the presence of dissolved radon in groundwater can alter the measurement of 210Pb, leading to an artificially high value. This is likely to occur if the sample has been allowed to sit in a sealed airtight container or where high concentrations of radon are present. When groundwater sources are being monitored for radionuclides, either the 210Pb should be extracted from freshly collected samples or dissolved radon should be removed before significant decay to 210Pb has occurred. According to the widely used Pylon technique (Pylon, 1989), detection of radon in drinking water is performed using a water degassing unit and Lucas scintillation chambers. Dissolved radon can be removed either by boiling the sample or by aerating the sample using a blender. Water that has been left to stand will have reduced radon activity.

No U.S. EPA-approved analytical method was identified for 210Pb.

6.2 Radium-226

The conventional method for detecting 226Ra is through co-precipitation with barium sulphate followed by alpha counting on a gas-flow proportional counter or alpha spectrometer (Chiu and Dean, 1986). A routinely achievable detection limit is 0.005 Bq/L.

A more rapid procedure has been developed at Health Canada that involves multiple ion exchange column separations followed by liquid scintillation counting of the ingrowth of 222Rn. This gives a detection limit of 0.002 Bq/L.

The U.S. EPA-approved analytical methods for 226Ra include EPA Methods 903.0 and Ra-03 using radiochemical methodology, and EPA Methods 903.1 and Ra-04 using the radon emanation technique. The U.S. EPA-approved analytical methods for 228Ra include EPA Methods 904.0 and Ra-05 using radiochemical methodology (U.S. EPA, 2008).

6.3 Total uranium

The traditional method for analysing total uranium in water involves fusion of the uranium with a sodium fluoride pellet. The uranium is then excited by ultraviolet light, and the resulting fluorescence is measured by a photomultiplier. A detection limit of 0.1 g/L has been achieved by Health Canada laboratories using this method (Health Canada, 2000a). In an alternative procedure, uranium in water is complexed with phosphoric acid (Bring and Miller, 1992). The uranyl complex will then phosphoresce after being excited by laser light, and a measurement of the light output gives the concentration of uranium in the sample. The detection limit routinely achievable by this method is 0.5 g/L.

If the isotopic composition of the uranium is required, then the methods of alpha spectrometry (EML, 1983) and inductively coupled plasma mass spectrometry (Igarashi et al., 1989) are available.

The U.S. EPA-approved analytical methods for uranium include EPA Method 908.0 using radiochemical methodology, EPA Method 908.1 using fluorometric methodology and EPA Method 00-07 using alpha spectrometry (U.S. EPA, 2008).

6.4 Tritium

Tritium decays by the emission of a low-energy (18.6 keV) beta particle, with no associated gamma rays or X-rays. The best method to measure tritium in water is liquid scintillation counting, which avoids self-absorption losses of the beta radiation. A detection limit of about 10 Bq/L is routinely achievable at Health Canada laboratories (Health Canada, 2000a).

U.S. EPA-approved analytical methods for tritium include EPA Method 906.0 using liquid scintillation (U.S. EPA, 2008).

6.5 Strontium-90

Strontium-90 decays by pure beta emission with a maximum energy of 546 keV. Since there is no associated gamma ray for easy identification, it is usually necessary to chemically separate the strontium before measuring the radiation with a beta detector. The conventional method employed by Health Canada (Department of National Health and Welfare, 1977) involves carbonate precipitation of the alkaline earth group, followed by dissolution in nitric acid, removal of the barium and radium by chromate precipitation, and final precipitation of the strontium as a carbonate. The strontium beta particles are counted with a gas-flow proportional counter. A typical detection limit is 0.001 Bq/L.

Recently, a more rapid method has been developed at Health Canada to allow processing of large numbers of samples during a nuclear emergency (Larivire et al., 2009). This procedure involves separation of strontium from multiple water samples on a cation exchange manifold followed by cleanup with high-pressure ion exchange chromatography. The strontium beta particles are counted either immediately by liquid scintillation or later with a Cerenkov counter. The liquid scintillation method is more rapid but gives a detection limit of only 0.2 Bq in a 1-L water sample. The off-line method must allow for the ingrowth of 2.7-day 90Y and gives a detection limit of 0.02 Bq/L.

The U.S. EPA-approved analytical methods for both 89Sr and 90Sr include EPA Method 905.0 using radiochemical methodology (U.S. EPA, 2008).

6.6 Iodine-131

Radioactive isotopes of iodine from 131 through 135 are easily detected and measured by a gamma spectrometry system. Detection limits are comparable to those achievable for radiocesium.

The U.S. EPA-approved analytical methods for radioiodine include EPA Method 902.0 using radiochemical methodology and EPA Method 901.1 using gamma ray spectrometry (U.S. EPA, 2008).

6.7 Cesium-137

Cesium-137 can be readily detected by a gamma spectrometry system through its characteristic 661.6-keV gamma ray (ASTM, 2006). A detection limit of 0.001 Bq/L can be achieved at Health Canada by evaporating down 60 L of water before counting.

The U.S. EPA-approved analytical methods for cesium are EPA Method 901.0 using radiochemical methodology and EPA Method 901.1 using gamma ray spectrometry (U.S. EPA, 2008).

6.8 Radon

Two methods are approved and one alternative method is recommended by the U.S. EPA (1999) for the measurement of radon in drinking water, all of which use scintillation counting. The approved methods are Standard Method 7500-Rn B - Liquid Scintillation (APHA et al., 2005) and the EPA de-emanation method (U.S. EPA, 1987). The alternative method recommended is ASTM Method D-5072-92 (ASTM, 1998). All methods require careful sampling because of the rapid loss of radon from the water when it is agitated and opened to the atmosphere.

In Standard Method 7500-Rn B, the water is injected directly into a scintillation solution and counted in an automated liquid scintillation device; a minimum detectable concentration of 18 pCi (0.67 Bq/L) is listed (APHA et al., 2005). In the de-emanation method, radon is degassed from the water and transferred into a Lucas scintillation cell, with a detection limit of approximately 0.05 Bq/L (Crawford-Brown, 1989). Radon in drinking water can be measured at concentrations above 0.04 Bq/L by ASTM Method D-5072-92 (ASTM, 1998).

6.9 Gross alpha and gross beta activity measurements

Analysing drinking water for gross alpha and gross beta activities (excluding radon) may be done by evaporating a known volume of the sample until dry and then measuring the activity of the residue. Since alpha radiation is easily absorbed within a thin layer of solid material, the reliability and sensitivity of the method for alpha determination may be reduced in samples with a high total dissolved solids (TDS) content.

Where possible, standardized methods should be used to determine concentrations of gross alpha and gross beta activities. Standardized methods for evaporation include ISO 9696:2007 for gross alpha determination (ISO, 2007) and ISO 9697:2008 for gross beta determination (ISO, 2008). Determining gross beta activity using the evaporation method must take into account the contribution from 40K. An additional analysis of total potassium should be done if the gross beta screening value is exceeded. The evaporation method is used for groundwater with a TDS content greater than 0.1 g/L. The detection limit for this method ranges from 0.02 to 0.1 Bq/L.

Another standardized method, the co-precipitation technique (APHA et al., 1998), excludes the contribution from 40K and therefore does not require the determination of total potassium. This method cannot be used in assessing water samples containing certain fission products, such as 137Cs; however, concentrations of fission products in drinking-water supplies generally are extremely low. TDS is not a concern with this method. The detection limit of this method is 0.02 Bq/L (APHA et al., 1998).

Although screening for gross alpha and gross beta activities reduces the number of costly analyses for specific radionuclides, it is a measurement tool that has a number of drawbacks. Some of these drawbacks include false-positive detections, particularly in the case of gross alpha measurements when dissolved radon is present. Concentrations of tens of becquerels per litre are not uncommon; however, in most of these cases, detailed analyses show no radionuclides to be in excess of their MACs. False negatives may also occur if there is a large amount of TDS in the water sample. When the sample is evaporated to dryness, self-absorption of the particles may lead to a significant loss in count rate. Laboratories that carry out gross measurements routinely report wide fluctuations in count rates, even for samples taken from the same source. If detectors used for gross measurements are operated in the alpha-and-beta mode to allow simultaneous detection, this can lead to crosstalk or spillover between alpha and beta channels and increase the analytical errors in an unpredictable manner. Despite these drawbacks, gross alpha and gross beta screening is useful in detecting changes in a drinking water supply whose composition has been well characterized by previous radionuclide measurements. In order to reduce costly and repetitive analyses for specific radionuclides, an agency may wish to define its own operational screening levels based on gross radioactivity measurements.

If a drinking water sample exceeds the screening value for gross alpha activity (0.5 Bq/L) or gross beta activity (1 Bq/L), it is recommended that the analysis be repeated to check the validity of the result. If the initial result is confirmed, then the sample should be analysed for specific radionuclides whose presence might be suspected, based on the type and location of the drinking water supply. For example, a ground water supply, far removed from any nuclear facility, is not likely to contain any artificial radionuclides. It should be analyzed initially for the naturally-occurring radionuclides 210Pb, 226Ra and total uranium. If these radionuclides are not present in sufficient abundance to explain the gross alpha or beta count, then analyses should be extended to additional members of the natural decay series, particularly 228Ra and dissolved 222Rn. The later is not considered to be a drinking water hazard, but its presence may explain a high gross alpha or beta count, and should prompt an analysis for indoor air levels.

On the other hand, if the sample is taken from a surface water supply located near a nuclear facility, then it should be analyzed for radionuclides that might be suspected from that facility. For a nuclear reactor, the analyses should begin with the four artificial radionuclides - tritium, 90Sr, 131I, and 137Cs - most likely to be found in emissions from a reactor. If their abundances are not sufficient to explain the gross particle count, then the presence of other fission products should be considered. In most cases this can be done quite simply, without the need for further analyses. If a gamma ray measurement was carried out with sufficient sensitivity to detect 131I and 137Cs at the beta screening level of 1 Bq/L, then any other gamma-emitting fission products present in the sample will also show up at this level.

7.0 Treatment technology

Several technologies exist to remove radiological contaminants from drinking water to levels considered to be protective of human health. It should be noted that the chemical characteristics of tritium prevent its removal from water, emphasizing the need to prevent contamination of the source water.

Because of the various factors influencing the removal of the various radionuclides, such as the chemical characteristics of a specific water and the nature of the contaminants, testing of any potential treatment process should be conducted with the actual water to be treated before the final selection of a treatment technology. Handling and disposal of waste produced by the treatment need to be carefully addressed when removing radionuclides from drinking water.

7.1 Municipal scale

Most radionuclides can be effectively treated in municipal-scale treatment facilities. Generally, the technologies recognized for the removal of radionuclides at municipal-scale treatment plants are ion exchange, reverse osmosis and lime softening (U.S. EPA, 2000b). Removal efficiency is influenced by the specific chemical characteristics of the water. Reported removal efficiencies by reverse osmosis varied from 70 to 99% (Annanmki, 2000; Health Canada, 2004). The ion exchange process is also dependent on the specific ion exchangers utilized, and the removal efficiency was reported as high as 95% (U.S. EPA, 2000a). Reported efficiency for lime softening treatment for the removal of radium varied from 80% to 95% (U.S. EPA, 2000a). For artificial radionuclides such as tritium, the strategy should be to prevent contamination of the source water.

Technologies recognized specifically for the removal of both 226Ra and 228Ra, in addition to ion exchange, reverse osmosis and lime softening, are greensand filtration, precipitation with barium sulphate, electrodialysis/electrodialysis reversal and hydrous manganese oxide filtration (U.S. EPA, 2000b). The ion exchange process to remove radium uses a cation exchange medium rejuvenated by an acid solution. The acid solution is prepared with common salt (sodium chloride) or, alternatively, with a potassium salt (Annanmki, 2000).

In addition, technologies recognized specifically for the removal of uranium are activated alumina (reported removal efficiency up to 99%) and enhanced coagulation and filtration (U.S. EPA, 2000b). The ion exchange process to remove uranium requires ion exchange media rejuvenated by a strong base solution. A strong base solution with sodium hydroxide is often preferred (Annanmki, 2000).

The U.S. EPA lists high-performance aeration as the best available technology for the removal of radon in groundwater supplies. High-performance aeration methods include packed tower aeration and multistage bubble aeration and can achieve up to 99.9% removal. However, these methods may create a large source of airborne radon.

Adsorption via granular activated carbon (GAC), with or without ion exchange, can also achieve high radon removal efficiencies, but is less efficient and requires large amounts of GAC, making it less suitable for large systems. GAC and point-of-entry GAC may be appropriate for very small systems under some circumstances (U.S. EPA, 1999). Two potential concerns associated with this technology are the elevated gamma radiation fields that develop close to the column and the waste disposal difficulties associated with treatment residuals.

Wastewater and solid waste produced by drinking water treatment need to be evaluated for their radiation level and disposed of as per the applicable regulation.

7.2 Residential scale

Municipal treatment of drinking water is designed to reduce contaminants to levels at or below their guideline values. As a result, the use of residential-scale treatment devices on municipally treated water is generally not necessary, but is primarily based on individual choice. In cases where an individual household obtains its drinking water from a private well, residential drinking water treatment devices may be an option for removing radionuclides from the water.

Residential treatment devices are available that are affordable and can remove some radionuclides from drinking water to make it compliant with the applicable guidelines. Periodic testing by an accredited laboratory should be conducted on both the water entering the treatment device and the water it produces to verify that the treatment device is effective. Devices can lose removal capacity through usage and time and need to be maintained and/or replaced. Consumers should verify the expected longevity of the components in their treatment device as per the manufacturer's recommendations.

The most common types of treatment devices available for the removal of radionuclides from drinking water in a residential setting are ion exchange and reverse osmosis systems (Health Canada, 2004). Efficiencies for point-of-use and point-of-entry treatment technologies are generally similar to those for municipal-scale treatment. Water softeners are an ion exchange technology that can remove radionuclides (but note that individuals on sodium-restricted diets should consult their physician before drinking artificially softened water). Similar to the municipal-scale ion exchange, the rejuvenating solution required could vary for different contaminants. For example, uranium removal requires a rejuvenating solution with a strong base, whereas radium removal requires an acid rejuvenating solution. Water softeners rejuvenated with common salt (sodium chloride) solution could add a significant amount of sodium to the water. This factor needs to be considered when selecting the most appropriate treatment process.

There is no certified treatment process for the removal of 210Pb. However, since the lead isotope behaves chemically like elemental lead, the treatment process for the removal of the stable lead would also remove the radioactive isotope. The treatment systems certified for lead removal are adsorption (i.e., carbon/charcoal), reverse osmosis, and distillation (NSF International, 2005a). Although there are currently no certified products for the reduction of uranium, reverse osmosis, distillation, or ion exchange resins should be able to remove uranium from drinking water.

A number of residential treatment devices are available to remove radon from drinking water to concentrations below 300 pCi/L (11 Bq/L). Filtration systems may be installed at the faucet (point-of-use) or at the location where water enters the home (point-of-entry). Point-of-entry systems are preferred for radon because they provide treated water for bathing and laundry as well as for cooking and drinking. In the case where certified point-of-entry treatment devices are not available for purchase, systems can be designed and constructed from certified materials.

Health Canada does not recommend specific brands of treatment devices, but it strongly recommends that consumers look for a mark or label indicating that the device has been certified by an accredited certification body as meeting the appropriate NSF International (NSF)/American National Standards Institute (ANSI) standard. These standards have been designed to safeguard drinking water by helping to ensure the material safety and performance of products that come into contact with drinking water. Certification organizations provide assurance that a product or service conforms to applicable standards and must be accredited by the Standards Council of Canada (SCC). In Canada, the following organizations have been accredited by the SCC to certify drinking water devices and materials as meeting NSF/ANSI standards:

An up-to-date list of accredited certification organizations can be obtained from the Next link will take you to another Web site Standards Council of Canada.

Drinking water treatment devices certified to remove 226Ra from untreated water (such as from a private well) use ion exchange and reverse osmosis. Consumers should use only certified treatment devices; these are certified to reduce levels from an average influent (challenge) concentration of 25 pCi/L (925 mBq/L) to a maximum finished effluent concentration of 5 pCi/L (185 mBq/L) or less (NSF International, 2005b).

The treatment devices certified to remove radon are generally based on activated carbon adsorption technology. Certification for the removal of radon requires that the device be capable of reducing the concentration of radon from an influent (challenge) concentration of 4000 pCi/L (148 Bq/L) to a maximum final (effluent) concentration of less than 300 pCi/L (11 Bq/L) (NSF International, 2005b).

In treating drinking water with naturally occurring levels of radionuclides, the liquid and solid waste from point-of-use/point-of-entry treatment may generally be eliminated in sewer or septic systems for the liquids and in municipal landfills for the solids; however, consumers should consult with the appropriate authority before doing so.

8.0 Health effects

8.1 Dose concepts and units

Radiological protection requires the establishment of a link between exposure to radiation and biological outcomes. This link is provided by the absorbed dose or amount of energy imparted by ionizing radiation to a unit mass of tissue. The International System of Units (SI) unit for absorbed dose is the gray (Gy), where 1 gray is equal to 1 joule of energy absorbed per kilogram of tissue.

The extent of biological damage depends not only on the absorbed dose but also on the type or quality of the radiation. A given dose of alpha radiation, because of its higher ionization potential, will produce much more damage than the same dose of X-rays or gamma rays. To put all ionizing radiations on an equal footing in terms of biological harm, the ICRP (1996) has introduced a set of radiation weighting factors as follows:

  • 1 for beta rays, gamma rays, and X-rays;
  • 10 for neutrons; and
  • 20 for alpha particles.

An absorbed dose in grays multiplied by a radiation weighting factor gives an equivalent dose in sieverts (Sv). The sievert is actually a very large unit. Normal background radiation exposure in Canada is about 2-3 mSv/year. A single chest X-ray would give an exposure of about 0.01 mSv. For simplicity, equivalent dose will be referred to as dose throughout the rest of this document.

The radiation dose resulting from the ingestion of a radionuclide through food or drinking water depends on a number of factors, including:

  • the amount of activity taken into the body;
  • the energy and the weighting factor of the radiation;
  • the percentage uptake by the gastrointestinal tract;
  • the distribution of the radionuclide to the various organs of the body; and
  • the period of time during which the radionuclide remains in the body.

Depending on the chemical and biological properties of the radionuclide, it may persist in the body for times varying from days to years. Internal exposures are therefore measured in terms of the integrated or committed dose. Standard periods of integration are 50 years for the adult population and 70 years for a lifetime exposure. The ICRP has incorporated all of these factors into a set of dose coefficients for all the common radionuclides, which give the committed dose from the ingestion or inhalation of one unit of radioactivity.

8.2 Health effects from radiation exposure

Radioactivity refers to the particles that are emitted from nuclei as a result of nuclear instability. The most common types of radiation are alpha, beta, and gamma radiation. Alpha particles consist of two protons and two neutrons bound together into a particle identical to a helium nucleus; these particles are emitted by radioactive nuclei such as uranium or radium by a process known as alpha decay. Beta particles are high-energy, high-speed electrons or positrons emitted by certain types of radioactive nuclei, such as 40K; the production of beta particles is called beta decay. Alpha emitters and beta emitters differ in the magnitude of their biological effects. Alpha particles interact very strongly with human tissues through a transfer of energy. Beta particles interact less strongly than alpha particles, which allows them to travel farther through tissue before their energy is transferred. The difference between alpha and beta particle effects is the concentration of tissue damage. Alpha particles may damage many molecules over a short distance, whereas beta particles may damage molecules spread out over a greater distance. The extent of damage depends upon the energy emitted by individual alpha or beta particle species. Gamma radiation is a form of electromagnetic radiation or light emissions at a specific frequency resulting from subatomic particle interactions such as radioactive decay. Gamma radiation is generally considered as light having the highest frequency and energy as well as the shortest wavelength within the light spectrum. As a result of its high energy content, gamma radiation is able to cause serious damage when absorbed by living cells.

When ionizing radiation passes through matter, neutral atoms and molecules acquire an electric charge as a result of interactions in which small amounts of energy are transferred to the atoms of the material. If the material is body tissue, this can result in alterations of sensitive biological structures.

Radiation causes damage to human tissue or any other material through the ionization of atoms. Ionizing radiation absorbed by human tissue has enough energy to remove electrons from the atoms that make up molecules of the tissue. In very simple terms, when an electron shared by atoms forming a molecular bond is dislodged, the bond is broken, and the molecule falls apart. This process may occur by a direct "hit" to these atoms, or it may result indirectly by free radical formation due to irradiation of adjacent molecules. The most sensitive structure in the cell is the DNA molecule, which carries the genetic blueprint for the cell and, indeed, for the whole organism. If radiation damage to the DNA is not repaired, the cells may fail to survive or reproduce. If insufficient cells survive, then loss of tissue or organ function may occur. Alternatively, the damage may be incompletely or improperly repaired so that the cells continue to divide, but become transformed or cancerous.

There are basically two broad classes of radiation effects on the human body. The first class involves deterministic effects, which do not occur until the dose reaches a certain threshold level. Above this level, the effect will definitely occur, and the severity of harm will increase with dose. Deterministic effects include nausea, vomiting, diarrhoea, hair loss, haemorrhage, immune function loss, nervous system damage, and death. The threshold for these effects in humans is about 500 mSv delivered over a short period of time (hours to days). Fortunately, such doses are extremely rare and do not arise from environmental exposures, such as ingestion of radionuclides in drinking water.

The second class of effects is termed stochastic effects, which means that the likelihood of occurrence increases with the amount of radiation received. These effects may occur at doses well below the threshold for deterministic effects. The main stochastic effects in humans are cancer in the exposed individual and possible genetic effects in the offspring. The types of cancer most frequently associated with radiation exposure are leukaemia and solid tumours of the lung, breast, thyroid, bone, digestive organs, and skin. The latency period between exposure and recognition of a cancer can range from 5 years to several decades (ICRP, 1991). No conclusive evidence exists for hereditary radiation effects in humans, although experimental studies on plants and animals suggest that such effects occur.

Radiation-induced cancers are indistinguishable from those that occur from other causes. The correlation between radiation and cancer induction can be shown only in large populations of irradiated individuals as an increase of cancers over the background incidence. The main sources of epidemiological information on radiation risks and effects have come from studies of individuals or groups who have had relatively high exposures, such as:

  • atomic bomb survivors at Hiroshima and Nagasaki;
  • patients who received high radiation doses for diagnostic or therapeutic purposes; and
  • occupationally exposed workers, including uranium miners and radium dial painters.

Since it is impossible to establish with any certainty the shape of the dose-response relationship for stochastic effects, particularly at low doses and low dose rates, it is usually assumed that the frequency of their occurrence is linear with dose and without threshold (linear no-threshold hypothesis). The absence of a threshold implies that there is no dose, however small, that may be considered absolutely safe. This assumption is simple, and there is considerable evidence that it is conservative (i.e., it is more likely to overestimate than to underestimate the risk). In 1996, ICRP published a revised set of dose coefficients for individuals. This revision is based on more recent measurements and more accurate metabolic models for the uptake and retention of radionuclides by the various organs of the body. The revised dose coefficients used to derive the MACs in this technical document assume the linear no-threshold hypothesis.

The ICRP (1991) has determined the lifetime probability of fatal cancer induction following a single low-dose, low dose rate exposure to be 5% per sievert. If allowance is made for hereditary risks and for non-fatal cancers weighted for severity, then the lifetime risk rises to 7.3% per sievert. The CNSC has set a dose limit of 1 mSv/year for members of the public (for artificial sources excluding natural background radiation and medical treatments). The linear no-threshold hypothesis implies that exposure to 1 mSv/year of radiation would give a lifetime excess cancer risk of 7.3 × 10-5 per year of exposure. Radionuclides in drinking water are assessed based on a reference dose level that is one-tenth of this dose limit. Even if the linear no-threshold hypothesis is valid, any health effects produced at this low level of exposure would be lost in the statistical background of spontaneous occurrences.

8.2.1 Radon

No experimental or epidemiological studies have linked ingested radon with health impacts in humans, and it has generally been concluded from experimental animal studies that the risk from ingestion is insignificant compared with the risk from inhalation. The U.S. National Research Council (U.S. NRC, 1999a) estimates the risk of death from stomach cancer following lifetime exposure to 1 Bq/L of radon by ingestion to be 2.0 × 10-12 based on calculations with risk projection models for specific cancer sites. Radon consumed in water appears to rapidly enter the bloodstream from the stomach (Crawford-Brown, 1989), perfusing all the cells of the body (Gosink et al., 1990). As it is lipid soluble (IRC, 1928; von Dobeln and Lindell, 1965), it does not distribute evenly throughout the body (Hursh et al., 1965). Clearance of radon from the bloodstream is relatively rapid, with a half-time on the order of minutes (Underwood and Diaz, 1941; Lindell, 1968).

Most radon inhaled is exhaled and remains in the lungs for only a short period of time. The radon daughter 218Po is very reactive and electrostatically attracted to tiny particulates in air. These particulates are inhaled and deposited in the lung. Radon's daughters then decay sequentially, releasing damaging alpha and beta particles. Therefore, it is the radon progeny, not radon itself, that actually cause damage to the bronchial epithelium, because only the progeny remain in the lungs long enough to decay significantly.

Radon is classified as a human carcinogen (IARC, 1988). This classification is based on the strong evidence of lung cancers in underground miners exposed to high levels of radon (Lubin et al., 1994; U.S. NRC, 1999b). A combined analysis of 11 cohorts of 65 000 underground miners conducted by Lubin et al. (1994) with an update by the U.S. NRC (1999b) provides a thorough assessment of lung cancer risks associated with radon. These cohort studies show that about 40% of the greater than 2700 lung cancer deaths that occurred among the 65 000 miners were due to radon (Lubin et al., 1995). Extrapolating the miner cohort data to lower household radon levels results in an odds ratio of 1.12 (95% confidence interval = 1.02-1.25) per 100 Bq/m3.

More recent studies on residential exposure to lower levels of radon in indoor air provide more evidence of an association between lung cancer and radon exposure. Darby et al. (2005) published a combined analysis of 13 European studies involving 7148 cases of lung cancer with 14 208 matched controls. Their results show an odds ratio of 1.08 (95% confidence interval = 1.03-1.16) for every 100 Bq/m3 of radon in air. If these radon measurements are statistically adjusted for the effects of high outliers, then the odds ratio increases to 1.16 (95% confidence interval = 1.05-1.31) per 100 Bq/m3 of radon.

In a report by Krewski et al. (2005), seven North American studies involving 3663 lung cancer cases and 4966 matched controls were analysed; results also showed a significant association between household radon and lung cancer, with an odds ratio of 1.11 (95% confidence interval = 1.00-1.28) per 100 Bq/m3 of radon. If these data are restricted to subjects who had resided in only one or two houses in the 5- to 30-year period before recruitment and with at least 20 years of alpha-track monitoring data, then the odds ratio increases to 1.18 (95% confidence interval = 1.02-1.43) for every 100 Bq/m3 of radon.

The combined analyses by Darby et al. (2005) and Krewski et al. (2005), along with the downward extrapolation from the miner studies, indicate an excess relative risk of about 10% for every 100 Bq/m3 of radon in air.

9.0 Classification and assessment

The approach used to assess radionuclides in drinking water is based on an annual dose and annual risk of cancer, and is consistent with the approach adopted internationally. This is an appropriate approach, since exceedances are expected to be investigated and addressed immediately, thereby eliminating lifetime exposure to exceedances. Drinking water guidelines for radionuclides are established based on a reference dose level of 0.1 mSv/year, which is one-tenth of the public (i.e., an individual of the general public not exposed to radiation in the workplace) dose limit recommended by the ICRP and embodied in Canadian regulations by the CNSC. This is consistent with the reference level of 0.1 mSv/year set by WHO (2008) in deriving its guidelines for drinking water quality. In setting its reference level, WHO conservatively assumed that only 10% of the total ingestion dose arises from drinking water, with the remaining 90% originating from food. Furthermore, the drinking water limit is meant to include all radionuclides in drinking water, whether naturally present or introduced by human activities. A level of 0.1 mSv represents less than 5% of the annual dose from natural background radiation.

It can be estimated that the lifetime risk of fatal cancer or other health detriment from 0.1 mSv is less than 1 in 100 000 (< 10-5). At this low level of risk, it can be concluded that no further actions are necessary to reduce the amount of radioactivity in the drinking water source.

The MAC for a given radionuclide in drinking water is derived using the following formula:

The equation used for calculating the maximum acceptable concentration (MAC) for a given radionuclide in drinking water.

where:

  • 0.1 mSv/year is the reference dose level from 1 year's consumption of drinking water.
  • 730 L/year is the yearly drinking water consumption for an adult, which corresponds to a daily consumption of 2 L/day. This value is consistent with the approach used by WHO and the U.S. EPA, but is somewhat higher than the average intake of 1.5 L/day used by Health Canada to derive guidelines for the chemical parameters.
  • DC is the dose coefficient, based on the 50-year committed dose from ICRP (1996). This provides an estimate of the 50-year committed effective dose for adults resulting from a single intake of 1 Bq of a given radionuclide. Specific information is provided in Appendix D for radionuclides for which a MAC has been established.

The adult dose coefficient is assumed to be adequately protective of both children and adults for the following reasons:

  • the MACs are based on a dose of only 0.1 mSv/year, which is a small fraction of the annual dose received from the natural background; and
  • the higher age-dependent dose coefficients calculated for children do not lead to significantly higher doses because of the lower mean volume of drinking water consumed by infants and children and their higher metabolic rates.

Although MACs have been established for the radionuclides most commonly detected in Canadian drinking water sources, guidance is also provided in Appendix A for an additional 78 radiological parameters for which ICRP dose coefficients have been established but which are not expected to be found in Canadian drinking water sources. It is not recommended that monitoring be carried out for these additional radionuclides. Concentrations have been calculated, for information purposes only, using their dose coefficients and the same formula and assumptions as the MACs. They represent the theoretical level at which potential health effects could occur following long term exposure to an individual radionuclide from drinking water.

9.1 Summation formula

The radiological effects of two or more radionuclides in the same drinking water source are assumed to be additive. Thus, the following summation formula should be satisfied in order to demonstrate compliance with the guidelines:

Summation Formula

The summation formula for two or more radionuclides in order to comply with the guidelines.

where Ci and MACi are the observed and maximum acceptable concentrations, respectively, for each contributing radionuclide. Only those radionuclides that are detected with at least 95% confidence should be included in the summation. Detection limits of undetected radionuclides should not be substituted for the concentrations Ci. Otherwise, a situation could arise where a sample fails the summation criterion even though no radionuclides are present.

9.2 Lower limit of detection

A corollary of the above discussion is that the lower limit of detection (LLD) for a given radionuclide should always be less than its MAC. The specification of an LLD for a given procedure (Currie, 1968) means that if one cannot detect the radionuclide, one can be 95% confident that it is not present at a concentration greater than its LLD. If there were only one radionuclide of concern, then in theory it would be acceptable to have its LLD = MAC. A problem arises if more than one radionuclide is present, and this often occurs for natural radionuclides in drinking water. Here it is assumed that no more than five radionuclides are likely to be present at concentrations approaching their respective MACs. Therefore, any testing procedure should aim to achieve an LLD not greater than 20% of the MAC of any radionuclide likely to be present. In the experience of most laboratories involved in the radiological analysis of drinking water samples, these detection limits are readily achievable.

9.3 Chemical limits

Chemical limits should not be included in the radiological summation formula. This applies particularly to isotopes of uranium,Footnote 2 where both chemical and radiological limits may apply. MACs for chemical carcinogens are derived using different assumptions, and it is not appropriate to combine them with radiological MACs in the summation formula.

10.0 Rationale

MACs in drinking water have been established for three natural (210Pb, 226Ra, and total uranium in chemical form) and four artificial (tritium, 137Cs, 90Sr, and 131I) radionuclides. These represent the natural and artificial radionuclides that are most commonly detected in Canadian water supplies. The MACs are derived using internationally accepted equations and principles and are based solely on health considerations. They are calculated using a reference dose level for 1 year's consumption of drinking water, assuming a consumption of 2 L/day at the MAC.

MACs for radionuclides do not take into consideration treatment or analytical limitations. Treatment of water supplies for radionuclides should be governed by the principle of keeping exposures as low as reasonably achievable, social and economic considerations being taken into account. MACs apply to routine operational conditions of existing or new water supplies, but they do not apply in the event of contamination during an emergency involving a large release of radionuclides into the environment.

The health risk from ingesting radon-contaminated drinking water is considered negligible, because most of the radon escapes at the faucet or water outlet, leaving only minimal amounts in the water itself. However, it should be noted that radon levels in drinking water, if sufficiently elevated, can significantly affect airborne radon concentrations. Where indoor air radon concentrations exceed 200 Bq/m3 as an annual average concentration in the normal living area, then the source of the radon should be investigated, including through the monitoring of concentrations in drinking water. If radon concentrations in drinking water exceed 2000 Bq/L, it is recommended that actions be taken to reduce the release of radon from the drinking water into indoor air.

11.0 References

Annanmki M (ed.) (2000) Treatment techniques for removing natural radionuclides from drinking water. Final report of the TENAWA Project, STUK-A169. Radiation and Nuclear Safety Authority, Helsinki.

APHA, AWWA, and WEF (1998) Standard methods for the examination of water and wastewater. 20th edition. American Public Health Association, American Water Works Association, and Water Environment Federation, Washington, DC.

APHA, AWWA, and WEF (2005) Standard methods for the examination of water and wastewater. 21st edition. American Public Health Association, American Water Works Association, and Water Environment Federation, Washington, DC.

ASTM (1998) ASTM annual book of standards. Vol. 11.02. American Society for Testing and Materials, Philadelphia, PA.

ASTM (2006) ASTM annual book of standards. Vol. 11.02. ASTM D3649 - 06 Standard Practice for High- Resolution Gamma-Ray Spectrometry of Water. American Society for Testing and Materials, Philadelphia, PA.

ATSDR (1999) Toxicological profile for uranium. Agency for Toxic Substances and Disease Registry, Public Health Service, U.S. Department of Health and Human Services, Atlanta, GA.

Baweja, A.S., Joshi, S.R., and Demayo, A. (1987) Radionuclide content of some Canadian surface waters: A report on the National Radionuclides Monitoring Program, 1981-1984. Water Quality Branch, Environment Canada, Ottawa.

Bring, R. and Miller, A.G. (1992) Direct detection of trace levels of uranium by laser induced kinetic phosphorimetry. Anal. Chem., 64: 1413-1418.

Chant, L.A., Workman, W.J.G., King, K.J., and Cornett, R.J. (1993) Tritium concentrations in Lake Ontario. Atomic Energy of Canada Ltd., Chalk River (RC-1149, COG-93-484).

Chiu, N.W. and Dean, J.R. (1986) Radioanalytical methods manual. National Uranium Tailings Program, Canadian Centre for Mineral and Energy Technology, Canadian Government Publishing Centre (CANMET Report 78-22).

CNSC (2010) Next link will take you to another Web site Canadian Nuclear Safety Commission.

Cothern, C.R. (1987) Estimating the health risks of radon in drinking water. J. Am. Water Works Assoc., 79(4): 153-158.

Crawford-Brown, D.J. (1989) The biokinetics and dosimetry of radon-222 in the human body following ingestion of groundwater. Environ. Geochem. Health, 11: 10-17.

Currie, L.A. (1968) Limits for qualitative detection and quantitative determination. Anal. Chem., 40(3): 586-593.

Darby, S., Hill, D., Auvinen, A., Barros-Dios, J.M., Baysson, H., Bochicchio, F., Deo, H., Falk, R., Forastiere, F., Hakama, M., Heid, I., Kreienbrock, L., Kreuzer, M., Lagarde, F., Mkelinen, I., Muirhead, C., Oberaigner, W., Pershagen, G., Ruano-Ravina, A., Ruosteenoja, E., Rosario, A.S., Tirmarche, M., Tomsek, L., Whitley, E., Wichmann, H.E., and Doll, R. (2005) Radon in homes and risk of lung cancer: Collaborative analysis of individual data from 13 European case-control studies. Br. Med. J., 330: 223.

Department of National Health and Welfare (1977) Chemical procedures for the determination of 89Sr, 90Sr, and 137Cs in surface waters, fresh-water algae and fresh-water fish. Ottawa (Report 77-EHD-14).

Drage, J., Baweja, A., and Wall, P. (2005) Naturally occurring radionuclides in Nova Scotia ground water. Canadian Radiation Protection Association Bulletin, Vol. 26, No. 3 (Part 1 of 2) and No. 4 (Part 2 of 2).

Durham, R.W. and Joshi, S.R. (1981) Concentrations of radionuclides in Lake Ontario water from measurements on water treatment plant sludges. Water Res., 15: 83-86.

EML (1983) EML procedures manual. 26th edition. H.L. Volchok and G. de Planque (eds.).Environmental Measurements Laboratory, U.S. Department of Energy, New York, NY (HASL-300).

Gosink, T.A., Baskaran, M., and Holleman, D.F. (1990) Radon in the human body from drinking water. Health Phys., 59(6): 919.

Government of Canada (2007) Indoor air quality guideline for radon. Canada Gazette, Vol. 141, No. 23, June 9.

Guzwell, K. (2002) Personal communication. Water Resources Management Division, Newfoundland and Labrador Department of the Environment, St. John's, Newfoundland and Labrador, December.

Health Canada (1999) Uranium. In: Guidelines for Canadian Drinking Water Quality supporting documentation. Health Canada, Ottawa.

Health Canada (2000) Environmental radioactivity in Canada 1989-1996. Available from Environmental Radiation Hazards Division, Radiation Protection Bureau, Health Canada, Ottawa [see also earlier editions of Environmental radioactivity in Canada].

Health Canada (2004) Point-of-use and point-of-entry treatment technologies for the removal of lead-210 and uranium from drinking water. Senes Consultants Ltd., Richmond Hill.

Hursh, J.B., Morken, D.A., Davis, R.P., and Lovass, A. (1965) The fate of radon ingested by man. Health Phys., 11: 465-476.

IARC (1988) Monographs on the Evaluation of the Carcinogenic Risk of Chemicals to Man. Man-made mineral fibres and radon. Geneva: World Health Organization, International Agency for Research on Cancer. ISBN 92 832 1243 6. Volume 43.

ICRP (1991) 1990 recommendations of the International Commission on Radiological Protection. Pergamon Press, Oxford. Ann. ICRP, 21(1-3) (ICRP Publication 60).

ICRP (1996) Age-dependent doses to members of the public from intake of radionuclides: Part 5. Compilation of ingestion and inhalation dose coefficients. International Commission on Radiological Protection. Pergamon Press, Oxford. Ann. ICRP, 26(1-3) (ICRP Publication 72).

Igarashi, Y., Kawamura, H., and Shiraishi, K. (1989) Determination of thorium and uranium in biological samples by inductively coupled plasma mass spectrometry using internal standardization. J. Anal. Atom. Spectrom., 4: 571-576.

IRC (1928) International critical tables of numerical data, physics, chemistry and technology. Vol. 3. International Research Council of the National Academy of Sciences, Washington, DC.

ISO (2007) Water quality - Measurement of gross alpha activity in non-saline water - Thick source method. International Organization for Standardization, Geneva (ISO 9696:2007).

ISO (2008) Water quality - Measurement of gross beta activity in non-saline water - Thick source method. International Organization for Standardization, Geneva (ISO 9697:2008).

Krewski, D., Lubin, J.H., Zielinski, J.M., Alavanja, M., Catalan, V.S., Field, R.W., Klotz, J.B., Ltourneau, E.G., Lynch, C.F., Lyon, J.I., Sandler, D.P., Schoenberg, J.B., Steck, D.J., Stolwijk, J.A., Weinberg, C., and Wilcox, H.B. (2005) Residential radon and risk of lung cancer: A combined analysis of 7 North American case-control studies. Epidemiology, 16: 137-145.

Larivire, D., Whyte, J.C., Zhang, W., Hoffman, J. Ungar, R. K., Johnson, S., and Cornett, R.J. (2009) Rapid and Automated Analytical Technologies for Radiological/Nuclear Emergency Preparedness. In: Nuclear Chemistry: New Research, A.N. Koskinen (Ed), Nova Science Publishers. pp. 99-154.

Life Systems, Inc. (1991) Radon in drinking water: Assessment of exposure pathways. Prepared for Office of Water, U.S. Environmental Protection Agency, June (TR-1242-87).

Lindell, B. (1968) Ingested radon as a source of human radiation exposure. In: Proceedings of the First International Congress of Radiation Protection. Pergamon Press, New York, NY. p. 719.

Lubin, J.H., Boice, J.D., Jr., Edling, C., Hornung, R.W., Howe, G.R., Kunz, E., Kusiak, R.A., Morrison, H.I., Radford, E.P., Samet, J.M., Tirmarche, M., Woodward, A., Xiang, Y.S., and Pierce, D.A. (1994) Radon and lung cancer risk: A joint analysis of 11 underground miners studies. National Institutes of Health, U.S. Department of Health and Human Services, Washington, DC (NIH Publication No. 94-3644).

Lubin, J.H., Boice, J.D., Edling, C., Hornung, R.W., Howe, G.R., Kunz, E., Kusiak, R.A., Morrison, H.I., Radford, E.P., Samet, J.M., Tirmarche, M., Woodward, A., Yao, S.X., and Pierce, D.A. (1995) Lung cancer in radon-exposed miners and estimation of risk from indoor exposure. J. Natl. Cancer Inst., 87(11): 817-827.

MOEE (1993) Radiological data 1992. Ontario Ministry of Environment and Energy, Toronto.

Nazaroff, W.W., Doyle, S.M., Nero, A.V., and Sexton, R.G. (1987) Potable water as a source of airborne 222Rn in U.S. dwellings: A review and assessment. Health Phys., 52: 281-295 [cited in Gosink et al., 1990].

NCRP (1987) Exposure of the population in the United States and Canada from natural background radiation. National Council on Radiation Protection and Measurements, Bethesda, MD (NCRP Report No. 94).

NCRP (1988) Measurement of radon and radon daughters in air. National Council on Radiation Protection and Measurements, Bethesda, MD (NCRP Report No. 97).

Next link will take you to another Web site NSF International (2005a) Contaminant guide.

Next link will take you to another Web site NSF International (2005b) Contaminant testing protocols.

Pylon (1989) Instruction manual for using Pylon Model 110A and 300A Lucas cells with the Pylon Model AB-5. Pylon Electronic Development Company Ltd., Ottawa. 43 pp.

Tracy, B.L. and Prantl, F.A. (1983) 25 years of fission product input to Lakes Superior and Huron. Water Air Soil Pollut., 19: 15-27.

Tracy, B.L. and Prantl, F.A. (1985) Radiological impact of coal-fired power generation. J. Environ. Radioactivity, 2: 145-160.

Underwood, N. and Diaz, J. (1941) A study of the gaseous exchange between the circulating system and the lungs. Am. J. Physiol., 13: 88.

Next link will take you to another Web site UNSCEAR (1982) Ionizing radiation: Sources and biological effects. United Nations Scientific Committee on the Effects of Atomic Radiation, United Nations, New York, NY.

Next link will take you to another Web site UNSCEAR (1988) Sources, effects and risks of ionizing radiation. United Nations Scientific Committee on the Effects of Atomic Radiation, United Nations, New York, NY.

Next link will take you to another Web site UNSCEAR (2000) Sources, effects and risks of ionizing radiation. United Nations Scientific Committee on the Effects of Atomic Radiation, United Nations, New York, NY.

U.S. EPA (1987) Two test procedures for radon in drinking water. Appendix D. Analytical test procedure. U.S. Environmental Protection Agency, Washington, DC, March. p. 22 (EPA/600/2-87/082).

U.S. EPA (1991) National primary drinking water regulations, radionuclides; proposed rules. U.S. Environmental Protection Agency, Washington, DC. Fed. Regist., 56(138): 33050.

U.S. EPA (1999) National primary drinking water regulations; radon-222. U.S. Environmental Protection Agency, Washington, DC. Fed. Regist., 64(211).

U.S. EPA (2000a) Radionuclides notice of data availability technical support document. Prepared by Office of Groundwater and Drinking Water, U.S. Environmental Protection Agency, in collaboration with Office of Indoor Air and Radiation, U.S. EPA, and United States Geological Survey. March.

U.S. EPA (2000b) National primary drinking water regulations; radionuclides; final rule. U.S. Environmental Protection Agency, Washington, DC. 40 Code of Federal Regulations Parts 9, 141, and 142.

U.S. EPA (2008) Next link will take you to another Web site Analytical methods approved for drinking water compliance monitoring of radionuclides. U.S. Environmental Protection Agency, Washington, DC.

U.S. NRC (1999a) Next link will take you to another Web site Risk assessment of radon in drinking water. Prepared by the Committee on Risk Assessment of Exposure to Radon in Drinking Water, National Research Council. National Academy Press, Washington, DC.

U.S. NRC (1999b) Next link will take you to another Web site Health effects of exposure to radon (BEIR VI). Prepared by the Committee on the Health Risks of Exposure to Radon (BEIR VI), Board of Radiation Effects Research, Commission on Life Sciences, National Research Council. National Academy Press, Washington, DC. p. 1048.

Veska, E. and Tracy, B.L. (1986) The migration of reactor produced tritium in Lake Huron. J. Environ. Radioactivity, 4: 31-38.

von Dobeln, W. and Lindell, B. (1965) Some aspects of radon contamination following ingestion. Ark. Fys., 27: 531-572.

WHO (2004) Next link will take you to another Web site Radon and health information sheet. World Health Organization, Geneva. March.

WHO (2008) Next link will take you to another Web site Guidelines for drinking-water quality, incorporating first and second addenda to third edition. Vol. 1. Recommendations. World Health Organization, Geneva.

Appendix A: Calculated Concentrations for selected radionuclides

(based on adults consuming 2 L of water per day; see Section 9.0 for calculations) Note: Radionuclides in bold represent those most commonly detected in Canadian drinking water supplies

Calculated Concentrations for selected Natural and Artificial radionuclides
Radionuclide Symbol Decay mode Half-life Adult dose coefficient (Sv/Bq) Calculated concentration (Bq/L)

Table b footnotes

Table b footnote 1

The activity concentration of natural uranium corresponding to the chemical guideline (MAC) of 0.02 mg/L is about 0.5 Bq/L.

Return to the first reference of table b footnote 1 referrer

Table b footnote 2

14C is also produced naturally in the atmosphere in significant quantities.

Return to table b footnote 2 referrer

Table b footnote 3

Tritium is also produced naturally in the atmosphere in significant quantities. HTO = tritiated water; OBT = organically bound tritium.

Return to the first reference of table 1 footnote 3 referrer

Natural radionuclides
Beryllium-7 7Be Electron capture 53.3 days 2.8 × 10−11 5000
Bismuth-210 210Bi Beta 5.01 days 1.3 × 10−9 100
Lead-210 210Pb Beta and gamma 22.3 years 6.9 × 10−7 0.2
Polonium-210 210Po Alpha 138 days 1.2 × 10−6 0.1
Radium-224 224Ra Alpha 3.66 days 6.5 × 10−8 2
Radium-226 226Ra Alpha 1600 years 2.8 × 10−7 0.5
Radium-228 228Ra Beta 5.76 years 6.9 × 10−7 0.2
Thorium-228 228Th Alpha 1.91 years 7.2 × 10-8 2
Thorium-230 230Th Alpha 75 400 years 2.1 × 10−7 0.6
Thorium-232 232Th Alpha 14 billion years 2.3 × 10−7 0.6
Thorium-234 234Th Beta 24.1 days 3.4 × 10−9 40
Uranium-234Table b footnote 1 234U Alpha 245 000 years 4.9 × 10-8 3
Uranium-235Table b footnote 1 235U Alpha 704 million years 4.7 × 10−8 3
Uranium-238Table b footnote 1 238U Alpha 4.47 billion years 4.5 × 10−8 3
Artificial radionuclides
Americium-241 241Am Alpha 432 years 2.0 × 10−7 0.7
Antimony-122 122Sb Beta 2.71 days 1.7 × 10−9 70
Antimony-124 124Sb Beta 60.2 days 2.5 × 10−9 50
Antimony-125 125Sb Beta 2.76 years 1.1 × 10−9 100
Barium-140 140Ba Beta 12.8 days 2.6 × 10−9 50
Bromine-82 82Br Beta 35.3 hours 5.4 × 10−10 300
Calcium-45 45Ca Beta 165 days 7.1 × 10-10 200
Calcium-47 47Ca Beta 4.54 days 1.6 × 10-9 100
Carbon-14Table b footnote 2 14C Beta 5730 years 5.8 × 10-10 200
Cerium-141 141Ce Beta 32.5 days 7.1 × 10-10 200
Cerium-144 144Ce Beta 284 days 5.2 × 10−9 30
Cesium-131 131Cs Electron capture 9.69 days 5.8 × 10−11 2000
Cesium-134 134Cs Electron capture / beta 2.07 years 1.9 × 10−8 7
Cesium-136 136Cs Beta 13.1 years 3.0 × 10−9 50
Cesium-137 137Cs Beta 30.2 years 1.3 × 10−8 10
Chromium-51 51Cr Electron capture 27.7 days 3.8 × 10−11 4000
Cobalt-57 57Co Electron capture 272 days 2.1 × 10−10 700
Cobalt-58 58Co Electron capture 70.9 days 7.4 × 10−10 200
Cobalt-60 60Co Beta 5.27 years 3.4 × 10-9 40
Gallium-67 67Ga Electron capture 78.3 hours 1.9 × 10-10 700
Gold-198 198Au Beta 2.69 days 1.0 × 10-9 100
Indium-111 111In Electron capture 2.81 days 2.9 × 10-10 500
Iodine-125 125I Electron capture 59.9 days 1.5 × 10−8 10
Iodine-129 129I Beta 16.0 million years 1.1 × 10-7 1
Iodine-131 131I Beta 8.04 days 2.2 × 10-8 6
Iron-55 55Fe Electron capture 2.68 years 3.3 × 10−10 400
Iron-59 59Fe Beta 44.5 days 1.8 × 10-9 70
Manganese-54 54Mn Electron capture 312 days 7.1 × 10−10 200
Mercury-197, methyl 197Hg Electron capture 64.1 hours 9.9 × 10-11 1000
Mercury-197, organic 197Hg Electron capture 64.1 hours 1.7 × 10-11 700
Mercury-197, inorganic 197Hg Electron capture 64.1 hours 2.3 × 10-10 600
Mercury-203, methyl 203Hg Beta 46.6 days 1.9 × 10-9 70
Mercury-203, organic 203Hg Beta 46.6 days 1.1 × 10-9 100
Mercury-203, inorganic 203Hg Beta 46.6 days 5.4 × 10-10 300
Molybdenum-99 99Mo Beta 65.9 hours 6.0 × 10-10 200
Neptunium-239 239Np Beta 2.35 days 8.0 × 10-10 200
Niobium-95 95Nb Beta 35.0 days 5.8 × 10−10 200
Phosphorus-32 32P Beta 14.3 days 2.4 × 10-9 60
Plutonium-238 238Pu Alpha 87.7 years 2.3 × 10-7 0.6
Plutonium-239 239Pu Alpha 24 100 years 2.5 × 10−7 0.6
Plutonium-240 240Pu Alpha 6560 years 2.5 × 10-7 0.6
Plutonium-241 241Pu Beta 14.4 years 4.8 × 10-9 30
Rhodium-105 105Rh Beta; 35.4 hours 3.7 × 10-10 400
Rubidium-81 81Rb Electron capture 4.58 hours 5.4 × 10-11 3000
Rubidium-86 86Rb Beta 18.6 days 2.8 × 10−9 50
Ruthenium-103 103Ru Beta 39.2 days 7.3 × 10−10 200
Ruthenium-106 106Ru Beta 373 days 7.0 × 10−9 20
Selenium-75 75Se Electron capture 120 days 2.6 × 10−9 50
Silver-108m 108mAg Electron capture / isomeric transition 127 years 2.3 × 10−9 60
Silver-110m 110mAg Beta / isomeric transition 250 days 2.8 × 10-9 50
Silver-111 111Ag Beta 7.47 days 1.3 × 10−9 100
Sodium-22 22Na Electron capture 2.61 years 3.2 × 10−9 40
Strontium-85 85Sr Electron capture 64.8 days 5.6 × 10−10 200
Strontium-89 89Sr Beta 50.5 days 2.6 × 10−9 50
Strontium-90 90Sr Beta 29 years 2.8 × 10−8 5
Sulphur-35, organic 35S Beta 87.2 days 7.7 × 10−10 200
Sulphur-35, inorganic 35S Beta 87.2 days 1.3 × 10−10 1000
Technetium-99 99Tc Beta 213 000 years 6.4 × 10−10 200
Technetium-99m 99mTc Isomeric transition / beta 6.01 hours 2.2 × 10−11 6000
Tellurium-129m 129mTe Isomeric transition / beta 33.4 days 3.0 × 10−9 50
Tellurium-131m 131mTe Beta / isomeric transition 32.4 hours 1.9 × 10−9 70
Tellurium-132 132Te Beta 78.2 hours 3.8 × 10−9 40
Thallium-201 201Ti Electron capture 3.04 days 9.5 × 10−11 1000
Tritium, HTOTable b footnote 3 3H Beta 12.3 years 1.8 × 10−11 7000
Tritium, OBTTable b footnote 3 3H Beta 12.3 years 4.2 × 10-11 3000
Ytterbium-169 169Yb Electron capture 32.0 days 7.1 × 10−10 200
Yttrium-90 90Y Beta 64 hours 2.7 × 10−9 50
Yttrium-91 91Y Beta 58.5 days 2.4 × 10−9 60
Zinc-65 65Zn Electron capture 244 days 3.9 × 10−9 40
Zirconium-95 95Zr Beta 64.0 days 9.5 × 10−10 100

Appendix B: Glossary of terms and units, conversion factors, and acronyms

Glossary

Absorbed dose:
Quantity of energy imparted by ionizing radiation to unit mass of matter such as tissue. Unit gray, symbol Gy. 1 Gy = 1 joule per kilogram. See also Table B-1 below.
Actinides:
A group of 15 elements with atomic numbers from 89 to 103 inclusive. All are radioactive, and they include thorium, uranium, plutonium, and americium.
Activity:
The rate at which transformations occur in a radioactive substance. Unit becquerel, symbol Bq. 1 Bq = 1 transformation or disintegration per second. See also Table B-1 below.
Alpha particle:
A particle consisting of two protons and two neutrons.
Becquerel:
See Activity.
Beta particle:
An electron emitted by the nucleus of a radionuclide. The electric charge may be positive, in which case the beta particle is called a positron.
Committed effective dose:
The effective dose that will be accumulated over a period of time following a single intake of radioactive material into the body. Standard periods of integration are 50 years for adults and 70 years for a lifetime exposure.
Decay:
The process of spontaneous transformation of a radionuclide. The decrease in the activity of a radioactive substance.
Decay product:
A nuclide or radionuclide produced by decay. It may be formed directly from a radionuclide or as a result of a series of successive decays through several radionuclides.
Dose:
General term for quantity of radiation. See Absorbed dose, Committed effective dose, Effective dose, Equivalent dose.
Dose coefficient:
The committed effective dose resulting from the inhalation or ingestion of 1 Bq of a given radionuclide. Unit sievert per becquerel, symbol Sv/Bq.
Effective dose:
The quantity obtained by multiplying the equivalent doses to various tissues and organs by the tissue weighting factor appropriate to each and summing the products. Unit sievert, symbol Sv.
Electron capture:
Nuclear decay in which a proton in the nucleus acquires an electron from the outer cloud of the atom's electrons. This converts the proton to a neutron, reducing the number of protons in the nucleus by one and the atomic number of the original element by one. Atomic mass number remains constant because the total number of protons and neutrons is unchanged.
Equivalent dose:
The quantity obtained by multiplying the absorbed dose by the appropriate radiation weighting factor to allow for the differing effectiveness of the various ionizing radiations in causing harm to tissue. Unit sievert, symbol Sv. See also Table B-1 below.
Gamma ray:
A discrete quantity of electromagnetic energy, without mass or charge.
Gray:
See Absorbed dose.
Half-life:
The time taken for the activity of a radionuclide to lose half its value by decay. Symbol t.
Ionization:
The process by which a neutral atom or molecule acquires or loses an electric charge. The production of ions.
Ionizing radiation:
Radiation that produces ionization in matter.
Latency period:
the time between the actual exposure to a carcinogen and the development of cancer.
Nuclear fission:
The process in which a nucleus splits into two or more nuclei and energy is released.
Radionuclide:
An unstable nuclide that emits ionizing radiation.
Risk factor:
The probability of fatal cancer or leukaemia per unit effective dose.
Secular equilibrium:
a situation in which the quantity of a radioactive isotope remains constant because its production rate (due, e.g., to decay of a parent isotope) is equal to its decay rate.
Sievert:
See Effective dose.
Table B-1. Relationship between old and new radiation units
Quantity
Old unit Symbol
New unit Symbol Relationship
Activity curie Ci becquerel Bq 1 Ci = 3.7 × 1010 Bq
Absorbed dose
rad rad gray Gy 1 rad = 0.01 Gy
Equivalent dose rem rem sievert Sv 1 rem = 0.01 Sv

Table B-2: Conversion factors
Multiple Prefix Symbol
1012 tera T
109 giga G
106 mega M
103 kilo k
10-3 milli m
10-6 micro
10-9 nano n
10-12 pico p

Acronyms

ANSI
American National Standards Institute
CANDU
Canadian Deuterium Uranium class of fission reactor
CNSC
Canadian Nuclear Safety Commission
DC
dose coefficient
DNA
deoxyribonucleic acid
GAC
granular activated carbon
HTO
tritiated water
ICRP
International Commission on Radiological Protection
ISO
International Organization for Standardization
LLD
lower limit of detection
MAC
maximum acceptable concentration
NSF
NSF International
OBT
organically bound tritium
SCC
Standards Council of Canada
SI
International System of Units
TDS
total dissolved solids
TNT
trinitrotoluene
U.S. EPA
United States Environmental Protection Agency
WHO
World Health Organization

Appendix C: Provincial/territorial monitoring surveys

Table C-1. Summary of the results of 14 years of monitoring for natural radionuclides at Elliot Lake, Port Hope, and Regina
Year Elliot Lake Port Hope Regina
226Ra Uranium 226Ra Uranium 226Ra Uranium
(Bq/L) (µg/L) (µg/L) Symbol (Bq/L) (µg/L)

Table C-1 footnotes

Table C-1 footnote 1

ND = less than the detection limit of 0.005 Bq/L.

Return to table C-1 footnote 1 referrer

1983 0.015 2.0 NDTable C-1 footnote 1 0.7 ND 7.1
1984 0.016 1.7 ND 0.8 ND 7.7
1985 0.019 1.1 ND 0.6 ND 5.3
1986 0.017 1.1 ND 0.8 ND 4.0
1987 0.016 0.6 ND 0.6 ND 2.7
1988 0.014 1.2 ND 1.0 ND 2.2
1989 0.014 0.7 ND 0.6 ND 3.0
1990 0.018 0.9 ND 0.7 ND 4.7
1991 0.008 0.5 ND 0.5 ND 2.0
1992 0.012 0.4 ND 0.5 ND 4.4
1993 0.01 0.5 ND 0.4 ND 2.8
1994 0.009 0.5 ND 0.4 ND 2.7
1995 0.007 0.6 ND 0.4 ND 1.9
1996 0.007 0.6 ND 0.4 ND 1.3

Table C-2. Summary of 137Cs and 90Sr measurements in the waters of the Great Lakes from 1973 to 1981Table C-2 footnote 1
Year Measurements in the open lakes (mBq/L) Measurements near reactors (mBq/L)
Superior Michigan Huron Erie Ontario Toronto Pickering Ajax

Table C-2 footnotes

Table C-2 footnote 1

textfortable1footnote1

Return to table C-2 footnote 1 referrer

137Cs
1973 2.8 1.8 1.5 0.8 1.8 9.3 11.1 4.8
1974 2.8   1.6     2.2 13.7 4.1
1975     1.6     1.1 3.0 1.1
1976     0.9   0.7 1.9 1.5 3.3
1977     1.4 0.8 0.8      
1978 1.8   1.3 0.7 0.9      
1979 1.7 1.3 1.0 0.6 0.7      
1980 2.0 1.6 1.6 0.6 1.0      
1981 1.7 1.2 1.2 0.6 0.8 1.2   1.4
90Sr
1973 19 31 32 41 47 35 31 34
1974           32 35 57
1975           34 30 35
1976           31 30 30
1977     31 30 35      
1978     37          
1979 19 31 29 31 31      
1980 12 23 25 15 42      
1981 16 17 23 23 25 20   15

Appendix D: Summary paragraphs for radionuclides with MACs

The radionuclides most commonly found in Canadian drinking water and with potential long-term health consequences are described below. The incorporation of new information has caused the MACs for some radionuclides to increase, whereas others have decreased or remained the same.

Lead-210
Lead-210 is a bone-seeking radionuclide that often occurs together with its alpha-emitting decay product, 210Po. The MAC of 210Pb is 0.2 Bq/L, up from 0.1 Bq/L, its previous guideline. This change is due mainly to a reduction from 20% to 10% in the reported uptake factor for lead from the gastrointestinal tract.

The radiological MAC for 210Pb should not be confused with the chemical MAC for stable lead of 0.01 mg/L. A 210Pb concentration at the radiological MAC of 0.2 Bq/L would correspond to a total lead concentration of only 7 × 10-8 g/L. The presence or absence of stable lead in drinking water has no bearing on the presence or absence of 210Pb.

Radium isotopes
Radium, like strontium, is a bone-seeking element of the alkaline earth family. About 20% of ingested radium is absorbed by the gastrointestinal tract. Radium-226 (half-life = 1600 years) and 224Ra (half-life = 3.66 days) undergo decay by alpha particle emission. Radium-228 (half-life = 5.75 years) decays through beta emission to 228Th, which is an alpha emitter. In the current guidelines, the MAC for 224Ra remains unchanged at 2 Bq/L; that for 226Ra is reduced slightly from 0.6 to 0.5 Bq/L, and that for 228Ra is reduced by a larger amount, from 0.5 to 0.2 Bq/L. These changes are due to more refined models for the internal metabolism of radium that have become available.
Uranium isotopes
Natural uranium consists of three isotopes - 234, 235, and 238 - with half-lives ranging from hundreds of millions to billions of years. As a result of these long half-lives, the specific activities of the uranium isotopes are very low. The MAC for uranium in drinking water is based on its chemical toxicity rather than its radiological properties. However, radiological MACs of about 3 Bq/L for each uranium isotope have been included in Appendix A so that they can be included in the radiological summation formula. At equilibrium, a total uranium concentration at its chemical MAC of 20 g/L would correspond to 0.25 Bq/L of 238U, 0.01 Bq/L of 235U, and 0.25 Bq/L of 234U. Note, however, that for uranium dissolved in groundwater, the 238 and 234 isotopes may be in disequilibrium by up to a factor of 2.

Experimental studies in adult humans consistently show that absorption of uranium by the oral route is less than 5% (Health Canada, 1999), with the average being 1-2% (ATSDR, 1999). Once in the blood, approximately 67% of uranium is filtered in the kidneys and leaves the body in urine within 24 hours; the remainder distributes to tissues. This may explain why no radiation-related cancers from uranium exposure have been identified in humans following exposure to naturally occurring levels of uranium, even following exposure to highly enriched uranium (ATSDR, 1999).

For information on the chemical aspects of uranium toxicity, the reader should refer to the Guideline Technical Document on uranium (Health Canada, 1999).

Tritium
Tritium, with a half-life of 12.3 years, exists in the environment mainly as tritiated water (HTO), but it may also occur in vegetation as organically bound tritium (OBT). Ingested HTO is quickly absorbed into the bloodstream and remains in the body for 2-18 days. Tritium gives off low-energy (18.6 keV) beta particles, with no associated gamma rays or X-rays. Consequently, the dose coefficient for tritium is quite low, and the resulting MAC is 7000 Bq/L, unchanged from its previous value.
Cesium-137
Cesium-137, an alkali metal with properties similar to those of potassium, is one of the more important fission products because of its relatively high yield and its ability to bioconcentrate in some food chains. With a half-life of 30.17 years, it has been released into the environment from nuclear weapons tests and from reactor emissions (e.g., the Chernobyl accident in 1986). Fixation by sediments in aquatic environments reduces its concentration in water bodies. Ingested 137Cs is readily absorbed into soft tissues, but is eliminated relatively quickly. Consequently, the MAC is set at 10 Bq/L, which is somewhat less restrictive than for 90Sr and is unchanged from its previous value.
Strontium-90

Strontium-90 has been monitored extensively in the environment since the beginning of nuclear weapons testing. It decays by pure beta emission with a half-life of 29 years. As an alkaline earth element, strontium is similar to calcium and follows calcium through the food chain to the human body, where it is retained largely in teeth and bone, with half-times of 3-7 years. Because of this long residence time in bone and because of the relatively high energy (846 keV) of its beta particles, the MAC of 90Sr has been set at 5 Bq/L, unchanged from its previous value.

Radioactive strontium (90Sr) should not be confused with stable strontium. The two species of strontium have quite different origins, and their concentrations in drinking water are not correlated.

Radioiodine
Radioactive isotopes of iodine are produced by nuclear fission and activation processes. They have received extensive study in view of their mobility and their selective irradiation of the thyroid gland when taken into the body. A number of iodine isotopes are routinely used in nuclear medicine procedures and thus have the potential to be released into water bodies through sewage effluent. Fortunately, all of the iodine isotopes except 129I are short-lived. The different iodine isotopes have MACs ranging from 1 to 10 Bq/L, which remain unchanged from the previous values.
Cesium-137
Cesium-137, an alkali metal with properties similar to those of potassium, is one of the more important fission products because of its relatively high yield and its ability to bioconcentrate in some food chains. With a half-life of 30.17 years, it has been released into the environment from nuclear weapons tests and from reactor emissions (e.g., the Chernobyl accident in 1986). Fixation by sediments in aquatic environments reduces its concentration in water bodies. Ingested 137Cs is readily absorbed into soft tissues, but is eliminated relatively quickly. Consequently, the MAC is set at 10 Bq/L, which is somewhat less restrictive than for 90Sr and is unchanged from its previous value.

Footnotes

Footnote 1

The term "radon" is used in this Guideline Technical Document to refer to the isotope 222Rn. Other radon isotopes are also ubiquitous in nature but generally present a much smaller risk than 222Rn.

Return to footnote 1 referrer

Footnote 2

Based on chemical criteria, uranium is not considered to be a human carcinogen and is classified in Group V (inadequate data for evaluation of carcinogenicity) (Health Canada, 1999).

Return to footnote 2 referrer