1Abbreviations: AI, adequate intake; AMDR, acceptable macronutrient distribution range; DRI, dietary reference intake; EAR, estimated average requirement; EER, estimated energy requirement; FNB, Food and Nutrition Board; IOM, Institute of Medicine; LOAEL, lowest-observed-adverse-effect level; NAS, National Academy of Sciences; NOAEL, no-observed-adverse-effect level; RDA, recommended dietary allowance; SEBR, systematic evidence-based review; UL, tolerable upper intake level.
A discussion of dietary reference intakes (DRIs) gives rise to several questions: What are the DRIs? How are they derived? How should they be used? DRIs include several types of nutrient reference values and are essentially tools for assessing and planning diets. Like any tool, their use is best accomplished through an understanding of their underpinnings and intended purpose.
Nutrient reference values were first developed more than 70 years ago to address the population-oriented tasks of planning for large-group food supplies and food programs and assess the adequacy of a population’s overall diet. As they have evolved, their application as guidance for individuals has required newer considerations, given that a particular person’s true nutrient need is unknown. Further, although nutrient reference values have moved through a history of addressing concerns about deficiencies and then to interests in intakes to promote health, newer research also has raised concerns about too much of a nutrient. Understandably, nutrient reference values can change in response to the evolution of the science surrounding their development.
This chapter describes the DRIs and the factors that underpin their development and application. Two emerging issues are also highlighted. It is important to note that the DRIs are focused on specifying the requirements for the essential constituents of food that are essential or important to health, but they do not address the larger picture of desirable dietary patterns or food guidance. However, the Dietary Guidelines for Americans (1) makes use of the DRIs, as does the Canadian Food Guide (2).
BACKGROUND
In the early 1940s, in a time of war, there were concerns about the nutritional health of the US military recruits as well as concerns about the adequacy of the food supply to ensure a healthy and fit population (3). Work began at the US National Academy of Sciences (NAS) to develop national nutrient reference values that could be used to assess the adequacy of the US food supply and plan diets that contained the essential nutrients for the military, schools, and other feeding programs.
The first values, known as the recommended dietary allowances (RDAs), were issued in April 1941 by an entity that would become the Food and Nutrition Board (FNB) of the NAS (4). Revisions occurred approximately every 5 years through 1989. The FNB nutrient reference values were reconfigured in 1994 as DRIs in response to advances in scientific and statistical understandings as well as recognition that, for many nutrients, a single reference value did not meet the expanding users’ needs (5). Canada adopted the RDAs in 1944 (6) and then, in 1948, began issuing its own reference values known as recommended nutrient intakes. In the 1990s, Canada joined the US government in supporting the development of DRIs by the FNB (7).
There are several important changes to the development of nutrient reference values, as outlined in 1994 (5) and incorporated into the DRIs. These are the following:
Clarification that development of DRIs is based on probability and risk, and that the statistical concept of a distribution underpins development and application
Specification of an average requirement in addition to a recommended level of intake adequate to meet the known nutritional needs of practically all healthy persons (approximately 97.5%)
Identification of tolerable upper levels of intake, given changes in the food supply, including increased use of dietary supplements
Greater consideration of chronic disease end points as well as more traditional endpoints in determining nutrient adequacy or adverse health effects
Additionally, the original vitamins, minerals, proteins, and calories specified in 1941 expanded over time to include a range of nutrient substances, including food constituents, such as total fiber and water. Table 106.1 shows the increasing number of nutritional substances for which reference values have been developed.
The DRIs now serve as a science standard for federal nutrition guidance and are either the statutory or de facto standard for virtually all national nutrition assistance programs. Over time, they have been applied in everyday life—such as by bankruptcy courts to determine income for food expenses—and they have increasingly been used by dietetic practitioners in settings very different from those imagined in the 1940s, when the concern was an adequate food supply and military preparedness. They are now also used as a basis for nutritional standards in a number of other countries.
KEY COMPONENTS
In the 1990s, the DRIs were envisioned as a set of values that would include more than an RDA; specifically, additions were outlined that became known as the estimated average requirement (EAR) and the tolerable upper intake level (UL). Activities in 1997 and beyond added other types of nutrient reference values to the DRIs. The types of values that comprise the DRIs at this time are shown in Table 106.2 and are discussed in the next section.
DRIs, along with descriptive text, are contained in six volumes published by NAS between 1997 and 2004 (http://www.iom.edu/dris). These publications represent the first generation of DRIs. To help users understand the application of the DRIs, two publications were created to provide general guidance. One is focused on applications related to dietary assessment (8) and the other on applications for dietary planning (9). In 2006, NAS issued Dietary Reference Intakes: The Essential Guide to Nutrient Requirements (10), which is also available in French. It provides an overall summary of the DRIs through 2004. After a 2007 workshop that focused on lessons learned during the development of the initial set of DRIs (11), a new report initiated the second generation of DRIs. Specifically, in 2009 to 2010, the DRIs for calcium and vitamin D were reviewed and updated, and a report was published in 2011 (12). Tables containing all the current DRI values can be viewed at the website http://www.iom.edu/dris. A range of US and Canadian government agencies have provided support for DRI development.
The current life stage groups associated with the DRIs were determined at the same time the DRIs were developed, using the new 1994 approach for nutrient reference values. The life stage groupings took into account developmental and gender differences as well as additional factors such as the age at which young children enter institutional feeding settings, the onset of menarche, and the age at which retirement generally occurs, which potentially affect energy requirements (13). Additionally, these life stage differences mean that the outcomes of interest in setting a nutrient requirement or an upper intake level differ for different life stage groups. The specific life stage groups for the DRIs can be viewed by accessing the website containing the DRI tables.
TABLE 106.1 NUTRIENTS WITH ESTABLISHED REFERENCE VALUES: 1941-2010
a Values for chromium, copper, fluoride, pantothenic acid, biotin manganese, and molybdenum were expressed in 1989 as estimated safe and adequate daily dietary intakes. Potassium was expressed as an estimated minimum requirement.
b Vitamin D and calcium were reviewed in 1997 and again in 2010. All other nutrients were reviewed once during the 1997-2004 time period. Carotenoids were reviewed, but no dietary reference intake (DRI) established.
Adapted with permission from Yates AA. Dietary reference intakes: rationale and application. In: Shils ME, Shike M, Ross AC et al. Modern Nutrition in Health and Disease. 10th ed. Baltimore: Lippincott Williams & Wilkins, 2006:1672-7.
TABLE 106.2 NUTRIENT REFERENCE VALUES THAT COMPRISE DIETARY REFERENCE INTAKESa
NUTRIENT REFERENCE VALUE
DESCRIPTION
Estimated average requirement (EAR)
Reflects the estimated median daily requirement and is particularly appropriate for applications related to planning and assessing intakes for groups of persons
Recommended dietary allowance (RDA)
Derived from the EAR and intended to cover the requirements for 97%-98% of the population
Tolerable upper intake level (UL)
Highest average daily intake that is likely to pose no risk
Adequate intake (AI)
Used when an EAR/RDA cannot be developed; average recommended daily intake level based on observed or experimental intakes
Acceptable macronutrient distribution range (AMDR)
An intake range for an energy source associated with reduced risk of chronic disease
Estimated energy requirement (EER)
Average daily dietary energy intake predicted to maintain energy balance in a healthy adult of defined age, gender, weight, height, and level of physical activity that is consistent with good health
aItalics denote DRI reference values developed after the initial DRI plan.
Adapted with permission from Food and Nutrition Board, Institute of Medicine. How Should the RDAs Be Revised? Washington, DC: National Academy Press, 1994.
FRAMEWORK FOR DEVELOPMENT
The basic task for developing DRIs appears simple: Identify the health outcome that the nutrient affects, and then determine how much of the nutrient causes that effect, so that a requirement for the nutrient can be specified. In the case of tolerable upper levels, the adverse effect must be identified and likewise the level that causes such an effect must be determined. For example, the question may be how much nutrient X ensures healthy bones or how much nutrient Y reduces the risk of coronary heart disease. In practice, the process is complicated and detailed. Data are often limited for key topic areas and for some life stage groups, and scientific judgment is required.
Fig. 106.1. Effect of nutrient intake on health outcome: direct versus indirect measurement.
The health outcomes of interest for DRIs—sometimes referred to as the “indicators” or “end points”—are considered in relation to the direct effect of intake on the health outcome, as shown by the upper line in Figure 106.1. More commonly, the indicator used relates to an indirect effect. That is, the effect of intake on a marker for the health outcome (bottom line in Fig. 106.1) is considered rather than a measure of the outcome itself. For example, the number and type of polyps is measured rather than the onset of a cancer.
The challenge is to ensure that the marker is a valid reflection of the health outcome of interest. With few exceptions, the approach to identifying and validating such markers is not as well developed for nutrients as for other substances such as pharmaceuticals and toxins. There is often confusion and inadvertent overlap regarding markers of intake and markers of effect, and guiding principles for ensuring the appropriateness of nutrient-related markers are not well articulated. Further, studies often fail to examine a range of intakes—that is, they are not designed to identify a dose-response—but instead use a single intake/dose level. As described in the following, the ability to identify a dose-response relationship is critical to DRI development. There are often calls for more studies on dose-response relationships and on the relationship between a marker of a health outcome and the outcome it purports to reflect (11, 14).
The selection of health outcomes for DRI development usually relates to the following:
Evidence demonstrates a causal relationship
Selection of health outcome supports protection of public health
Adequacy: Preference for outcome with relatively high intake—not necessarily outcome with most data or even strongest data
Upper level: Preference for outcome with relatively low intake—not necessarily outcome that is most “severe”
Selection may differ by life stage group
Some examples of indicators (i.e., the health outcomes and markers) used for DRI development for adequate intakes are shown in Table 106.3. Indicators can be described and labeled in various ways, for example, as “clinical,” “biochemical,” and “functional” measures. Factorial models and balance studies also are used.
TABLE 106.3 EXAMPLES OF INDICATORS (OUTCOMES AND MARKERS) USED FOR DIETARY REFERENCE INTAKE DEVELOPMENT FOR ADEQUACY REFERENCE VALUES
NUTRIENT
VALUE
INDICATOR
TYPE OF INDICATOR
Thiamin
EAR
Urinary thiamin excretion
Biochemical
Vitamin C
EAR
Antioxidant functions in leukocytes
Functional
Vitamin A
EAR
Amount of dietary vitamin A required to maintain a given body-pool size in well-nourished adults
Factorial model
Magnesium
EAR
Magnesium balance studies
Nutrient balance
Fluoride
AI
Prevention of dental caries
Clinical
Pantothenic acid
AI
Pantothenic acid intakes
(direct estimation)
AI, adequate intake; EAR, estimated average requirement.
Adapted with permission from a background paper developed by Dr. Margaret Cheney for a 2007 workshop. Food and Nutrition Board, Institute of Medicine. The Development of DRIs 1994-2004: Lessons Learned and New Challenges: Workshop Summary. Washington, DC: National Academies Press, 2008.
Only gold members can continue reading. Log In or Register to continue