Approaches To Preventing Micronutrient Deficiencies1



Approaches To Preventing Micronutrient Deficiencies1


Lindsay H. Allen





HISTORICAL PERSPECTIVE

Since the 1930s, there has been a major evolution in our understanding concerning appropriate nutritional interventions for populations that are undernourished. The term “undernourished” includes signs of protein-energy malnutrition as well as growth stunting and evidence of specific nutrient deficiencies. In the 1930s, the main global nutrition problem was perceived to be lack of protein. Opinion then changed gradually in the 1960s to the general assumption that protein-energy malnutrition was the underlying problem and in the 1970s to the recognition that, except in cases or situations of a severe lack of food (famine and hunger) or where staple foods are low in protein (e.g., cassava), protein deficiency is not the problem. In the late 1970s, the focus shifted to preventing undernutrition by promoting breast-feeding and improving complementary feeding by adding foods such as legumes. Lack of energy (from lack of food) was investigated in the 1980s as the possible cause of chronic undernutrition in the Nutrition Collaborative Research Support Program, but that research revealed that poor dietary quality and lack of specific micronutrients were the strongest predictors of growth stunting, delays in child development, and many other adverse outcomes (1, 2).

In the 1980s, the need and opportunity for micronutrient interventions started to receive major attention. It had certainly been known for decades that severe deficiencies of vitamin A, iron, iodine, and other micronutrients increased mortality and morbidity and impaired child development, but there was little awareness before 1980 that marginal deficiencies of micronutrients could adversely affect human function and that many more functions were affected than evident from the clinical symptoms of severe deficiency. Once this reality was recognized, along with the widespread prevalence of multiple micronutrient deficiencies that result from poor-quality diets, the scientific community, agencies, governments, and others involved with improving nutrition tested and developed a wide range of interventions including the following: single and, later, multiple, nutrient supplements; fortification with single or multiple micronutrients; and food-based improvements.

This chapter describes options for delivering the micronutrients of most importance to public health. More detail on micronutrient assessment and function is available in the chapters on specific nutrients.


VITAMIN A

The World Health Organization (WHO) estimates that 5.2 million preschoolers and 7 million pregnant women suffer from clinical signs of vitamin A deficiency (most commonly night blindness), and 190 million have deficiency without clinical symptoms. Most of these people live in South and Southeast Asia and sub-Saharan Africa.

The need for large-scale interventions to prevent vitamin A deficiency was recognized in the mid-1980s, when preschooler mortality in Sumatra—where deficiency was prevalent—was reduced by 34% with highdose (200,000 IU) capsules 6 months apart (3). This was later confirmed by a metaanalysis of additional studies (4). Currently, more than 70% of children 6 to 59 months old receive the recommended twice-yearly high-dose supplements (100,000 IU for those 6 to 11 months old and 200,000 IU for those 12 to 59 months old). Effective distribution of the supplements is often supported by public health campaigns promoting “vitamin A days” or combined vaccination and vitamin A promotions.
A metaanalysis of 21 studies showed that neonatal highdose supplementation reduced all-cause mortality by 12%, but it had no effect during the first 6 months of age (5). In children 6 to 59 months old, mortality was reduced by 25% and diarrhea by 30%. There was no significant effect on deaths from measles or meningitis, although the reduction in deaths from these conditions was almost 30%.

The benefits of giving lower doses of vitamin A to women starting in the periconceptional period were investigated in two studies adequately powered to determine the effect on maternal and infant mortality. The supplements provided the daily recommended intake in a single weekly dose. The first study, in rural Nepal, showed a 40% reduction in pregnancy-related mortality with retinol as the supplement and a 49% reduction with β-carotene (6). However, when the study was replicated in rural Bangladesh, these supplements had no effect on pregnancy-related mortality, which the investigators suggested was the result of the lower mortality rates (better delivery care) and vitamin A status in the Bangladeshi women (7). Providing high-dose supplements to infants within a few days of birth has produced inconsistent effects on infant mortality (8). Additional trials of this question are ongoing in India, Ghana, and Tanzania.

For women of reproductive age, supplementation with high-dose preformed vitamin A (200,000 IU) is restricted to the first 6 weeks postpartum when there is a low risk of their becoming pregnant. This mitigates concern that a high dose could have teratogenic effects on the embryo; the tolerable upper intake limit (UL) for vitamin A is 3000 IU (µg retinol equivalent [RE]) daily for women of reproductive age based on this concern. β-Carotene does not present this risk. Supplementing breast-feeding mothers during early lactation increases the secretion of retinol in breast milk and improves infant vitamin A status. In fact, breast milk vitamin A concentration is a good indicator of the effectiveness of vitamin A intervention programs for women and infants (9). After the first 6 weeks postpartum, when high-dose supplements can no longer be used, maternal intake can be increased through lowdose supplements or foods rich in preformed vitamin A or β-carotene.

Food sources are highly variable in their content of vitamin A and its precursor carotenoids. Animal source foods including milk, eggs, and liver are a good source of retinol. Some fruits and vegetables contain β-carotene and other carotenoids that can be converted to vitamin A and improve vitamin A status (10). One of the most concentrated natural sources of carotenoids is red palm oil, in which carotenoids are present as precursor carotenoids if these are not removed by processing (11). Biofortification for improving vitamin A status can be achieved effectively with orange-fleshed sweet potatoes and golden rice. For example, in Mozambique, Helen Keller’s Reaching Agents of Change Project is providing orange-fleshed sweet potatoes to 600,000 households. Vitamin A-rich cassava is also being investigated.


IRON

Iron is often cited as being the most common nutrient deficiency in the world. It is certainly prevalent, especially in menstruating women and in infants and children, although “neglected” nutrients such as riboflavin and vitamin B12 may in fact be more common. Pending more definitive data, the WHO states that approximately 50% of anemia worldwide is the result of iron deficiency. Causes of the remainder are incompletely understood but include malaria, thalassemias, vitamin A deficiency, and parasitic infections such as hookworm and schistosomiasis. No strategy to control iron deficiency can be complete unless infections that cause iron deficiency are controlled.

Interventions to prevent and treat iron deficiency are justified because iron deficiency increases the risk of anemia, reduces work capacity and performance, increases the risk of depression, and impairs the cognitive development of children (12). Iron deficiency anemia in infancy may affect major dopamine pathways, leading to persistently poor inhibitory control and executive function later in childhood and adult life (13).

Iron can be provided as ferrous sulfate or other iron salts in tablets or in syrup for infants and young children. These can treat anemia within about 2 to 3 months or can prevent the development of iron deficiency in young children and pregnant women. The recommended daily dose is 3 mg/kg for children up to 5 years and 60 mg for adults; 60 mg is the recommended UL because of the risk of intestinal distress at higher levels. Higher doses are often given in the mistaken assumption that they will be more effective for increasing hemoglobin, but iron absorption is down-regulated as intakes increase, and in some trials, 20 mg/day was as effective as 60 mg for pregnant women.

Iron supplements do not have to be consumed every day to be effective, and taking a supplement once per week can reduce the risk of iron deficiency anemia. The WHO recommends once-weekly iron (60 mg of elemental iron as 300 mg ferrous sulfate, 180 mg ferrous fumarate, or 500 mg ferrous gluconate) and folic acid (2800 µg) supplementation for menstruating adolescent girls and adult women living in areas where the prevalence of anemia in this group is 20% or higher (14). Supplementation can be stopped for 3 months and restarted for another 3 months. Efficacy for treating anemia depends predominantly on the total amount of iron delivered and not on the frequency of delivery; in Bangladesh, most of the hemoglobin response to 60 mg/day supplements occurred with the first 20 tablets, and response plateaued after 40 tablets (15). In pregnant women, the demands for iron are particularly high, so daily, rather than weekly, supplementation is recommended (60 mg/day with 400 µg folic acid) (14). In areas where the prevalence of anemia
is higher than 40%, the supplements should be continued for 3 months postpartum.

It is possible that supplemental iron exacerbates the adverse effects of malaria in infants and young children, especially in those who are not iron deficient initially and in areas where malaria prophylaxis and health care are poor (16). Research is ongoing to determine the underlying mechanisms and whether adverse effects may be avoided by providing the iron with meals or in fortified foods.

Iron fortification of food is a common strategy for the prevention of iron deficiency. This approach avoids problems frequent in supplementation programs such as lack of pill distribution and poor participant compliance. One review discussed the reasons that iron deficiency anemia continues to be so prevalent even though the first flour fortification programs (in Canada, the United States, and the United Kingdom) started in the 1940s (17). These issues include concerns about the safety of supplementation and fortification, technical constraints to the addition of iron to foods, the complexities of assessing iron status, and lack of knowledge about the adverse consequences of iron deficiency.

One review of the effectiveness of iron fortification of wheat flour concluded that only 7 out of 78 countries evaluated were likely to detect a positive effect even if the program was implemented effectively (18). The main reason for this is that millers have used less bioavailable iron compounds such as atomized or hydrogen-reduced iron powders because they cost less and do not cause adverse taste and color reactions with food; however, these substances are absorbed poorly. Less reactive yet bioavailable forms of iron such as ferrous fumarate, sodium iron ethylenediaminetetraacetic acid (NaFeEDTA), and micronized ferric pyrophosphate are being used increasingly in flours, condiments (salt, curry, fish sauce, soy sauce), and complementary foods for infants. Where 150 to 300 mg wheat flour is consumed daily, the recommendation is to add 20 ppm iron as NaFeEDTA or 30 ppm as dried ferrous fumarate or ferrous sulfate (18). If sensory problems occur or to reduce cost, 60 ppm of electrolytic iron can be used. Only NaFeEDTA is recommended for high-extraction wheat flour.

In general, food-based strategies to improve iron status have been less effective than supplementation or fortification. Meat and meat products are often expensive, and even feeding 70 g daily for 9 months did not improve the iron status of young Guatemalan children (Allen et al, unpublished data). Increasing intake of food high in ascorbic acid did not improve iron status of Mexican women, even though their diet was high in poorly available iron (19). In many plant-based diets, iron bioavailability is poor because of the high content of phytates and/or tannins. Although absorption from such foods can be improved by soaking or pretreatment with phytases, this is not a common or popular practice.

Biofortification of staple foods shows some promise for improving the iron status of populations. For example, rice with a high content of iron (which only added 1.4 mg/day to the usual dietary intake) did slightly improve iron stores in Filipina women (20). Biofortified beans and pearl millet are being explored by Harvest Plus.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Jul 27, 2016 | Posted by in PUBLIC HEALTH AND EPIDEMIOLOGY | Comments Off on Approaches To Preventing Micronutrient Deficiencies1

Full access? Get Clinical Tree

Get Clinical Tree app for offline access