Chapter 1
Nutrients
History and Definitions
Discovery of the Nutrients
Early Observations
The belief that foods contained a single nutritional principle persisted for more than two millennia up until the nineteenth century and impeded progress in understanding nutrition. Some recorded observations made during the seventeenth and eighteenth centuries hinted at scientific progress to come. For example, during the 1670s Thomas Sydenham, a British physician, observed that a tonic of iron filings in wine produced a clinical response in patients with chlorosis, a condition now recognized as hypochromic or iron-deficiency anemia (McCollum, 1957). In 1747 James Lind of Scotland, while serving as a naval surgeon, conducted a clinical trial of various proposed treatments of sailors who were ill with scurvy. He observed that consumption of citrus fruits (oranges and lemons), but not other typical foods and medicines, cured the disease (Carpenter, 1986). Nevertheless, chlorosis and scurvy were not viewed as nutritional diseases, and the concept that a disease might be caused by a deficit of a substance that was nutritionally essential did not exist. Before 1900, toxins, heredity, and infections, but not yet nutrition, were recognized as causes of disease.
Recognition that Food is a Source of Specific Nutrients
By the early 1800s the elements carbon, nitrogen, hydrogen, and oxygen were recognized as the primary components of food, and the need for the carbon-containing components of food as a substrate for combustion (heat production) was recognized (Carpenter, 2003a). Protein was identified as a specific essential dietary component by the observations of François Magendie in Paris in 1816. Magendie showed that dogs fed only carbohydrate or fat lost considerable body protein and weight within a few weeks, whereas dogs fed foods that contained nitrogen (protein) remained healthy. In 1827 William Prout, a physician and scientist in London, proposed that the nutrition of higher animals could be explained by their need for proteins, carbohydrates, and fats, and this explanation was widely accepted. During the next two decades, the need of animals for several mineral elements was demonstrated, and at least six mineral elements (Ca, P, Na, K, Cl, and Fe) had been established as essential for higher animals by 1850 (Harper, 1999; Carpenter et al., 1997).
The nineteenth century German chemist Justus von Liebig postulated that energy-yielding substances (carbohydrates and fats) and proteins, together with a few minerals, represented the essentials of a nutritionally adequate diet, and he proposed that the nutritive value of foods and feeds could be predicted from knowledge of their gross chemical composition. Liebig prepared tables of food values based on this concept—work that was facilitated by the work of Wilhelm Henneberg, who devised the Weende system, known as proximate analysis, for analyzing foods and feeds for protein, fat, fiber, nitrogen-free extract (carbohydrate), and ash (McCollum, 1957). Throughout the remainder of the nineteenth century, nutritional thinking continued to be dominated by the belief that sources of energy, protein, and a few minerals were the sole principles of a nutritionally adequate diet.
validity of his assumptions was challenged during the nineteenth century (McCollum, 1957). In 1843 Jonathan Pereira in England stated that diets containing a wide variety of foods were essential for human well-being, whereas diets containing only a few foods were associated with the acquisition of diseases such as scurvy. Jean Baptist Dumas, based on his observation that an artificial milk formula that contained all of the known dietary constituents failed to prevent deterioration of health of children during the siege of Paris (1870-1871), also questioned the validity of Liebig’s assumptions. In addition, Nikolai Lunin (~1881), who worked in Gustav von Bunge’s laboratory in Dorpat, Estonia, conducting studies with mice in an effort to identify inadequacies in the mineral component of purified diets, demonstrated that mice fed a diet composed of purified proteins, fats, carbohydrates, and a mineral mixture survived less than 5 weeks, whereas mice that received milk or egg yolk in addition to the purified components remained healthy throughout the experiment. Lunin concluded that milk must contain small quantities of other unknown substances essential to life, but von Bunge apparently did not encourage him or subsequent students in his laboratory who made similar observations to investigate what the active factor in milk might be. The Liebig–von Bunge view that nutritional requirements consisted only of protein, energy-yielding substances, and a few minerals still had such hold on scientific thought that, rather than consider that these observations might point to the presence of other essential nutrients in foods, the inadequacies of the purified diets were attributed to mineral imbalances, failure to supply minerals as organic complexes, or lack of palatability (Wolf and Carpenter, 1997).
Also of significance were the studies of beriberi that were conducted during the nineteenth century. Kanehiro Takaki was concerned during the 1880s with the devastating effects of beriberi on sailors in the Japanese navy. Because of the superior health of British sailors, he compared the food supplies of the two navies and was struck by the higher protein content of the British rations. He, therefore, included evaporated milk and meat in the diet and substituted barley for part of the polished rice in the Japanese rations. These changes eradicated beriberi. He attributed this to the additional protein. In retrospect, we know that this was incorrect (i.e., beriberi is caused by thiamin deficiency), but his conclusion does imply that he correctly considered beriberi to be a disease caused by a nutritional inadequacy (Takaki, 1906). Christiaan Eijkman, an army physician in the Dutch East Indies, began his investigations of beriberi in the 1890s (Jansen, 1956). He had observed a high incidence of beriberi in the prisons in Java in which polished rice was a staple. He assumed it was caused by chronic consumption of a diet consisting largely of polished rice. He noted during his experimental studies that chickens fed on a military hospital diet composed mainly of polished rice developed a neurological disease resembling beriberi, whereas birds fed rice with the pericarp intact remained healthy. He concluded that ingestion of the high starch diet resulted in formation in the intestine of a substance that acted as a nerve poison and that rice hulls contained an antidote. Eijkman’s conclusion illustrates the fact that a connection between nutrient deficiency and disease was still a foreign concept at the end of the nineteenth century.
Recognition of the Connection of Diet and Disease
Resistance to the notion of nutritional deficiency diseases continued into the early twentieth century. With the accumulating number of diet-associated diseases, however, the concept that a disease might be caused by a deficit of an essential nutrient slowly gained acceptance (Carpenter, 2003b).
In 1901 Gerrit Grijns, who took over Eijkman’s research in the Dutch East Indies in 1896, showed through feeding trials that Eijkman’s active substance was present in other foods (Jansen, 1956; Carpenter, 1995). After demonstrating that beriberi could be prevented by including rice polishings, beans, or water extracts of these foods in the diet, he proposed that beriberi was a dietary deficiency disease caused by the absence of an essential nutrient present in rice hulls. Grijns thus interpreted Eijkman’s results correctly and provided for the first time a clear concept of a dietary deficiency disease. The broad implications of Grijns’ interpretation of his investigation of beriberi were not appreciated for some years, however.
In 1907 Alex Holst and Theodore Fröhlich in Norway reported that guinea pigs fed dry diets with no fresh vegetables developed a disease resembling scurvy; supplying them with fresh vegetables cured the disease—giving rise to a second example of a dietary deficiency disease (Carpenter, 1986). Interestingly, Holst and Fröhlich had been looking for a mammal to test a diet that had earlier produced beriberi in pigeons; they were surprised that scurvy resulted instead because, up until that time, scurvy had not been considered to occur in any species other than humans. This was a fortuitous occurrence because the guinea pig allowed assessment of the antiscorbutic value of different foodstuffs, leading to the isolation and identification of vitamin C.
In 1914 Joseph Goldberger was appointed by the Surgeon General of the United States to study the disease pellagra, which was prevalent in the southern United States. At the time, pellagra was thought to be an infectious disease, but Goldberger correctly theorized that the disease was caused by malnutrition (Carpenter, 2003c). He observed that those who treated the sick never developed the disease and noticed that people with restricted diets (mainly corn bread, molasses, and a little pork fat) were more likely to develop pellagra. Goldberger, however, had difficulty convincing others of this theory. Eventually, Goldberger’s group found that dogs developed a condition called “blacktongue” when fed mixtures with mostly cornmeal and no meat or milk powder, allowing dogs to be used to “assay”’ fractions from various foods for anti-blacktongue potency. The dogs responded rapidly to yeast, and yeast was also found to cure pellagra in humans. After Goldberger’s death, Conrad Elvehjem at the University of Wisconsin went on to show in 1937 that nicotinic acid, which had been discovered to be a bacterial growth factor, was extremely potent in curing blacktongue and also prevented and healed pellagra.
The iodination of salt in the 1920s, the fortification of milk with vitamin D in the 1930s (even before vitamin D had been purified and synthesized), and the addition of niacin, thiamin, and iron to cereal flours and products in the 1930s were successful efforts to reduce the incidence of goiter, rickets, and pellagra (Bishai and Nalubola, 2002), respectively. The concept of nutritional deficiency disease was firmly established.
Discovery of the First Small Organic Molecule Essential for Growth
The first evidence of essentiality of a specific small organic molecule was the discovery by Edith Willcock and Fredrick G. Hopkins (1906) that a supplement of the amino acid tryptophan, which had been discovered in 1900, prolonged survival of mice fed a zein-based diet. Zein is the major storage protein in corn endosperm and it contains only a small proportion of tryptophan. It was also recognized at this time that enzyme hydrolysates of protein supported adequate growth rates, whereas acid hydrolysates of protein failed to support growth (Carpenter, 2003b). This difference was also attributed to a deficiency of tryptophan due to the destruction of tryptophan by acid digestion (Henriques and Hansen, 1905). But the growth rate of rats fed on semipurified diets was not satisfactory, so further work on amino acid requirements was delayed until this problem was solved.
Disproving Liebig’s Hypothesis
The validity of Liebig’s hypothesis—that the nutritive value of foods and feeds could be predicted from measurements of their gross composition—was directly tested at the University of Wisconsin from 1907 to 1911 in what has become known as the Wisconsin single-grain experiment (Carpenter et al., 1997; Hart et al., 1911). This study was suggested to E. B. Hart by his predecessor at the University of Wisconsin, Stephen M. Babcock, who had observed that milk production by cows consuming rations composed of different feedstuffs differed considerably, even when the rations were formulated to have the same gross composition and energy content. Hart and colleagues compared the performance of four groups of heifers fed rations composed entirely of corn (cornmeal, corn gluten, and corn stover), wheat (ground wheat, wheat gluten, and wheat straw), or oats (oat meal and oat straw), all formulated to be closely similar in gross composition and energy content; or a mixed ration consisting of equal parts of the three plants. Six-month-old heifers were fed the assigned rations to maturity and through two reproductive periods. Differences between performance of the corn and wheat groups were marked, with other groups being intermediate. Calves born to cows consuming the corn ration were strong and vigorous and all lived, but cows consuming the wheat ration all delivered 3 to 4 weeks prematurely and none of the calves lived beyond 12 days. Cows fed the corn ration produced almost double the amount of milk produced by those fed the wheat ration. Thus Hart and colleagues demonstrated that the nutritive value of a ration could not be predicted solely from measurements of its content of protein, energy, and a few minerals. In hindsight, the signs of inadequacy in the wheat and oat groups resembled those of vitamin A deficiency, which was probably prevented by the carotene in the ration that contained corn.