Nutrients: History and Definitions

Chapter 1


Nutrients


History and Definitions


Martha H. Stipanuk, PhD


Nutrients are defined as chemical substances found in foods that are necessary for human life and growth, maintenance, and repair of body tissues. It is now commonly accepted that proteins, fats, carbohydrates, vitamins, minerals, and water are the major nutritional constituents of foods.



Discovery of the Nutrients


Before the chemical nature of food was understood, food was believed to be made up of nutriment, medicines, and poisons. In ancient Greece (~500-300 BC), differences in the physical properties of foods and in their content of medicinal and toxic substances were recognized. The role of diet in the causation and treatment of disease was recognized, as evidenced by the use of liver to treat night blindness. However, the physicians of this era had no understanding of the chemical nature of foods and believed that foods contained only a single nutritional principle that was called aliment. In some ways this ancient understanding is still appropriate in that foods do contain nutrients, substances beneficial to health, and substances that have adverse effects on health, although we now know that each of the three principles in fact includes many different chemical compounds.



Early Observations


The belief that foods contained a single nutritional principle persisted for more than two millennia up until the nineteenth century and impeded progress in understanding nutrition. Some recorded observations made during the seventeenth and eighteenth centuries hinted at scientific progress to come. For example, during the 1670s Thomas Sydenham, a British physician, observed that a tonic of iron filings in wine produced a clinical response in patients with chlorosis, a condition now recognized as hypochromic or iron-deficiency anemia (McCollum, 1957). In 1747 James Lind of Scotland, while serving as a naval surgeon, conducted a clinical trial of various proposed treatments of sailors who were ill with scurvy. He observed that consumption of citrus fruits (oranges and lemons), but not other typical foods and medicines, cured the disease (Carpenter, 1986). Nevertheless, chlorosis and scurvy were not viewed as nutritional diseases, and the concept that a disease might be caused by a deficit of a substance that was nutritionally essential did not exist. Before 1900, toxins, heredity, and infections, but not yet nutrition, were recognized as causes of disease.



Recognition that Food is a Source of Specific Nutrients


By the early 1800s the elements carbon, nitrogen, hydrogen, and oxygen were recognized as the primary components of food, and the need for the carbon-containing components of food as a substrate for combustion (heat production) was recognized (Carpenter, 2003a). Protein was identified as a specific essential dietary component by the observations of François Magendie in Paris in 1816. Magendie showed that dogs fed only carbohydrate or fat lost considerable body protein and weight within a few weeks, whereas dogs fed foods that contained nitrogen (protein) remained healthy. In 1827 William Prout, a physician and scientist in London, proposed that the nutrition of higher animals could be explained by their need for proteins, carbohydrates, and fats, and this explanation was widely accepted. During the next two decades, the need of animals for several mineral elements was demonstrated, and at least six mineral elements (Ca, P, Na, K, Cl, and Fe) had been established as essential for higher animals by 1850 (Harper, 1999; Carpenter et al., 1997).


The nineteenth century German chemist Justus von Liebig postulated that energy-yielding substances (carbohydrates and fats) and proteins, together with a few minerals, represented the essentials of a nutritionally adequate diet, and he proposed that the nutritive value of foods and feeds could be predicted from knowledge of their gross chemical composition. Liebig prepared tables of food values based on this concept—work that was facilitated by the work of Wilhelm Henneberg, who devised the Weende system, known as proximate analysis, for analyzing foods and feeds for protein, fat, fiber, nitrogen-free extract (carbohydrate), and ash (McCollum, 1957). Throughout the remainder of the nineteenth century, nutritional thinking continued to be dominated by the belief that sources of energy, protein, and a few minerals were the sole principles of a nutritionally adequate diet.


Despite the dominance of Liebig’s views during the mid to late nineteenth century, it should be noted that the



image HISTORICAL TIDBIT 1-1   The Connection between Combustion and Respiration


The Experiments of Antoine Lavoisier


image


Nearly 300 years before Lavoisier, during the sixteenth century, the artist and scientist Leonardo da Vinci noted the part played by air in combustion. The ancients realized that air was necessary for burning but did not understand the nature of the combustion process. Leonardo arranged deliberate experiments on enclosed combustion and arrived at the correct answer to a problem that continued to worry experimenters for years afterward. In manuscripts deposited as the Codex Leicester, Leonardo noted that “air is consumed by the introduction of the fire.” He also noted, in the Codice Atlantico, that “where flame cannot live, no animal that draws breath can live,” clearly correlating the phenomenon of combustion with the one of animal respiration. Like Leonardo, Robert Fludd and John Mayow came to their own correct interpretations of the phenomenon of combustion in the seventeenth century. However, despite the work of these early insightful scientists, the phlogiston theory dominated the view of combustion from the late seventeenth century through much of the eighteenth century. The phlogiston theory posited the existence of the substance called phlogiston in combustible materials; the process of combustion was thought to involve the release of phlogiston into the air. Because substances in a sealed container were observed to burn for only a limited period of time, air was thought to have a limited capacity to accept phlogiston.


This phlogiston theory of combustion was widely accepted until it was refuted by Antoine Lavoisier’s experiments showing that respiration was essentially a slow combustion of organic material using inhaled oxygen and



producing carbon dioxide and water (Wilkinson, 2004). Lavoisier and mathematician Pierre-Simon Laplace performed experiments in 1780 with guinea pigs in which they quantified the oxygen consumed and carbon dioxide produced by metabolism. They also developed an ice-calorimeter apparatus to measure the amount of heat given off during combustion or respiration (see drawing). They measured the quantity of carbon dioxide and heat produced by a live guinea pig that was confined in this apparatus and then determined the amount of heat produced when sufficient carbon was burned in the ice-calorimeter to produce the same amount of carbon dioxide as had been exhaled by the guinea pig. They found the same ratio of heat to carbon dioxide for both processes, leading to the conclusion that animals produced energy by a type of combustion reaction. Lavoisier further showed that combustion involved the reaction of the combustible substance with oxygen and that heat or light were released as weightless by-products. Lavoisier and his colleagues viewed respiration as a very slow combustion phenomenon that is conducted inside the lungs. About 50 years later in 1837, German physiologist Heinrich Gustav Magnus performed his famous experiments showing that both carbon dioxide and oxygen existed in both arterial and venous blood, with oxygen higher and carbon dioxide lower in arterial blood compared to venous blood. Magnus correctly concluded that combustion (oxygen uptake; carbon dioxide, water, and heat production) must occur throughout the body (not just in the lungs), and other scientists subsequently showed that oxidation occurs in the tissues, not in the blood plasma.


With Armand Sequin, Lavoisier pushed his studies further to investigate the influence of work, food, and environmental temperature on metabolism. By measuring the amount of carbon dioxide exhaled, they showed that respiration (oxygen consumption or carbon dioxide production) increased by about 10% in a cold environment, by 50% due solely to food intake, and by 200% with exercise. They also showed a direct correlation between the heart rate (pulse) and the amount of work performed (sum of weights lifted to a predetermined height) and between the heart rate and the quantity of oxygen consumed. These studies, along with some knowledge of the chemical composition of plant and animal foods, allowed Lavoisier to make the fundamental conclusion that the oxidation of carbon compounds was the source of energy for activity and other bodily functions of animals.


Although Lavoisier’s experiments were cut short by the French Revolution and his execution by the French revolutionists during the Reign of Terror (because of Lavoisier’s service as a tax collector), Lavoisier’s seminal contributions to modern chemistry, metabolism, nutrition, and exercise physiology were enormous. He is often called the “Father of Modern Chemistry.”


validity of his assumptions was challenged during the nineteenth century (McCollum, 1957). In 1843 Jonathan Pereira in England stated that diets containing a wide variety of foods were essential for human well-being, whereas diets containing only a few foods were associated with the acquisition of diseases such as scurvy. Jean Baptist Dumas, based on his observation that an artificial milk formula that contained all of the known dietary constituents failed to prevent deterioration of health of children during the siege of Paris (1870-1871), also questioned the validity of Liebig’s assumptions. In addition, Nikolai Lunin (~1881), who worked in Gustav von Bunge’s laboratory in Dorpat, Estonia, conducting studies with mice in an effort to identify inadequacies in the mineral component of purified diets, demonstrated that mice fed a diet composed of purified proteins, fats, carbohydrates, and a mineral mixture survived less than 5 weeks, whereas mice that received milk or egg yolk in addition to the purified components remained healthy throughout the experiment. Lunin concluded that milk must contain small quantities of other unknown substances essential to life, but von Bunge apparently did not encourage him or subsequent students in his laboratory who made similar observations to investigate what the active factor in milk might be. The Liebig–von Bunge view that nutritional requirements consisted only of protein, energy-yielding substances, and a few minerals still had such hold on scientific thought that, rather than consider that these observations might point to the presence of other essential nutrients in foods, the inadequacies of the purified diets were attributed to mineral imbalances, failure to supply minerals as organic complexes, or lack of palatability (Wolf and Carpenter, 1997).


Also of significance were the studies of beriberi that were conducted during the nineteenth century. Kanehiro Takaki was concerned during the 1880s with the devastating effects of beriberi on sailors in the Japanese navy. Because of the superior health of British sailors, he compared the food supplies of the two navies and was struck by the higher protein content of the British rations. He, therefore, included evaporated milk and meat in the diet and substituted barley for part of the polished rice in the Japanese rations. These changes eradicated beriberi. He attributed this to the additional protein. In retrospect, we know that this was incorrect (i.e., beriberi is caused by thiamin deficiency), but his conclusion does imply that he correctly considered beriberi to be a disease caused by a nutritional inadequacy (Takaki, 1906). Christiaan Eijkman, an army physician in the Dutch East Indies, began his investigations of beriberi in the 1890s (Jansen, 1956). He had observed a high incidence of beriberi in the prisons in Java in which polished rice was a staple. He assumed it was caused by chronic consumption of a diet consisting largely of polished rice. He noted during his experimental studies that chickens fed on a military hospital diet composed mainly of polished rice developed a neurological disease resembling beriberi, whereas birds fed rice with the pericarp intact remained healthy. He concluded that ingestion of the high starch diet resulted in formation in the intestine of a substance that acted as a nerve poison and that rice hulls contained an antidote. Eijkman’s conclusion illustrates the fact that a connection between nutrient deficiency and disease was still a foreign concept at the end of the nineteenth century.



Recognition of the Connection of Diet and Disease


Resistance to the notion of nutritional deficiency diseases continued into the early twentieth century. With the accumulating number of diet-associated diseases, however, the concept that a disease might be caused by a deficit of an essential nutrient slowly gained acceptance (Carpenter, 2003b).


In 1901 Gerrit Grijns, who took over Eijkman’s research in the Dutch East Indies in 1896, showed through feeding trials that Eijkman’s active substance was present in other foods (Jansen, 1956; Carpenter, 1995). After demonstrating that beriberi could be prevented by including rice polishings, beans, or water extracts of these foods in the diet, he proposed that beriberi was a dietary deficiency disease caused by the absence of an essential nutrient present in rice hulls. Grijns thus interpreted Eijkman’s results correctly and provided for the first time a clear concept of a dietary deficiency disease. The broad implications of Grijns’ interpretation of his investigation of beriberi were not appreciated for some years, however.


In 1907 Alex Holst and Theodore Fröhlich in Norway reported that guinea pigs fed dry diets with no fresh vegetables developed a disease resembling scurvy; supplying them with fresh vegetables cured the disease—giving rise to a second example of a dietary deficiency disease (Carpenter, 1986). Interestingly, Holst and Fröhlich had been looking for a mammal to test a diet that had earlier produced beriberi in pigeons; they were surprised that scurvy resulted instead because, up until that time, scurvy had not been considered to occur in any species other than humans. This was a fortuitous occurrence because the guinea pig allowed assessment of the antiscorbutic value of different foodstuffs, leading to the isolation and identification of vitamin C.


In 1914 Joseph Goldberger was appointed by the Surgeon General of the United States to study the disease pellagra, which was prevalent in the southern United States. At the time, pellagra was thought to be an infectious disease, but Goldberger correctly theorized that the disease was caused by malnutrition (Carpenter, 2003c). He observed that those who treated the sick never developed the disease and noticed that people with restricted diets (mainly corn bread, molasses, and a little pork fat) were more likely to develop pellagra. Goldberger, however, had difficulty convincing others of this theory. Eventually, Goldberger’s group found that dogs developed a condition called “blacktongue” when fed mixtures with mostly cornmeal and no meat or milk powder, allowing dogs to be used to “assay”’ fractions from various foods for anti-blacktongue potency. The dogs responded rapidly to yeast, and yeast was also found to cure pellagra in humans. After Goldberger’s death, Conrad Elvehjem at the University of Wisconsin went on to show in 1937 that nicotinic acid, which had been discovered to be a bacterial growth factor, was extremely potent in curing blacktongue and also prevented and healed pellagra.


The iodination of salt in the 1920s, the fortification of milk with vitamin D in the 1930s (even before vitamin D had been purified and synthesized), and the addition of niacin, thiamin, and iron to cereal flours and products in the 1930s were successful efforts to reduce the incidence of goiter, rickets, and pellagra (Bishai and Nalubola, 2002), respectively. The concept of nutritional deficiency disease was firmly established.



Discovery of the First Small Organic Molecule Essential for Growth


The first evidence of essentiality of a specific small organic molecule was the discovery by Edith Willcock and Fredrick G. Hopkins (1906) that a supplement of the amino acid tryptophan, which had been discovered in 1900, prolonged survival of mice fed a zein-based diet. Zein is the major storage protein in corn endosperm and it contains only a small proportion of tryptophan. It was also recognized at this time that enzyme hydrolysates of protein supported adequate growth rates, whereas acid hydrolysates of protein failed to support growth (Carpenter, 2003b). This difference was also attributed to a deficiency of tryptophan due to the destruction of tryptophan by acid digestion (Henriques and Hansen, 1905). But the growth rate of rats fed on semipurified diets was not satisfactory, so further work on amino acid requirements was delayed until this problem was solved.



Disproving Liebig’s Hypothesis


The validity of Liebig’s hypothesis—that the nutritive value of foods and feeds could be predicted from measurements of their gross composition—was directly tested at the University of Wisconsin from 1907 to 1911 in what has become known as the Wisconsin single-grain experiment (Carpenter et al., 1997; Hart et al., 1911). This study was suggested to E. B. Hart by his predecessor at the University of Wisconsin, Stephen M. Babcock, who had observed that milk production by cows consuming rations composed of different feedstuffs differed considerably, even when the rations were formulated to have the same gross composition and energy content. Hart and colleagues compared the performance of four groups of heifers fed rations composed entirely of corn (cornmeal, corn gluten, and corn stover), wheat (ground wheat, wheat gluten, and wheat straw), or oats (oat meal and oat straw), all formulated to be closely similar in gross composition and energy content; or a mixed ration consisting of equal parts of the three plants. Six-month-old heifers were fed the assigned rations to maturity and through two reproductive periods. Differences between performance of the corn and wheat groups were marked, with other groups being intermediate. Calves born to cows consuming the corn ration were strong and vigorous and all lived, but cows consuming the wheat ration all delivered 3 to 4 weeks prematurely and none of the calves lived beyond 12 days. Cows fed the corn ration produced almost double the amount of milk produced by those fed the wheat ration. Thus Hart and colleagues demonstrated that the nutritive value of a ration could not be predicted solely from measurements of its content of protein, energy, and a few minerals. In hindsight, the signs of inadequacy in the wheat and oat groups resembled those of vitamin A deficiency, which was probably prevented by the carotene in the ration that contained corn.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Feb 26, 2017 | Posted by in PHARMACY | Comments Off on Nutrients: History and Definitions

Full access? Get Clinical Tree

Get Clinical Tree app for offline access