Contents - Previous - Next


Malnutrition and behavioral development: the nutrition variable1


Diagnosing malnutrition and identifying populations at risk
Associations and interactions among dietary components
Acknowledgment
Literature cited


BEAT SCHÜRCH2

1Prepared for the International Dietary Energy Consultative Group (IDECG) Task Force workshop on malnutrition and behavior at the University of California, Davis, CA, December 6-10, 1993. This workshop was supported by IDECG, the Nestle Foundation Kraft Foods and the International Union for Nutritional Science. Guest editor for this supplement publication was Ernesto Pollitt, Department of Pediatrics, University of California, Davis, CA 95616.

2To whom correspondence should be addressed: Executive Secretary International Dietary Energy Consultancy Group, c/o Nestle Foundation, P. O. Box 851, 1001 Lausanne, Switzerland.

International Dietary Energy Consultancy Group, c/o Nestle Foundation, 1001 Lausanne, Switzerland

ABSTRACT During the last 50 years, the perception of nutrition variables that affect behavioral development has shifted, as have the scientific hypotheses that were addressed, the design of the studies that were conducted, the nature and composition of the dietary supplements that were given and compared and the interpretation of outcomes. Methods for diagnosing malnutrition and identifying the populations at risk of becoming malnourished are reviewed in relation to nutrition interventions. Even in dietary supplementation studies it can be difficult to isolate specific nutrient effects because of associations and interactions among dietary components. These and other problems associated with the study of possible effects of dietary energy, protein and micronutrients, and of breast vs. formula feeding on human development are examined. Where dietary intake data, biochemical indicators or clinical symptoms strongly suggest the presence of a single nutrient deficiency, the appropriate action may continue to be food supplementation or fortification, until the corresponding nutrient deficiency has been rectified in the habitual diet; where the nature of the deficiency is less clear, supplementation trials and programs aimed at improving dietary diversity and quality in general are more likely to show effects on indicators of behavioral development. J. Nutr. 125: 2255S-2262S, 1995.

INDEXING KEY WORDS:

• malnutrition • behavioral development • protein deficiency

During the last 50 years, the focus of scientists and policy makers concerned with malnutrition in infants and children in the developing world has shifted repeatedly. In 1949, when the Food and Agriculture Organization (FAO) of the United Nations and the World Health Organization (WHO) convened nutrition experts to a meeting to assess the importance of malnutrition worldwide, the main concern was vitamin deficiencies; in spite of Williams' pioneering work and description of kwashiorkor in the 1930s (Williams 1933, Williams 1935), protein deficiency did not yet figure on the agenda (Brock 1974). This changed shortly thereafter when Brock and Autret (1952) in their report on "Kwashiorkor in Africa" concluded that kwashiorkor was "the most serious and widespread nutritional disorder known to medical and nutritional science." The predominant idea during the following two decades was that a lack of protein was the most important nutritional problem in the Third World. This position came under attack in the early 1970s, when nutritionists, like McLaren (1974), who worked outside of Africa and saw many more children suffering from marasmus than from kwashiorkor, began to argue that the main cause of malnutrition in the world was a lack of dietary energy, not protein.

According to this view, giving undernourished children protein without sufficient energy would be futile because the dietary protein would be metabolized to supply energy rather than essential amino acids. More recently, micronutrients (particularly iron, vitamin A and iodine) have received most of the attention (ACC/SCN 1993). In each period, the prevalent view determined to a large extent the scientific questions that were asked, the design of the studies that were conducted, the nature and composition of the dietary supplements that were given and compared, and the interpretation of outcomes.

In many of the earlier studies, the development of malnourished children was compared with that of children matched on as many apparently relevant characteristics as possible but with no history of malnutrition. Results of such studies always remain open to other interpretations, and the very fact that one group ended up being malnourished, whereas the other did not, appears to be more convincing evidence that the two groups were growing up under different conditions than a whole array of matching variables attempting to support the claim that they were not. Prospective, randomized, double-blind supplementation trials are needed to argue convincingly that inadequate amounts of energy, of a single nutrient, or of a combination of nutrients are affecting behavioral development. In this paper it is, therefore, primarily this type of study that will serve as a basis for discussion and illustration.

Until about 10 years ago, it was frequently argued that most children who show delays or deficiencies in various aspects of their development grow up under conditions entailing many disadvantages and that it is therefore exceedingly difficult, if not impossible, to isolate the effects of nutritional factors. At present, one can claim that there is convincing evidence from several nutritional supplementation trials showing that inadequate dietary intake, and even a lack of specific nutrients, can affect the behavioral development of children. Some of these studies and their comparative analysis also showed that the probable effects of early nutritional deficiencies are dependent on the environmental conditions under which they occur additional risk factors, on the one hand, and compensatory factors, on the other, can modify the effect of a nutritional deficiency on behavioral development (see the companion paper by Wachs in this issue).

The risk of malnutrition is particularly high during the first 2 y of life. At that early age, the child is completely dependent on care providers. The high rate of growth entails a relatively high energy and nutrient demand, and the small child, therefore, depends on a frequent supply of meals with an energy and nutrient density that may be higher than that of the food consumed by the rest of the family. The high prevalence of malnutrition in this age group, however, does not necessarily mean that older children are less sensitive to damaging effects of malnutrition.

Diagnosing malnutrition and identifying populations at risk


To diagnose malnutrition or to identify individuals at risk of becoming malnourished, one can assess dietary intakes, look for clinical symptoms and signs or rely on anthropometric, biochemical or functional indicators of nutritional status. The intent here is not to go into methodological details of dietary intake and nutritional status assessment, which can be found elsewhere (e.g., Cameron and van Staveren 1988, Fidanza 1991, Gibson 1990), but to briefly review the advantages and limitations of the different approaches in relation to nutrition interventions (see also Sahn et al. 1984).

If the aim is to test the hypothesis that protein-energy or general undernutrition during early phases of life has undesirable effects on behavioral development, as was the case in several large-scale supplementation studies in the 1970s (Chavez and Martinez 1982, Joos et al. 1983, McKay et al. 1978, Pollitt et al. 1993, Rush et al. 1980, Waber et al. 1981), one will attempt to intervene in a target population in which the incidence of undernutrition is high. Various combinations of low socioeconomic status, low anthropometric indices in children and low energy and protein intakes in pregnant mothers were used to identify target groups in these studies.

In the case of nutrient deficiencies, the choice of approach depends to a large extent on the type of nutrient in which one is primarily interested. Golden (1988, 1991) proposed to make a distinction between two types of nutrient deficiencies. In a deficiency of a Type I nutrient, growth remains normal initially, but the concentration of the nutrient and its metabolites is reduced in tissues, and specific clinical signs can be observed. Iodine, iron, copper, calcium, thiamin, riboflavin, ascorbic acid and retinol are examples of nutrients of this type. A deficiency of a Type II nutrient manifests itself initially as a general growth failure, but concentrations of the nutrient and its metabolites remain normal in tissues, and no specific signs can be observed. Nitrogen, essential amino acids, zinc, potassium, sodium, phosphorus and energy are examples belonging to this category. Both types of nutrients can be further subdivided into those with and without identified stores in specific body tissues. Type I nutrient deficiencies are amenable to diagnosis by clinical symptoms and signs and biochemical tests. The diagnosis of Type II nutrient deficiencies is much more difficult and relies primarily on a combination of dietary intake assessments and anthropometric indicators.

Assessment of habitual dietary intakes. The accuracy of methods for assessing dietary intakes is approximately proportional to their complexity. The minimal accuracy that is required will depend on the nutrient under consideration and on the dependent variable and effect size one is interested in.

Information on habitual intakes appears to be of interest primarily for energy, macronutrients, and Golden's Type II nutrients; it is less important for nutrients with measurable stores and metabolites (like iodine and iron). Because we know little about the dose:response relationship between energy, macro- and micronutrient intakes on the one hand and various aspects of behavioral development on the other, it is difficult to specify the accuracy of the intake measure that is required; it appears, however, that it is a degree of accuracy that can only be obtained by weighing the food intake of individuals.

It is possible, but difficult, time-consuming and rather intrusive to assess dietary intakes by this method. It requires highly qualified and very motivated personnel to collect this kind of information, and food intakes have to be assessed for several days to justify the claim that one has measured habitual intakes. Such data can, therefore, only be collected from small numbers of individuals. In infants, this is further complicated by the fact that, while they are still being breast-fed, it is very difficult to assess their intake of breast milk. Additional potential sources of error are the translation of food intake into nutrient intakes and the estimates one has to make for proportions of nutrients that are not only ingested, but actually absorbed.

In Bogotá, Colombia (Waber et al. 1981), Guatemala (KIein et al. 1976) and Taiwan Joos et al.1983), pregnant mothers received food supplements after their habitual energy intake had been assessed. Even considering the low weight and height of these women, estimates of average habitual energy intakes were very low: 1600, 1400 and 1200 kcal/d, respectively. Recent studies of total energy expenditure with doubly-labeled water suggest that habitual energy intakes, like these, may often have been underestimated (Schoeller and Fjeld 1991)

Intakes usually end up being compared with requirements, which in their turn have often been established on an insecure basis and using criteria that are not related to behavioral development. Protein requirements of children, for instance, are based on amounts required for satisfactory weight gains, provided the energy supply is adequate. We do not know if the requirements are the same for normal longitudinal growth and behavioral development.

Requirements may be increased by frequent episodes of infectious disease and parasitic infestation and modified by nutrient interactions. The latter are relatively well known in the case of iron (Joint FAD/WHO Consultation 1988, Mejia and Arroyave 1982, Monsen et al. 1987). Most diets contain the physiologic iron requirement of 1.0-1.5 mg/d, but the absorption of nonheme iron, which is the predominant dietary form, depends so much on other enhancing (ascorbic and other organic acids, vitamin A, etc.) and depressing (phytates, tannins, oxalates, etc.) constituents of the diet that it is mainly the influence of these interactions that determines whether a diet is adequate to meet requirements. Also zinc deficiency was first described in populations consuming diets containing amounts of zinc that would be considered adequate in Western diets but were not, for instance, in Iranian adolescents on a diet containing large amounts of zinc antagonists (phytate, and possibly fiber) (Ronaghy et al. 1969).

Assessment of supplement intakes. In supplementation studies where subjects receive additional food, it is important to assess average intakes before and during the intervention. Only this makes it possible to estimate the subject's intake before the experiment and to partition the supplement into the portion replacing habitual intakes and the portion which is an actual supplement. Most studies do not meet these requirements. Coming closest are perhaps a recent study in Indonesia (Husaini et al. 1991), in which estimates of dietary intakes before and during the intervention were made, based on a combination of two 24-h dietary recalls and weighing of available foods, the Guatemala-Orient study (Pollitt et al. 1993) and a study in Taiwan (Jogs et al. 1983), where habitual dietary intakes were estimated by recall and supplement intakes recorded.

Assessment of nutritional status: clinical symptoms and signs. Clinical symptoms and signs, usually in combination with anthropometric data, were used primarily to diagnose severe protein-energy malnutrition as kwashiorkor, marasmus or a combination of the two. There is still a considerable amount of uncertainty about the pathogenesis of these clinical syndromes, but most of the affected children suffer from multiple deficiencies. A long time ago, Spanish-speaking scientists used the term distrofia pluricarencial, and more recently, people again argued that energy-nutrient deficiency would be a better term than protein-energy malnutrition.

Clinical symptoms and signs are usually observed and recorded in hospital when the child is acutely ill and shows a number of behavioral abnormalities, but many of these behavioral changes tend to disappear quite rapidly with recovery. There are only a few studies (e.g., Galler et al. 1987a, b) relating clinical signs or syndromes to developmental delays and abnormalities, and their results are difficult to interpret (see Grantham-McGregor's paper in this issue).

Anthropometric indicators. The most frequently used anthropometric indicators are weight-for-age, height-for-age and weight-for-height. They can be collected relatively easily and quickly in relatively large numbers of children. They are so frequently used as indicators of nutritional status that low weight-for-height (wasting) and low height-for-age (stunting) are often equated quite uncritically with acute and chronic undernutrition, respectively. Even though some children appear to be genetically thin, using wasting as an indicator of undernutrition has a high degree of face validity. A child can become wasted rapidly during an episode of acute diarrhea or because of anorexia due to infectious disease and when refed appropriately can also regain its original weight rather rapidly. In the case of stunting the situation is more complex and ambiguous. In support of the notion that stunting reflects chronic undernutrition, one can argue that increased growth velocities can be observed in stunted children receiving appropriate food supplements. There are, however, other observations that fail to support this hypothesis. In Chile and other Latin American countries stunting is very prevalent, yet from 1 mo of age the average stunted child has a positive weight-for-length Z score when compared with reference values of the US National Center for Health Statistics (NCHS), and it remains overweight for its length throughout the whole period during which its length-for-age declines. There is also evidence from Mexico (Garcia et al. 1990) and Peru (Brown 1991) that children do not eat all the food that is offered to them in the period during which they become stunted. Such children may be chronically malnourished, in the sense of a nutrient imbalance in their habitual diet, but it seems odd to call them chronically undernourished, and feeding them more of their usual diet is not likely to make them grow and develop faster or behave differently.

Quite a lot is known about undesirable behavioral and other correlates, but relatively little is known about the causes and mechanisms of linear growth retardation and what it means or indicates in nutritional terms. Stunting (in the sense of falling behind in length-for-age) occurs mostly during the first 24 mo; after that, most stunted children do not show the expected tendency to catch up but follow more or less a low percentile line for height-for-age. Stunting results from a delay in the growth of long bones, which occurs in three phases. During the infancy phase, which begins in utero, reaches its apex around birth and then declines till 3-4 y of age, the growth of long bones is believed to depend on several factors, among them insulin and insulin-like growth factors (Gluckman 1989, Milner and Hill 1987). Growth hormone (GH) is in the circulation in relatively large amounts (Hill et al. 1988), but linear growth of the fetus is almost independent of GH (Gluckman 1989, Milner and Hill 1987), perhaps because the GH-specific receptors in the growth plate are still immature (Barnard et al. 1988). Around 6 mo after birth, in a normally growing child, the GH receptors begin to function, and GH becomes the most important determinant of bone growth: this is the beginning of the childhood growth phase. Evidence is accumulating that what characterizes and de termines stunting is a delay in the onset of the childhood growth phase (Karlberg et al. 1994). However, we do not yet know what determines the time of onset of the childhood growth phase. It also remains unsatisfactory to have sets of anthropometric data indicating that some groups of children are smaller and lighter than reference children, as long as we are uncertain about how important these differences per se are in functional terms. We still need more data of a more varied nature to gain a clearer idea of the relationship of impairment of function to anthropometric indexes of malnutrition and/or dietary intake measures (see below).

Biochemical indicators. Biochemical indicators are useful for the assessment of most of Golden's Type I nutrients. They allow us to diagnose and monitor deficiencies already at a subclinical stage. One of the main problems is the definition of cutoffs, which should be based on an extensive knowledge of the functional consequences of various degrees of deficiency, and this type of information is still mostly unavailable.

Functional. Dietary intake data and indicators of nutritional status are not very meaningful per se; they get most of their meaning from known relationships with functional criteria and tests, such as physical fitness and activity, other aspects of behavior, immune defense, morbidity and mortality. Unfortunately, information on such relationships is relatively sparse. The argument becomes circular when supplementation trials are used to establish the fact that a nutritional deficiency existed in the first place. It is also often quite difficult to infer the degree of handicap in real-life situations from deficits, even when they are statistically significant, in tests of certain physiological (e.g., delays in evoked potentials) or behavioral (e.g., test scores) functions.

Associations and interactions among dietary components


Protein, energy and general undernutrition. In the 1960s the research community and policy makers became increasingly concerned with the idea that early malnutrition could result in permanent impairment of children's intellectual development. This was the period when the predominant notion was that protein was the most limiting component of the diet of poor populations nutritionally at risk. A series of major supplementation studies was undertaken subsequently in the 1970s to elucidate effects of protein- land to a lesser extent energy-) malnutrition, and the results they produced continue to constitute the majority of the evidence on which we base our current views concerning the effects of nutritional deficits on behavioral development.

The mothers and/or infants participating in these studies were assigned either to an experimental condition in which they received a dietary supplement rich in protein (and to a lesser extent in energy), or to a control condition in which they either received no supplement at all or a supplement containing little or no protein and energy. In some instances, multivitamin and mineral supplements were also provided, on the assumption (or to test the hypothesis) that they had no effect.

Studies that were undertaken to elucidate effects of protein-energy malnutrition (PEM) on behavioral development (and their prevention or reversal) allow us to conclude that providing dietary supplements to nutritionally at-risk pregnant mothers, infants and young children can have a positive effect on indicators of behavioral development (Chavez et al. 1982, Husaini et al. 1991, Joos et al. 1983, McKay et al. 1978, Pollitt et al. 1993, Rush et al. 1980, Waber et al. 1981); it unfortunately remains unclear what element or what combination of elements in the supplements had the observed effect. Whenever food intake is inadequate, the intake of some macro- and micronutrients is also low, and whenever food is provided as a supplement, the intake of some macro- or micronutrients is likely to be increased. In situations where food is so scarce that energy intake is inadequate, dietary quality is also likely to be poor. Poor quality diets consist primarily of staples such as cereals, legumes or other plants; they typically contain few animal products, fruits and vegetables and are therefore associated with low intakes of several vitamins and minerals, high intakes of phytates and fiber, and, hence, poor bioavailability. Analogously, it can be argued that low-protein intakes are frequently accompanied by inadequate amounts of important micronutrients, including iron, zinc, calcium and vitamin A, that are contained in dietary protein sources. This was recently corroborated by dietary intake data from observational studies in Kenya, Mexico and Egypt (Allen 1993). The same data provide examples of associations between both indicators of diet quantity and quality on one hand and indicators of cognitive performance and related measures on the other.

In underprivileged population groups and individuals, poor dietary quality is probably much more common than actual food shortage. Under the same circumstances, the incidence of morbidity is usually high, and this is likely to cause depletion of several nutrients simultaneously through anorexia, malabsorption and/or diarrhea. Population groups and individuals at risk of malnutrition are also likely to be at a disadvantage from a socioeconomic point of view and have less access to medical and social services. Data from earlier supplementation studies, therefore, have to be reexamined to see if, and to what extent, dietary energy and micronutrients that were intentionally or unintentionally part of the protein-rich supplements could have been. responsible for observed effects. Pollitt (1994), for instance, examined the extent to which iron deficiency and supplementation could be a confounder in the postulated effects of PEM in these earlier studies.

Micronutrient deficiencies. In recent years, research and policy interest shifted from the study of the effects of protein and energy to those of iron, iodine and vitamin A (ACC/SCN 1993). Previously, deficiency of iron caused concern mainly as the cause of iron deficiency anemia, iodine as a cause of goiter and cretinism and vitamin A as a cause of xerophthalmia. Now it is well recognized that these micronutrients have broader systemic effects, resulting in multiple impediments to a child's optimal health and development.

In part, probably, because iron deficiency is the most prevalent nutritional problem worldwide but also because it is one of Golden's Type I nutrients with identified stores and therefore comparatively easy to investigate, and appropriate treatment can reverse the effects of de ficiency in a few weeks, we know more about effects on behavioral development of iron than of other single nutrient deficiencies.

Infants and toddlers suffering from iron deficiency anemia show delays in their motor and mental development (Logoff 1990, Walter 1993). Hemoglobin concentration and duration of anemia are correlated with permanency, and no deficits were found in iron deficient but nonanemic infants and toddlers (Lozoff 1990, Walter 1993). Supplements of iron-fortified foods can prevent the anemia and its behavioral correlates (Walter et al. 1993). In toddlers suffering from iron deficiency anemia, short-term iron supplementation corrects the hematological signs of anemia. Supplementation for a few months has also been shown to reverse developmental delays in a study in Indonesia (Idjradinata and Pollitt 1993), whereas follow-up studies in Chile and Costa Rica suggest that moderately severe iron deficiency anemia in early childhood can entail a long-term developmental disadvantage (Lozoff et al. 1991, Walter 1993).

Older children suffering from iron deficiency anemia score lower in cognitive tests (Soewondo et al. 1989), and their school performance is poorer (Pollitt et al. 1989). Iron supplementation generally leads to improvements in cognitive performance (Seshadri and Gopaldas 1989). The availability of all this information on behavioral consequences of iron deficiency anemia should, however, not mislead us to believe that all problems are resolved, even in this specific area. A paper by Pollitt and Metallinos-Katsaras (1990) provides an up-to-date summary of achievements and unresolved problems.

Iodine and thyroxin deficiency in early pregnancy impair the development of the fetal central nervous system and can result in cretinism of the child. Trials in Papua New Guinea and Ecuador during the late 1960s prevented cretinism by giving iodine supplements to the mother before conception and in early pregnancy. However, once it is established, frank cretinism and its behavioral consequences cannot be fully reversed.

Lesser degrees of iodine deficiency, later in fetal life and after birth, are likely to be associated with psychomotor and cognitive deficits (Ma et al. 1989, Muzzo et al. 1987). The extent to which they can be reversed by iodine supplementation remains an open question; two double-blind intervention studies in primary school children (Bautista et al. 1982, Shrestha, R. et al. 1995) yielded contradictory results.

For conceptual and historical reasons, there is a great temptation to pursue this effort of concentrating on the study of single-nutrient deficiencies, their pathophysiological and behavioral effects and possibilities of their remediation. This is the approach that nutritionists and physicians traditionally used to study and to treat vitamin deficiencies. Attempts to identify additional single-nutrient deficiencies and their syndromes, however, met with more problems than expected, especially with Type II nutrients, in which determining requirements, diagnosing mild-to-moderate deficiencies and assessing the degree of success of treatment remain very difficult. Zinc, one of Golden's Type II nutrients without any identified stores in the human body, may serve as an illustration.

Zinc intake is not predictive of zinc status because its absorption depends so much on the amount of other nutrients in the diet. Circulating and tissue levels of zinc also do not reflect zinc balance or status. This makes the diagnosis of zinc deficiency extremely difficult.

Zinc supplementation can improve the immune response of zinc-deficient children (Castillo-Duran et al. 1987), and it has been speculated that the growth and behavioral development of the supplemented children could benefit from their being ill less frequently. This indirect argumentation may be misleading. Schlesinger et al. (1992), for instance, found that zinc-fortified formula improved physical growth and the immune response in Chilean infants (average age 7 mo) but had no effect on the number or duration of infectious episodes, suggesting that the beneficial effect on growth cannot be explained by lower morbidity.

Stunting has been shown to be associated with delays in behavioral development, and one could argue similarly that, since zinc supplementation has accelerated growth in some zinc-deficient infants and young children (Chen et al. 1985, Schlesinger et al. 1992), it might have an effect on their behavioral development too. This argument is as indirect and remains as tenuous as the previous one, unless it can be shown that stunting is accompanied by delayed motor development and that this, in turn, affects other aspects of behavioral development.

In laboratory animals (rats), severe zinc deficiency disrupts brain growth and morphology and leads to longterm behavioral changes that are qualitatively similar in many respects to those produced by general malnutrition (Golub et al. 1995). It is important to note, however, that severe zinc deficiency produces anorexia and that it is difficult to separate effects of low zinc intake from an overall decrease in nutrient intake. In stunted school-age children, there were no differences in scores of standardized tests reflecting attention between groups differing in zinc status or in response to zinc supplements (Gibson et al. 1989). It seems therefore that, at this point in time, there is not enough evidence to assert that zinc deficiency affects the behavioral development of infants and small children.

Recent randomized trials comparing formulas and human milk. A large prospective trial is currently under way in the United Kingdom where preterm babies with a birth weight of < 1850 g were randomly assigned at birth for about 30 d to a preterm milk formula or unfortified donor breast milk from a milk bank, either as the sole food or as a supplement to milk expressed from their mothers' breasts. When survivors were given Knobloch et al.'s developmental screening inventory (consisting of selected items from the Gesell developmental schedules) at 9 mo, babies on preterm formula outperformed babies on banked breast milk (Lucas et al. 1989):

At 18 mo of age, children who had received a nutrientenriched preterm formula had higher Bayley mental and psychomotor development indexes and Vineland social maturity quotients than children who had received a standard formula. Differences were greatest in former small for gestational age (SGA) infants and in motor development. This suggests that the preterm formula, which contained 2.0 g protein, 80 kcal/dL and was also enriched in Na, Ca, P. Cu. Zn, vitamins D, E and K, water soluble vitamins, carnitine and taurine, was more effective than a standard formula containing 1.45 g protein and 68 kcal/dL in promoting growth and behavioral development in these premature babies. What is remarkable is that the infants were on this randomly assigned diet only for an average of 4 wk (Lucas et al. 1990).

When retested at the age of 7.5-8 y with a Wechsler Intelligence Scale for Children, children who had formerly received their own mothers' milk performed significantly better than their peers who had not (Lucas et al. 1992). This could be due to the fact that the nutrient composition of formulas differs from that of human breast milk, particularly insofar as long-chain polyunsaturated fatty acids are concerned, but this is also open to other interpretations. For ethical and practical reasons it is not possible to randomly assign infants to breast- or formula-feeding, and there is plenty of evidence showing that there are differences between parents who choose to breast-feed and those who choose not to. In the study of Lucas et al. (1992), breast-feeding mothers were in general of a higher social class and better educated, but an 8.3 point difference in IQ (a little more than half a standard deviation) in the children remained even after controlling for differences in the mothers' education and social class. The study also showed a positive dose:response relationship between the proportion of mother's milk in the diet and subsequent IQ, and children whose mothers had chosen to breast-feed, but failed to do so, had IQs similar to those of the bottle-fed group. Jacobson and Jacobson (1992) in Michigan also found higher scores on McCarthy scales in 4-y-old formerly breast-fed than in bottle-fed children, but after maternal IQ and measures of parenting skills were controlled statistically, breast-feeding no longer appeared to affect IQ. It was also argued that the act of breast-feeding itself has profound effects on the behavior and physiology of mothers and infants that cannot be obtained by bottle-feeding, regardless of the composition of the formula and the dedication of the mother to feeding her child well. This argument, however is not applicable to preterm babies who are all tube-fed in the beginning.

Results of a recent study in Chile suggest that a certain amount of omega-3 fatty acids (particularly decosahexaeonic acid) in the diet is needed for an optimal development of visual function in premature babies (Birch et al. 1992). The fatty acid composition of human milk differs in certain respects from that of currently used formulas. Indexes of visual acuity reflected a higher degree of maturity in 4-mo-old exclusively breast-fed than in formula-fed infants (Birch et al. 1993). At 3 y of age, the breast-fed group still had better stereo acuity (assessed by operant preferential looking techniques! and better visual recognition scores than the formula-fed group (Birch et al. 1993). We do not yet know what the implications are for behavioral development, but the effects of certain dietary lipids on behavioral development have certainly become a topic of great interest.

Vitamins. Guilarte (1993) reviewed several studies suggesting that pregnant and lactating women may have dietary intakes of vitamin B-6 that are well below the recommended dietary allowance, and that this may affect the vitamin B-6 status of their offspring. Vitamin B-6 appears to be an essential cofactor in the developing central nervous system and may influence brain development and cognitive function in animals. Recent work in animal models suggests that vitamin B-6 deficiency during gestation and lactation alters the function of N-methyl-Daspartate receptors, a subtype of receptors of the gluta-matergic neurotransmitter system thought to play an important role in learning and memory. McCullough et al. (1990) reported that, in Egypt, vitamin B-6 concentrations in the mothers' breast milk were related to infant behavior and mother-infant interactions.

Multi-vitamin and -mineral supplements were reported to increase certain subscores of schoolchildren on intelligence tests (e.g., Benton and Roberts 1988), but it is not clear what nutrients ma y have been deficient in the habitual diets of these children and what constituent(s) of the supplements may have benefited them.

Where dietary intake data, biochemical indicators or clinical symptoms and signs strongly suggest the presence of a single nutrient deficiency, the appropriate action may continue to be food supplementation or fortification in the short and medium term, until the corresponding nutrient deficit has been rectified in the habitual diet of the individuals concerned.

Where the nature of the deficiency is less clear and appears to concern nutrients that are frequently associated with one another and with a poor quality diet, it remains a challenge to understand the pathogenesis and pathophysiological processes and to utilize this information in intervention strategies. However, it is probable that the supplementation of a single nutrient will not be completely effective because others will rapidly succeed it in becoming limiting. In such situations, supplementation trials and programs, aimed at improving dietary diversity and quality, are more likely to show effects on indicators of behavioral development.

Acknowledgment


The author gratefully acknowledges L.H. Allen's helpful comments on a draft of this paper.


Contents - Previous - Next