This is the old United Nations University website. Visit the new site at http://unu.edu
Ethical issues related to nutrition field trials
Joan P. Porter
This paper reviews several fundamental ethical principles, provisions of the US federal regulations for the protection of human subjects, and past deliberations of the Subcommittee on Vitamin A Deficiency Prevention and Control regarding ethical concerns in vitamin-A studies. The fundamental ethical principles were set forth by the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research in its Belmont Report |l]. These principles were the basis for the regulations of the Department of Health and Human Services (DHHS) for the "Protection of Human Subjects" [2], and they guide the design, review, and conduct of any research involving human subjects - they are the foundation for our rules and norms.
Ethical principles
The first ethical principle is respect for persons. Two corollaries of this principle are that individuals should be treated as autonomous agents and that persons with diminished autonomy are entitled to protection. Kant stated the principle in these words: "So act as to treat humanity, whether in thine own person or in that of any other, in every case as an end withal, never as a means only`' [3]. Obtaining informed consent is an important activity in research that derives, in part, from this principle.
The second principle is beneficence. Persons are treated in an ethical way by making efforts to serve their well-being. Beneficent actions involve two rules: "do not harm"; maximize possible benefit and minimize possible harms. Claude Bernard indicated that one should not injure even one person involved in research regardless of the benefit that might come to others 13]. This is not always an easy principle to app ly, because avoiding what is harmful requires learning what is harmful, and that sometimes means exposing people to risks of harm. Also, researchers have to act to benefit their subjects based on their best judgement - and at times there is no ready consensus or scientific information about what is in the best interest of human beings. These imperatives, then, require us to weigh possible risks and possible benefits and to determine what is too great a risk in pursuit of a possible benefit. Research designers and groups who review research seek to minimize the risks and to maximize the benefits by using procedures, processes, and designs that clearly weight the benefits side of the scale - for the individuals involved and also for individuals to follow and for society in general.
Some ethicists who have considered beneficence have said there are obligations that derive from this principle. In Frankena's schema [cited in ref. 3, p. ill], for example, these are, in order of strength: (l) One ought not to inflict evil or harm. (2) One ought to prevent evil or harm. (3) One ought to remove evil. (4) One ought to do or promote good. I will return to the balancing of risks and benefits below.
The third principle is justice. Justice requires that we treat persons fairly and that we give each person what he or she is due. The National Commission [1] was concerned, in large part, with "distributive" justice, which involves the distribution of scarce benefits when there is competition for them. It was also concerned with the distribution of burdens, particularly with regard to imposing burdens on fewer than all members of a class of persons. lust distribution of burdens and benefits is an important concept underlying the selection of research subjects. As the National Commission notes, researchers have to decide who should receive the benefits and who should bear the burdens. The selection of subjects must be carefully reviewed to determine whether some classes are being systematically selected primarily because of their easy availability or manipulability. If the research pays off, one concept of justice requires that those who are disadvantaged have access to the benefits of research and that those who have borne the burdens be considered with high priority in distributing the benefits [l].
In writing about research and treatment of acquired immunodeficiency syndrome (AIDS), Leroy Walters [4] of the Joseph and Rose Kennedy Institute of Ethics has suggested a still nebulous, fourth ethical principle: "community," or "mutuality," or "solidarity." This principle acknowledges that, in view of the complexities of major health epidemics or problems, we must include this value with others to guide actions. Some of the vitamin-A studies in question here must accommodate the importance of local authority and of community reassurance and participation in decision making. These values appear to be related to Dr. Walters's emergent principle.
Risks and benefits
Risks and benefits can be physical, psychological, social, and economic. In this regard, we tend to think of the vitamin-A studies as trade-offs of physical risks and benefits, but the culture of the research subjects may sometimes create subtle benefits or risks - community pressure about participating or not participating in a study or risks of excessive expectation. Possible psychological risks might derive from one's having improved health by virtue of receiving an intervention or accompanying medical attention and then having that benefit withdrawn, sometimes rather rapidly, after the completion of a trial. The mere inconvenience of an intrusion into one's daily activities may also be a risk.
In the best of situations, those who are designing the research would provide quantitative estimates of the probability and magnitude of risks and benefits based on empirical data, but that is not an easy task. There may not be agreement, for example, on how to weigh a risk or benefit, or agreement on the relevance or accuracy of previous findings.
Several points about benefits should be considered when vitamin-A studies are contemplated. First, anticipated benefits really have to be weighed in terms of expected duration. Also, research designers must consider what happens if the benefit of an experimental health intervention is provided in a community that would not otherwise have access to it. Is there any possibility of continuation through other governmental or private support once an efficacious research intervention ceases?
Second, in areas of the world where individuals have no access to basic health care, one sometimes hears the argument that persons in the control group of a randomized trial, who receive no intervention or who receive a placebo, are "no worse off" for the re search. They did not have access to a vitamin or other nutrient before the trials, so why should we be concerned if they have none now in the trial? Are the risks any greater, considering probability and magnitude, than those they ordinarily encounter in daily life [2]? Some may argue that this rationale seems to contradict the obligation to maximize benefits - especially if there are relatively basic medical interventions basic medical interventions that can be provided to study subjects.
Third, there exists a dilemma if agreement is lacking on what works and what does not or on what previous research means or requires us to do to avoid harm and to promote benefit. Widespread application of tentative or unproven interventions is not ethical - it prevents us from finding the best treatment, raises hopes inappropriately, and exposes those who receive the unproven intervention to unknown risks. Further, unsound health interventions waste resources in a world where they are scarce.
Fourth, the concept of benefit is also closely linked to selection of subjects, as noted above. Those who stand to gain the most from the research results are those who might first be asked to assume the risks. In discussing maximization of benefits, Robert Levine [3] of Yale University suggests that ethical codes and regulations forcefully prohibit causing death or injury, but obligations to promote good are based on good scientific design and good balance of harms and benefits. If the benefit of promoting health is based on the obligation to avoid harm, then every subject - even those in a control group - should have the best proven diagnostic and therapeutic method. Witholding an effective therapy for a disease that if untreated may produce death or disability is not acceptable according to Levine [3, P 45].
A corollary to justifying risks in terms of benefits is the obligation to minimize risks. Sometimes minimization takes the form of eliminating non-essential procedures, such as drawing extra blood. Another approach is monitoring, through data and safety monitoring boards, in blinded trials to detect unanticipated statistical trends that reveal problems or results that are so dramatic that justification to continue the controlled trial is unwarranted. Setting clear end points and building consensus on the meaning of the data flow from the monitoring concept.
Prescreening is another way to minimize risk. For example, in the vitamin-A studies, children with clear signs of ophthalmological symptoms were not entered into the trials or were withrawn from the test groups and treated. In selecting subjects, research designers should consider involving the least vulnerable, least at-risk persons to obtain the data needed. Some may say that this takes the form of involving persons with marginal deficiencies or only mild illnesses rather than those at later stages of deprivation or illness, if possible. Other things being equal, one looks for the least vulnerable representatives of populations as subjects.
Some of the studies the subcommittee reviewed in 19S6 involved the randomized clinical trial, in which one group receives the intervention and a control group receives a placebo or nothing [5]. Many ethical problems are peculiar to this type of study design. Some researchers believe that we have relied on randomized clinical trials too much. D. L. Sackett [cited in ref. 3, p. 136] maintains that the objectives of clinical trials are validity, generalizability, and efficiency - the first objective, validity, being the mandatory one. Sackett believes that problems arise when the three objectives are out of balance or given the wrong priority. For example, efficiency might require that high-risk persons be enrolled so that dramatic results can be achieved, but validity and generalizability may be compromised.
Use of historical controls is one alternative to randomized clinical trials. But, in the face of many confounding factors, this design is often criticized. Richard Feachem has suggested that it is unethical to follow prospectively children with any signs or symptoms of vitamin-A deficiency without providing full vitamin-A therapy. He also suggested use of the case-control method in diarrhoeal and vitamin-A studies as an alternative to randomized clinical trials [6, p. 13].
US federal regulations
The ethical principles that have been set forth are not always easy to apply. Nor is it always evident how to apply them. The DHHS regulations for the protection of human subjects [2l provide some rules and processes for application of the principles. Two processes required by the regulations provide a means to put the principles into practice: use of an institutional review board (IRB) and obtaining informed consent. Although some of ethical codes, such as those of Helsinki or the Council of International Organizations of Medical Societies of the World Health Organization, recommend general courses of action for the conduct of ethical research, the DHHS regulations are more explicit and precribe procedures.
(Not addressed here is subpart D of the regulations, "Additional Protections for Children Involved as Subjects in Research." Note that the regulations provide several additional risk/benefit categories for an IRB to examine. For example, if, in a research protocol intended for support, the risks are relatively high and the knowledge to be gained and benefits are quite important but far removed from the children who are subjects, the Secretary of Health and Human Services would convene a second review to consider the research and make recommendations about whether it can go forward.)
Institutional review boards
The regulations call for a local committee of specific composition, an institutional review beard, with sufficient authority and an independent perspective to assess the balance of the possible risks and possible benefits of research. The IRB assesses the minimization of risks, the adequacy of the informed-consent process, and protection of vulnerable subjects. Along with research experts, the IRB must include persons unaffiliated with the institution sponsoring the research, persons who can represent community attitudes and values, and persons who are not scientists [2l. Although it is important to involve government officials in the planning of field studies, sometimes the perspective of those with other than political agendas can help sort out what are acceptable risks and benefits.
In negotiating assurances of compliance with the regulations for the protection of human subjects for research that is sponsored by a domestic institution in a foreign country, the Office for Protection from Research Risks, with few exceptions, asks for at least two IRB reviews - one by the IRB in the domestic institution and one in the country where the research is to be conducted. Local review, i.e., in the community where the research will take place, is sought. Whereas most IRBs in the United States are attached to academic or research organizations, there are many variations abroad. Quite often an existing government review committee, supplemented to meet the requirements of the regulations, is utilized. Because the governments often do not do research, reviews can be complicated by other goals. In most of the trials, the national governments are involved at some point, and they must be for the success of the effort and for ensuring continued commitment if the results of the research are favourable.
Obtaining informed consent
The other process required by the regulations is obtaining informed consent. The basic elements of informed consent include considerations a reasonable person would want to know. The informed-consent process respects the value of autonomy of an individual. In this process, persons must be told (a) that the intervention is research, (b) what alternative treatments there are, if any, and (c) the foreseeable risks and foreseeable benefits to themselves and to others from their participation as subjects. They are informed about the extent of confidentiality and, if the research is of more than minimal risk, of any availability of compensation in the event of an injury. They are told that their participation is voluntary, that they may withdraw at any time, and that, if they decide not to participate in the research, they will not be denied any benefits to which they might otherwise be entitled. They are also given the name of someone of whom they may ask additional questions [2].
Informed consent, verbal or written, must be obtained; but, if it is to be meaningful, information must be explained, and it is no easy task to explain alternative methods, long-term risks and discomforts, and possible benefits, particularly to potential subjects in developing countries [7].
To hand out consent forms to illiterate people is not sufficient. Also, in some areas, signing any papers is considered an act of self-incrimination or makes possible subject quite uneasy. One might also need to seek consent from the traditional head of an area, a parent, or the chief teacher for schoolchildren. Husbands must also consent for their wives in some areas
In the United States, we place a high value on autonomy and individual rights. In other cultures, this value is modulated somewhat by values of community solidarity and community authority. In such cultures, it is important to convince local tribal and village elders and religious leaders of the acceptability of the research to be done. This requires more lead time to prepare for a study, but it helps to ensure a high degree of participation and compliance. Obtaining the local leader's consent is necessary, but is not sufficient to guarantee participation. Dr. Keith West of the Johns Hopkins Medical Institution cautions that sensitivity to superstitions, local norms, local behavior - indicators that tell a researcher whether someone is consenting or refusing - is essential (personal communication, 1988). De Maar et al. [7] and others writing about the management of clinical trials in developing countries have advised that constituting permanent national or international advisory committees to provide broad support can be helpful (e.g., scientific advice and data monitoring), but they can be costly. Also, the co-operation and understanding of local health dignitaries and the traditional medicine man or the local dispenser should be sought. Their dissatisfaction could jeopardize a study [7].
In developing countries in which field trials are to be conducted, researchers and IRBs need to consider, then, the constraints on obtaining permission to conduct trials and on obtaining informed consent. E. Ekamen [8] of Nigeria, in discussing methodological constraints and limitations in developing countries, cited illiteracy as the major drawback to the conduct of research. Populations are often ill-informed and have little understanding of the value and objectives of research projects; they do not participate unless some clear benefit is offered. Language barriers should not be underestimated; sometimes there are no local concepts for the technical terms involved in research. Concepts of time and causality are culturally defined. Placebo-controlled trials and blinded and double-blinded studies are not well understood [9]. Also, it should be remembered that selecting persons as potential subjects who are in schools or other institutions in developing countries may lead to a biased sample of the general population.
Previous subcommittee deliberations
The deliberations of the subcommittee in August 1986, as reported in Vitamin A Supplementation: Methodologies for Field Trials [5], identified the following problems:
There were minority opinions among the subcommittee. Some, for example, noted that persons with marginal vitamin-A status are at increased risk of developing severe deficiency. Their risk remains if they are in the control group receiving a placebo. Children who will eventually become xerophthalmic because of vitamin-A deficiency are already compromised, but the ways to measure and diagnose this are not so direct. Because some subcommittee members do not consider vitamin A an "experimental agent," they urged studies without placebo groups in areas of high morbidity and xerophthalmia. Others were concerned that high risks were borne by the control population, whereas benefits would be universal.
A suggestion was that an alternative to randomized clinical trials that would shift or minimize risks more appropriately might be retrospective case control studies of mortality based on who accepts and who refuses vitamin-A supplements. Other possibilities suggested were time-staggered introduction of capsules and other services into planned vitamin-A health care delivery and intensive vitamin-A distribution to selected populations, and regular distribution of government supplements to other populations.
Ethical safeguards considered by the subcommittee included the following:
It appears that tensions surround the questions of how much we already know and of the extent to which subcommittee members agree on the validity and generalizability of the research data about vitamin-A studies. Ethical deliberations are not undertaken within a neat framework. There is always some degree of conflict about the nature and weights of values, benefits, and risks, about the primacy of principles, and about interpretation of previous studies. There are also conflicts about deciding on research priorities and about balancing research versus other immediate health treatment necessities in the face of scarce resources.
The subcommittee needs to address these points in considering future studies. Although there are no easy answers, this paper should help to refresh memories about past discussions and to guide future
References
Commentary: Underpinning vitamin-A deficiency prevention and control programmes
Barbara A. Underwood
If you give a hungry man a fish, he is fed for one day but is dependent upon you for continued sustenance. If you teach a hungry man to fish, he is independent for life.
The analogy between this well-known saying and the approaches to the prevention and control of vitamin-A deficiency is obvious: providing children with high-dose capsules of vitamin A saves many from developing clinical symptoms and perhaps reduces mortality and morbidity, as long as the dose can be delivered repeatedly at specified intervals. If the system fails or the individual child is not reached, the problem recurs. Approaches to prevention that foster practical solutions attainable through better utilization of available food and other resources are more difficult to implement and take longer to bring about the needed behavioural changes in child-rearing practices. But they can be permanent and address health and nutrition issues that commonly coexist with vitamin-A deficiency.
Most vitamin-A intervention programmes recognize these facts and include an "educational" component. In practice, however, the educational component takes a back seat to efforts required for the delivery and monitoring of the high-dose capsule. The personnel responsible for capsule delivery frequently inform recipients of what the capsule is for and of foods they should eat that contain the vitamin, but fail to communicate the message in a locally appropriate, meaningful way that changes behaviours: such communication may be perceived as taking too much time. This fact is illustrated by the evaluation report of the Bangladesh vitamin-A distribution programme described in an earlier issue of the Food and Nutrition Bulletin [1].
Clearly there is need to rethink strategies for vitamin-A-deficiency prevention and control. The high-dose medical approach is appropriate under circumstances where a public health problem exists and alternatives are not feasible, e.g. where water, transportation, and food-storage facilities are in short supply or non-existent. Often, however, these circumstances are regionally clustered and not applicable to an entire country. But even under these circumstances, strategies that combine the short-term medical approach with programmes addressing underlying conditions that contribute to high rates of infections - e.g. programmer to improve personal and environmental sanitation and increase immunization coverage - can have beneficial spin-off effects on the vitamin-A deficiency problem. As the evaluation of the Bangladesh programme by Darnton-Hill et al. [1] illustrates, the efficacy of the medical model is limited by the inefficient delivery system. There is no doubt that the programme has saved the sight and lives of many Bengali children, but, as the authors note, it has not reduced the overall prevalence of the problem - even after 14 years. In addition, the struggle to improve the delivery system and its monitoring is consuming much of the national human and economic resource pool.
During the 14 years of the Bangladesh programme, some evidence indicates that diets not only have not improved nutritionally with respect to vitamin A but have deteriorated, and that little change has occurred in personal and environmental health practices. After 23 rounds of vitamin-A-capsule distribution, limited knowledge about the programme exists: 34%-60% of mothers did not know what the capsule was for, 15%21% had not seen the educational materials, and 51%75% could not name a vitamin-A-rich food. It is precisely this kind of evaluation data that frequently is used by opponents to illustrate that educational approaches don't work! But can we blame this failure on the educational approach, or should we admit that we have been ineffective communicators in the educational component of the currently operational high-dose programmes? Often we ask overburdened, unmotivated, and minimally trained delivery personnel to get the message out. Or we determine that only those who have higher education have sufficient knowledge to effectively compose and communicate the message, whereas those to whom we want most to relate are underprivileged and often lack formal education and access to other social programmes. But they are survivors. As survivors they have had to make choices - choices that include which of the many messages they hear and programmes forced upon them they will choose to act upon in the use of their limited resources, both of time and of money. Choice, however limited, is valued irrespective of socio-economic status.
People change practices when they are convinced that the change is to their benefit and they choose to change. Choice is too frequently left out of approaches to solving public health problems, including vitamin-A intervention strategies. Most universal capsule distribution programmes do not entertain choice as an option, yet targeted recipients for such programmes choose not to participate in increasing numbers in successive rounds, as evaluations of the Bangladesh and other national programmes illustrate. Indeed, proponents of fortification programmes proclaim the lack of choice as the major advantage of a fortification strategy. But, as occurred with the sugar fortification programme in Guatemala, as effective as the programme was shown to be while operational, the situation deteriorated rapidly when it was disrupted by internal political and economic changes. No demand for continuation of the programme had been created among the passive recipients.
How can the concept of choice be introduced into strategies for the prevention and control of vitamin-A deficiency? Just as with any other programme, there is not likely to be a universally applicable answer. Each situation has to be evaluated at the national, community, and family levels. The important point is that choices usually do exist if imagination and innovative thinking are applied, and these choices could be made available when considering strategies at each level of intervention. In some instances where clinical deficiency is rare, a national programme to improve the intra-country preservation, storage, and year-round availability of vitamin-A-containing foods, combined with an effective programme to improve consumption, might be an appropriate alternative to a high-dose programme. Elsewhere, a community-based feeding programme, a community- or family-level income-generation programme to provide economic resources to permit a choice of appropriate foods, or a kitchen/community garden may be alternatives - and these programmes are not mutually exclusive. Until we create a demand for a programme or a product, i.e. convert programme recipients into programme consumers, whether for a high-dose capsule, lower-cost green leafy vegetables, or better means of preparing and preserving vitamin-A-rich foods for feeding young children, it is difficult to conceive of achieving the effective sustained behavioural change that must occur to eradicate and control vitamin-A deficiency as a public health problem.
Reference
Breast milk the life saver: Observations from recent studies
Karima A. Dualeh and Fitzroy J. Henry
Breast milk is universally accepted as the best food for infants, and its desirable properties have been extensively described [1]. This paper reviews the results of recent studies that improve our understanding of the role of breast-feeding in child health and survival and concludes that, despite much recent attention, breastfeeding is still much undervalued.
Four important questions are considered:
The growth of exclusively breast-fed children
In 1980, Waterlow and colleagues linked infant mortality with growth faltering [2]. Data presented from several countries in the developing world indicated a decline in the growth rate of infants between the ages of three and four months when compared with the mean for the United Kingdom. Waterlow further showed that, theoretically, after three months of age breast milk alone was insufficient to sustain adequate growth [3]. Inadequate production of breast milk by undernourished mothers would contribute further to growth "faltering" and lead to a degree of relative malnutrition in their offspring, which would increase mortality at this age. Recent studies have attempted to clarify this issue.
A study of 96 exclusively breast-fed infants in the United States indicated that most of these children grew adequately without supplementation during the major part of their first year of life [4]. However, in Australia the weight increments of healthy exclusively breast-fed infants fell below the UK standard of normal growth after three months of age [5]. These and other studies that have attempted to link prolonged exclusive breast-feeding directly with growth [6; 7] are still controversial and inconclusive. One problem is that these attempts to identify an age at which exclusive breast-feeding becomes inadequate are limited by individual variations and self-regulation controls within the infant [8-10].
Ultimately, all infants must be weaned, but the question remains as to when the risk for the infant from malnutrition due to the inadequacy of breast milk is greater than the risk from diarrhoea due to early supplementary feeding, which invariably introduces contamination I 11]. To answer this question it is necessary to consider the influence of infections on the relationship between feeding and growth.
Studies in urban Gambia have shown that, although infants spent 15% of their time ill with diarrhoea, the impact of this load of infection on growth was felt mainly by those who were mixed-fed during weaning, not by the children who were exclusively breast-fed [12]. In the Sudan a similar pattern of effect was observed even though diarrhoea was less prevalent there 113]. These studies did not consider the effect of infection on nutrient intake, but they do indicate that breast-feeding reduces the impact of infections on growth.
To what extent does breast-feeding reduce infections?
Studies in poor communities have shown that breast fed infants have lower diarrhoeal morbidity than other, otherwise similar, infants [14]. Infants who receive no breast milk are at greater risk than those who are exclusively breast-fed [15]. Furthermore, breastfeeding may reduce the severity of diarrhoeal disease [16] and also morbidity from other illnesses, including acute respiratory infections, meningitis, measles, and allergies [17]. One major consequence of infectious disease, particularly diarrhoea, is reduced food intake [18; 19]. However, in one community in Bangladesh, only minor decreases in intake were noted for rural children with diarrhoea l20]. Moreover, in those hospitalized, anorexia was responsible for a substantial reduction in intake of supplemental foods, but the intake of breast milk was apparently unaffected [21].
In summary, these studies suggest that breastfeeding, both exclusive and partial, appears to reduce the severity of diarrhoea and also lessens the effect of reduced nutrient intake that commonly accompanies an attack. The protective effects of breast-feeding are presumed to be due to both the intrinsic anti-infective properties of breast milk and to reduced exposure to contaminated foods. But do breast-fed children experience fewer infections?
When does prolonged breast-feeding cause malnutritions
It is recognized that breast milk alone is insufficient to support normal growth during the second half of infancy. For this, supplementary feeding is required [22]. But when does continued breast-feeding hamper the growth of children? Several recent studies have reported that prolonged breast-feeding (i.e. beyond 12 months) may be associated with a higher prevalence of malnutrition than is found in non-breast-fed children [23-26]. Recently, it was suggested that breast-feeding should stop at around 18 months [27] because nutritional status has been observed to be poorer at this age in children who are still being breast-fed.
Is there any clear biological reason why breast-fed children should be more malnourished after one year of age than fully weaned children even when they receive supplementary food? Breast milk has the highest energy density (calculated on the basis of dry weight) of all the foods consumed by children of this age in Bangladesh [28]. In Uganda consumption of energy was 17% higher for children over 18 months old who were breast-fed than for those who were not [29]. Similarly, in Kenya children 18-23 months old had 108% and 84% of their recommended daily calorie intake in the breast-fed and non-breast-fed groups respectively [30]. Studies in Zaire [31], New Guinea [32], and Bangladesh [33] have shown that children receive 500 to 600 ml of breast milk per day. Furthermore, women who are poorly nourished can produce up to 700 ml per day during the first six months and 300 to 500 ml per day in the second year [1]. In the lean pre-harvest season in Machakos, Kenya, breast-milk intakes averaged 405 ml per day for children 12
17 months old [30]. The protein content of breast milk has the highest biological value compared to other foods [1]. Furthermore, during extended lactation the composition of protein remained constant in studies in Côte d'Ivoire [34]. During the second year a mean daily protein intake of 2.2 g per kilogram of body weight was maintained in both Uganda [35] and Kenya [36].
Why, then, do breast-fed children tend to be more malnourished around two years of age even when their diet is supplemented with other foods? Is breastfeeding at fault? It should be noted that the studies mentioned above have not collected data on the frequency of breast-feedings or the volume of breast milk taken, or on the adequacy of other foods given to the child and other such information that may help to explain the poorer nutritional state. A recent unpublished observation by A. Briend and colleagues, however, indicates that breast-fed children who were about to be weaned had a lower nutritional status than those who continued breast-feeding. This means that cessation of breast-feeding is not the main cause of the poorer nutritional state. Hence, the mother of a malnourished child should not be advised to stop breast-feeding in an attempt to improve the child's dietary intake. Furthermore, the decision to stop breast-feeding should not be taken merely on the basis of a child's age.
Breast-feeding and survival
A recent case-control study in Brazil [37] shows that infants who received no breast milk were 14 times more likely to die of diarrhoea than exclusively breast-fed infants. Infants receiving animal milks in addition to breast milk were four times more likely to die of diarrhoea than exclusively breast-fed infants. Furthermore, with each additional daily breastfeeding, children showed a 20% decrease in the risk of death from diarrhoea [37]. Data from Bangladesh suggest that breast-feeding may protect against diarrhoeal mortality well into the third year of life [38].
The question is: When should breast-feeding stop? The suggestion that the age of weaning (the cessation of breast-feeding) should be 18 months [27] can be misleading and unwise under conditions still prevailing in Bangladesh. This is so because, despite the lower nutritional status of supplemented breast-fed children than of those weaned earlier [23-26], there is also good evidence that mortality in such children is substantially less both in hospitals [39] and in the community [38]. The findings indicate that supplemented breast-feeding can have a favourable impact on survival compared with full weaning even when it results in less than optimal nutritional status [38]. Hence, poorer nutritional status alone should not be the criterion for terminating breast-feeding, although it may indicate a need for better complementary feeding.
In Bangladesh women of low socio-economic status tend to breast-feed longer [40]. While this may account for the lower nutritional status of breast-fed children in Bangladesh as judged by anthropometric measurements, it should not be assumed that this is the result of breast-feeding per se, but rather of the lack of resources to obtain sufficient supplementary food. The protective effect of breast-feeding at this age is likely to be critical for the survival of malnourished children [41]. In fact, the protective effect of prolonged breast-feeding on survival was observed only in malnourished children. These studies lead to the conclusion that underprivileged mothers should be advised to breast-feed as long as possible, because human milk provides more than nutrition: it is a protection against diseases. It saves lives, particularly in the malnourished.
References