Contents - Previous - Next

This is the old United Nations University website. Visit the new site at http://unu.edu



Micronutrient deficiencies


Let us turn now to the consequences of deficiencies of the micronutrients iron, iodine, and vitamin A. Fortunately, the classical vitamin-deficiency diseases beriberi, due to inadequate thiamine, pellagra, due to inadequate niacin and tryptophan, and scurvy. due to inadequate vitamin C, have largely disappeared as public health problems. Unfortunately, they are being seen again in refugee populations in Africa and Asia [64, 65]. There may be some areas where zinc deficiency is of significance, but, with the possible exception of some populations consuming diets based heavily on unleavened bread, this remains unproven.

 

Iron deficiency

Iron deficiency is the most widespread nutrition problem in the world today. Long before its cause was known, the pallor of anaemia was associated with weakness and tiredness. It is now recognized that even mild to moderate iron deficiency even without anaemia has adverse functional consequences, although they are less obvious [66]. Iron deficiency has been shown to affect adversely the physical capacity and work performance of adolescents and adults; the cognitive performance, behaviour, and growth of infants and preschool and school children; immune status and morbidity from infections in all age groups; and the maintenance of body temperature in adults exposed to a cool environment. I will briefly review the most recent evidence for these effects.

Prevalence

From 700 to 800 million people are affected by iron-deficiency anaemia [67], and as many more have subclinical iron deficiency. Women in the reproductive age group and young children in tropical and subtropical developing countries are particularly vulnerable. Table 7 gives iron-deficiency anaemia rates by region for adults and children [68]. The overall rates are 26% for men, 42% for women, 51% for children 0-4 years of age, and 46% for those of school age. Even in industrialized countries, iron deficiency anaemia is estimated to affect 10% of the population. An equal number of people are iron-deficient without being anaemic.

TABLE 7. Estimated prevalence of anaemia by geographic region and age/sex category around 1980

  Men 15-49 years Women 15-49 years Children
Pregnant All 0-4 years 5-12 years
% N % N % N % N % N
Africa 20 23.4 63 11.3 44 46.8 56 48.0 49 47.3
Latin America 13 12.8 30 3.0 17 14.7 26 13.7 26 18.1
East Asia 11 6.1 20 0.5 18 8.4 20 3.2 22 5.6
South Asia 32 123.6 65 27.1 58 191.0 56 118.7 50 139.2
Developing regions 26 162.2 59 41.9 47 255.7 51 183.2 46 208.3

Population numbers in millions.

A recent report from India gives an anaemia prevalence of 60%-80% for pregnant women in different regions [69]. A corresponding figure of 59% has been reported from Indonesia (see FIG.8. Anaemia among men, pregnant and non-pregnant women, and children 1-5 years old in Indonesia (data from D. Karyadi. personal communication, 1991)); D. Karyadi, personal communication, 1990). Figure 9 (see FIG. 9. Anaemia among children 0-1 and 0-4 years old in Brazil (data from Jose Dutra de Oliveira, personal communication, 1991) shows 29% of children 0-4 years old in Brazil to be anaemic and 83% of them to be iron-deficient, with even higher rates for infants (J. Dutra de Oliveira, personal communication, 1991).

Iron-deficiency anaemia affects substantial numbers of persons in the United States. If iron deficiency rather than anaemia is the criterion, one-third of premenopausal women in the United States fall into this category, with minority groups more heavily affected. NHANES 11 data for 1976-1980 [70] reveal high frequencies of impaired iron status in US preschool children and premenopausal women, particularly among those whose incomes are below the poverty level. The report documents an overall 7.2% prevalence of actual anaemia in women 15-44 years old, with the highest burden in minority and poverty groups. For example, the percentages with two or three abnormal iron-status indicators were 11.7% for white but 31.3% for black adolescents 15-19 years old [70].

Iron and behaviour

The earliest functions to be affected by iron deficiency are those of brain enzymes involved in cognition and behaviour. Performances on tests of brain function and scholastic achievement are reduced in iron deficiency, and this is usually not reversible unless the deficiency is mild. Iron supplementation has been shown to improve the cognitive performance of iron-deficient infants [71], preschool children [72], and adolescents [73] in the United States. This effect has been confirmed by studies in infants in Chile [74]. Costa Rica [75]. Guatemala [76], and Indonesia [77] and in school children in Egypt [78], India [79]. Indonesia [80, 81], and Thailand [82]. In Thailand performance on Thai language and mathematics tests was significantly related to haemoglobin status and was not reversed by supplementation that restored normal iron parameters in blood [82].

A recent article by Lozoff et al. [78], following up her earlier study in Costa Rica. reports that children who had moderately severe anaemia as infants had lower scores on tests of mental and other functioning at school entry than the rest of the children even when the data were controlled for a comprehensive set of socio-economic factors. The mechanisms by which iron deficiency impairs learning and behaviour, both reversible and non-reversible, are probably multiple and related to the stage of physical development. Nevertheless, it is now firmly established that iron deficiency can impair cognitive performance at all stages of life and that, when it occurs in infancy and childhood, its effects may not be reversible. Recent work from Israel, using an animal model, suggests that the development of neurons essential for key neurotransmitters may be inhibited, with consequences evident in later life [83-85].

Iron and infection

Next to suffer are cells of the immune system. A number of antimicrobial systems within the neutrophil are adversely affected by iron deficiency [86-89]. One is the capacity of the cell to kill ingested micro-organisms by a so-called respiratory burst [90]. This involves the iron-containing enzyme myeloperoxidase [91]. Chandra reports that iron deficiency in Indian children decreases the ability of lymphocytes to replicate when stimulated by a mitogen and lowers their capacity for respiratory bursts necessary for killing ingested organisms [92].

Figure 10 (see FIG.10. Lymphocyte proliferation, intracellular bactericidal capacity of neutrophils, and quantitative nitroblue tetrazolium test, related to serum transferrin saturation. The vertical bars indicate the means and ranges of values obtained in iron-replete controls [93]) shows the impairment with iron deficiency as indicated by transferring saturation for lymphocyte proliferation, the intracellular bactericidal capacity of neutrophils, and the ability of these cells to produce any oxidative bursts [93].

There is also extensive evidence from India [90, 94-96] for a direct relationship between iron status and the concentration of cells responsible for cell-mediated immunity. The skin-test response to common antigens, indicating the development of immunity, has been found to be reduced in studies in iron-deficient children in India [9O, 94]. Additional evidence for the effects of iron deficiency on immunity has been reviewed by Keusch [97].

Iron deficiency in Alaskan native children has been reported to be associated with increased diarrhoeal and respiratory disease [98], and meningitis was observed to be fatal only in anaemic children [99]. A study by Basta et al. in Indonesia [100] found that the greater morbidity from infection among anaemic rubber tappers decreased after iron supplementation. In field studies in both Egypt [101] and Indonesia [102] a decrease in diarrhoeal and respiratory infections was observed in the groups receiving iron supplementation.

Iron deficiency, work performance, and productivity

Figure 11 (see FIG.11. Correlation of haemoglobin status with Harvard step-test performance as a measure of physical fitness in Guatemalan agricultural labourers [103}) shows a linear correspondence between haemoglobin status and the Harvard step test (HST) in adult plantation workers in Guatemala [103]. When the individuals with low haemoglobin received an iron supplement, their performance improved markedly. In Indonesia Basta et al. [100] found similar differences in HST scores between anaemic and non-anaemic road-construction workers and rubber tappers. The anaemic workers had markedly improved performances when they were given iron for 60 days.

The results among male tea pickers in Sri Lanka were similar [104-106]. In both Sri Lanka and Kenya [107, 108] treadmill performance was shown to be proportional to plasma haemoglobin. Spurr et al. [109] found linear correlations between haemoglobin status, aerobic power, and other measures of physical capacity in Colombian sugar cane workers. Low plasma haemoglobin was also associated with poor running performance in Gambian children [110].

The question remains whether these observed differences in physical capacity affect productivity. Basta et al. [100] subsequently showed a strong correlation between haemoglobin status, HST performance, and the take-home pay of Indonesian rubber-plantation workers. Tappers given iron supplementation for 60 days increased their take-home pay by more than 30%. Even among the weeders who were not paid on an incentive basis, less area was weeded by those who were anaemic, and their output also increased with supplementation.

In a subsequent study in Indonesia [102, 111], the quantity of tea leaves collected per hour was significantly less for anaemic women; work output increased by 24% after four months of iron supplementation. Similar results were obtained in a larger follow-up study sponsored by the United Nations University on four other tea plantations in the same area. The daily productivity of the anaemic male tea pickers in Sri Lanka increased more than 20% after iron supplementation for one month [104, 1()5]. An increase in agricultural productivity of Indian women given iron supplementation has been reported [112]. Productivity also increased after iron supplementation of anaemic male agricultural workers in Colombia [109] and industrial workers in East Africa [108].

The impact of iron deficiency on mothers

There are additional adverse consequences of iron deficiency in childbearing women. Maternal mortality, prenatal and perinatal infant loss, and prematurity are significantly increased [113, 114]. Favourable pregnancy outcomes are 30%-45% less frequent in anaemic mothers than in normal mothers, and their infants have less than one-half of normal iron reserves. Such infants are at greater risk of morbidity and mortality during infancy [115]. Undernutrition during pregnancy leads to low-birth-weight infants who exhaust their iron stores at an earlier age. They then require more iron than supplied by breast milk at an earlier age than infants of normal birth weight [116, 117]

The impact of iron deficiency on child growth

Iron-deficient children given supplementary iron showed improved growth in Indonesia [118], Kenya [119], and Bangladesh [120]. This was also evident in studies in the United Kingdom [121] and the United States [122]. Whether or not an effect of iron supplementation is observed apparently depends on local factors, including infections, age at depletion, and possibly other dietary factors [72].

Iron deficiency and temperature regulation

On the basis of studies in experimental animals showing that iron-deficient anaemic animals readily become hypothermic and have depressed thyroid function, human studies were conducted first in Venezuela [123-126]. Martinez-Torres et al. (126] observed that iron-deficient anaemic subjects submerged in relatively warm water (28 °C) for one hour were unable to maintain normal body temperature. This has since been confirmed by additional studies in the United States [127].

 

Vitamin A

Vitamin A is converted directly to the visual pigment of the eye which is essential for night vision. An early sign of vitamin A deficiency is night blindness. The epithelium of the conjunctiva and cornea is also particularly susceptible to lesions due to vitamin A deficiency. The eye becomes dry, and foamy areas may appear on the conjunctiva. As the deficiency progresses, the cornea becomes eroded, the iris prolapses, the lens is extruded, and a corneal scar develops, resulting in blindness.

WHO estimates that about 40 million children in the world suffer from vitamin A deficiency, although this varies greatly between regions and countries [68]. About 350,000 infants and young children become blind annually because of vitamin A deficiency, and 70% of these die within one year, mainly because of susceptibility to infections. Vitamin A deficiency is recognized as a public health problem in 37 countries. Subclinical deficiency undoubtedly affects many more. Most children who go blind from vitamin A deficiency die within a year, but about 250,000 who go blind each year survive to burden their societies.

Four recent studies, summarized in figure 12 (see FIG. 12. Reduction in the mortality of children 0-5 years old with vitamin A supplementation for 12 months in Aceh [130], and West Java [131], Indonesia; Madurai, India [132]; and Sarlahi, Nepal [129]), have reported dramatic decreases in the mortality of children 0-5 years old given supplementary vitamin A for 12 months [128]. The decreases ranged from 30% in Sarlahi, Nepal [129], and 34% in Aceh, Indonesia [130], to 45% in Java [131] and 54% in Madurai, India [132]. Significant effects were not reported from studies in Hyderabad, India [133], and the Sudan [134]. The children in the Sudan study were so malnourished that other nutrients may have been limiting. There is extensive evidence from experimental animals that vitamin A deficiency can adversely effect immunocompetence and other resistance to infections [135]. However, the reason for the effect of vitamin A supplementation on mortality with little or no detectable effect on morbidity is not yet known.

A study in Tanzania [136] and two in South Africa [137, 138] have shown a sharp reduction in measles mortality and other complications in children given vitamin A when the disease is diagnosed. As a consequence, WHO recommends administration of vitamin A with the onset of measles wherever the case fatality rate exceeds 1% [139]. Reduction in morbidity from infectious disease with improved vitamin A status has also been shown in studies in Indonesia [140], India [141], and Thailand [142], but this seems situation-dependent [128].

 

Iodine

Until recently, iodine deficiency was identified only with a compensatory swelling of the thyroid gland known as endemic goitre and with cretinism, a manifestation in the child of severe iodine deficiency during gestation. The typical cretin has profound mental deficiency, a characteristic appearance, a shuffling gait, shortened stature, and spastic dysplasia. The subject is usually deaf and mute, and usually dies unless given good care. It is now recognized that, even when cases of cretinism are few in number, they indicate a much larger number of persons who do not have the classic signs of cretinism but whose linear growth, intellectual capacity, and other neurological functions such as coordination are compromised to varying degrees because of iodine deficiency in their mothers [143-147]. In addition. iodine deficiency causes an increased rate of stillbirths and abortions and may have other adverse effects. Endemic goitre rates in schoolchildren are the most convenient indicator of iodine deficiency in a population. WHO recommends that a goiter prevalence above 10% in a population should be taken to indicate a public health problem requiring preventive measures.

About one billion people of all ages are considered to be at risk from iodine deficiency, although cases of endemic goitre are estimated at 200-300 million. About 20 million of these are believed to experience some degree of mental retardation or other neurological change, and about six million show signs of cretinism. Iodine deficiency disorders are significant in at least 90 countries [148, 149].

Endemic goitre is global in distribution wherever populations depend on local food supplies grown on iodine-poor soil. Such soils are found in regions that have been glaciated, in mountainous areas, and where heavy rainfall leaches micronutrients. Except where iodated salt has been introduced, goitre is still prevalent along the Andes, across central Africa, in the Asian subcontinent, and along the entire length of the Himalayan chain, with large pockets of severe disease in Burma, Viet Nam, Indonesia, and New Guinea [146, 150].


Demographic significance of child mortality due to hunger


One of the most logical but pernicious misconceptions is that high mortality rates restrain population growth rates in countries with high fertility rates, and that lowering death rates will increase that growth. On the basis of this false premise, some have suggested that resources devoted to improving child survival should be diverted to family planning. There is no doubt that a spectrum of family planning methods should be made available to all populations. Nor is there any question that high birth rates are still a major problem for most developing countries. Bangladesh, Egypt, Kenya, India, Pakistan, and the Philippines are notorious examples. However, it is also a reality that promoting family planning without improving nutrition, health, education, and social equity has consistently failed.

India has spent more per capita on family planning than any other country and achieved essentially nothing. Egypt has had lavish assistance in family planning from the World Bank and US AID with meagre results. By contrast, Chile Cuba, Korea, Taiwan, Thailand, and other countries that have reduced their infant and preschool mortality have achieved relatively high receptor rates for family planning. Within India, the state with the lowest infant mortality and highest literacy has the lowest fertility rates, despite the fact that it is also relatively poor among the states in per capita income.

I had an opportunity to compare infant mortality and the acceptance of family planning in the provinces with the highest and lowest infant mortality rates in Indonesia in 1990. Where mortality was lowest, Yogyakarta, most families had adopted an effective contraception method. Their goal was no more than two or three children, and we rarely saw preschool children who were closely spaced. In the districts with the highest infant mortality, few families had accepted family planning. Many wanted as many children as "God would give them," and crowds of children were in evidence. This could be dismissed as anecdotal were it not repeated in many other developing countries [151-154].

Figure 13 (see FIG. 13. Rate of annual population increase versus infant mortality in 73 countries (data from PC Globe, 1990) shows that, over a broad range of countries, those with high infant mortality also tend to have high rates of population increase, while those with low infant mortality are the ones with low fertility. The UNICEF report The State of the World's Children, 1991 [155] provides other evidence that reducing infant deaths does not necessarily lead to higher birth rates. It describes the relationship between the mortality rates of children 0-5 years old and total fertility rates in 1960, 1980, and 1988 for 18 developing countries. As illustrated in figure 14 (see FIG. 14. Average drop in countries' crude birth rate, 1960-1988, versus infant mortality in 1960 [156]), the lower the infant mortality rate of a country, the greater the decline in the birth rate between 1980 and 1988 [156]. The data indicate that, although there is a lag between initial steep falls in mortality and a fall in fertility rates, as time passes and mortality rates fall further, fertility rates drop sharply. The message is that reducing the malnutrition that is the major factor in high infant mortality rates is an essential prerequisite to successful family planning.


Overcoming hunger


It is time to turn from the consequences of hunger to the cost of not overcoming it, and to an examination of what will be required to abolish hunger as a public health problem in the world. I will now review what can and must be done to prevent the hidden hungers described, beginning with the easiest, iodine deficiency, followed by avitaminosis A, iron deficiency, and, most difficult of all, deficiencies of energy and protein.

 

Iodine

Iodine deficiency disorders are the easiest of the hidden hungers to prevent. Table 8 shows the prompt drop of endemic goitre following the addition of iodine to salt for human consumption in the state of Caldas, Colombia [157], and on a national scale in Guatemala [158]. Legislation requiring the ionization of all salt for human consumption need not require either government subsidy or an increase in retail price. At the time it was introduced in Guatemala the cost of ionization was only about five US cents per hundred pounds. It does require legislation and cooperation on the part of the producers. Where there are several large producers and a few localized groups of small producers, as in the countries of Latin America, each large producer can be required to have separate ionization equipment, and the small producers can bring their salt to a common centre for the addition of potassium iodate.

TABLE 8. Effect of iodized salt on the prevalence of endemic goitre in two Latin American countries

  Colombiaa Guatemala
Year Goitre (%) Year Goitre (%)
Before ionization 1945 82 1952 39
After ionization 1952
1965
37
3
1962
1965
15
5

a. State of Caldas.

It is not always this easy. The rock salt used in some parts of India cannot be readily iodized in this form. In some countries the organization of many small producers into marketing cooperatives with the capacity to handle the ionization has thus far been beyond the capacity of local governments. International or bilateral assistance can help a country to overcome this obstacle.

In some remote mountainous areas, iodine has been administered intramuscularly in iodinated oil [148]. A single large dose is protective for up to four years against the risk of damage to the foetus during pregnancy [159]. Recently it has been shown that similar oral doses are nearly as effective. Under some circumstances ionization of water supplies may be feasible. Potassium iodate can also be substituted for potassium bromate as a bread improver [160].

Since the main cause of endemic goitre is the lack of iodine in local soil and water because it has been removed by heavy rainfall or glaciation, this is the one deficiency that is not amenable to changes in the use of local food supplies. The problem decreases, even in once-goitrous areas, when their populations begin to consume food produced in non-goitrous regions or from the ocean.

 

Vitamin A

Vitamin A deficiency should be equally easy to prevent, since green leaves and other plants that are rich sources of vitamin A activity are available to most populations. For rural populations they can be grown readily in gardens or gathered from the wild. Yet some of the worst concentrations of xerophthalmia and blindness due to vitamin A deficiency occur in populations surrounded by abundant sources of the vitamin in local vegetables and fruits, and no country has yet mounted a successful campaign to solve the vitamin A problem in this way.

In the rubber-plantation study in Indonesia referred to earlier [100], the experimental group received a daily tablet containing iron, and the control group was to have received an identical-appearing tablet without iron. When the results were tabulated, both groups showed a return of haemoglobin status to normal and an improvement in work capacity and productivity. The explanation was that the local representative of the World Bank, which was financing the study, felt sorry for the placebo group and persuaded the novice investigator to give them a small "incentive" of 15 rupees, then three US cents, per day. It could not be spent on more rice because of rationing, so it was spent largely on green leaves— mainly from cassava and spinach. These provided 3-5 mg of available iron, enough to produce a response when iron mobilization was also improved by the vitamin C, vitamin A, and protein that the leaves also provided. If the margin between sufficiency and deficiency is so narrow, and so readily responsive to local foods, why is it not the obvious solution to the problem everywhere?

I believe improved dietary habits to be the only acceptable long-term solution. However, adult food habits have been hard to change, and children are often resistant to green vegetables. Whether they are reflecting the attitude of adults caring for them or rebelling at the consistency remains to be determined. There are many societies whose consumption of local fruits and vegetables at all ages frees them of this problem. Well devised and implemented nutrition and health education programmes should be able to achieve this in other populations as well. At the Micronutrient Conference in Montreal [67], a successful campaign in Thailand was described that popularized the 'ivy gourd," a plant whose green leaves are rich in vitamin A activity. Moreover, with slightly greater social equity permitting a more varied diet, as in Costa Rica or China, vitamin A deficiency ceased to be a public health problem.

What is to be done in the meantime? Vitamin A is stored in the liver, and, when reserves are built up, they are sufficient for many months. It has now been demonstrated in many countries that the administration of an oral dose of 200,000 units of vitamin A palmitate will protect a young child from eye lesions due to avitaminosis A for four to six months. Furthermore, as mentioned above, this may also result in a dramatic decrease in mortality from infectious disease. The principal thrust of the current WHO/ UNICEF and US AID effort is to supply countries with high-dose capsules and to assist with the logistics of delivering them. The problem is that very often, as in India, the programme is erratic and the coverage poor |161].

An alternate strategy is the vitamin A supplementation of appropriate food sources. Fortification of commonly consumed foods such as margarine and milk is routine in industrialized countries, but usually they are not consumed with sufficient regularity to be useful for this purpose in developing ones. Arroyave et al. [158] demonstrated on a national scale in Costa Rica and Guatemala twenty years ago that it is feasible in Central America to fortify sugar with vitamin A, and this has been successfully maintained in Guatemala. It is now planned for the other Central American countries except Costa Rica, where it is no longer needed because of improved nutrition and health conditions. Monosodium glutamate (MSG) has been used as a vehicle for vitamin A in Indonesia [131].

The successful implementation of a dietary approach will require the use of foods rich in vitamin A activity and selective agricultural breeding and extension programmes to improve the variety and vitamin A activity of local vegetables and fruits. Older methods of agriculture and home economics extension. including assistance with school and home gardens and revision of school curriculums, can now be supplemented by modern methods of communication, including radio, television, and interactive videos.

 

Iron deficiency

The main reason that iron deficiency is not a problem among most middle- and upper-income populations is that their diet contains red meat as a source of haem iron, which is much better absorbed than the iron from vegetable diets. Moreover, these groups are less likely to have diseases such as hookworm, malaria, or schistosomiasis that exacerbate iron deficiency. For less-privileged populations the elimination of these diseases is a contribution to the campaign against iron deficiency. For developing-country populations that are predominately vegetarian for economic, and sometimes religious, reasons, exhortation to eat more red meat is not a feasible or desirable approach.

Yet the absorption of iron from vegetable diets is so low that it is almost impossible for pregnant women to get sufficient iron from them [162]. A number of reasonably well utilized iron compounds are available for addition to staple foods, and this is an attractive approach. However, there is no single vehicle applicable in all countries. In industrialized countries infant formulas and most cereals are fortified with iron. In Egypt, high bread consumption, the limited number of flour mills, and their location in each province make fortification of wheat flour feasible. Fernando Viteri demonstrated in Nicaragua the practicality of fortifying sugar with the iron chelate EDTA (unpublished INCAP data).

An approach that would link the elimination of iodine-deficiency disorders to the prevention of iron deficiency would be the double fortification of salt with both iron and iodine. Such double fortification has been reported by the National Institute of Nutrition [163, 164] to have been successfully field-tested in India and to be ready for wider application. If this apparent success can be replicated in other countries, it could greatly facilitate the effort to eliminate both iron and iodine deficiencies as public health problems worldwide.

The phytates and fibre in cereals and some vegetables reduce intestinal absorption of dietary iron. Nevertheless, an increase in the consumption of fruits and green and yellow vegetables would not only help to eliminate the problem of vitamin A deficiency but also provide additional vitamin C, which would increase the absorption of iron from vegetable diets. In some cultures, drinking tea with meals significantly reduces the absorption of iron, and this should be avoided where iron status is deficient or borderline. As economic status improves. iron deficiency generally decreases, because the diet is likely to have more vegetable sources of iron and red meat as a source of readily available haem iron.


Political mobilization for the conquest of hunger


I must now discuss how to overcome the most difficult kind of hidden hunger, that due to chronic energy and protein deficiencies. Improving food production and post-harvest food conservation, food processing, and the physical distribution of food are each important, but they are not sufficient by themselves to solve the problem of hunger. Chronic energy deficiency is largely the result the inability of poor families to acquire enough food. The reasons for this are too complex and controversial to attempt to analyse in this paper, but they need to be addressed if this kind of hunger is to be overcome.

A number of recent declarations have set forth such admirable goals as the elimination of starvation and death caused by famine, the elimination of the major micronutrient deficiencies, and the prevention of damage from chronic undernutrition. However, these goals cannot be attained unless governments are willing to take the measures required to achieve them. The 1974 World Food Congress in Rome proved to be a rhetorical exercise that did not result in the actions by either developing or industrialized governments that would be required to achieve the proclaimed goals. Almost the only tangible outcome was the establishment of the World Food Council, with a tiny budget and the very limited mission of sensitizing governments to food problems and promoting policies to overcome it, and the International Fund for Agricultural Development (IFAD), which has been a valuable organization but limited in purpose.

The situation is at last changing, thanks both to leadership from the specialized agencies of the United Nations and to private efforts. In 1978, representatives of 134 countries convened in Alma Ata and endorsed the goal of "Health for all by the year 2,000." By this was meant removing the obstacles to health [165]. They went further, however, and proposed a comprehensive strategy for achieving this goal that focused on prevention. Of nine essential elements, the first seven are all nutrition-related. These are health education, proper food supply and nutrition, safe water and basic sanitation, maternal and child health care, immunization, and the prevention and control of endemic disease.

In the 1980s, WHO increased its promotion of primary health care and introduced an expanded programme of immunization in cooperation with UNICEF. It succeeded in increasing the efforts of many developing countries to implement the goals of Alma Ata, but the response was variable and somewhat disappointing. In order to accelerate and focus primary health care resources for mothers and children, UNICEF initiated a child-survival initiative in 1983, initially under the acronym "GOBI." This strategy emphasized growth monitoring to determine the adequacy of child feeding, the promotion of rehydration for severe diarrhoea cases, improved personal hygiene to prevent diarrhoea, and expanded immunization to prevent the common communicable diseases of children [166]. Progress since has been impressive and justifies calling it the beginning of a "child survival and development revolution."

For example, the WHO expanded programme of immunization, with strong support from UNICEF and bilateral agencies, is achieving coverage rates of 80%-90% in country after country, compared with previous coverages of less than 20% [155]. Oral rehydration therapy to treat against diarrhoea has been introduced with comparable success in many countries. Clean water has been made available to an additional 900 million persons. Four-fifths of the world's population lives within 17 countries in which more than 50% of couples are now using contraception. However, the direct attack on hidden hunger has proved more difficult.

These results have been accelerated by a remarkably successful UNICEF initiative in 1990 that brought together 71 presidents and prime ministers for the first "World Summit for Children." This largest gathering of heads of state and government in history made a commitment to try to end child deaths and child malnutrition on today's scale by the year 2000 [156].

This overall goal was broken down into more than 20 specific targets in a plan of action agreed upon by the 159 nations represented at the summit. The final declaration stated: "We are prepared to make available the resources to meet these commitments." All national and international organizations were asked to participate. In particular, the worlds of religion, education, the communications media, business, and the non-governmental organizations in every country were challenged to join in this decade-long effort.

An immediate result has been an increase in the number of governments actively adopting child-survival strategies and an acceleration in the coverage achieved by ongoing programmes. In October 1991 in Montreal a Policy Conference on Micronutrient Malnutrition [67] was convened by WHO and UNICEF with the participation of other agencies in the UN system and bilateral agencies concerned with nutrition. Delegations from more than 60 developing countries reported on the success of their programmes, and experts reviewed the current global status of deficiencies of vitamin A, iodine, and iron and the availability of affordable and effective actions to eliminate them. The consensus of the more than 300 participants was that this goal can be achieved in the current decade if current resource flows and leadership are sustained and countries keep the commitment their leaders made at the world summit.

The World Food Programme (WFP) is recognizing that its food aid must be integrated into national development programmes and treated as a development resource by governments and that it must be backed by adequate technical and financial support. Used in this way, food aid makes both an immediate and a long-term contribution to preventing hunger. Food aid also continues to make an important contribution to alleviating famine and refugee hunger, although the WFP's capacity to do so is limited more by the level of cooperation of the responsible governments than by the resources obtainable for the purpose.

An International Conference on Nutrition was held in Rome in December 1992. While the conference was organized by FAO and WHO, all other international, bilateral, and voluntary agencies were invited to participate. Almost all developing countries were represented at the conference at the highest political and technical levels. Building on the Summit for Children and the Conference on Micronutrients, this meeting was another landmark in the conquest of hunger.


Contents - Previous - Next