This is the old United Nations University website. Visit the new site at http://unu.edu
Barbara A. Underwood and Abraham Stekel
Prolonged dietary inadequacy alters the biochemical milieu of the body, and consequently enzymatic activities. in advance of the appearance of clinical symptoms and signs. Laboratory measurements of the nutrient adequacy of body fluids or tissues, therefore, can provide objective, specific, and sensitive indicators of nutriture.
These measurements, judiciously selected for some nutrients, can provide subclinical information useful in evaluating the nutritional impact of nutrition interventions.
Depending on the specific nutrient in question, adequacy may be measured biochemically by:
Which laboratory approach is appropriate for measuring adequacy for a particular nutrient will depend on an understanding of its basic biochemical role, the distribution among body compartments during periods of dietary lack and sufficiency, and how this distribution is influenced by short and long-term changes in the physiological environment, e.g.. acute and chronic infections, and hormonal imbalances and variations.
Figure 4.1. (see FIG. 4.1. Generalized Scheme of the Development of a Primary Nutritional Deficiency. ) depicts a generalized scheme of the sequence that occurs from the time diets become inadequate to when clinical signs and symptoms become evident. This sequence will vary for different nutrients, as will the sequence of reversal in response to re-supply of the nutrient. An understanding of these time sequences in depletion/repletion of specific nutrients of concern is crucial, therefore, in selecting a methodology that will provide the information required to evaluate nutritional impact of various types of programmes. Hence laboratory methodology, judiciously selected for a particular nutrient, can be useful in detecting changes in subclinical levels of nutriture; or it can be an expensive, non-informative exercise.
It is important to keep in mind that in practice, the relative availability of some nutrients to support biochemical functions varies from day to day with fluctuations in intake. Hence, flow in figure 4.1 may be upward or downward and a single laboratory measurement or clinical observation will not reveal the direction of events. Furthermore, the rapidity with which the direction of flow responds to alterations in food or nutrient supply will vary. For example, anatomical signs, though the last to appear, may take substantially longer to disappear than will restoration of the activity of nutrient-dependent enzymes or the concentration of the nutrient in blood. Thus the biochemical assessment of nutritional status for a specific nutrient does not always directly correlate with the findings from dietary or clinical assessment, particularly when applied on an individual basis or among populations of small sample size. Generally, when applied to populations for assessment of nutritional status, the trend will be in the same direction for dietary, biochemical, and clinical findings.
To use cross-sectionally obtained measurements for purposes of evaluating the nutritional impact of nutrition interventions, a laboratory measurement should be chosen that represents the cumulative effects on nutriture. i.e., nutritional status, rather than immediate responses to dietary intake. On the other hand (as noted earlier) acute changes in the physiological environment, acute infection, hormonal balances etc., as well as diet, can shift the distribution of nutrients among compartments, thus affecting biochemical events. Therefore it is often also useful to have an indicator of the immediate situation, particularly when dealing with individuals or populations of small size. In all cases, an appropriate comparison group is necessary to evaluate associations among laboratory measurements with nutritional intervention.
Biochemical measurements with nutritional implications can be made quite non-invasively on tissues such as hair and nails, or at the extreme of invasiveness on liver and muscle. But in practice, blood and its cellular components and urine are the most readily available tissues and can be obtained with moderate evasiveness, for estimating status in surveys. For most nutrients, urine is unsuitable for nutritional status assessment unless a timed or 24-hour sample can be obtained to compensate for diurnal variations in nutrient excretion rates and in volume. Relating values to creatinine concentrations reduces but does not eliminate this problem. This criticism is most applicable when applied to individuals and to populations of small size. On the other hand, casual urine specimens can be useful if the purpose is to evaluate compliance to a food or nutrient supplementation programme rather than nutritional impact. For example, by including a nutrient marker in the supplement that normally is excreted in the urine, such as riboflavin, increased levels can be detected qualitatively when related to non-participant or non-complying comparison groups.
As already noted, choice of fluid or cellular component in blood that best reflects status rather than immediate dietary intake will vary in relation to how specific nutrients are distributed between extra and intracellular compartments, how responsive this distribution is to dietary change, and how it is influenced by altered physiological conditions. such as acute or chronic infections. drugs, and stressful) circumstances, which are unrelated to diet. For many nutrients, compensatory biochemical mechanisms exist to adjust for short-term fluctuations in dietary intake and to delay the onset of clinical signs of inadequacy. Body reserves of varying size and half-life exist for this purpose. Ideally, assessment of change in the magnitude of the reserve supply of a nutrient (step 2, fig. 4.1.) would be most useful for nutritional surveillance purposes and for measuring the nutritional impact of certain nutrient-specific interventions where homeostatic mechanisms maintain body fluid levels until reserves are depleted, such as for iron and vitamin A. This would represent the earliest stage of dietary inadequacy and indicate when preventive measure should be instituted. However, biochemical indicators of tissue reserves that are present in blood and accessible to laboratory evaluation in surveys are available for only some nutrients, e.g., serum ferritin as a reflector of tissue iron stores. There is no "true" reserve store of protein, only variation in active protein mass that is not easily measured in surveys by laboratory methods.
Blood samples obtained from fasting subjects are preferred to avoid fluctuations in some nutrients that reflect immediate dietary intake. However, under field conditions, especially among children, non-fasting specimens are often all that can be obtained practically. By selection of the appropriate parameter, non-fasting specimens can be used without prejudice in interpretation for estimation of protein and iron status, and for vitamin A status except following a meal that contains a concentrated source of the vitamin (e.g., animal liver). Since blood samples are usually obtained in the morning hours and rich sources of preformed vitamin A are unusual breakfast items in developing countries, this is an unlikely significant confounding variable (1). Fasting specimens may be more critical in the laboratory assessment of certain other nutrients, particularly water-soluble vitamins (2).
Caution must be used in interpreting blood data obtained from subjects with acute infections. These may cause a transient lowering of blood level of some nutrient-specific transport proteins, such as retinol-binding protein and transferrin. Chronic infections too can cause a lowering of circulating levels for some nutrients. These blood concentration changes may not reflect a depletion of the total available body pool, but a temporary redistribution of the nutrient that is without physiological significance.
The appropriateness of using laboratory measurement for nutritional impact evaluation will depend on the nature of the intervention programme and the kind, severity, and prevalence of nutritional problems in the recipient population (see TABLE 4.1. Summary of Laboratory Methodology (Nutritional Biochemistry and Hematology)). Laboratory measurements are most appropriately applied in tandem with the introduction of specific, population-based nutrient interventions, such as iron, iodine, or vitamin A food fortification programmes, or in interventions targeted to individuals who are given specific supplements for which specific before-and after-treatment effects can be measured.
TABLE 4.1. Summary of Laboratory Methodology (Nutritional Biochemistry and Hematology)
Kind of Information | Kind of Intervention | Unit of Observation | Personnel Training |
Resources Required | Time Level | Survey |
Albumin/ Prealbumin | Protein-calorie supplement to malnourished vulnerable groups | Individual | Lab. tech. (2 weeks) | Radio-immuno diffusion kit, refrigeration | 10-30 samples/day |
Minimal |
Prevalence of low serum retinol levels | Vitamin A | Individual (1 month) | Lab. tech. | HPLC Spectrophotometric Colorimetric |
20 specs./day | Sophisticated Appropriate for LDCs |
Hemoglobin concentration or hematocrit | Iron | Individual | Lab. tech. (2 weeks) | Colorimetric | 3 minutes | Minimum for difficult field conditions |
Transferrin saturation | Iron | Individual | Lab. tech. (1 month) | Spectrophotometer | 20 minutes | Sophisticated |
Free erythrocyte | Iron | Individual | Lab. tech. (1 week) |
Hematofluorometer | 3 minutes | Simple |
Protoporphyrin | Lab. tech. (1 month) | Fluorometer or Spectrophotometer | 20 minutes | |||
Serum ferritin | Iron | Individual | Lab. tech. (1 month) | Refrigerator Centrifuge Gamma counter or Spectrophotometer Serum ferritin kit | 20 minutes | Sophisticated |
Biochemical measurements are also appropriate where intervention programmes, even non-nutrient-specific ones, are clearly targeted to vulnerable population groups with known significant dietary inadequacies. One example would be preschool children from poor environments in which PEM is prevalent and among their pregnant and lactating mothers.
Biochemical methods may not be useful in evaluating general food aid programmes for adult workers with only marginally adequate diets. Under these circumstances, limitations in the magnitude of biochemical responsiveness to moderate dietary change that can be reliably detected by laboratory measurements could preclude usefulness. For example, clearly it would be inappropriate to apply laboratory assessment of iron and vitamin A status to evaluate a food aid programme such as a grain distribution in a food-for-work programme, that did not include significant amount of these nutrients. Furthermore. it is unlikely that significant alterations in protein status would be detected by laboratory methodology in this type of adult recipient population.
In contrast, however, a feeding programme including a protein ration for preschool, poor children is likely to show a shift upward in the distribution of albumin levels from the lower range of the distribution curve. Because laboratory measurements are costly with respect to employment of professional personnel and possible continued cooperation of programme recipients, the decision of evaluators to include laboratory assessment should be carefully matched to the specific programme being evaluated, to maximize the potential for obtaining interpretable data on nutritional impact.
The major nutrient deficiencies of public health significance which food or nutrient-specific distribution programmes have most often been designed to alleviate include inadequate food energy intake, protein-energy malnutrition, iron deficiency anaemia. vitamin A deficiency, and iodine deficiency. Evaluation by laboratory measurement of the nutritional impact of food programmes to combat these five nutrient-specific problems is possible using several methods that vary in degree of sophistication, reliability, and accuracy. Individual laboratories should evaluate their own resources and select the method best suited to their situation. We have chosen to include in this chapter those methods that are most practical and least costly, that required a moderate level of training of personnel, yet provide the degree of precision and accuracy required for programme evaluation. Other methods might be more appropriate for research purposes, or where laboratories are well-equipped and have a highly trained staff.
Protein-Energy Malnutrition (PEM)
Energy balance is best assessed by non-laboratory measurements (see chapter 3), while protein status can be reflected in biochemical measurements. The speed at which changes in protein status occur will vary according to the turnover rate of the protein in question. In contrast to iron and vitamin A, for which there are tissues that accumulate reserve stores (step 2, fig. 4.1.), there is no true storage tissue for protein, only variations in the amount of total active protein mass. Knowledge of turnover rates of various protein species found in blood is needed, therefore, to select the parameter appropriate for use in evaluating nutritional impact of nutritional interventions, i.e., long-term effects rather than short-term responses to the relative availability of a balanced amino acid supply for protein synthesis. In this respect, blood levels of rapidly turning over transport proteins (retinol-binding protein, or RBP, transferring and prealbumin) are known to be sensitive to short-term changes in the available protein and energy supply (step 3. fig. 1), but do not necessarily reflect depletion in protein mass (step 2, fig. 1 ) or decreased functional level (step 4, fig. 1).
The slower turning over albumin level in blood best reflects longer term protein status (3). The transport proteins, for example, are most useful for evaluating the immediate responsiveness to an intervention that provides an energy or protein supplement, or that decreases the burden of infections that stress protein-energy requirements; albumin levels, on the other hand, better reflect true nutritional impact on protein status. Because some transport proteins, such as transferring and retinol-binding protein (RBP), are at least partially dependent upon the availability of the nutrient they carry, their use for assessing protein nutriture may be confounded by concurrent deficits of the dependent nutrient. Prealbumin, though a carrier of the RBP-retinol complex as well as the iodine-containing thyroxine, is not dependent on either of these nutrients for its hepatic synthesis or secretion, yet still has a short half-life (2-days) that makes it sensitive to the immediate availability of a balanced amino acid supply (3). Blood levels of prealbumin, like albumin and other transport proteins, are dependent upon adequate liver function, and therefore are depressed by liver disease independent of dietary adequacy.
To evaluate programme impact on protein status, a combined assessment of prealbumin and albumin provides information on both short and long-term dietary effects, respectively (4). This methodology is most appropriately applied for evaluation of intervention programmes targeted to vulnerable groups such as preschool children, pregnant and lactating mothers. Unless evidence exists for substantially inadequate protein or energy intake among other recipient populations, such as school children and adult male workers, prior to the intervention, laboratory assessment of protein nutriture is unlikely to reflect responsiveness to a dietary change achieved through food aid programmes.
Suggested methods
Both prealbumin and albumin can be determined by relatively inexpensive, reliable laboratory technique requiring a routinely trained laboratory technician (see appendix). Radial immuno-diffusion (RID) is used to determine blood levels of prealbumin, and kits are available commercially for this purpose (5). The kits contain complete instructions and all the materials necessary for the assay, including the protein standards, with the exception of a microliter syringe and measuring ruler or caliper. The plates should be stored before use at refrigerator temperatures. The assay can be completed in as little as 18 hours.
There are several methods for determination of serum albumin, including standard electrophoresis, dyebinding, and salt fractionation. Specificity is highest for electrophoresis, followed by dye-binding and salt fractionation, and the relative cost for the analysis and sophistication of required equipment follow the same order. For evaluation purposes, the specificity and precision of the dye-binding procedure is adequate.
Vitamin A Nutriture
Interpretation of the biochemical measurement of vitamin A in blood short of deficient (< 10 µg/dl) or excess (> 70-80 µg/dl) levels is confounded by homeostatic controls, partially independent of diet, that modulate the release of liver reserve supplies (1). Hence, blood levels (step 3, fig. 1 ) do not necessarily reflect the level in the liver reserve (step 2, fig. 1 ) and, therefore, the relative level of vitamin A nutriture. Blood values that lie between about 15-30 µg/dl. particularly in young children and for some specific individuals, may reflect physiological conditions unrelated to vitamin A reserve stores. Under such circumstances, improved dietary intake of vitamin A through an intervention will not necessarily change the level in the blood. On the other hand, if the blood level for individuals is in a range that is difficult to interpret (15-30 µg/dl), for reasons of chronic inadequate intake and low liver reserves (step 2, fig. 1), blood levels will increase in response to an increased intake of vitamin A. Still, it cannot be assumed that adequate reserve stores have been established as a result of increased circulating levels, since blood levels must exceed a threshold before stores are replenished.
Therefore, to evaluate the impact of a food programme that seeks to improve vitamin A nutriture, it is important to look at changes in the lower end of the population distribution curve of blood levels rather than means or absolute values (6). When the lower end of the distribution curve shifts to the right following an intervention programme, this can be interpreted as programme impact even though there may be no significant change in mean or median values (7). Figures 4.2 (see FIG. 4.2. Effect of Sugar Fortification with Vitamin A on the Distribution of Serum Retinol Levels of Rural Preschool Children. A: 1975 1976. B: 1975,1977.) and 4.3 (see FIG. 4.3. Pre-supplementation and Post-supplementation Distribution of Serum Retinol Concentration (A) and Total Vitamin A Concentration (B) in a Group of Normal Elderly Men.) illustrate this point for, respectively, children in a national programme of vitamin A fortification of sugar and a shorter duration of vitamin A supplementation programme for elderly men. To assess biochemically the programme impact on vitamin A status, change in the distribution of blood levels under 30 µg/dl in the latter case, and especially those under 15 µg/dl in the former case, were found to be most useful.
Retinol-binding protein has been suggested as a vitamin A adequacy indicator that can be simply assayed in the field by RID techniques. However, as noted above, RBP synthesis is influenced by acute protein deficiency (4) and liver function, as well as by the availability of vitamin A from the diet or reserve tissue stores (8). Furthermore, the RID assay determines total RBP, including that not bound to retinol. The unbound or apo-RBP is physiologically unimportant with respect to vitamin A status. It is important to note that total RBP may remain in the low normal range while available vitamin A (holo-RBP) has declined dangerously (1).
Suggested Methods
The analytic method of choice for determination of vitamin A status uses high-pressure liquid chromatography (HPLC), which is fast, determines retinol directly, and minimizes opportunity for oxidative losses, but is expensive.
Several other analytic methods are available, such as spectrophotometry, fluorometry, and colorimetry, and with care they can be used interchangeably, depending upon available laboratory resources. All methods require a carefully trained and standardized technician. These methods are described in detail elsewhere (1). When a spectrophotometer equivalent to a DU is available, the procedure of Bessey and Lowry, based on UV inactivation, is likely to be least expensive and most reliable. In the absence of a DU spectrophotometer, which involves a relatively high initial investment, the colorimetric assay using trifluoracetic acid, trichloracetic acid, or antimony bichloride can be satisfactory, provided a reliable vitamin A standard is available and proper precautions are exercised (1). Fluorometric procedures are more sensitive than the colorimetric ones. However. spurious high results are notable because of fluorescent contaminants difficult to avoir under most laboratory conditions in developing countries.
Serum levels of vitamin A obtained cross-sectionally and displayed in distribution curves can provide information on the percentage of individuals with low levels of vitamin A who may be subclinicaily malnourished. When evaluated against an appropriately matched comparison group, this information is useful in assessing differences in the magnitude of the "at risk" group among recipients of an intervention programme. The best way to determine the nutritional impact of a programme on vitamin A status is to have before and after treatment laboratory measurements. An alternative is to measure the response of a subsample of recipients with plasma values that are in the lower portion of the distribution to an additional short-term supplement (8).
Laboratory assessments of vitamin A status are appropriate for evaluation of nutrition intervention programmes in which the daily intake of vitamin A is increased (vitamin A-containing food fortification programmes), in single, massive-dose intervention programmes, and in programmes to correct severe forms of protein-energy malnutrition. Usefulness in the latter types of programme stems from the intimate interrelationship between protein and vitamin A status. Intervention programmes to correct serious protein deficiency should always provide sources of vitamin A concurrently, Since there is no practical way of knowing whether serum levels of vitamin A are lowered by depletion of tissue reserves or are secondary to protein deficiency and impairment of mobilization. Stimulation of growth by correcting protein deficiency elevates the need for vitamin A, and if the latter is not supplied, irreversible eye damage can be precipitated in a very short time.
Iodine Nutriture
Clinical examination and palpation of the thyroid gland are generally sufficient to evaluate the success of programmes to correct iodine deficiency and control endemic goiter. However, iodine nutriture can be assessed in the laboratory by measuring blood levels of protein-bound iodine (PBI), urinary excretion of iodine, and radioiodine uptake. Since all three of these parameters may be influenced by various physiological states and drugs, interpretation of iodine nutriture by biochemical methods must be done with caution. As for vitamin A, it is necessary to evaluate the iodine impact of food programmes by looking for shifts in the lower range of PBI and urinary excretion values rather than for absolute values for means or medians.
Iron Deficiency Anaemia
Nutritional anaemia is one of the most common and significant nutritional problems in the world today. It is likely, therefore, that anaemia will be prevalent in areas where nutrition intervention programmes take place. Iron deficiency, folate deficiency, protein-calorie malnutrition, acute infection, and chronic disease can all contribute to the occurrence of anaemia. Studies in several parts of the world have demonstrated, however, that iron deficiency is, in most situations, the main etiologic factor.
Definition of Anaemia
Anaemia is usually defined using criteria established by population studies. These studies have determined, for individuals of different sex, age, and physiological condition, levels of haemoglobin concentration under which anaemia is likely to be present. It must be borne in mind, however, that there is an overlap of haemoglobin concentration figures between normal and anaemic individuals. Therefore the use of fixed limits of normality will misdiagnose as anaemic a certain proportion of individuals with adequate haemoglobin concentration and include among the normal group some anaemic subjects (see FIG. 4.4. Fitted Double Population of Haemoglobin Concentrations in the Third Trimester of Pregnancy in Latin American Women: The vertical line depicts the haemoglobin value used to define anaemia according to WHO criteria. The higher curve represents the distribution of the normal population of pregnant women, and the lower curve, the anaemic group. The crosshatched area above 11 g/dl represents subjects found anaemic by distribution analysis but considered normal by WHO criteria. The shaded area below 11 g/dl represents that portion of the population found normal by distribution analysis but considered anaemic by WHO criteria. (Redrawn from Cook et al. 19] by Bothwell et al. 110].)). A more precise definition of the prevalence of anaemia in a population can be obtained by determining the proportion of individuals that show a significant haemoglobin response to supplementation with hematinics.
Stages in the Development of Iron Deficiency
A measureable decrease in haemoglobin concentration is a late effect of iron deficiency. The iron-replete individual not only has sufficient iron to synthesize haemoglobin and other essential iron containing compounds, but also has some iron reserves. The amount of iron stores in normal adult males have been estimated at 500 to 1,000 mg. Stores are lower in women of reproductive age and in infancy and childhood.
The first consequence of a negative iron balance is a decrease in the amount of storage iron, a condition known as iron depletion. Once stores are depleted there may not be a sufficient supply of iron for erythropoiesis, and haemoglobin synthesis is impaired. This state is known as iron-deficient erythropoiesis. After some time, this is reflected in a decrease in haemoglobin concentration and iron deficiency anaemia.
Laboratory Tests
The most useful tests for the evaluation of iron nutritional status are the plasma ferritin, the per cent saturation of transferrin, the concentration of free erythrocyte protoporphyrin, and the haemoglobin concentration. There is a general correlation between the stages of iron deficiency and the changes in these laboratory tests.
Valuable information on the iron status of individuals can be obtained by the measurement of plasma or serum ferritin. It has been shown that the concentration of this compound (step 3, fig. 4.1.) reflects the amount of storage iron (step 2, fig. 4.1.), and that 1 µg/l of serum ferritin is roughly equivalent to 10 mg. of iron stores. With progressive depletion of iron stores there is a parallel drop in serum ferritin, with serum values below 12 µg/l representing absence of storage iron.
After iron stores are depleted there is a drop in the amount of iron being transported in the plasma (plasma iron). The amount of the transport protein transferrin that is saturated with iron (or total binding capacity) concomitantly increases, so that per cent saturation of transferrin with iron falls from values above 30 per cent to less than 15 per cent. At the same time, since not all the protoporphyrin synthesized by erythrocyte precursors in the bone marrow is formed into heme because of the insufficient iron supply, there is a rise in the amount of free erythrocyte protoporphyrin in red cells from normal values of about 30 µg/dl to above 100 µ/dl. Finally, there is a measurable drop in haemoglobin concentration.
The relationship of these measurements to iron stores and the values found in the different stages of iron deficiency are depicted in figure 4.5.(see FIG. 4.5. Measurements of Iron Status in Relationship to Body Iron Stores (mg). Negative iron stores indicate the amount of iron that must be replaced in circulating red cells before iron reserves can re-accumulate (Data from Cook and Finch [11]).). Extremely useful information can be obtained by the measurement of all four parameters. The most important, however, are probably haemoglobin and plasma ferritin, which permit evaluation of both extremes of the spectrum of iron status.
Monitoring Results of Intervention Programmes
As already mentioned, the two most useful laboratory tests for measuring the effect of interventions on nutritional anaemias are the haemoglobin concentration and the serum or plasma ferritin. In situations where there is a high prevalence of anaemia and supplementation strategies are used, effects will be most readily measured by changes in haemoglobin. With food fortification, on the other hand, especially in populations where there is little anaemia, one can expect relatively modest increases in the amount of daily absorbed iron, and results may be better monitored by measuring changes in iron stores as reflected in serum ferritin.
The effects of intervention programmes on iron deficiency anaemia can be better evaluated in vulnerable groups such as infants, preschool children, and pregnant women. Ideally, studies should be conducted in representative samples of the target populations and should include appropriate control groups.