Objective. Iron deficiency anemia is known to impair cognitive and psychomotor development. The zinc protoporphyrin/heme (ZPP/H) ratio is a simple, accurate, and sensitive laboratory screening test that detects early iron depletion before the onset of anemia. The objective of this work was to evaluate this test in a primary pediatric practice setting.
Methods. The iron status of a cohort of 361 children was screened during routine examinations at a community pediatric practice. Whole blood hemoglobin concentration, hematocrit ratio, serum transferrin saturation, ferritin concentration, and the ZPP/H ratio were measured. The ZPP/H ratio then was evaluated as a single indicator of iron status by comparing it with other tests for detecting the onset of iron deficiency and for monitoring recovery after iron supplementation.
Results. Significant age- and sex-related differences in the ZPP/H ratio were found. In this cohort, serum ferritin concentration and the ZPP/H ratio independently identified the same fraction of iron-deficient patients (3%–4%), and both tests were more specific than was either hemoglobin or hematocrit. A concordance of three iron status parameters changed the prediction of iron deficiency to ≤1%. Children <3 years of age and adolescent girls had significantly higher ZPP/H results.
Conclusion. The prevalence of iron deficiency anemia in the typical healthy American pediatric population is low, but iron deficiency without anemia remains relatively common at some stages of development. Increase in the ZPP/H ratio is demonstrated to be a sensitive, specific, and cost-effective test for identifying preanemic iron deficiency in a community pediatric practice. anemia, nutrition, development.
- ZPP =
- zinc protoporphyrin •
- ZPP/H =
- zinc protoporphyrin/heme ratio •
- TIBC =
- total iron-binding capacity •
- FEP =
- free erythrocyte protoporphyrin •
- EP =
- erythrocyte protoporphyrin
Although iron deficiency is decreasing in industrialized societies, it is still the most common childhood nutritional deficiency.1 In the United States, the prevalence of childhood iron deficiency with or without anemia has been estimated in different reports over the last 20 years to range from 3% to 44%,2 ,3 and deficiency is relatively independent of socioeconomic bounds.3 In young children, iron deficiency with anemia impairs cognitive function and psychomotor development.4–6 Similar effects attributable to preanemic iron deficiency are evident but not yet established.7 Iron deficits at an early age are of particular concern because some of the sequelae may be irreversible,8–10 although the question of reversibility remains unresolved.11 Not to be overlooked is the fact that iron deficiency also enhances lead absorption,12 which is a good reason in itself to assure adequate iron nutriture in young children.
Because many factors contribute to the vulnerability of infants and children to negative iron balance and deficiency, the need for a simple, accurate, and sensitive screening test for iron status remains. Iron deficiency most often is diagnosed based on either hemoglobin concentration or hematocrit measurement, ie, after the onset of anemia. But neither iron deplete nor replete states, at least in their incipiency, are reflected consistently by these two measures. In fact, no single biochemical indicator in current routine use is consistently diagnostic of iron deficiency, particularly in the preanemic state. Although combining several iron status indicators provides the best assessment,13 ,14 multiple testing is impractical for screening purposes, and as a single test, the commonly used hematocrit measurement is inadequate.15 ,16
Normally, a trace of zinc rather than iron is incorporated into protoporphyrin during the final step of heme biosynthesis.17 In states of iron-deficient erythropoiesis, zinc protoporphyrin (ZPP) formation is enhanced. These interrelated reactions can be summarized as follows: Although reaction A greatly predominates at all times (normally ∼30 000 to 1), a very slight decrease in iron availability, as in beginning iron depletion, causes reaction B to increase with the ZPP binding to globin in maturing erythrocytes. Thus, the zinc protoporphyrin/heme ratio (ZPP/H), which is essentially the ratio of these reaction products, reflects iron status in the bone marrow during hemoglobin formation.18 ,19 As a physiologic or functional measure of iron utilization, the erythrocyte ZPP/H ratio increase compares favorably with ferritin concentration decrease in diagnosing not only iron deficiency anemia,20 but also preanemic iron depletion.19 ZPP/H correlates well with both mean cell volume and hemoglobin concentration in states of iron deficiency anemia.21
Using the simple technique of hematofluorometry to measure the ZPP/H ratio in whole blood, we have studied the effectiveness of this index in screening for iron depletion or for deficiency in children seen for routine examination in a community pediatric practice.
Subjects and Clinical Setting
These studies were approved by the Human Subjects Division, Office of Research, University of Washington. The subjects, 361 children (206 girls and 155 boys), were seen in a private pediatric clinic. With the exception of one 2-month-old infant, the children were all between 9 months and 18 years of age, the majority being comprised of either white (63%) or black (19%) children. As judged at the clinic, 56% of the subjects came from lower socioeconomic backgrounds, 36% from middle, and 8% from higher. This heterogeneous group of consenting patients represents a pediatric practice in the area. Although the distributions of ages, genders, and ethnicity do not allow rigorous statistical analysis of the data, they do allow an evaluation of the laboratory test being described.
All children were free of complaints and were seen during physician visits for well-child check-up, preschool physical, or sports physical. Iron status of the subjects was evaluated as a routine part of patient care at the clinic. Thirty-two children whose transferrin saturation, hemoglobin concentration, or hematocrit values indicated the presence of iron deficiency were given oral iron therapy and then were monitored by retesting within 4 to 30 weeks (median = 8). Ethylenediaminetetraacetic acid-anticoagulated blood had been obtained for complete blood counts; an aliquot of this same specimen was used to perform the ZPP/H test. Serum specimens were used for assays of ferritin and iron concentrations and for determination of total iron-binding capacity (TIBC). Percent transferrin saturation was calculated as serum iron/TIBC × 100.
The hematocrit and hemoglobin concentrations were determined with a Coulter S-Plus (Coulter Electronics, Inc, Hialeah, FL). Serum iron concentration and TIBC were determined on a Ferrochem II Serum Iron/TIBC Analyzer (Environmental Sciences Associates, Inc, Bedford, MA). Serum ferritin concentration was determined using an immunoradiometric procedure (Corning Diagnostics Corp, East Walpole, MA). ZPP/H ratio was measured on a ProtoFluor-Z hematofluorometer (Helena Laboratories, Inc, Beaumont, TX). When a result was above our reference range (>80 μmol/mol), plasma was removed, the cells were suspended in 0.9% saline, and the test was repeated.22 ,23Statistical analyses were performed using RS/1 Release 3.0 software (BBN Software Products Corp, Cambridge, MA).
Each of the five laboratory tests used for assessing iron status measures a different parameter of iron metabolism. Hematocrit and hemoglobin concentrations both represent the major end-product of iron utilization revealed by the onset of anemia. Serum transferrin saturation indicates the level of mobilized iron, but it is influenced greatly by serum iron concentration, which reacts to recent intake and diurnal variation. Serum ferritin concentration reflects the level of storage iron. Ferritin is an acute phase protein that can give falsely elevated results. The ZPP/H ratio reflects bone marrow iron available during erythrocyte production. Therefore, the ZPP/H ratio might be considered to approximate a marrow iron stain.18
Because patients were not subject to any selection process, we obtained a mixed population representative of the community. Some standardization of results was achieved by using single cutoff values for each iron deficiency test rather than by attempting to adjust for age, gender, or ethnicity.24 ,25 Iron deficiency indicators chosen for data analysis were those applicable to older children, which seemed appropriate because, with few exceptions, our subjects were of elementary or high school age. Although this decision may have limited our evaluation of the ZPP/H ratio as a test for iron disorders, the results still demonstrate the clinical merits of the test in a community setting.
The ZPP/H ratio is measured directly using a dedicated instrument called a hematofluorometer. This rather simple device costs ∼$4000 and may be used with or without a reagent, which remains a debatable issue.26 ,27 Also required are microscope cover slips and 0.9% saline solution for washing erythrocytes as needed. A measure of iron status requires one drop of whole blood, and the result can be obtained in ∼1 minute with minimal skill and training. Even so, ZPP/H ratio determination is not on the waived test list for the Clinical Laboratories Improvement Act. For billing, Current Procedural Terminology code 84202 described as “Protoporphyrin, RBC; quantitative” can be used until the ZPP/H ratio is classified specifically. A charge of no more than ∼$5.00 per test should cover the actual cost.
Table 1 summarizes the results for the five iron parameters measured including the ranges of values found, the deficiency limits used, and the prevalence of deficiency. In all cases, the mean value for each given iron parameter (data not shown) was close to the mean of the reference ranges for typical pediatric populations in Western countries. The limits predictive of iron deficiency were those commonly used for older children, and these formed the basis for determining the prevalences of iron deficiency in all of the children. These prevalences ranged from 3.1% based on the ZPP/H ratio to 27.9% based on transferrin saturation.
Prevalences of iron deficiency also were determined based on combinations of three test results (transferrin saturation, ferritin concentration, and ZPP/H ratio). The prevalence of iron deficiency based on any one of the three test results being abnormal reached 33%. However, the prevalence of iron deficiency based on two abnormal results was decreased to 4%, and based on all three abnormal test results was 1%. Of the 121 subjects having only one abnormal iron test result, 73% of these were defined as deficient by transferrin saturation. When hemoglobin was combined with ferritin and ZPP/H in this same type of analysis, the prevalence of iron deficiency ranged from 17% based on one test result (hemoglobin) to <1% when all three test results were abnormal.
Multiple regression analyses of the data were performed to determine which single laboratory test was the best predictor of iron deficiency as defined by hemoglobin concentration, the most commonly used parameter. The ZPP/H ratio was the most highly correlated predictor of hemoglobin concentration (P = 2.0 × 10−7). Ferritin concentration (P = 2.5 × 10−5) and percent transferrin saturation (P = .05) also were correlated significantly with hemoglobin concentration. Similarly, ZPP/H ratio was the most highly correlated predictor of hematocrit measurement (P = 7.0 × 10−7), followed by ferritin concentration (P = 1.0 × 10−6) and percent transferrin saturation (P = .04).
Baseline and follow-up laboratory data on the subset of children (n = 32) who had been placed on iron supplement therapy are shown in Table 2. In keeping with accepted practice, the laboratory basis for a decision to supplement was primarily transferrin saturation (22 of 32 were <16% saturated). Time between the initial visit (baseline) and the follow-up visit ranged from 4 to 30 weeks (median = 8). During this period, the ZPP/H ratios generally decreased while the percent transferrin saturation increased with iron supplementation; the ferritin and hemoglobin concentrations and hematocrit measurements did not change significantly. By the nonparametric Wilcoxon sign rank test, the difference between treated and untreated children was statistically significant for transferrin saturation (P < .0001) and ZPP/H ratio (P < .032). The latter correlation occurred despite the fact that the mean initial ZPP/H ratio was 54, and only two ZPP/H values exceeded our upper limit of 80 μmol of ZPP/mol heme.
Comparisons of the trends in ZPP/H ratios with age and sex are shown inFig 1. There was a noteworthy trend toward higher ZPP/H ratios in children <3 years of age, possibly attributable to prolonged high milk intake.28 A decrease in ZPP/H ratio that was statistically significant (P < .0002) was noted between the <3-year-old and the 3- and 4-year-old groups of healthy children. No significant difference was found in ZPP/H ratios between healthy boys (overall mean ± SD = 50.3 ± 13.7) and healthy girls (overall mean ± SD = 51.7 ± 12.7), until 15 years of age, when the mean ratios diverged. The median ZPP/H ratio for adolescent girls (15–18 years of age) was significantly higher than that of boys of the same ages (P < .0005) as shown in Fig 2.
Although ZPP has been established as a metabolite formed in response to iron deficiency, terminology in the literature often refers to ZPP as free erythrocyte protoporphyrin (FEP) or simply erythrocyte protoporphyrin (EP). Unfortunately, this practice obscures the clinical difference between these porphyrin metabolites.17 FEP (or EP) is an analytical anomaly whereby zinc is stripped from ZPP by acid solvents used in extracting porphyrin compounds from tissue; rare exceptions in which a high level of FEP is formed include some cases of porphyria. Also noteworthy is the fact that investigators and clinical laboratories may present results either as a concentration (FEP, EP, or ZPP) or as a ZPP/heme ratio, the latter being recommended.29
Historically, hemoglobin concentration or hematocrit measurements have been used to screen for iron deficiency. For the generally healthy subjects in this study, hemoglobin and hematocrit overpredicted iron deficiency by approximately twofold and sixfold, respectively. These findings confirm previous research showing hemoglobin concentration and hematocrit measurement to be poor diagnostic tests for iron deficiency,15 ,16 especially in preanemic states.
One of the classic ways to assess nutrient adequacy is dietary evaluation. Diet history is almost always used in conjunction with other nutritional assessment tools because it lacks sensitivity. This shortcoming arises from problems associated with accurately determining the quantities of foods consumed and the variability of the nutrient content in the foods eaten, problems that particularly affect estimations of the nutrient intakes of children. Only in the most controlled environments can the food consumed be estimated by weighing and a portion then analyzed for nutrient content. Hence, assessment of nutrient inadequacy usually involves integration of data from several approaches including clinical, dietary, and laboratory sources. In the case of iron status, the laboratory plays a primary role, and we have shown that for children the ZPP/H ratio is useful, as it is in adults, for assessing iron status.
The gold standard that has been used to define iron deficiency is decreased or absent stainable iron in bone marrow aspirate. Clearly, marrow biopsy is not an acceptable alternative for routine assessment of iron status. Therefore, ferritin concentration was considered the gold standard for this study, although a limitation to ferritin concentration is its positive response to inflammation. Because inflammation blocks release of storage iron, it affected also the ZPP/H ratio, a response not seen in acute infections because of the delay in ZPP-laden cells entering the circulation.
Prediction of the prevalence of iron deficiency based on a combination of parameters improves diagnostic accuracy. For example, hemoglobin and ferritin concentrations and ZPP/H ratio showed the increased accuracy achievable by employing multiple parameters. The specificities of ZPP/H ratio and hemoglobin concentration for ferritin levels were 97.7% and 93.0%, respectively, emphasizing the low level of likelihood of false positive diagnoses of iron-deficient erythropoiesis based on ZPP/H ratio alone.
As expected, transferrin saturation increased dramatically in response to iron supplementation. Although the ZPP/H ratios at the beginning of supplementation were mostly within the assigned reference range, the change in ZPP/H was the only other test of iron status to change significantly in response to supplementation (P < 0.05). The ability of the ZPP/H ratio to detect small changes in overall iron status is indicated clearly by these findings. Because ZPP/H ratio (but not transferrin saturation) reflects long-term iron status, this ratio seems to be the preferred index for monitoring iron therapy in preanemic patients, which agrees with earlier work from this laboratory showing that the ZPP/H ratio is responsive to the early stages of iron depletion.19 Although a ZPP/H ratio of 80 μmol/mol has been assigned as a practical upper limit, experience has shown that full repletion and/or iron overload may produce ZPP/H ratios one half (40 μmol/mol) or less that of a preanemic state. This observation is not to suggest that the ZPP/H ratio should be pushed to the lower limit by iron supplementation because potential toxicity then becomes a concern.
The unique sensitivity of the ZPP/H ratio to iron status is indicated also by the significantly higher ratios in adolescent girls than in adolescent boys. Up to age 15, boys and girls have identical ZPP/H levels, and these levels fall well below the cutoff for diagnosis of anemia (50.0 ± 10.8 for children 3–14 years of age). The incidence of overt iron deficiency was fivefold higher in girls after probable menarche (15% had ZPP/H ratios of 80–100 μmol/mol), and 50% had higher ratios than did any of the boys in this same age group. Previous studies have shown that male and female EP or FEP concentrations diverge similarly during adolescence and that this divergence is maintained until the usual age of menopause.24
The ZPP/H ratio is considered by some investigators to be a nonspecific test. This is attributable to the many causes that may underlie iron-deficient erythropoiesis, including anemia of chronic disease, chronic infections, and indeed any inflammatory processes, as well as nutritional iron deficiency. However, the ZPP/H ratio only reflects iron delivery to the developing erythrocyte. In this respect, the ratio is highly specific.
Our experience with the use of the ZPP/H ratio in hospitalized patients offers a word of caution. As might be predicted, ZPP/H ratios averaged ∼50% higher in this population. Thus, iron-deficient erythropoiesis was caused not only by nutritional iron deficiency but also by anemia of chronic disease, chronic infections, chronic inflammation, or hemoglobinopathies.30 Such observations do not negate the value of the ZPP/H ratio as a diagnostic test; instead, they emphasize the need for caution in interpreting results from sick children when factors other than nutritional deficiency must be considered. Elevated ZPP/H ratios also can serve as an alert that unsuspected problems may exist. No clinical situation is known in which a ZPP/H ratio >80 μmol/mol does not warrant intervention or follow-up. Applications of the test in other pediatric settings have been described.28 ,30
In summary, the predicted prevalence of iron deficiency in healthy children reported here was approximately the same whether based on the ZPP/H ratio or on serum ferritin measurement. However, the ZPP/H ratio was capable of predicting iron deficiency nearly as well as two or three laboratory tests combined, and it alone could detect preanemic deficiency better than other individual tests. Because the ZPP/H ratio is a functional indicator of long-term iron status, it is an attractive test both to screen for deficiency and to monitor treatment. The ZPP/H ratio is a cost-effective test that is performed using as little as one drop of capillary blood with a benchtop instrument suitable for use at sites remote from a central laboratory.
This work was supported in part by National Institutes of Health Grant DK35816.
- Received October 19, 1998.
- Accepted April 19, 1999.
Reprint requests to (R.F.L.) Department of Laboratory Medicine, University of Washington, Box 359743, Seattle, WA 98195. E-mail:
- ↵Cook JD, Skikne BS, Baynes RD. Iron deficiency: the global perspective. Adv Exp Med Biol. 1994;219–228
- Bailey LB,
- Colman N,
- Cooper BA,
- et al.
- Dallman PR,
- Yip R,
- Johnson C
- Soemantri AG,
- Pollitt E,
- Kim I
- ↵Walter T. Effect of iron deficiency anemia on infant psychomotor development. In: Filer Jr LJ, ed. Dietary Iron: Birth to Two Years. New York, NY: Raven Press; 1989:161–175
- ↵Yip R. The interaction of lead and iron. In: Filer Jr LJ, ed. Dietary Iron: Birth to Two Years. New York, NY: Raven Press; 1989:179–181
- Siegel RM,
- LaGrone DH
- McLaren GD,
- Carpenter JT Jr,
- Nino HV
- Hershko C,
- Konijn AM,
- Link G,
- Moreb J,
- Grauer F,
- Weissenberg E
- Hastka J,
- Lasserre JJ,
- Schwarzbeck A,
- Strauch M,
- Hehlmann R
- Yip R,
- Johnson C,
- Dallman PR
- Centers for Disease Control and Prevention
- Luoro JM,
- Tutor JC
- Labbe RF,
- Dewanji, A, McLaughlin K
- ↵National Committee on Clinical Laboratory Standards. Erythrocyte Protoporphyrin Testing: Approved Guideline. Vol 10. Villanova, PA: National Committee on Clinical Laboratory Standards; 1996
- Copyright © 1999 American Academy of Pediatrics