Everyone knows that anemia is a bad thing, and babies are routinely tested for iron deficiency by their first birthday. But it turns out that the test commonly used to detect iron-poor blood doesn't do a very good job.
That's a big problem, because about 10 percent of American children suffer iron deficiency between ages 1 and 2. Lack of iron in infancy can impair mental development, perhaps permanently, long before a child is considered anemic.
Researchers at Children's Hospital Boston say they've discovered that a simple blood test can detect iron deficiency in children earlier and more accurately than the standard measure.
"While iron deficiency can be readily treated, practitioners haven't had a simple, reliable, and practical screening test to detect it early enough," says Henry Bernstein, associate chief of pediatrics at Children's and principal investigator on the study, which was published in the August 24 Journal of the American Medical Association.
The test measures hemoglobin in immature blood cells, which are present in the bloodstream for just 24 to 48 hours. Traditional tests measure hemoglobin, which is an iron-containing, oxygen-carrying molecule, in all red blood cells, which can circulate in the bloodstream for months.
In a study involving 200 healthy infants 9 to 12 months of age, the new test correctly identified 83 percent of iron-deficient infants, compared with 26 percent using the current screening standard, which is a hemoglobin level of less than 11 grams per deciliter. Iron deficiency is easily treated with dietary iron supplementation, either with vitamins or through eating iron-rich foods.
Although the test won't be used to routinely test children until it undergoes more evaluation, the doctors say it is simple and won't cost any more than the test now being used.