30 research outputs found
Iron fortification and iron supplementation are cost-effective interventions to reduce iron deficiency in four subregions of the world
Iron deficiency is the most common and widespread nutritional disorder in
the world, affecting millions of people in both nonindustrialized and
industrialized countries. We estimated the costs, effects, and
cost-effectiveness of iron supplementation and iron fortification
interventions in 4 regions of the world. The effects on population health
were arrived at by using a population model designed to estimate the
lifelong impact of iron supplementation or iron fortification on
individuals benefiting from such interventions. The population model took
into consideration effectiveness, patient adherence, and geographic
coverage. Costs were based on primary data collection and on a review of
the literature. At 95% geographic coverage, iron supplementation has a
larger impact on population health than iron fortification. Iron
supplementation would avert <12,500 disability adjusted life years (DALY)
annually in the European subregion, with very low rates of adult and child
mortality, to almost 2.5 million DALYs in the African and Southeast Asian
subregions, with high rates of adult and child mortality. On the other
hand, fortification is less costly than supplementation and appears to be
more cost effective than iron supplementation, regardless of the
geographic coverage of fortification. We conclude that iron fortification
is economically more attractive than iron supplementation. However,
spending the extra resources to implement iron supplementation is still a
cost-effective option. The results should be interpreted with caution,
because evidence of intervention effectiveness predominantly relates to
small-scale efficacy trials, which may not reflect the actual effect under
expected conditions
Mild increases in serum hepcidin and interleukin-6 concentrations impair iron incorporation in haemoglobin during an experimental human malaria infection.
Contains fulltext :
79604.pdf (publisher's version ) (Closed access)The correct selection of individuals who will benefit from iron supplements in malaria-endemic regions requires improved insight in the effects of malaria on host iron homeostasis and innovative biomarkers. We assessed sequential changes in serum hepcidin and in traditional biochemical iron status indicators during an experimental Plasmodium falciparum malaria infection with five adult volunteers. The haemoglobin content of reticulocytes (Ret-H(e)) and of mature red blood cells (RBC-H(e)) represented iron incorporation into haemoglobin. Low-density parasitaemia and its treatment induced a mild increase in interleukin (IL)-6 and serum hepcidin concentrations. Despite this only mild increase, a marked hypoferraemia with a strong increase in serum ferritin concentrations developed, which was associated with a sharp fall in Ret-H(e), while RBC-H(e) remained unchanged. The ratio of soluble transferrin receptor (sTfR) to log ferritin concentrations decreased to an average nadir of 63% of the baseline value. We concluded that even mild increases in serum hepcidin and IL-6 concentrations result in a disturbed host iron homeostasis. Serum hepcidin, Ret-H(e) and Delta-H(e) (Ret-H(e) minus RBC-H(e)) are promising biomarkers to select those individuals who will benefit from iron supplements in malaria endemic regions, while the sTfR/log ferritin ratio should be used with caution to assess iron status during malaria