53 research outputs found

    The effects of iron fortification on the gut microbiota in African children: a randomized controlled trial in Cote d'Ivoire.

    No full text
    BACKGROUND: Iron is essential for the growth and virulence of many pathogenic enterobacteria, whereas beneficial barrier bacteria, such as lactobacilli, do not require iron. Thus, increasing colonic iron could select gut microbiota for humans that are unfavorable to the host. OBJECTIVE: The objective was to determine the effect of iron fortification on gut microbiota and gut inflammation in African children. DESIGN: In a 6-mo, randomized, double-blind, controlled trial, 6-14-y-old Ivorian children (n = 139) received iron-fortified biscuits, which contained 20 mg Fe/d, 4 times/wk as electrolytic iron or nonfortifoed biscuits. We measured changes in hemoglobin concentrations, inflammation, iron status, helminths, diarrhea, fecal calprotectin concentrations, and microbiota diversity and composition (n = 60) and the prevalence of selected enteropathogens. RESULTS: At baseline, there were greater numbers of fecal enterobacteria than of lactobacilli and bifidobacteria (P < 0.02). Iron fortification was ineffective; there were no differences in iron status, anemia, or hookworm prevalence at 6 mo. The fecal microbiota was modified by iron fortification as shown by a significant increase in profile dissimilarity (P < 0.0001) in the iron group as compared with the control group. There was a significant increase in the number of enterobacteria (P < 0.005) and a decrease in lactobacilli (P < 0.0001) in the iron group after 6 mo. In the iron group, there was an increase in the mean fecal calprotectin concentration (P < 0.01), which is a marker of gut inflammation, that correlated with the increase in fecal enterobacteria (P < 0.05). CONCLUSIONS: Anemic African children carry an unfavorable ratio of fecal enterobacteria to bifidobacteria and lactobacilli, which is increased by iron fortification. Thus, iron fortification in this population produces a potentially more pathogenic gut microbiota profile, and this profile is associated with increased gut inflammation. This trial was registered at controlled-trials.com as ISRCTN21782274

    Iron fortification and iron supplementation are cost-effective interventions to reduce iron deficiency in four subregions of the world

    Get PDF
    Iron deficiency is the most common and widespread nutritional disorder in the world, affecting millions of people in both nonindustrialized and industrialized countries. We estimated the costs, effects, and cost-effectiveness of iron supplementation and iron fortification interventions in 4 regions of the world. The effects on population health were arrived at by using a population model designed to estimate the lifelong impact of iron supplementation or iron fortification on individuals benefiting from such interventions. The population model took into consideration effectiveness, patient adherence, and geographic coverage. Costs were based on primary data collection and on a review of the literature. At 95% geographic coverage, iron supplementation has a larger impact on population health than iron fortification. Iron supplementation would avert <12,500 disability adjusted life years (DALY) annually in the European subregion, with very low rates of adult and child mortality, to almost 2.5 million DALYs in the African and Southeast Asian subregions, with high rates of adult and child mortality. On the other hand, fortification is less costly than supplementation and appears to be more cost effective than iron supplementation, regardless of the geographic coverage of fortification. We conclude that iron fortification is economically more attractive than iron supplementation. However, spending the extra resources to implement iron supplementation is still a cost-effective option. The results should be interpreted with caution, because evidence of intervention effectiveness predominantly relates to small-scale efficacy trials, which may not reflect the actual effect under expected conditions

    Adjusting plasma ferritin concentrations to remove the effects of subclinical inflammation in the assessment of iron deficiency: a meta-analysis

    Get PDF
    Background: The World Health Organization recommends serum ferritin concentrations as the best indicator of iron deficiency (ID). Unfortunately, ferritin increases with infections; hence, the prevalence of ID is underestimated. Objective: The objective was to estimate the increase in ferritin in 32 studies of apparently healthy persons by using 2 acute-phase proteins (APPs). C-reactive protein (CRP) and alpha(1)-acid glycoprotein (AGP), individually and in combination, and to calculate factors to remove the influence of inflammation from ferritin concentrations. Design: We estimated the increase in ferritin associated with inflammation (ie, CRP >5 mg/L and/or AGP >1 g/L). The 32 studies comprised infants (5 studies), children (7 studies), men (4 studies), and women (16 studies) (n = 8796 subjects). In 2-group analyses (either CRP or AGP), we compared the ratios of log ferritin with or without inflammation in 30 studies. In addition, in 22 studies, the data allowed a comparison of ratios of log ferritin between 4 subgroups: reference (no elevated APP), incubation (elevated CRP only), early convalescence (both APP and CRP elevated), and late convalescence (elevated AGP only). Results: In the 2-group analysis, inflammation increased ferritin by 49.6% (CRP) or 38.2% (AGP; both P <0.001). Elevated AGP was more common than CRP in young persons than in adults. In the 4-group analysis, ferritin was 30%, 90%, and 36% (all P < 0.001) higher in the incubation, early convalescence, and late convalescence subgroups, respectively, with corresponding correction factors of 0.77, 0.53, and 0.75. Overall, inflammation increased ferritin by approximate to 30% and was associated with a 14% (CI: 7%, 21%) underestimation of ID. Conclusions: Measures of both APP and CRP are needed to estimate the full effect of inflammation and can be used to correct ferritin concentrations. Few differences were observed between age and sex subgroups. Am J Clin Nutr 2010;92:546-55

    Dietary Factors Modulate Helicobacter-associated Gastric Cancer in Rodent Models

    Get PDF
    Since its discovery in 1982, the global importance of Helicobacter pylori–induced disease, particularly in developing countries, remains high. The use of rodent models, particularly mice, and the unanticipated usefulness of the gerbil to study H. pylori pathogenesis have been used extensively to study the interactions of the host, the pathogen, and the environmental conditions influencing the outcome of persistent H. pylori infection. Dietary factors in humans are increasingly recognized as being important factors in modulating progression and severity of H. pylori–induced gastric cancer. Studies using rodent models to verify and help explain mechanisms whereby various dietary ingredients impact disease outcome should continue to be extremely productive.National Institutes of Health (U.S.) (P01CA028842)National Institutes of Health (U.S.) (P01CA026731)National Institutes of Health (U.S.) (P30ES002109

    Prevalence of anemia and deficiency of iron, folic acid, and zinc in children younger than 2 years of age who use the health services provided by the Mexican Social Security Institute

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In Mexico, as in other developing countries, micronutrient deficiencies are common in infants between 6 and 24 months of age and are an important public health problem. The objective of this study was to determine the prevalence of anemia and of iron, folic acid, and zinc deficiencies in Mexican children under 2 years of age who use the health care services provided by the Mexican Institute for Social Security (IMSS).</p> <p>Methods</p> <p>A nationwide survey was conducted with a representative sample of children younger than 2 years of age, beneficiaries, and users of health care services provided by IMSS through its regular regimen (located in urban populations) and its Oportunidades program (services offered in rural areas). A subsample of 4,955 clinically healthy children was studied to determine their micronutrient status. A venous blood sample was drawn to determine hemoglobin, serum ferritin, percent of transferrin saturation, zinc, and folic acid. Descriptive statistics include point estimates and 95% confidence intervals for the sample and projections for the larger population from which the sample was drawn.</p> <p>Results</p> <p>Twenty percent of children younger than 2 years of age had anemia, and 27.8% (rural) to 32.6% (urban) had iron deficiency; more than 50% of anemia was not associated with low ferritin concentrations. Iron stores were more depleted as age increased. Low serum zinc and folic acid deficiencies were 28% and 10%, respectively, in the urban areas, and 13% and 8%, respectively, in rural areas. The prevalence of simultaneous iron and zinc deficiencies was 9.2% and 2.7% in urban and rural areas. Children with anemia have higher percentages of folic acid deficiency than children with normal iron status.</p> <p>Conclusion</p> <p>Iron and zinc deficiencies constitute the principal micronutrient deficiencies in Mexican children younger than 2 years old who use the health care services provided by IMSS. Anemia not associated with low ferritin values was more prevalent than iron-deficiency anemia. The presence of micronutrient deficiencies at this early age calls for effective preventive public nutrition programs to address them.</p

    Effects of prenatal food and micronutrient supplementation on child growth from birth to 54 months of age: a randomized trial in Bangladesh

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is a lack of information on the optimal timing of food supplementation to malnourished pregnant women and possible combined effects of food and multiple micronutrient supplementations (MMS) on their offspring's growth. We evaluated the effects of prenatal food and micronutrient interventions on postnatal child growth. The hypothesis was that prenatal MMS and early invitation to food supplementation would increase physical growth in the offspring during 0-54 months and a combination of these interventions would further improve these outcomes.</p> <p>Methods</p> <p>In the large, randomized MINIMat trial (Maternal and Infant Nutrition Interventions in Matlab), Bangladesh, 4436 pregnant women were enrolled between November 2001 and October 2003 and their children were followed until March 2009. Participants were randomized into six groups comprising 30 mg Fe and 400 μg folic acid (Fe30F), 60 mg Fe and 400 μg folic acid (Fe60F) or MMS combined with either an early (immediately after identification of pregnancy) or a later usual (at the time of their choosing, i.e., usual care in this community) program invitation to food supplementation. The anthropometry of 3267 children was followed from birth to 54 months, and 2735 children were available for analysis at 54 months.</p> <p>Results</p> <p>There were no differences in characteristics of mothers and households among the different intervention groups. The average birth weight was 2694 g and birth length was 47.7 cm, with no difference among intervention groups. Early invitation to food supplementation (in comparison with usual invitation) reduced the proportion of stunting from early infancy up to 54 months for boys (p = 0.01), but not for girls (p = 0.31). MMS resulted in more stunting than standard Fe60F (p = 0.02). There was no interaction between the food and micronutrient supplementation on the growth outcome.</p> <p>Conclusions</p> <p>Early food supplementation in pregnancy reduced the occurrence of stunting during 0-54 months in boys, but not in girls, and prenatal MMS increased the proportion of stunting in boys. These effects on postnatal growth suggest programming effects in early fetal life.</p> <p>Trial registration number</p> <p>ISRCTN: <a href="http://www.controlled-trials.com/ISRCTN16581394">ISRCTN16581394</a></p

    Biomarkers of Nutrition for Development (BOND)—Iron Review

    Get PDF
    This is the fifth in the series of reviews developed as part of the Biomarkers of Nutrition for Development (BOND) program. The BOND Iron Expert Panel (I-EP) reviewed the extant knowledge regarding iron biology, public health implications, and the relative usefulness of currently available biomarkers of iron status from deficiency to overload. Approaches to assessing intake, including bioavailability, are also covered. The report also covers technical and laboratory considerations for the use of available biomarkers of iron status, and concludes with a description of research priorities along with a brief discussion of new biomarkers with potential for use across the spectrum of activities related to the study of iron in human health. The I-EP concluded that current iron biomarkers are reliable for accurately assessing many aspects of iron nutrition. However, a clear distinction is made between the relative strengths of biomarkers to assess hematological consequences of iron deficiency versus other putative functional outcomes, particularly the relationship between maternal and fetal iron status during pregnancy, birth outcomes, and infant cognitive, motor and emotional development. The I-EP also highlighted the importance of considering the confounding effects of inflammation and infection on the interpretation of iron biomarker results, as well as the impact of life stage. Finally, alternative approaches to the evaluation of the risk for nutritional iron overload at the population level are presented, because the currently designated upper limits for the biomarker generally employed (serum ferritin) may not differentiate between true iron overload and the effects of subclinical inflammation
    corecore