1,349 research outputs found
Methods to assess iron and iodine status
Four methods are recommended for assessment of iodine nutrition: urinary iodine concentration, the goitre rate, and blood concentrations of thyroid stimulating hormone and thyroglobulin. These indicators are complementary, in that urinary iodine is a sensitive indicator of recent iodine intake (days) and thyroglobulin shows an intermediate response (weeks to months), whereas changes in the goitre rate reflect long-term iodine nutrition (months to years). Spot urinary iodine concentrations are highly variable from day-to-day and should not be used to classify iodine status of individuals. International reference criteria for thyroid volume in children have recently been published and can be used for identifying even small goitres using thyroid ultrasound. Recent development of a dried blood spot thyroglobulin assay makes sample collection practical even in remote areas. Thyroid stimulating hormone is a useful indicator of iodine nutrition in the newborn, but not in other age groups. For assessing iron status, haemoglobin measurement alone has low specificity and sensitivity. Serum ferritin remains the best indicator of iron stores in the absence of inflammation. Measures of iron-deficient erythropoiesis include transferrin iron saturation and erythrocyte zinc protoporphyrin, but these often do not distinguish anaemia due to iron deficiency from the anaemia of chronic disease. The serum transferrin receptor is useful in this setting, but the assay requires standardization. In the absence of inflammation, a sensitive method to assess iron status is to combine the use of serum ferritin as a measure of iron stores and the serum transferrin receptor as a measure of tissue iron deficienc
Symposium on ‘Geographical and geological influences on nutrition' Iodine deficiency in industrialised countries: Conference on ‘Over- and undernutrition: challenges and approaches'
Iodine deficiency is not only a problem in developing regions; it also affects many industrialised countries. Globally, two billion individuals have an insufficient iodine intake, and approximately 50% of continental Europe remains mildly iodine deficient. Iodine intakes in other industrialised countries, including the USA and Australia, have fallen in recent years. Iodine deficiency has reappeared in Australia, as a result of declining iodine residues in milk products because of decreased iodophor use by the dairy industry. In the USA, although the general population is iodine sufficient, it is uncertain whether iodine intakes are adequate in pregnancy, which has led to calls for iodine supplementation. The few available data suggest that pregnant women in the Republic of Ireland and the UK are now mildly iodine deficient, possibly as a result of reduced use of iodophors by the dairy industry, as observed in Australia. Representative data on iodine status in children and pregnant women in the UK are urgently needed to inform health policy. In most industrialised countries the best strategy to control iodine deficiency is carefully-monitored salt iodisation. However, because approximately 90% of salt consumption in industrialised countries is from purchased processed foods, the iodisation of household salt only will not supply adequate iodine. Thus, in order to successfully control iodine deficiency in industrialised countries it is critical that the food industry use iodised salt. The current push to reduce salt consumption to prevent chronic diseases and the policy of salt iodisation to eliminate iodine deficiency do not conflict; iodisation methods can fortify salt to provide recommended iodine intakes even if per capita salt intakes are reduced to <5 g/
The impact of iodised salt or iodine supplements on iodine status during pregnancy, lactation and infancy
Objectives: Monitoring of iodine status during pregnancy, lactation and infancy is difficult as there are no established reference criteria for urinary iodine concentration (UI) for these groups; so it is uncertain whether iodized salt programs meet the needs of these life stages. Design and Subjects: The method used in this paper was: 1) to estimate the median UI concentration that reflects adequate iodine intake during these life stages; and 2) to use these estimates in a review of the literature to assess whether salt iodisation can control iodine deficiency in pregnant and lactating women, and their infants. Results: For pregnancy, recommended mean daily iodine intakes of 220-250 μg were estimated to correspond to a median UI concentration of about 150 μg l−1, and larger surveys from the iodine sufficient countries have reported a median UI in pregnant women ≥140 μg l−1. Iodine supplementation in pregnant women who are mild-to-moderately iodine deficient is beneficial, but there is no clear affect on maternal or newborn thyroid hormone levels. In countries where the iodine intake is sufficient, most mothers have median breast milk iodine concentration (BMIC) greater than the concentration (100-120 μg l−1) required to meet an infant's needs. The median UI concentration during infancy that indicates optimal iodine nutrition is estimated to be ≥100 μg l−1. In iodine-sufficient countries, the median UI concentration in infants ranges from 90-170 μg l−1, suggesting adequate iodine intake in infancy. Conclusions: These findings suggest pregnant and lactating women and their infants in countries with successful sustained iodised salt programs have adequate iodine statu
Assessment of iodine nutrition in populations: past, present, and future
Iodine status has been historically assessed by palpation of the thyroid and reported as goiter rates. Goiter is a functional biomarker that can be applied to both individuals and populations, but it is subjective. Iodine status is now assessed using an objective biomarker of exposure, i.e., urinary iodine concentrations (UICs) in spot samples and comparison of the median UIC to UIC cut-offs to categorize population status. This has improved standardization, but inappropriate use of the crude proportion of UICs below the cut-off level of 100 µg/L to estimate the number of iodine-deficient children has led to an overestimation of the prevalence of iodine deficiency. In this review, a new approach is proposed in which UIC data are extrapolated to iodine intakes, adjusted for intraindividual variation, and then interpreted using the estimated average requirement cut-point model. This may allow national programs to define the prevalence of iodine deficiency in the population and to quantify the necessary increase in iodine intakes to ensure sufficiency. In addition, thyroglobulin can be measured on dried blood spots to provide an additional sensitive functional biomarker of iodine statu
Schoolchildren in the Principality of Liechtenstein are mildly iodine deficient
Abstract Objective To investigate the iodine status of schoolchildren in the Principality of Liechtenstein. Design A representative, cross-sectional principality-wide screening of iodine level in household salt and urinary iodine concentrations (UIC) in primary-school children. Data were compared with the WHO criteria and with 2009 iodine survey data from Switzerland, a neighbouring country that supplies most of the salt used in Liechtenstein. Settings Principality of Liechtenstein. Subjects Schoolchildren (n 228) aged 6-12 years from five different primary schools representing 11·4 % of the children at this age. Results The median UIC was 96 (range: 10-446) μg/l; 11 %, 56 % and 1 % of children had a UIC 300 μg/l, respectively. In all, 79 % of households were using adequately iodised salt (≥15 ppm). The median UIC was 20 % lower than that in children at comparable age in Switzerland (120 μg/l; P < 0·05). Conclusions According to the WHO criteria, schoolchildren in Liechtenstein are mildly iodine deficient and household iodised salt coverage is inadequate. Public health measures to increase iodine intakes in the Principality should be considere
Maximizing the benefits and minimizing the risks of intervention programs to address micronutrient malnutrition: symposium report.
Interventions to address micronutrient deficiencies have large potential to reduce the related disease and economic burden. However, the potential risks of excessive micronutrient intakes are often not well determined. During the Global Summit on Food Fortification, 9-11 September 2015, in Arusha, a symposium was organized on micronutrient risk-benefit assessments. Using case studies on folic acid, iodine and vitamin A, the presenters discussed how to maximize the benefits and minimize the risks of intervention programs to address micronutrient malnutrition. Pre-implementation assessment of dietary intake, and/or biomarkers of micronutrient exposure, status and morbidity/mortality is critical in identifying the population segments at risk of inadequate and excessive intake. Dietary intake models allow to predict the effect of micronutrient interventions and their combinations, e.g. fortified food and supplements, on the proportion of the population with intakes below adequate and above safe thresholds. Continuous monitoring of micronutrient intake and biomarkers is critical to identify whether the target population is actually reached, whether subgroups receive excessive amounts, and inform program adjustments. However, the relation between regular high intake and adverse health consequences is neither well understood for many micronutrients, nor do biomarkers exist that can detect them. More accurate and reliable biomarkers predictive of micronutrient exposure, status and function are needed to ensure effective and safe intake ranges for vulnerable population groups such as young children and pregnant women. Modelling tools that integrate information on program coverage, dietary intake distribution and biomarkers will further enable program makers to design effective, efficient and safe programs
Assessing Human Iron Kinetics Using Stable Iron Isotopic Techniques
Stable iron isotope techniques are critical for developing strategies to combat iron deficiency anemia, a leading cause of global disability. There are four primary stable iron isotope methods to assess ferrokinetics in humans. (i) The fecal recovery method applies the principles of a metabolic balance study but offers enhanced accuracy because the amount of iron isotope present in feces can be directly traced back to the labeled dose, distinguishing it from endogenous iron lost in stool from shed intestinal cells. (ii) In the plasma isotope appearance method, plasma samples are collected for several hours after oral dosing to evaluate the rate, quantity, and pattern of iron absorption. Key metrics include the time of peak isotope concentration and the area under the curve. (iii) The erythrocyte iron incorporation method measures iron bioavailability (absorption and erythrocyte iron utilization) from a whole blood sample collected 2 weeks after oral dosing. Simultaneous administration of oral and intravenous tracers allows for separate measurements of iron absorption and iron utilization. These three methods determine iron absorption by measuring tracer concentrations in feces, serum, or erythrocytes after administration of a tracer. In contrast, (iv) in iron isotope dilution, an innovative new approach, iron of natural composition acts as the tracer, diluting an ad hoc modified isotopic signature obtained via prior isotope administration and equilibration with body iron. This technique enables highly accurate long-term studies of iron absorption, loss, and gain. This review discusses the application of these kinetic methods and their potential to address important questions in hematology and iron biology
Mild riboflavin deficiency is highly prevalent in school-age children but does not increase risk for anaemia in Côte d'Ivoire
There are few data on the prevalence of riboflavin deficiency in sub-Saharan Africa, and it remains unclear whether riboflavin status influences the risk for anaemia. The aims of this study were to: (1) measure the prevalence of riboflavin deficiency in children in south-central Côte d'Ivoire; (2) estimate the riboflavin content of the local diet; and (3) determine if riboflavin deficiency predicts anaemia and/or iron deficiency. In 5- to 15-year-old children (n 281), height, weight, haemoglobin (Hb), whole blood zinc protoporphyrin (ZPP), erythrocyte glutathione reductase activity coefficient (EGRAC), serum retinol, C-reactive protein (CRP) and prevalence of Plasmodium spp. (asymptomatic malaria) and Schistosoma haematobium (bilharziosis) infections were measured. Three-day weighed food records were kept in twenty-four households. Prevalence of anaemia in the sample was 52%; 59% were iron-deficient based on an elevated ZPP concentration, and 36% suffered from iron deficiency anaemia. Plasmodium parasitaemia was found in 49% of the children. Nineteen percent of the children were infected with S. haematobium. Median riboflavin intake in 5- to 15-year-old children from the food records was 0·42mg/d, ~47% of the estimated average requirement for this age group. Prevalence of riboflavin deficiency was 65%, as defined by an EGRAC value >1·2. Age, elevated CRP and iron deficiency were significant predictors of Hb. Riboflavin-deficient children free of malaria were more likely to be iron deficient (odds ratio; 3·07; 95% CI 1·12, 8·41). In conclusion, nearly two-thirds of school-age children in south-central Côte d'Ivoire are mildly riboflavin deficient. Riboflavin deficiency did not predict Hb and/or anaemia, but did predict iron deficiency among children free of malari
Low iron availability in continuous in vitro colonic fermentations induces strong dysbiosis of the child gut microbial consortium and a decrease in main metabolites
Iron (Fe) deficiency affects an estimated 2 billion people worldwide, and Fe supplements are a common corrective strategy. The impact of Fe deficiency and Fe supplementation on the complex microbial community of the child gut was studied using in vitro colonic fermentation models inoculated with immobilized fecal microbiota. Chyme media (all Fe chelated by 2,2′-dipyridyl to 26.5 mg Fe L−1) mimicking Fe deficiency and supplementation were continuously fermented. Fermentation effluent samples were analyzed daily on the microbial composition and metabolites by quantitative PCR, 16S rRNA gene 454-pyrosequencing, and HPLC. Low Fe conditions (1.56 mg Fe L−1) significantly decreased acetate concentrations, and subsequent Fe supplementation (26.5 mg Fe L−1) restored acetate production. High Fe following normal Fe conditions had no impact on the gut microbiota composition and metabolic activity. During very low Fe conditions (0.9 mg Fe L−1 or Fe chelated by 2,2′-dipyridyl), a decrease in Roseburia spp./Eubacterium rectale, Clostridium Cluster IV members and Bacteroides spp. was observed, while Lactobacillus spp. and Enterobacteriaceae increased consistent with a decrease in butyrate (−84%) and propionate (−55%). The strong dysbiosis of the gut microbiota together with decrease in main gut microbiota metabolites observed with very low iron conditions could weaken the barrier effect of the microbiota and negatively impact gut healt
Iron deficiency up-regulates iron absorption from ferrous sulphate but not ferric pyrophosphate and consequently food fortification with ferrous sulphate has relatively greater efficacy in iron-deficient individuals
Fe absorption from water-soluble forms of Fe is inversely proportional to Fe status in humans. Whether this is true for poorly soluble Fe compounds is uncertain. Our objectives were therefore (1) to compare the up-regulation of Fe absorption at low Fe status from ferrous sulphate (FS) and ferric pyrophosphate (FPP) and (2) to compare the efficacy of FS with FPP in a fortification trial to increase body Fe stores in Fe-deficient children v. Fe-sufficient children. Using stable isotopes in test meals in young women (n 49) selected for low and high Fe status, we compared the absorption of FPP with FS. We analysed data from previous efficacy trials in children (n 258) to determine whether Fe status at baseline predicted response to FS v. FPP as salt fortificants. Plasma ferritin was a strong negative predictor of Fe bioavailability from FS (P<0·0001) but not from FPP. In the efficacy trials, body Fe at baseline was a negative predictor of the change in body Fe for both FPP and FS, but the effect was significantly greater with FS (P<0·01). Because Fe deficiency up-regulates Fe absorption from FS but not from FPP, food fortification with FS may have relatively greater impact in Fe-deficient children. Thus, more soluble Fe compounds not only demonstrate better overall absorption and can be used at lower fortification levels, but they also have the added advantage that, because their absorption is up-regulated in Fe deficiency, they innately ‘target' Fe-deficient individuals in a populatio
- …