983 research outputs found

    Methods to assess iron and iodine status

    Get PDF
    Four methods are recommended for assessment of iodine nutrition: urinary iodine concentration, the goitre rate, and blood concentrations of thyroid stimulating hormone and thyroglobulin. These indicators are complementary, in that urinary iodine is a sensitive indicator of recent iodine intake (days) and thyroglobulin shows an intermediate response (weeks to months), whereas changes in the goitre rate reflect long-term iodine nutrition (months to years). Spot urinary iodine concentrations are highly variable from day-to-day and should not be used to classify iodine status of individuals. International reference criteria for thyroid volume in children have recently been published and can be used for identifying even small goitres using thyroid ultrasound. Recent development of a dried blood spot thyroglobulin assay makes sample collection practical even in remote areas. Thyroid stimulating hormone is a useful indicator of iodine nutrition in the newborn, but not in other age groups. For assessing iron status, haemoglobin measurement alone has low specificity and sensitivity. Serum ferritin remains the best indicator of iron stores in the absence of inflammation. Measures of iron-deficient erythropoiesis include transferrin iron saturation and erythrocyte zinc protoporphyrin, but these often do not distinguish anaemia due to iron deficiency from the anaemia of chronic disease. The serum transferrin receptor is useful in this setting, but the assay requires standardization. In the absence of inflammation, a sensitive method to assess iron status is to combine the use of serum ferritin as a measure of iron stores and the serum transferrin receptor as a measure of tissue iron deficiency

    Assessment of Regression Models for Adjustment of Iron Status Biomarkers for Inflammation in Children with Moderate Acute Malnutrition in Burkina Faso.

    Get PDF
    BACKGROUND: Biomarkers of iron status are affected by inflammation. In order to interpret them in individuals with inflammation, the use of correction factors (CFs) has been proposed. OBJECTIVE: The objective of this study was to investigate the use of regression models as an alternative to the CF approach. METHODS: Morbidity data were collected during clinical examinations with morbidity recalls in a cross-sectional study in children aged 6-23 mo with moderate acute malnutrition. C-reactive protein (CRP), α1-acid glycoprotein (AGP), serum ferritin (SF), and soluble transferrin receptor (sTfR) were measured in serum. Generalized additive, quadratic, and linear models were used to model the relation between SF and sTfR as outcomes and CRP and AGP as categorical variables (model 1; equivalent to the CF approach), CRP and AGP as continuous variables (model 2), or CRP and AGP as continuous variables and morbidity covariates (model 3) as predictors. The predictive performance of the models was compared with the use of 10-fold crossvalidation and quantified with the use of root mean square errors (RMSEs). SF and sTfR were adjusted with the use of regression coefficients from linear models. RESULTS: Crossvalidation revealed no advantage to using generalized additive or quadratic models over linear models in terms of the RMSE. Linear model 3 performed better than models 2 and 1. Furthermore, we found no difference in CFs for adjusting SF and those from a previous meta-analysis. Adjustment of SF and sTfR with the use of the best-performing model led to a 17% point increase and <1% point decrease, respectively, in estimated prevalence of iron deficiency. CONCLUSION: Regression analysis is an alternative to adjust SF and may be preferable in research settings, because it can take morbidity and severity of inflammation into account. In clinical settings, the CF approach may be more practical. There is no benefit from adjusting sTfR. This trial was registered at www.controlled-trials.com as ISRCTN42569496

    Targeting, monitoring and effect of oral iron therapy on haemoglobin levels in older patients discharged to primary care from inpatient rehabilitation:a cohort study using routinely collected data

    Get PDF
    Background: Oral iron is commonly prescribed to older patients with suspected or confirmed iron deficiency anaemia, however few studies have examined the effectiveness of oral iron therapy in the real world in this population. We therefore determined the prevalence of iron deficiency in older people prescribed oral iron, examined the response mounted to therapy and ascertained predictors of response to oral iron.Methods: We analysed a routinely collected, linked dataset from older patients who had undergone inpatient rehabilitation between 1999 and 2011. An initial analysis examined patients within this cohort who were prescribed iron after rehabilitation and derived three groups based upon their ferritin and transferrin indices; probably, possibly and not iron deficient. A second analysis compared pre- and post-treatment haemoglobin to determine the degree of response to iron therapy across each category of deficiency. Finally, patient demographics, linked biochemistry data and comorbid disease based on International Statistical Classification of Disease (ICD-10) codes from previous hospital admissions were used in regression modelling to evaluate factors affecting response to therapy.Results: 490 patients were prescribed oral iron within 90 days of rehabilitation discharge. 413/490 (84%) had iron indices performed; 94 (23%) were possibly deficient, 224 (54%) were probably deficient, and 95 (23%) were not deficient. 360/490 patients had both pre and post treatment haemoglobin data and iron indices; probably deficient patients mounted a slightly greater response to oral iron (17g/L vs 12g/L for not deficient; p&lt;0.05). Only pre-treatment haemoglobin, mean cell volume (MCV) and lower gastrointestinal pathology were significant predictors of a response to oral iron therapy. Notably, acid-suppressant use was not a predictor of response.Conclusion: We conclude that many older patients are exposed to oral iron without good evidence of either iron deficiency or a significant response to therapy.<br/

    Peri-operative treatment of anaemia in major orthopaedic surgery: a practical approach from Spain

    Get PDF
    In patients undergoing major orthopaedic surgery, pre-operative anaemia, peri-operative bleeding and a liberal transfusion policy are the main risk factors for requiring red blood cell transfusion (RBCT). The clinical and economic disadvantages of RBCT have led to the development and implementation of multidisciplinary, multimodal, individualised strategies, collectively termed patient blood management, which aim to reduce RBCT and improve patients' clinical outcome and safety. Within a patient blood management programme, low pre-operative haemoglobin is one of the few modifiable risk factors for RBCT. However, a survey among Anaesthesia Departments in Spain revealed that, although pre-operative assessment was performed in the vast majority of hospitals, optimisation of haemoglobin concentration was attempted in <40% of patients who may have benefitted from it, despite there being enough time prior to surgery. This indicates that haemoglobin optimisation takes planning and forethought to be implemented in an effective manner. This review, based on available clinical evidence and our experience, is intended to provide clinicians with a practical tool to optimise pre-operative haemoglobin levels, in order to minimise the risk of patients requiring RBCT. To this purpose, after reviewing the diagnostic value and limitations of available laboratory parameters, we developed an algorithm for the detection, classification and treatment of pre-operative anaemia, with a patient-tailored approach that facilitates decision-making in the pre-operative assessment. We also reviewed the efficacy of the different pharmacological options for pre-operative and post-operative management of anaemia. We consider that such an institutional pathway for anaemia management could be a viable, cost-effective strategy that is beneficial to both patients and healthcare systems

    Detection, evaluation, and management of preoperative anaemia in the elective orthopaedic surgical patient: NATA guidelines

    Get PDF
    Previously undiagnosed anaemia is common in elective orthopaedic surgical patients and is associated with increased likelihood of blood transfusion and increased perioperative morbidity and mortality. A standardized approach for the detection, evaluation, and management of anaemia in this setting has been identified as an unmet medical need. A multidisciplinary panel of physicians was convened by the Network for Advancement of Transfusion Alternatives (NATA) with the aim of developing practice guidelines for the detection, evaluation, and management of preoperative anaemia in elective orthopaedic surgery. A systematic literature review and critical evaluation of the evidence was performed, and recommendations were formulated according to the method proposed by the Grades of Recommendation Assessment, Development and Evaluation (GRADE) Working Group. We recommend that elective orthopaedic surgical patients have a haemoglobin (Hb) level determination 28 days before the scheduled surgical procedure if possible (Grade 1C). We suggest that the patient's target Hb before elective surgery be within the normal range, according to the World Health Organization criteria (Grade 2C). We recommend further laboratory testing to evaluate anaemia for nutritional deficiencies, chronic renal insufficiency, and/or chronic inflammatory disease (Grade 1C). We recommend that nutritional deficiencies be treated (Grade 1C). We suggest that erythropoiesis-stimulating agents be used for anaemic patients in whom nutritional deficiencies have been ruled out, corrected, or both (Grade 2A). Anaemia should be viewed as a serious and treatable medical condition, rather than simply an abnormal laboratory value. Implementation of anaemia management in the elective orthopaedic surgery setting will improve patient outcome

    Serum Hepcidin concentrations decline during pregnancy and may identify iron deficiency: Analysis of a longitudinal pregnancy cohort in The Gambia.

    Get PDF
    Background Antenatal anemia is a risk factor for adverse maternal and fetal outcomes and is prevalent in sub-Saharan Africa. Less than half of antenatal anemia is considered responsive to iron; identifying women in need of iron may help target interventions. Iron absorption is governed by the iron-regulatory hormone hepcidin. Objective We sought to characterize changes in hepcidin and its associations with indexes of iron stores, erythropoiesis, and inflammation at weeks 14, 20, and 30 of gestation and to assess hepcidin!s diagnostic potential as an index of iron deficiency. Methods We measured hemoglobin and serum hepcidin, ferritin, soluble transferrin receptor (sTfR), and C-reactive protein (CRP) at 14, 20, and 30 wk of gestation in a cohort of 395 Gambian women recruited to a randomized controlled trial. Associations with hepcidin were measured by using linear regression, and hepcidin!s diagnostic test accuracy [area under the receiver operating characteristic curve (AUCROC), sensitivity, specificity, cutoffs] for iron deficiency at each time point was analyzed. Results The prevalence of anemia increased from 34.6% at 14 wk of gestation to 50.0% at 20 wk. Hepcidin concentrations declined between study enrollment and 20 wk, whereas ferritin declined between 20 and 30 wk of gestation. The variations in hepcidin explained by ferritin, sTfR, and CRP declined over pregnancy. The AUCROC values for hepcidin to detect iron deficiency (defined as ferritin <15 mg/L) were 0.86, 0.83, and 0.84 at 14, 20, and 30 wk, respectively. Hepcidin was superior to hemoglobin and sTfR as an indicator of iron deficiency. Conclusions In Gambian pregnant women, hepcidin appears to be a useful diagnostic test for iron deficiency and may enable the identification of cases for whom iron would be beneficial. Hepcidin suppression in the second trimester suggests a window for optimal timing for antenatal iron interventions. Hemoglobin does not effectively identify iron deficiency in pregnancy. This trial was registered at www.isrctn.com as ISRCTN49285450

    Iron deficiency anaemia revisited

    Get PDF
    Iron deficiency anaemia is a global health concern affecting children, women and the elderly, whilst also being a common comorbidity in multiple medical conditions. The aetiology is variable and attributed to several risk factors decreasing iron intake and absorption or increasing demand and loss, with multiple aetiologies often coexisting in an individual patient. Although presenting symptoms may be nonspecific, there is emerging evidence on the detrimental effects of iron deficiency anaemia on clinical outcomes across several medical conditions. Increased awareness about the consequences and prevalence of iron deficiency anaemia can aid early detection and management. Diagnosis can be easily made by measurement of haemoglobin and serum ferritin levels, whilst in chronic inflammatory conditions, diagnosis may be more challenging and necessitates consideration of higher serum ferritin thresholds and evaluation of transferrin saturation. Oral and intravenous formulations of iron supplementation are available, and several patient and disease-related factors need to be considered before management decisions are made. This review provides recent updates and guidance on the diagnosis and management of iron deficiency anaemia in multiple clinical settings

    Effects of Micronutrients during Pregnancy and Early Infancy on Mental and Psychomotor Development

    Get PDF
    Spectacular progress has been made in the last decades in the global fight against deficiencies of iodine and vitamin A [1]. As a result, the number of people suffering from iodine deficiency has been reduced from about 1.5 billion in 1990 to about 0.5 billion now, almost entirely due to the introduction in many countries of what has been termed ‘universal salt iodization’. In addition, approximately one million child deaths may have been prevented between 1998 and 2000 by vitamin A supplementation [2]. The political and financial commitment that has allowed these achievements has been generated to a large extent by scientific studies that have shown the extent of human suffering caused by these deficiencies, and that have determined the potential health gains of interventions. Progress in eliminating deficiencies of other micronutrients, notably iron, has been much slower. About two billion people, or about one third of the human population, continue to suffer from iron deficiency. Iron supplementation programs have been advocated for infants and preschool children, largely because of concerns of possible adverse effects of iron deficiency on mental and motor development. Similar concerns were instrumental in establishing salt iodization programs. The questions that will be addressed in this chapter concern the extent to which a shortage of iodine and iron during fetal and infant development impairs mental development, and the extent to which this impairment can be redressed by increasing the intake of these micronutrients. First, the stages of brain development in the fetus and infant will be addressed, followed by an assessment of the timing of vulnerable periods when the brain of the fetus and infant is at high risk of exposure to an inadequate supply of iodine or iron. Where possible, the mechanisms involved will be discussed. Then, observational and intervention studies will be reviewed that have examined the effect of deficiencies of iodine or iron on mental development. Approximately half of the world’s population may be at risk of low zinc intake [3]. Given this high prevalence, inconclusive but mounting evidence that zinc deficiency during pregnancy may possibly impair the infant’s neurobehavioral development and immune function should also raise great concern [4–10]. However, because of space limitations, such effects and those of other micronutrients [11] will not be reviewed in the present report

    Markers of inflammation and mortality in a cohort of patients with alcohol dependence

    Get PDF
    Abstract.Inflammation and intestinal permeability are believed to be paramount features in the development of alcohol-related liver damage. We aimed to assess the impact of 3 surrogate markers of inflammation (anemia, fibrinogen, and ferritin levels) on mid-term mortality of patients with alcohol dependence. This longitudinal study included patients with alcohol dependence admitted for hospital detoxification between 2000 and 2010. Mortality was ascertained from clinical charts and the mortality register. Associations between markers of inflammation and all-cause mortality were analyzed with mortality rates and Cox proportional hazards regression models. We also performed a subgroup analysis of mortality rates in patients with anemia, based on their mean corpuscular volume (MCV). We included 909 consecutive patients with alcohol dependence. Patients were mostly male (80.3%), had a median age of 44 years (interquartile range [IQR]: 38-50), and upon admission, their median alcohol consumption was 192 g/day (IQR: 120-265). At admission, 182 (20.5%) patients had anemia; 210 (25.9%) had fibrinogen levels >4.5 mg/dL; and 365 (49.5%) had ferritin levels >200 ng/mL. At the end of follow-up (median 3.8 years [IQR: 1.8-6.5], and a total of 3861.07 person-years), 118 patients had died (12.9% of the study population). Cox regression models showed that the presence of anemia at baseline was associated with mortality (hazard ratio [HR]: 1.67, 95% confidence interval [CI]: 1.11-2.52, P < 0.01); no associations were found between mortality and high fibrinogen or high ferritin levels. A subgroup of patients with anemia was analyzed and compared to a control group of patients without anemia and a normal MCV. The mortality ratios of patients with normocytic and macrocytic anemia were 3.25 (95% CI: 1.41-7.26; P < 0.01) and 3.39 (95% CI: 1.86-6.43; P < 0.01), respectively. Patients with alcohol dependence admitted for detoxification had an increased risk of death when anemia was present at admission. More accurate markers of systemic inflammation are needed to serve as prognostic factors for poor outcomes in this subset of patient

    Iron deficiency anaemia in mothers and infants with high inflammatory burden:Prevalence and profile in a South African birth cohort

    Get PDF
    The scarcity of epidemiological data on anaemia in low- and middle-income countries, coupled with contrasting approaches to the assessment of iron status with inflammation, represent critical research gaps. This study characterised the prevalence and profile of iron deficiency anaemia, including adjustment for inflammation, in mothers and infants from South Africa. Mother-child dyads (n = 394) were recruited (2021-2022) for the Khula birth cohort in Cape Town. Haematological metrics, iron metrics, and inflammatory biomarkers were obtained from mothers antenatally and 3-6 months postnatally, and infants 3-18 months postnatally. The extent to which inflammation impacted iron deficiency was assessed using two methods; Method A: higher serum ferritin thresholds for classifying iron status in participants with inflammation (World Health Organisation), Method B: Biomarkers Reflecting Inflammation and Nutritional Determinants of Anaemia (BRINDA) regression which corrects serum ferritin based on inflammatory biomarker concentrations. Prevalence of maternal anaemia was 34.74% (107/308) in pregnancy and 22.50% (54/240) in mothers at 3-6 months after childbirth. Of their infants, 46.82% (125/267) and 48.10% (136/283) were anaemic by 6-12 months and 12-18 months, respectively. Using Method A, the prevalence of maternal iron deficiency (regardless of anaemia), increased from 18.35% (20/109) to 55.04% (60/109) in pregnancy, and from 11.97% (28/234) to 46.58% (109/234) postnatally. Similarly, using Method B, maternal iron deficiency prevalence increased to 38.53% (42/109) in pregnancy, and 25.21% (59/234) postnatally. In infants at 12-18 months, the prevalence of iron deficiency increased from 19.79% (19/96) to 31.25% (30/96) and 32.29% (31/96) using Methods A and B, respectively. Approximately half of anaemia cases in mothers antenatally (50%; 20/40) and postnatally (45.10%; 23/51), and infants at 12-18 months (55.56%; 10/18), were attributable to iron deficiency. This is one of the first studies reporting the extent to which iron deficiency anaemia may be underestimated if inflammation is unaccounted for in South African mothers and infants.</p
    corecore