709 research outputs found
Groeten uit...???
In elke editie wordt een studieverslag geplaatst van een B.I.L.-lid die een tijdje in het buitenland studeert of heeft gestudeerd. Dit keer is het de beurt aan derdejaarsstudent Rianne Beguin. Zij heeft het eerste semester in Londen gestudeerd
Red blood cell precursor mass as an independent determinant of serum erythropoietin level.
Serum erythropoietin (sEpo) concentration is primarily related to the rate of renal production and, under the stimulus of hypoxia, increases exponentially as hemoglobin (Hb) decreases. Additional factors, however, appear to influence sEpo, and in this work, we performed studies to evaluate the role of the red blood cell precursor mass. We first compared the relationship of sEpo with Hb in patients with low versus high erythroid activity. The first group included 27 patients with erythroid aplasia or hypoplasia having serum transferrin receptor (sTfR) levels 10 mg/L (erythroid activity > 2 times normal). There was no difference between the two groups with respect to Hb (8.3 +/- 1.6 v 8.0 +/- 1.3 g/dL, P > .05), but sEpo levels were notably higher in patients with low erythroid activity (1,601 +/- 1,542 v 235 +/- 143 mU/mL, P < . 001). In fact, multivariate analysis of variance (ANOVA) showed that, at any given Hb level, sEpo was higher in patients with low erythroid activity (P < .0001). Twenty patients undergoing allogeneic or autologous bone marrow transplantation (BMT) were then investigated. A marked increase in sEpo was seen in all cases at the time of marrow aplasia, disproportionately high when compared with the small decrease in Hb level. Sequential studies were also performed in five patients with iron deficiency anemia undergoing intravenous (IV) iron therapy. Within 24 to 72 hours after starting iron treatment, marked decreases in sEpo (up to one log magnitude) were found before any change in Hb level. Similar observations were made in patients with megaloblastic anemia and in a case of pure red blood cell aplasia. These findings point to an inverse relationship between red blood cell precursor mass and sEpo: at any given Hb level, the higher the number of red blood cell precursors, the lower the sEpo concentration. The most likely explanation for this is that sEpo levels are regulated not only by the rate of renal production, but also by the rate of utilization by erythroid cells
Defective iron supply for erythropoiesis and adequate endogenous erythropoietin production in the anemia associated with systemic-onset juvenile chronic arthritis.
peer reviewedSystemic-onset juvenile chronic arthritis (SoJCA) is associated with high levels of circulating interleukin-6 (IL-6) and is frequently complicated by severe microcytic anemia whose pathogenesis is unclear. Therefore, we studied 20 consecutive SoJCA patients with hemoglobin (Hb) levels <12 g/dL, evaluating erythroid progenitor proliferation, endogenous erythropoietin production, body iron status, and iron supply for erythropoiesis. Hb concentrations ranged from 6.5 to 11.9 g/dL. Hb level was directly related to mean corpuscular volume (r = .82, P < .001) and inversely related to circulating transferrin receptor (r = -.81, P < .001) suggesting that the severity of anemia was directly proportional to the degree of iron-deficient erythropoiesis. Serum ferritin ranged from 18 to 1,660 microgram/L and was unrelated to Hb level. Bone marrow iron stores wore markedly reduced in the three children investigated, and they also showed increased serum transferrin receptor and normal-to-high serum ferritin. All 20 patients had elevated IL-6 levels and normal in vitro growth of erythroid progenitors. Endogenous erythropoietin (epo) production was appropriate for the degree of anemia as judged by both the observed to predicted log (serum epo) ratio 10.95 +/- 0.12) and a comparison of the serum epo-Hb regression found in these subjects with that of thalassemia patients. Multiple regression analysis showed that serum transferrin receptor was the parameter most closely related to hemoglobin concentration: variation in circulating transferrin receptor explained 61% of the variation in Hb level (P < .001). In 10 severely anemic patients, amelioration of anemia following intravenous iron administration resulted in normalization of serum transferrin receptor. Defective iron supply to the erythron rather than blunted epo production is the major cause of the microcytic anemia associated with SoJCA. A true body-iron deficiency caused by decreased iron absorption likely complicates long-lasting inflammation in the most anemic children, and this can be recognized by high serum transferrin receptor levels. Although oral iron is of no benefit, intravenous iron saccharate is a safe and effective means for improving iron availability for erythropoiesis and correcting this anemia. Thus, while chronically high endogenous IL-6 levels do not appear to blunt epo production, they are probably responsible for the observed abnormalities in iron metabolism. Anemia of chronic disease encompasses a variety of anemic conditions whose peculiar features may specifically correlate with the type of cytokine(s) predominantly released
Feed-back on the development of a small scale Contact Erosion Test in the laboratory (characteristic size ~ 30 cm)
To determine the hydraulic load requested to initiate contact erosion process, tests are performed with an apparatus called the “Contact Erosion Test”. This device originally results from research carried out by Grenoble University, Électricité de France and Compagnie Nationale du Rhône, at the scale of ~60 cm. It has been adapted to a smaller scale in geophyConsult laboratory to conduct tests on samples extracted from core drilling. The instrumentation was improved to enable a better control of the hydraulic loading and avoid biases. The test protocol was modified, especially to better constrain the soil density at the interface. From the first series of test, we drew conclusions on the test repeatability and on the influence of parameters of the soil state. Discrepancies with previous results obtained at the scale of ~60 cm were identified. Therefore, a new erosion test campaign was planned to confirm and determine the reasons for these differences
Palliatieve inpatients in general hospitals : a one day observational study in Belgium
Background: Hospital care plays a major role at the end-of-life. But little is known about the overall size and characteristics of the palliative inpatient population. The aim of our study was to analyse these aspects.
Methods: We conducted a one-day observational study in 14 randomly selected Belgian hospitals. Patients who met the definition of palliative patients were identified as palliative. Then, information about their sociodemographic characteristics, diagnoses, prognosis, and care plan were recorded and analysed.
Results: There were 2639 in-patients on the day of the study; 9.4% of them were identified as “palliative”. The mean age of the group was 72 years. The primary diagnosis was cancer in 51% of patients and the estimated life expectancy was shorter than 3 months in 33% of patients and longer than 1 year in 28% of patients. The professional caregivers expected for most of the patients (73%), that the treatment would improve patient comfort rather than prolong life. Antibiotics, transfusions, treatments specific to the pathology, and artificial nutrition were administered in 90%, 78%, 57% and 50% of the patients, respectively, but were generally given with a view to controlling the symptoms.
Conclusions: This analysis presents a first national estimate of the palliative inpatient population. Our results confirm that hospitals play a major role at the end-of-life, with one out of ten inpatients identified as a “palliative” patient. These data also demonstrate the complexity of the palliative population and the substantial diversity of care that they can require
Impact of Community-Based Larviciding on the Prevalence of Malaria Infection in Dar es Salaam, Tanzania.
The use of larval source management is not prioritized by contemporary malaria control programs in sub-Saharan Africa despite historical success. Larviciding, in particular, could be effective in urban areas where transmission is focal and accessibility to Anopheles breeding habitats is generally easier than in rural settings. The objective of this study is to assess the effectiveness of a community-based microbial larviciding intervention to reduce the prevalence of malaria infection in Dar es Salaam, United Republic of Tanzania. Larviciding was implemented in 3 out of 15 targeted wards of Dar es Salaam in 2006 after two years of baseline data collection. This intervention was subsequently scaled up to 9 wards a year later, and to all 15 targeted wards in 2008. Continuous randomized cluster sampling of malaria prevalence and socio-demographic characteristics was carried out during 6 survey rounds (2004-2008), which included both cross-sectional and longitudinal data (N = 64,537). Bayesian random effects logistic regression models were used to quantify the effect of the intervention on malaria prevalence at the individual level. Effect size estimates suggest a significant protective effect of the larviciding intervention. After adjustment for confounders, the odds of individuals living in areas treated with larviciding being infected with malaria were 21% lower (Odds Ratio = 0.79; 95% Credible Intervals: 0.66-0.93) than those who lived in areas not treated. The larviciding intervention was most effective during dry seasons and had synergistic effects with other protective measures such as use of insecticide-treated bed nets and house proofing (i.e., complete ceiling or window screens). A large-scale community-based larviciding intervention significantly reduced the prevalence of malaria infection in urban Dar es Salaam
Management of Myelodysplastic Syndrome Relapsing after Allogeneic Hematopoietic Stem Cell Transplantation: A Study by the French Society of Bone Marrow Transplantation and Cell Therapies
To find out prognostic factors and to investigate different therapeutic approaches, we report on 147 consecutive patients who relapsed after allogeneic hematopoietic stem cell transplantation (allo-HSCT) for myelodysplastic syndrome (MDS). Sixty-two patients underwent immunotherapy (IT group, second allo-HSCT or donor lymphocyte infusion), 39 received cytoreductive treatment alone (CRT group) and 46 were managed with palliative/supportive cares (PSC group). Two-year rates of overall survival (OS) were 32%, 6%, and 2% in the IT, CRT, and PSC groups, respectively (P < .001). In multivariate analysis, 4 factors adversely influenced 2-year rates of OS: history of acute graft-versus-host disease (hazard ratio [HR], 1.83; 95% confidence interval [CI], 1.26 to 2.67; P ¼ .002), relapse within 6 months (HR, 2.69; 95% CI, .82 to 3.98; P < .001), progression to acute myeloid leukemia (HR, 2.59; 95% CI, 1.75 to 3.83; P < .001), and platelet count < 50 G/L at relapse (HR, 1.68; 95% CI, 1.15 to 2.44; P ¼.007). A prognostic score based on those factors discriminated 2 risk groups with median OSs of 13.2 versus 2.4 months, respectively (P < .001). When propensity score, prognostic score, and treatment strategy were included in Cox model, immunotherapy was found to be an independent factor that favorably impacts OS (HR, .40; 95% CI, .26 to .63; P < .001). In conclusion, immunotherapy should be considered when possible for MDS patients relapsing after allo-HSCT
Subclinical iron deficiency is a strong predictor of bacterial vaginosis in early pregnancy
BACKGROUND: Bacterial vaginosis (BV) is the single most common vaginal infection in women of childbearing age and associated with a sizeable infectious disease burden among both non-pregnant and pregnant women, including a significantly elevated risk of adverse pregnancy outcome. Overall, little progress has been made in identifying causal factors involved in BV acquisition and persistence. We sought to evaluate maternal iron status in early pregnancy as a putative risk factor for BV, considering that micronutrients, and iron deficiency in particular, affect the host response against bacterial colonization, even in the setting of mild micronutrient deficiencies. METHODS: In a nested case-control study, we compared maternal iron status at entry to prenatal care (mean gestational age 9.2 ± 2.6 weeks) between eighty women with healthy vaginal microflora and eighteen women with vaginosis-like microflora. Vaginal microflora status was assessed by assigning a modified Nugent score to a Gram-stained vaginal smear. Maternal iron status was assayed by an array of conventional erythrocyte and serum indicators for iron status assessment, but also by more sensitive and more specific indicators of iron deficiency, including soluble transferrin receptors (sTfR) as an accurate measure of cellular and tissue iron deficiency and the iron deficiency log(10)[sTfR/ferritin] index as the presently most accurate measure of body storage iron available. RESULTS: We found no statistically significant correlation between vaginal microflora status and routinely assessed iron parameters. In contrast, a highly significant difference between the healthy and vaginosis-like microflora groups of women was shown in mean values of sTfR concentrations (1.15 ± 0.30 mg/L versus 1.37 ± 0.38 mg/L, p = 0.008) and in mean iron deficiency log(10)[sTfR/ferritin] index values (1.57 ± 0.30 versus 1.08 ± 0.56, p = 0.003), indicating a strong association between iron deficiency and vaginosis-like microflora. An sTfR concentration >1.45 mg/L was associated with a 3-fold increased risk (95%CI: 1.4–6.7) of vaginosis-like microflora and after controlling for maternal age, gestational length, body mass, parity, and smoking habits with an adjusted odds ratio of 4.5 (95%CI: 1.4–14.2). CONCLUSION: We conclude that subclinical iron deficiency, presumably resulting from inadequate preconceptional iron supplies, is strongly and independently associated with vaginosis-like microflora during early pregnancy
- …