456 research outputs found

    Early fluid resuscitation with hyperoncotic hydroxyethyl starch 200/0.5 (10%) in severe burn injury

    Get PDF
    INTRODUCTION: Despite large experience in the management of severe burn injury, there are still controversies regarding the best type of fluid resuscitation, especially during the first 24 hours after the trauma. Therefore, our study addressed the question whether hyperoncotic hydroxyethyl starch (HES) 200/0.5 (10%) administered in combination with crystalloids within the first 24 hours after injury is as effective as 'crystalloids only' in severe burn injury patients. METHODS: 30 consecutive patients were enrolled to this prospective interventional open label study and assigned either to a traditional 'crystalloids only' or to a 'HES 200/0.5 (10%)' volume resuscitation protocol. Total amount of fluid administration, complications such as pulmonary failure, abdominal compartment syndrome, sepsis, renal failure and overall mortality were assessed. Cox proportional hazard regression analysis was performed for binary outcomes and adjustment for potential confounders was done in the multivariate regression models. For continuous outcome parameters multiple linear regression analysis was used. RESULTS: Group differences between patients receiving crystalloids only or HES 200/0.5 (10%) were not statistically significant. However, a large effect towards increased overall mortality (adjusted hazard ratio 7.12; P = 0.16) in the HES 200/0.5 (10%) group as compared to the crystalloids only group (43.8% versus 14.3%) was present. Similarly, the incidence of renal failure was 25.0% in the HES 200/0.5 (10%) group versus 7.1% in the crystalloid only group (adjusted hazard ratio 6.16; P = 0.42). CONCLUSIONS: This small study indicates that the application of hyperoncotic HES 200/0.5 (10%) within the first 24 hours after severe burn injury may be associated with fatal outcome and should therefore be used with caution

    Estimates of child deaths prevented from malaria prevention scale-up in Africa 2001-2010

    Get PDF
    Funding from external agencies for malaria control in Africa has increased dramatically over the past decade resulting in substantial increases in population coverage by effective malaria prevention interventions. This unprecedented effort to scale-up malaria interventions is likely improving child survival and will likely contribute to meeting Millennium Development Goal (MDG) 4 to reduce the < 5 mortality rate by two thirds between 1990 and 2015.\ud The Lives Saved Tool (LiST) model was used to quantify the likely impact that malaria prevention intervention scale-up has had on malaria mortality over the past decade (2001-2010) across 43 malaria endemic countries in sub-Saharan African. The likely impact of ITNs and malaria prevention interventions in pregnancy (intermittent preventive treatment [IPTp] and ITNs used during pregnancy) over this period was assessed. The LiST model conservatively estimates that malaria prevention intervention scale-up over the past decade has prevented 842,800 (uncertainty: 562,800-1,364,645) child deaths due to malaria across 43 malaria-endemic countries in Africa, compared to a baseline of the year 2000. Over the entire decade, this represents an 8.2% decrease in the number of malaria-caused child deaths that would have occurred over this period had malaria prevention coverage remained unchanged since 2000. The biggest impact occurred in 2010 with a 24.4% decrease in malaria-caused child deaths compared to what would have happened had malaria prevention interventions not been scaled-up beyond 2000 coverage levels. ITNs accounted for 99% of the lives saved. The results suggest that funding for malaria prevention in Africa over the past decade has had a substantial impact on decreasing child deaths due to malaria. Rapidly achieving and then maintaining universal coverage of these interventions should be an urgent priority for malaria control programmes in the future. Successful scale-up in many African countries will likely contribute substantially to meeting MDG 4, as well as succeed in meeting MDG 6 (Target 1) to halt and reverse malaria incidence by 2015

    Two different hematocrit detection methods: Different methods, different results?

    Get PDF
    BACKGROUND: Less is known about the influence of hematocrit detection methodology on transfusion triggers. Therefore, the aim of the present study was to compare two different hematocrit-assessing methods. In a total of 50 critically ill patients hematocrit was analyzed using (1) blood gas analyzer (ABLflex 800) and (2) the central laboratory method (ADVIA(R) 2120) and compared. FINDINGS: Bland-Altman analysis for repeated measurements showed a good correlation with a bias of +1.39% and 2 SD of +/- 3.12%. The 24%-hematocrit-group showed a correlation of r2 = 0.87. With a kappa of 0.56, 22.7% of the cases would have been transfused differently. In the-28%-hematocrit group with a similar correlation (r2 = 0.8) and a kappa of 0.58, 21% of the cases would have been transfused differently. CONCLUSIONS: Despite a good agreement between the two methods used to determine hematocrit in clinical routine, the calculated difference of 1.4% might substantially influence transfusion triggers depending on the employed method

    Regulatory role of C5a in LPS-induced IL-6 production by neutrophils during sepsis

    Full text link
    Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/154469/1/fsb2fj030708fje-sup-0001.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/154469/2/fsb2fj030708fje.pd

    Evidence for a functional role of the second C5a receptor C5L2

    Full text link
    During experimental sepsis in rodents after cecal ligation and puncture (CLP), excessive C5a is generated, leading to interactions with C5aR, loss of innate immune functions of neutrophils, and lethality. In the current study, we have analyzed the expression of the second C5a receptor C5L2, the putative â defaultâ or nonsignaling receptor for C5a. Rat C5L2 was cloned, and antibody was developed to C5L2 protein. After CLP, blood neutrophils showed a reduction in C5aR followed by its restoration, while C5L2 levels gradually increased, accompanied by the appearance of mRNA for C5L2. mRNA for C5L2 increased in lung and liver during CLP. Substantially increased C5L2 protein (defined by binding of 125Iâ antiâ C5L2 IgG) occurred in lung, liver, heart, and kidney after CLP. With the use of serum ILâ 6 as a marker for sepsis, infusion of antiâ C5aR dramatically reduced serum ILâ 6 levels, while antiâ C5L2 caused a nearly fourfold increase in ILâ 6 when compared with CLP controls treated with normal IgG. When normal blood neutrophils were stimulated in vitro with LPS and C5a, the antibodies had similar effects on release of ILâ 6. These data provide the first evidence for a role for C5L2 in balancing the biological responses to C5a.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/154410/1/fsb2fj043424fje.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/154410/2/fsb2fj043424fje-sup-0040.pd

    Comparative susceptibility of mosquito populations in North Queensland, Australia to oral infection with dengue virus.

    Get PDF
    Dengue is the most prevalent arthropod-borne virus, with at least 40% of the world's population at risk of infection each year. In Australia, dengue is not endemic, but viremic travelers trigger outbreaks involving hundreds of cases. We compared the susceptibility of Aedes aegypti mosquitoes from two geographically isolated populations to two strains of dengue virus serotype 2. We found, interestingly, that mosquitoes from a city with no history of dengue were more susceptible to virus than mosquitoes from an outbreak-prone region, particularly with respect to one dengue strain. These findings suggest recent evolution of population-based differences in vector competence or different historical origins. Future genomic comparisons of these populations could reveal the genetic basis of vector competence and the relative role of selection and stochastic processes in shaping their differences. Lastly, we show the novel finding of a correlation between midgut dengue titer and titer in tissues colonized after dissemination

    IR and UV Galaxies at z=0.6 -- Evolution of Dust Attenuation and Stellar Mass as Revealed by SWIRE and GALEX

    Get PDF
    We study dust attenuation and stellar mass of z∼0.6\rm z\sim 0.6 star-forming galaxies using new SWIRE observations in IR and GALEX observations in UV. Two samples are selected from the SWIRE and GALEX source catalogs in the SWIRE/GALEX field ELAIS-N1-00 (Ω=0.8\Omega = 0.8 deg2^2). The UV selected sample has 600 galaxies with photometric redshift (hereafter photo-z) 0.5≤z≤0.70.5 \leq z \leq 0.7 and NUV≤23.5\leq 23.5 (corresponding to \rm L_{FUV} \geq 10^{9.6} L_\sun). The IR selected sample contains 430 galaxies with f24μm≥0.2f_{24\mu m} \geq 0.2 mJy (\rm L_{dust} \geq 10^{10.8} L_\sun) in the same photo-z range. It is found that the mean Ldust/LFUV\rm L_{dust}/L_{FUV} ratios of the z=0.6 UV galaxies are consistent with that of their z=0 counterparts of the same LFUV\rm L_{FUV}. For IR galaxies, the mean Ldust/LFUV\rm L_{dust}/L_{FUV} ratios of the z=0.6 LIRGs (\rm L_{dust} \sim 10^{11} L_\sun) are about a factor of 2 lower than local LIRGs, whereas z=0.6 ULIRGs (\rm L_{dust} \sim 10^{12} L_\sun) have the same mean Ldust/LFUV\rm L_{dust}/L_{FUV} ratios as their local counterparts. This is consistent with the hypothesis that the dominant component of LIRG population has changed from large, gas rich spirals at z>0.5>0.5 to major-mergers at z=0. The stellar mass of z=0.6 UV galaxies of \rm L_{FUV} \leq 10^{10.2} L_\sun is about a factor 2 less than their local counterparts of the same luminosity, indicating growth of these galaxies. The mass of z=0.6 UV lunmous galaxies (UVLGs: \rm L_{FUV} > 10^{10.2} L_\sun) and IR selected galaxies, which are nearly exclusively LIRGs and ULIRGs, is the same as their local counterparts.Comment: 27 pages, 8 figures, to be published in the Astrophysical Journal Supplement series dedicated to GALEX result

    Nearby early-type galaxies with ionized gas. The UV emission from GALEX observations

    Full text link
    We present GALEX far-ultraviolet (FUV, λeff\lambda_{eff}=1538 \AA) and near-ultraviolet (NUV, λeff\lambda_{eff}=2316 \AA) surface photometry of 40 early-type galaxies (ETGs) selected from a wider sample of 65 nearby ETGs showing emission lines in their optical spectra. We derive FUV and NUV surface brightness profiles, (FUV-NUV) colour profiles and D25_{25} integrated magnitudes. We extend the photometric study to the optical {\it r} band from SDSS imaging for 14 of these ETGs. In general, the (FUV-NUV) radial colour profiles become redder with galactocentric distance in both rejuvenated (≤4\leq 4 Gyr) and old ETGs. Colour profiles of NGC 1533, NGC 2962, NGC 2974, NGC 3489, and IC 5063 show rings and/or arm-like structures, bluer than the body of the galaxy, suggesting the presence of recent star formation. Although seven of our ETGs show shell systems in their optical image, only NGC 7135 displays shells in the UV bands. We characterize the UV and optical surface brightness profiles, along the major axis, using a Sersic law. The Sersic law exponent, nn, varies from 1 to 16 in the UV bands. S0 galaxies tend to have lower values of nn (≤5\leq5). The Sersic law exponent n=4n=4 seems to be a watershed: ETGs with n>4n>4 tend to have [α\alpha/Fe] greater than 0.15, implying a short star-formation time scale. We find a significant correlation between the FUV−-NUV colour and central velocity dispersions σ\sigma, with the UV colours getting bluer at larger σ\sigma. This trend is likely driven by a combined effect of `downsizing' and of the mass-metallicity relation.Comment: Accepted for publication in MNRAS, 33 pages, 7 figure

    Inhaled Sedation in Patients with COVID-19-Related Acute Respiratory Distress Syndrome: An International Retrospective Study

    Full text link
    Background and objectives: The coronavirus disease 2019 (COVID-19) pandemic and the shortage of intravenous sedatives has led to renewed interest in inhaled sedation for patients with acute respiratory distress syndrome (ARDS). We hypothesized that inhaled sedation would be associated with improved clinical outcomes in COVID-19 ARDS patients. Methods: Retrospective international study including mechanically ventilated patients with COVID-19 ARDS who required sedation and were admitted to 10 European and US intensive care units. The primary endpoint of ventilator-free days through day 28 was analyzed using zero-inflated negative binomial regression, before and after adjustment for site, clinically relevant covariates determined according to the univariate results, and propensity score matching. Results: A total of 196 patients were enrolled, 78 of whom died within 28 days. The number of ventilator-free days through day 28 did not differ significantly between the patients who received inhaled sedation for at least 24 h (n = 111) and those who received intravenous sedation only (n = 85), with medians of 0 (interquartile range [IQR] 0–8) and 0 (IQR 0–17), respectively (odds ratio for having zero ventilator-free days through day 28, 1.63, 95% confidence interval [CI], 0.91–2.92, p = 0.10). The incidence rate ratio for the number of ventilator-free days through day 28 if not 0 was 1.13 (95% CI, 0.84–1.52, p = 0.40). Similar results were found after multivariable adjustment and propensity matching. Conclusion: The use of inhaled sedation in COVID-19 ARDS was not associated with the number of ventilator-free days through day 28. Keywords: coronavirus disease 2019; acute respiratory distress syndrome; inhaled sedation; sevoflurane; isofluran
    • …
    corecore