92 research outputs found

    Anticipating pulmonary complications after thoracotomy: the FLAM Score

    Get PDF
    OBJECTIVE: Pulmonary complications after thoracotomy are the result of progressive changes in the respiratory status of the patient. A multifactorial score (FLAM score) was developed to identify postoperatively patients at higher risk for pulmonary complications at least 24 hours before the clinical diagnosis. METHODS: The FLAM score, created in 2002, is based on 7 parameters (dyspnea, chest X-ray, delivered oxygen, auscultation, cough, quality and quantity of bronchial secretions). To validate the FLAM score, we prospectively calculated scores during the first postoperative week in 300 consecutive patients submitted to posterolateral thoracotomy. RESULTS: During the study, 60 patients (20%) developed pulmonary complications during the postoperative period. The FLAM score progressively increased in complicated patients until the fourth postoperative day (mean 13.5 ± 11.9). FLAM scores in patients with complications were significantly higher (p < 0.05) at least 24 hours before the clinical diagnosis of complication, compared to FLAM scores in uncomplicated patients. ROC curves analysis showed that the cut-off value of FLAM with the best sensitivity and specificity for pulmonary complications was 9 (area under the curve 0.97). Based on the highest FLAM scores recorded, 4 risk classes were identified with increasing incidence of pulmonary complications and mortality. CONCLUSION: Changes in FLAM score were evident at least 24 hours before the clinical diagnosis of pulmonary complications. FLAM score can be used to categorize patients according to risk of respiratory morbidity and mortality and could be a useful tool in the postoperative management of patients undergoing thoracotomy

    Ice chemistry in embedded young stellar objects in the Large Magellanic Cloud

    Get PDF
    We present spectroscopic observations of a sample of 15 embedded young stellar objects (YSOs) in the Large Magellanic Cloud (LMC). These observations were obtained with the Spitzer Infrared Spectrograph (IRS) as part of the SAGE-Spec Legacy program. We analyze the two prominent ice bands in the IRS spectral range: the bending mode of CO_2 ice at 15.2 micron and the ice band between 5 and 7 micron that includes contributions from the bending mode of water ice at 6 micron amongst other ice species. The 5-7 micron band is difficult to identify in our LMC sample due to the conspicuous presence of PAH emission superimposed onto the ice spectra. We identify water ice in the spectra of two sources; the spectrum of one of those sources also exhibits the 6.8 micron ice feature attributed to ammonium and methanol. We model the CO_2 band in detail, using the combination of laboratory ice profiles available in the literature. We find that a significant fraction (> 50%) of CO_2 ice is locked in a water-rich component, consistent with what is observed for Galactic sources. The majority of the sources in the LMC also require a pure-CO_2 contribution to the ice profile, evidence of thermal processing. There is a suggestion that CO_2 production might be enhanced in the LMC, but the size of the available sample precludes firmer conclusions. We place our results in the context of the star formation environment in the LMC.Comment: Minor corrections to Table 2. Accepted for publication in ApJ, 66 pages, 9 figures (some in color), 4 table

    Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study

    Get PDF
    Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection

    Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study

    Get PDF
    Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe

    The benefit of retraining pKa models studied using internally measured data

    No full text
    The ionization state of drugs influences many pharmaceutical properties such as their solubility, permeability, and biological activity. It is therefore important to understand the structure property relationship for the acid-base dissociation constant pKa during the lead optimization process to make better-informed design decisions. Computational approaches, as implemented in MoKa, can help with this, however often predict with a too large error. In this contribution, we look at how retraining helps to greatly improve prediction error. Using a longitudinal study with data measured over 15 years in a drug discovery environment, we assess the impact of model training on prediction accuracy and look at model degradation over time. Using the MoKa software, we will demonstrate that regular retraining is required to address changes in chemical space leading to model degradation over six to nine months
    corecore