88 research outputs found

    Informing antimicrobial management in the context of COVID-19:Understanding the longitudinal dynamics of C-reactive protein and procalcitonin

    Get PDF
    Background: To characterise the longitudinal dynamics of C-reactive protein (CRP) and Procalcitonin (PCT) in a cohort of hospitalised patients with COVID-19 and support antimicrobial decision-making. Methods: Longitudinal CRP and PCT concentrations and trajectories of 237 hospitalised patients with COVID-19 were modelled. The dataset comprised of 2,021 data points for CRP and 284 points for PCT. Pairwise comparisons were performed between: (i) those with or without significant bacterial growth from cultures, and (ii) those who survived or died in hospital. Results: CRP concentrations were higher over time in COVID-19 patients with positive microbiology (day 9: 236 vs 123 mg/L, p < 0.0001) and in those who died (day 8: 226 vs 152 mg/L, p < 0.0001) but only after day 7 of COVID-related symptom onset. Failure for CRP to reduce in the first week of hospital admission was associated with significantly higher odds of death. PCT concentrations were higher in patients with COVID-19 and positive microbiology or in those who died, although these differences were not statistically significant. Conclusions: Both the absolute CRP concentration and the trajectory during the first week of hospital admission are important factors predicting microbiology culture positivity and outcome in patients hospitalised with COVID-19. Further work is needed to describe the role of PCT for co-infection. Understanding relationships of these biomarkers can support development of risk models and inform optimal antimicrobial strategies

    Switching Virally Suppressed, Treatment-Experienced Patients to a Raltegravir-Containing Regimen Does Not Alter Levels of HIV-1 DNA

    Get PDF
    Background: Current HIV-1 antiretroviral therapy (ART) greatly reduces virus replication but does not significantly affect the viral reservoir. Raltegravir, a recently introduced integrase inhibitor, could, at least theoretically, reduce residual viremia in patients on ART and affect the viral reservoir size. The aim of this study was to assess whether switching therapy in treatment-experienced patients that were virally suppressed to a raltegravir-containing regimen reduces the size of the viral reservoir, and if such treatment leads to a change in levels of HIV 2-LTR circles in this patient group. Methods: 14 ART experienced individuals with a suppressed viral load (,50 HIV-1 RNA copies/mL plasma) at baseline (for at least 2 months) were switched to a raltegravir-containing regimen. Blood samples were taken at baseline and at $2 timepoints up to 4866 weeks. Levels of total HIV-1 DNA and 2-LTR circles in peripheral blood mononuclear cells (PBMCs) were measured using real-time PCR assays. Results: There was no significant change in HIV-1 total DNA levels over the study duration (p = 0.808), median slope 0.24 (conservative nonparametric 95 % CI: 211.78, 26.23). Low levels of 2-LTR circles were detected in 2 patients. One had 16 copies/10 6 PBMCs at baseline and the other had 34 copies/10 6 PBMCs at week 51. Conclusions: The switch to a raltegravir containing regimen was not associated with a significant change in HIV-1 total DNA levels in this cohort. There were no observed changes in the levels of HIV-1 2-LTR circles associated with raltegravi

    A modified Sequential Organ Failure Assessment score for dengue: development, evaluation and proposal for use in clinical trials

    Get PDF
    Background Dengue is a neglected tropical disease, for which no therapeutic agents have shown clinical efficacy to date. Clinical trials have used strikingly variable clinical endpoints, which hampers reproducibility and comparability of findings. We investigated a delta modified Sequential Organ Failure Assessment (delta mSOFA) score as a uniform composite clinical endpoint for use in clinical trials investigating therapeutics for moderate and severe dengue. Methods We developed a modified SOFA score for dengue, measured and evaluated its performance at baseline and 48 h after enrolment in a prospective observational cohort of 124 adults admitted to a tertiary referral hospital in Vietnam with dengue shock. The modified SOFA score included pulse pressure in the cardiovascular component. Binary logistic regression, cox proportional hazard and linear regression models were used to estimate association between mSOFA, delta mSOFA and clinical outcomes. Results The analysis included 124 adults with dengue shock. 29 (23.4%) patients required ICU admission for organ support or due to persistent haemodynamic instability: 9/124 (7.3%) required mechanical ventilation, 8/124 (6.5%) required vasopressors, 6/124 (4.8%) required haemofiltration and 5/124 (4.0%) patients died. In univariate analyses, higher baseline and delta (48 h) mSOFA score for dengue were associated with admission to ICU, requirement for organ support and mortality, duration of ICU and hospital admission and IV fluid use. Conclusions The baseline and delta mSOFA scores for dengue performed well to discriminate patients with dengue shock by clinical outcomes, including duration of ICU and hospital admission, requirement for organ support and death. We plan to use delta mSOFA as the primary endpoint in an upcoming host-directed therapeutic trial and investigate the performance of this score in other phenotypes of severe dengue in adults and children

    Is the observable Universe consistent with the cosmological principle?

    Get PDF
    The cosmological principle (CP)—the notion that the Universe is spatially isotropic and homogeneous on large scales—underlies a century of progress in cosmology. It is conventionally formulated through the Friedmann-Lemaître-Robertson-Walker (FLRW) cosmologies as the spacetime metric, and culminates in the successful and highly predictive Λ-Cold-Dark-Matter (ΛCDM) model. Yet, tensions have emerged within the ΛCDM model, most notably a statistically significant discrepancy in the value of the Hubble constant, H0. Since the notion of cosmic expansion determined by a single parameter is intimately tied to the CP, implications of the H0 tension may extend beyond ΛCDM to the CP itself. This review surveys current observational hints for deviations from the expectations of the CP, highlighting synergies and disagreements that warrant further study. Setting aside the debate about individual large structures, potential deviations from the CP include variations of cosmological parameters on the sky, discrepancies in the cosmic dipoles, and mysterious alignments in quasar polarizations and galaxy spins. While it is possible that a host of observational systematics are impacting results, it is equally plausible that precision cosmology may have outgrown the FLRW paradigm, an extremely pragmatic but non-fundamental symmetry assumption

    New genetic loci link adipose and insulin biology to body fat distribution.

    Get PDF
    Body fat distribution is a heritable trait and a well-established predictor of adverse metabolic outcomes, independent of overall adiposity. To increase our understanding of the genetic basis of body fat distribution and its molecular links to cardiometabolic traits, here we conduct genome-wide association meta-analyses of traits related to waist and hip circumferences in up to 224,459 individuals. We identify 49 loci (33 new) associated with waist-to-hip ratio adjusted for body mass index (BMI), and an additional 19 loci newly associated with related waist and hip circumference measures (P < 5 × 10(-8)). In total, 20 of the 49 waist-to-hip ratio adjusted for BMI loci show significant sexual dimorphism, 19 of which display a stronger effect in women. The identified loci were enriched for genes expressed in adipose tissue and for putative regulatory elements in adipocytes. Pathway analyses implicated adipogenesis, angiogenesis, transcriptional regulation and insulin resistance as processes affecting fat distribution, providing insight into potential pathophysiological mechanisms

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study

    Get PDF
    Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P &lt; 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    Socializing One Health: an innovative strategy to investigate social and behavioral risks of emerging viral threats

    Get PDF
    In an effort to strengthen global capacity to prevent, detect, and control infectious diseases in animals and people, the United States Agency for International Development’s (USAID) Emerging Pandemic Threats (EPT) PREDICT project funded development of regional, national, and local One Health capacities for early disease detection, rapid response, disease control, and risk reduction. From the outset, the EPT approach was inclusive of social science research methods designed to understand the contexts and behaviors of communities living and working at human-animal-environment interfaces considered high-risk for virus emergence. Using qualitative and quantitative approaches, PREDICT behavioral research aimed to identify and assess a range of socio-cultural behaviors that could be influential in zoonotic disease emergence, amplification, and transmission. This broad approach to behavioral risk characterization enabled us to identify and characterize human activities that could be linked to the transmission dynamics of new and emerging viruses. This paper provides a discussion of implementation of a social science approach within a zoonotic surveillance framework. We conducted in-depth ethnographic interviews and focus groups to better understand the individual- and community-level knowledge, attitudes, and practices that potentially put participants at risk for zoonotic disease transmission from the animals they live and work with, across 6 interface domains. When we asked highly-exposed individuals (ie. bushmeat hunters, wildlife or guano farmers) about the risk they perceived in their occupational activities, most did not perceive it to be risky, whether because it was normalized by years (or generations) of doing such an activity, or due to lack of information about potential risks. Integrating the social sciences allows investigations of the specific human activities that are hypothesized to drive disease emergence, amplification, and transmission, in order to better substantiate behavioral disease drivers, along with the social dimensions of infection and transmission dynamics. Understanding these dynamics is critical to achieving health security--the protection from threats to health-- which requires investments in both collective and individual health security. Involving behavioral sciences into zoonotic disease surveillance allowed us to push toward fuller community integration and engagement and toward dialogue and implementation of recommendations for disease prevention and improved health security

    31st Annual Meeting and Associated Programs of the Society for Immunotherapy of Cancer (SITC 2016) : part two

    Get PDF
    Background The immunological escape of tumors represents one of the main ob- stacles to the treatment of malignancies. The blockade of PD-1 or CTLA-4 receptors represented a milestone in the history of immunotherapy. However, immune checkpoint inhibitors seem to be effective in specific cohorts of patients. It has been proposed that their efficacy relies on the presence of an immunological response. Thus, we hypothesized that disruption of the PD-L1/PD-1 axis would synergize with our oncolytic vaccine platform PeptiCRAd. Methods We used murine B16OVA in vivo tumor models and flow cytometry analysis to investigate the immunological background. Results First, we found that high-burden B16OVA tumors were refractory to combination immunotherapy. However, with a more aggressive schedule, tumors with a lower burden were more susceptible to the combination of PeptiCRAd and PD-L1 blockade. The therapy signifi- cantly increased the median survival of mice (Fig. 7). Interestingly, the reduced growth of contralaterally injected B16F10 cells sug- gested the presence of a long lasting immunological memory also against non-targeted antigens. Concerning the functional state of tumor infiltrating lymphocytes (TILs), we found that all the immune therapies would enhance the percentage of activated (PD-1pos TIM- 3neg) T lymphocytes and reduce the amount of exhausted (PD-1pos TIM-3pos) cells compared to placebo. As expected, we found that PeptiCRAd monotherapy could increase the number of antigen spe- cific CD8+ T cells compared to other treatments. However, only the combination with PD-L1 blockade could significantly increase the ra- tio between activated and exhausted pentamer positive cells (p= 0.0058), suggesting that by disrupting the PD-1/PD-L1 axis we could decrease the amount of dysfunctional antigen specific T cells. We ob- served that the anatomical location deeply influenced the state of CD4+ and CD8+ T lymphocytes. In fact, TIM-3 expression was in- creased by 2 fold on TILs compared to splenic and lymphoid T cells. In the CD8+ compartment, the expression of PD-1 on the surface seemed to be restricted to the tumor micro-environment, while CD4 + T cells had a high expression of PD-1 also in lymphoid organs. Interestingly, we found that the levels of PD-1 were significantly higher on CD8+ T cells than on CD4+ T cells into the tumor micro- environment (p < 0.0001). Conclusions In conclusion, we demonstrated that the efficacy of immune check- point inhibitors might be strongly enhanced by their combination with cancer vaccines. PeptiCRAd was able to increase the number of antigen-specific T cells and PD-L1 blockade prevented their exhaus- tion, resulting in long-lasting immunological memory and increased median survival
    corecore