19 research outputs found

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Retrospective evaluation of whole exome and genome mutation calls in 746 cancer samples

    No full text
    Funder: NCI U24CA211006Abstract: The Cancer Genome Atlas (TCGA) and International Cancer Genome Consortium (ICGC) curated consensus somatic mutation calls using whole exome sequencing (WES) and whole genome sequencing (WGS), respectively. Here, as part of the ICGC/TCGA Pan-Cancer Analysis of Whole Genomes (PCAWG) Consortium, which aggregated whole genome sequencing data from 2,658 cancers across 38 tumour types, we compare WES and WGS side-by-side from 746 TCGA samples, finding that ~80% of mutations overlap in covered exonic regions. We estimate that low variant allele fraction (VAF < 15%) and clonal heterogeneity contribute up to 68% of private WGS mutations and 71% of private WES mutations. We observe that ~30% of private WGS mutations trace to mutations identified by a single variant caller in WES consensus efforts. WGS captures both ~50% more variation in exonic regions and un-observed mutations in loci with variable GC-content. Together, our analysis highlights technological divergences between two reproducible somatic variant detection efforts

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    A Deep-Learning Approach to Soil Moisture Estimation with GNSS-R

    No full text
    GNSS reflection measurements in the form of delay-Doppler maps (DDM) can be used to complement soil measurements from the SMAP Mission, which has a revisit rate too slow for some hydrological/meteorological studies. The standard approach, which only considers the peak value of the DDM, is subject to a significant amount of uncertainty due to the fact that the peak value of the DDM is not only affected by soil moisture, but also complex topography, inundation, and overlying vegetation. We hypothesize that information from the entire 2D DDM could help decrease uncertainty under various conditions. The application of deep-learning-based techniques has the potential to extract additional information from the entire DDM, while simultaneously allowing for the incorporation of additional contextual information from external datasets. This work explored the data-driven approach of convolutional neural networks (CNNs) to determine complex relationships between the reflection measurement and surface parameters, providing the groundwork for a mechanism to achieve improved global soil moisture estimates. A CNN was trained on CYGNSS DDMs and contextual ancillary datasets as inputs, with aligned SMAP soil moisture values as the targets. Data were aggregated into training sets, and a CNN was developed to process them. Predictions from the CNN were studied using an unbiased subset of samples, showing strong correlation with the SMAP target values. With this network, a soil moisture product was generated using DDMs from 2017–2019 which is generally comparable to existing global soil moisture products, and shows potential advantages in spatial resolution and coverage over regions where SMAP does not perform well. Comparisons with in-situ measurements demonstrate the correlation between the network predictions and ground truth with high temporal resolution

    Disinfection of Virtual Reality Devices in Health Care Settings: In Vitro Assessment and Survey Study

    No full text
    BackgroundVirtual reality (VR) devices are increasingly used in health care settings. The use among patients has the potential to unintentionally transmit pathogens between patients and hospital staff. No standard operating procedure for disinfection exists to ensure safe use between patients. ObjectiveThis study aims to determine the efficacy of disinfectants on VR devices in order to ensure safe use in health care settings. MethodsThree types of bacteria were inoculated onto porous and nonporous surfaces of 2 VR devices: the Meta Oculus Quest and Meta Oculus Quest 2. Disinfection was performed using either isopropyl alcohol or alcohol-free quaternary ammonium wipes. A quantitative culture was used to assess the adequacy of disinfection. A survey was separately sent out to VR device technicians at other pediatric health care institutes to compare the methods of disinfection and how they were established. ResultsBoth products achieved adequate disinfection of the treated surfaces; however, a greater log-kill was achieved on nonporous surfaces than on the porous surfaces. Alcohol performed better than quaternary ammonium on porous surfaces. The survey respondents reported a wide variability in disinfection processes with only 1 person reporting an established standard operating procedure. ConclusionsDisinfection can be achieved through the use of either isopropyl alcohol or quaternary ammonium products. Porous surfaces showed lesser log-kill rates than the nonporous surfaces, indicating that the use of an added barrier may be of benefit and should be a point of future research. Given the variability in the disinfection process across health care systems, a standard operating procedure is proposed
    corecore