359 research outputs found

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Simvastatin in critically ill patients with Covid-19

    No full text
    Abstract: BackgroundThe efficacy of simvastatin in critically ill patients with coronavirus disease 2019 (Covid-19) is unclear.MethodsIn an ongoing international, multifactorial, adaptive platform, randomized, controlled trial, we evaluated simvastatin (80 mg daily) as compared with no statin (control) in critically ill patients with Covid-19 who were not receiving statins at baseline. The primary outcome was respiratory and cardiovascular organ support-free days, assessed on an ordinal scale combining in-hospital death (assigned a value of -1) and days free of organ support through day 21 in survivors; the analyis used a Bayesian hierarchical ordinal model. The adaptive design included prespecified statistical stopping criteria for superiority (>99% posterior probability that the odds ratio was >1) and futility (>95% posterior probability that the odds ratio was <1.2).ResultsEnrollment began on October 28, 2020. On January 8, 2023, enrollment was closed on the basis of a low anticipated likelihood that prespecified stopping criteria would be met as Covid-19 cases decreased. The final analysis included 2684 critically ill patients. The median number of organ support-free days was 11 (interquartile range, -1 to 17) in the simvastatin group and 7 (interquartile range, -1 to 16) in the control group; the posterior median adjusted odds ratio was 1.15 (95% credible interval, 0.98 to 1.34) for simvastatin as compared with control, yielding a 95.9% posterior probability of superiority. At 90 days, the hazard ratio for survival was 1.12 (95% credible interval, 0.95 to 1.32), yielding a 91.9% posterior probability of superiority of simvastatin. The results of secondary analyses were consistent with those of the primary analysis. Serious adverse events, such as elevated levels of liver enzymes and creatine kinase, were reported more frequently with simvastatin than with control.ConclusionsAlthough recruitment was stopped because cases had decreased, among critically ill patients with Covid-19, simvastatin did not meet the prespecified criteria for superiority to control

    A persistent major mutation in canonical jasmonate signaling is embedded in an herbivory-elicited gene network

    Full text link
    When insect herbivores attack plants, elicitors from oral secretions and regurgitants (OS) enter wounds during feeding, eliciting defense responses. These generally require plant jasmonate (JA) signaling, specifically, a jasmonoyl-L-isoleucine (JA-Ile) burst, for their activation and are well studied in the native tobacco Nicotiana attenuata. We used intraspecific diversity captured in a 26-parent MAGIC population planted in nature and an updated genome assembly to impute natural variation in the OS-elicited JA-Ile burst linked to a mutation in the JA-Ile biosynthetic gene NaJAR4. Experiments revealed that NaJAR4 variants were associated with higher fitness in the absence of herbivores but compromised foliar defenses, with two NaJAR homologues (4 and 6) complementing each other spatially and temporally. From decade-long seed collections of natural populations, we uncovered enzymatically inactive variants occurring at variable frequencies, consistent with a balancing selection regime maintaining variants. Integrative analyses of OS-induced transcriptomes and metabolomes of natural accessions revealed that NaJAR4 is embedded in a nonlinear complex gene coexpression network orchestrating responses to OS, which we tested by silencing four hub genes in two connected coexpressed networks and examining their OS-elicited metabolic responses. Lines silenced in two hub genes (NaGLR and NaFB67) co-occurring in the NaJAR4/6 module showed responses proportional to JA-Ile accumulations; two from an adjacent module (NaERF and NaFB61) had constitutively expressed defenses with high resistance. We infer that mutations with large fitness consequences can persist in natural populations due to compensatory responses from gene networks, which allow for diversification in conserved signaling pathways and are generally consistent with predictions of an omnigene model

    Evaluating potential of leaf reflectance spectra to monitor plant genetic variation

    No full text
    Abstract Remote sensing of vegetation by spectroscopy is increasingly used to characterize trait distributions in plant communities. How leaves interact with electromagnetic radiation is determined by their structure and contents of pigments, water, and abundant dry matter constituents like lignins, phenolics, and proteins. High-resolution (“hyperspectral”) spectroscopy can characterize trait variation at finer scales, and may help to reveal underlying genetic variation—information important for assessing the potential of populations to adapt to global change. Here, we use a set of 360 inbred genotypes of the wild coyote tobacco Nicotiana attenuata: wild accessions, recombinant inbred lines (RILs), and transgenic lines (TLs) with targeted changes to gene expression, to dissect genetic versus non-genetic influences on variation in leaf spectra across three experiments. We calculated leaf reflectance from hand-held field spectroradiometer measurements covering visible to short-wave infrared wavelengths of electromagnetic radiation (400–2500 nm) using a standard radiation source and backgrounds, resulting in a small and quantifiable measurement uncertainty. Plants were grown in more controlled (glasshouse) or more natural (field) environments, and leaves were measured both on- and off-plant with the measurement set-up thus also in more to less controlled environmental conditions. Entire spectra varied across genotypes and environments. We found that the greatest variance in leaf reflectance was explained by between-experiment and non-genetic between-sample differences, with subtler and more specific variation distinguishing groups of genotypes. The visible spectral region was most variable, distinguishing experimental settings as well as groups of genotypes within experiments, whereas parts of the short-wave infrared may vary more specifically with genotype. Overall, more genetically variable plant populations also showed more varied leaf spectra. We highlight key considerations for the application of field spectroscopy to assess genetic variation in plant populations

    Risk of COVID-19 after natural infection or vaccinationResearch in context

    No full text
    Summary: Background: While vaccines have established utility against COVID-19, phase 3 efficacy studies have generally not comprehensively evaluated protection provided by previous infection or hybrid immunity (previous infection plus vaccination). Individual patient data from US government-supported harmonized vaccine trials provide an unprecedented sample population to address this issue. We characterized the protective efficacy of previous SARS-CoV-2 infection and hybrid immunity against COVID-19 early in the pandemic over three-to six-month follow-up and compared with vaccine-associated protection. Methods: In this post-hoc cross-protocol analysis of the Moderna, AstraZeneca, Janssen, and Novavax COVID-19 vaccine clinical trials, we allocated participants into four groups based on previous-infection status at enrolment and treatment: no previous infection/placebo; previous infection/placebo; no previous infection/vaccine; and previous infection/vaccine. The main outcome was RT-PCR-confirmed COVID-19 >7–15 days (per original protocols) after final study injection. We calculated crude and adjusted efficacy measures. Findings: Previous infection/placebo participants had a 92% decreased risk of future COVID-19 compared to no previous infection/placebo participants (overall hazard ratio [HR] ratio: 0.08; 95% CI: 0.05–0.13). Among single-dose Janssen participants, hybrid immunity conferred greater protection than vaccine alone (HR: 0.03; 95% CI: 0.01–0.10). Too few infections were observed to draw statistical inferences comparing hybrid immunity to vaccine alone for other trials. Vaccination, previous infection, and hybrid immunity all provided near-complete protection against severe disease. Interpretation: Previous infection, any hybrid immunity, and two-dose vaccination all provided substantial protection against symptomatic and severe COVID-19 through the early Delta period. Thus, as a surrogate for natural infection, vaccination remains the safest approach to protection. Funding: National Institutes of Health

    Factors influencing terrestriality in primates of the Americas and Madagascar

    Get PDF
    Among mammals, the order Primates is exceptional in having a high taxonomic richness in which the taxa are arboreal, semiterrestrial, or terrestrial. Although habitual terrestriality is pervasive among the apes and African and Asian monkeys (catarrhines), it is largely absent among monkeys of the Americas (platyrrhines), as well as galagos, lemurs, and lorises (strepsirrhines), which are mostly arboreal. Numerous ecological drivers and species-specific factors are suggested to set the conditions for an evolutionary shift from arboreality to terrestriality, and current environmental conditions may provide analogous scenarios to those transitional periods. Therefore, we investigated predominantly arboreal, diurnal primate genera from the Americas and Madagascar that lack fully terrestrial taxa, to determine whether ecological drivers (habitat canopy cover, predation risk, maximum temperature, precipitation, primate species richness, human population density, and distance to roads) or species-specific traits (bodymass, group size, and degree of frugivory) associate with increased terrestriality. We collated 150,961 observation hours across 2,227 months from 47 species at 20 sites in Madagascar and 48 sites in the Americas. Multiple factors were associated with ground use in these otherwise arboreal species, including increased temperature, a decrease in canopy cover, a dietary shift away from frugivory, and larger group size. These factors mostly explain intraspecific differences in terrestriality. As humanity modifies habitats and causes climate change, our results suggest that species already inhabiting hot, sparsely canopied sites, and exhibiting more generalized diets, are more likely to shift toward greater ground use

    Tidewater-glacier response to supraglacial lake drainage

    No full text
    The flow speed of the Greenland Ice Sheet changes dramatically in inland regions when surface meltwater drains to the bed. But ice-sheet discharge to the ocean is dominated by fast-flowing outlet glaciers, where the effect of increasing surface melt on annual discharge is unknown. Observations of a supraglacial lake drainage at Helheim Glacier, and a consequent velocity pulse propagating down-glacier, provide a natural experiment for assessing the impact of changes in injected meltwater, and allow us to interrogate the subglacial hydrological system. We find a highly efficient subglacial drainage system, such that summertime lake drainage has little net effect on ice discharge. Our results question the validity of common remote-sensing approaches for inferring subglacial conditions, knowledge of which is needed for improved projections of sea-level rise
    • 

    corecore