12 research outputs found

    Quantitative Evidence for Revising the Definition of Primary Graft Dysfunction after Lung Transplant

    Get PDF
    RATIONALE: Primary graft dysfunction (PGD) is a form of acute lung injury that occurs after lung transplantation. The definition of PGD was standardized in 2005. Since that time, clinical practice has evolved, and this definition is increasingly used as a primary endpoint for clinical trials; therefore, validation is warranted. OBJECTIVES: We sought to determine whether refinements to the 2005 consensus definition could further improve construct validity. METHODS: Data from the Lung Transplant Outcomes Group multicenter cohort were used to compare variations on the PGD definition, including alternate oxygenation thresholds, inclusion of additional severity groups, and effects of procedure type and mechanical ventilation. Convergent and divergent validity were compared for mortality prediction and concurrent lung injury biomarker discrimination. MEASUREMENTS AND MAIN RESULTS: A total of 1,179 subjects from 10 centers were enrolled from 2007 to 2012. Median length of follow-up was 4 years (interquartile range = 2.4-5.9). No mortality differences were noted between no PGD (grade 0) and mild PGD (grade 1). Significantly better mortality discrimination was evident for all definitions using later time points (48, 72, or 48-72 hours; P < 0.001). Biomarker divergent discrimination was superior when collapsing grades 0 and 1. Additional severity grades, use of mechanical ventilation, and transplant procedure type had minimal or no effect on mortality or biomarker discrimination. CONCLUSIONS: The PGD consensus definition can be simplified by combining lower PGD grades. Construct validity of grading was present regardless of transplant procedure type or use of mechanical ventilation. Additional severity categories had minimal impact on mortality or biomarker discrimination

    Pervasive gaps in Amazonian ecological research

    Get PDF
    Biodiversity loss is one of the main challenges of our time,1,2 and attempts to address it require a clear un derstanding of how ecological communities respond to environmental change across time and space.3,4 While the increasing availability of global databases on ecological communities has advanced our knowledge of biodiversity sensitivity to environmental changes,5–7 vast areas of the tropics remain understudied.8–11 In the American tropics, Amazonia stands out as the world’s most diverse rainforest and the primary source of Neotropical biodiversity,12 but it remains among the least known forests in America and is often underrepre sented in biodiversity databases.13–15 To worsen this situation, human-induced modifications16,17 may elim inate pieces of the Amazon’s biodiversity puzzle before we can use them to understand how ecological com munities are responding. To increase generalization and applicability of biodiversity knowledge,18,19 it is thus crucial to reduce biases in ecological research, particularly in regions projected to face the most pronounced environmental changes. We integrate ecological community metadata of 7,694 sampling sites for multiple or ganism groups in a machine learning model framework to map the research probability across the Brazilian Amazonia, while identifying the region’s vulnerability to environmental change. 15%–18% of the most ne glected areas in ecological research are expected to experience severe climate or land use changes by 2050. This means that unless we take immediate action, we will not be able to establish their current status, much less monitor how it is changing and what is being lostinfo:eu-repo/semantics/publishedVersio

    Pervasive gaps in Amazonian ecological research

    Get PDF

    Pervasive gaps in Amazonian ecological research

    Get PDF
    Biodiversity loss is one of the main challenges of our time,1,2 and attempts to address it require a clear understanding of how ecological communities respond to environmental change across time and space.3,4 While the increasing availability of global databases on ecological communities has advanced our knowledge of biodiversity sensitivity to environmental changes,5,6,7 vast areas of the tropics remain understudied.8,9,10,11 In the American tropics, Amazonia stands out as the world's most diverse rainforest and the primary source of Neotropical biodiversity,12 but it remains among the least known forests in America and is often underrepresented in biodiversity databases.13,14,15 To worsen this situation, human-induced modifications16,17 may eliminate pieces of the Amazon's biodiversity puzzle before we can use them to understand how ecological communities are responding. To increase generalization and applicability of biodiversity knowledge,18,19 it is thus crucial to reduce biases in ecological research, particularly in regions projected to face the most pronounced environmental changes. We integrate ecological community metadata of 7,694 sampling sites for multiple organism groups in a machine learning model framework to map the research probability across the Brazilian Amazonia, while identifying the region's vulnerability to environmental change. 15%–18% of the most neglected areas in ecological research are expected to experience severe climate or land use changes by 2050. This means that unless we take immediate action, we will not be able to establish their current status, much less monitor how it is changing and what is being lost

    “A reference genome assembly and adaptive trait analysis of Castanea mollissima ‘Vanuxem,’ a source of resistance to chestnut blight in restoration breeding”

    No full text
    Forest tree species are increasingly subject to severe mortalities from exotic pests, pathogens, and invasive organisms, accelerated by climate change. Such forest health issues are threatening multiple species and ecosystem sustainability globally. One of the most extreme examples of forest ecosystem disruption is the extirpation of the American chestnut (Castanea dentata) caused by the introduction of chestnut blight and root rot pathogens from Asia. Asian species of chestnut are being employed as donors of disease resistance genes to restore native chestnut species in North America and Europe. To aid in the restoration of threatened chestnut species, we present the assembly of a reference genome for Chinese chestnut (C. mollissima) "Vanuxem," one of the donors of disease resistance for American chestnut restoration. From the de novo assembly of the complete genome (725.2 Mb in 14,110 contigs), over half of the sequences have been anchored to the 12 genetic linkage groups. The anchoring is validated by genetic maps and in situ hybridization to chromosomes. We demonstrate the value of the genome as a platform for research and species restoration, including signatures of selection differentiating American chestnut from Chinese chestnut to identify important candidate genes for disease resistance, comparisons of genome organization with other woody species, and a genome-wide examination of progress in backcross breeding for blight resistance. This reference assembly should prove of great value in the understanding, improvement, and restoration of chestnut species

    Delivering clinical trials at home: protocol, design and implementation of a direct-to-family paediatric lupus trial

    No full text
    Introduction Direct-to-family clinical trials efficiently provide data while reducing the participation burden for children and their families. Although these trials can offer significant advantages over traditional clinical trials, the process of designing and implementing direct-to-family studies is poorly defined, especially in children with rheumatic disease. This paper provides lessons learnt from the design and implementation of a self-controlled, direct-to-family pilot trial aimed to evaluate the effects of a medication management device on adherence to hydroxychloroquine in paediatric SLE.Methods Several design features accommodate a direct-to-family approach. Participants meeting eligibility criteria from across the USA were identified a priori through a disease registry, and all outcome data are collected remotely. The primary outcome (medication adherence) is evaluated using electronic medication event-monitoring, plasma drug levels, patient questionnaires and pill counts. Secondary and exploratory endpoints include (1) lupus disease activity measured by a remote SLE Disease Activity Index examination and the Systemic Lupus Activity Questionnaire; and (2) hydroxychloroquine pharmacokinetics and pharmacodynamics. Recruitment of the initial target of 20 participants was achieved within 10 days. Due to initial recruitment success, enrolment was increased to 26 participants. Additional participants who were interested were placed on a waiting list in case of dropouts during the study.Discussion and dissemination Direct-to-family trials offer several advantages but present unique challenges. Lessons learnt from the protocol development, design, and implementation of this trial will inform future direct-to-family trials for children and adults with rheumatic diseases. Additionally, the data collected remotely in this trial will provide critical information regarding the accuracy of teleresearch in lupus, the impact of adherence to hydroxychloroquine on disease activity and a pharmacokinetic analysis to inform paediatric-specific dosing of hydroxychloroquine.Trial registration number ClinicalTrials.gov Registry (NCT04358302)

    First low-frequency Einstein@Home all-sky search for continuous gravitational waves in Advanced LIGO data

    No full text
    International audienceWe report results of a deep all-sky search for periodic gravitational waves from isolated neutron stars in data from the first Advanced LIGO observing run. This search investigates the low frequency range of Advanced LIGO data, between 20 and 100 Hz, much of which was not explored in initial LIGO. The search was made possible by the computing power provided by the volunteers of the Einstein@Home project. We find no significant signal candidate and set the most stringent upper limits to date on the amplitude of gravitational wave signals from the target population, corresponding to a sensitivity depth of 48.7  [1/Hz]. At the frequency of best strain sensitivity, near 100 Hz, we set 90% confidence upper limits of 1.8×10-25. At the low end of our frequency range, 20 Hz, we achieve upper limits of 3.9×10-24. At 55 Hz we can exclude sources with ellipticities greater than 10-5 within 100 pc of Earth with fiducial value of the principal moment of inertia of 1038  kg m2

    First narrow-band search for continuous gravitational waves from known pulsars in advanced detector data

    No full text
    International audienceSpinning neutron stars asymmetric with respect to their rotation axis are potential sources of continuous gravitational waves for ground-based interferometric detectors. In the case of known pulsars a fully coherent search, based on matched filtering, which uses the position and rotational parameters obtained from electromagnetic observations, can be carried out. Matched filtering maximizes the signal-to-noise (SNR) ratio, but a large sensitivity loss is expected in case of even a very small mismatch between the assumed and the true signal parameters. For this reason, narrow-band analysis methods have been developed, allowing a fully coherent search for gravitational waves from known pulsars over a fraction of a hertz and several spin-down values. In this paper we describe a narrow-band search of 11 pulsars using data from Advanced LIGO’s first observing run. Although we have found several initial outliers, further studies show no significant evidence for the presence of a gravitational wave signal. Finally, we have placed upper limits on the signal strain amplitude lower than the spin-down limit for 5 of the 11 targets over the bands searched; in the case of J1813-1749 the spin-down limit has been beaten for the first time. For an additional 3 targets, the median upper limit across the search bands is below the spin-down limit. This is the most sensitive narrow-band search for continuous gravitational waves carried out so far

    Search for intermediate mass black hole binaries in the first observing run of Advanced LIGO

    No full text
    International audienceDuring their first observational run, the two Advanced LIGO detectors attained an unprecedented sensitivity, resulting in the first direct detections of gravitational-wave signals produced by stellar-mass binary black hole systems. This paper reports on an all-sky search for gravitational waves (GWs) from merging intermediate mass black hole binaries (IMBHBs). The combined results from two independent search techniques were used in this study: the first employs a matched-filter algorithm that uses a bank of filters covering the GW signal parameter space, while the second is a generic search for GW transients (bursts). No GWs from IMBHBs were detected; therefore, we constrain the rate of several classes of IMBHB mergers. The most stringent limit is obtained for black holes of individual mass 100  M⊙, with spins aligned with the binary orbital angular momentum. For such systems, the merger rate is constrained to be less than 0.93  Gpc−3 yr−1 in comoving units at the 90% confidence level, an improvement of nearly 2 orders of magnitude over previous upper limits

    Model comparison from LIGO–Virgo data on GW170817’s binary components and consequences for the merger remnant

    No full text
    International audienceGW170817 is the very first observation of gravitational waves originating from the coalescence of two compact objects in the mass range of neutron stars, accompanied by electromagnetic counterparts, and offers an opportunity to directly probe the internal structure of neutron stars. We perform Bayesian model selection on a wide range of theoretical predictions for the neutron star equation of state. For the binary neutron star hypothesis, we find that we cannot rule out the majority of theoretical models considered. In addition, the gravitational-wave data alone does not rule out the possibility that one or both objects were low-mass black holes. We discuss the possible outcomes in the case of a binary neutron star merger, finding that all scenarios from prompt collapse to long-lived or even stable remnants are possible. For long-lived remnants, we place an upper limit of 1.9 kHz on the rotation rate. If a black hole was formed any time after merger and the coalescing stars were slowly rotating, then the maximum baryonic mass of non-rotating neutron stars is at most , and three equations of state considered here can be ruled out. We obtain a tighter limit of for the case that the merger results in a hypermassive neutron star
    corecore