207 research outputs found

    A Gap Analysis Methodology for Collecting Crop Genepools: A Case Study with Phaseolus Beans

    Get PDF
    Background The wild relatives of crops represent a major source of valuable traits for crop improvement. These resources are threatened by habitat destruction, land use changes, and other factors, requiring their urgent collection and long-term availability for research and breeding from ex situ collections. We propose a method to identify gaps in ex situ collections (i.e. gap analysis) of crop wild relatives as a means to guide efficient and effective collecting activities. Methodology/Principal Findings The methodology prioritizes among taxa based on a combination of sampling, geographic, and environmental gaps. We apply the gap analysis methodology to wild taxa of the Phaseolus genepool. Of 85 taxa, 48 (56.5%) are assigned high priority for collecting due to lack of, or under-representation, in genebanks, 17 taxa are given medium priority for collecting, 15 low priority, and 5 species are assessed as adequately represented in ex situ collections. Gap “hotspots”, representing priority target areas for collecting, are concentrated in central Mexico, although the narrow endemic nature of a suite of priority species adds a number of specific additional regions to spatial collecting priorities. Conclusions/Significance Results of the gap analysis method mostly align very well with expert opinion of gaps in ex situ collections, with only a few exceptions. A more detailed prioritization of taxa and geographic areas for collection can be achieved by including in the analysis predictive threat factors, such as climate change or habitat destruction, or by adding additional prioritization filters, such as the degree of relatedness to cultivated species (i.e. ease of use in crop breeding). Furthermore, results for multiple crop genepools may be overlaid, which would allow a global analysis of gaps in ex situ collections of the world's plant genetic resource

    Decoding accuracy in supplementary motor cortex correlates with perceptual sensitivity to tactile roughness

    Get PDF
    Perceptual sensitivity to tactile roughness varies across individuals for the same degree of roughness. A number of neurophysiological studies have investigated the neural substrates of tactile roughness perception, but the neural processing underlying the strong individual differences in perceptual roughness sensitivity remains unknown. In this study, we explored the human brain activation patterns associated with the behavioral discriminability of surface texture roughness using functional magnetic resonance imaging (fMRI). First, a wholebrain searchlight multi-voxel pattern analysis (MVPA) was used to find brain regions from which we could decode roughness information. The searchlight MVPA revealed four brain regions showing significant decoding results: the supplementary motor area (SMA), contralateral postcentral gyrus (S1), and superior portion of the bilateral temporal pole (STP). Next, we evaluated the behavioral roughness discrimination sensitivity of each individual using the just-noticeable difference (JND) and correlated this with the decoding accuracy in each of the four regions. We found that only the SMA showed a significant correlation between neuronal decoding accuracy and JND across individuals; Participants with a smaller JND (i.e., better discrimination ability) exhibited higher decoding accuracy from their voxel response patterns in the SMA. Our findings suggest that multivariate voxel response patterns presented in the SMA represent individual perceptual sensitivity to tactile roughness and people with greater perceptual sensitivity to tactile roughness are likely to have more distinct neural representations of different roughness levels in their SMA. © 2015 Kim et al.close0

    Why Are Outcomes Different for Registry Patients Enrolled Prospectively and Retrospectively? Insights from the Global Anticoagulant Registry in the FIELD-Atrial Fibrillation (GARFIELD-AF).

    Get PDF
    Background: Retrospective and prospective observational studies are designed to reflect real-world evidence on clinical practice, but can yield conflicting results. The GARFIELD-AF Registry includes both methods of enrolment and allows analysis of differences in patient characteristics and outcomes that may result. Methods and Results: Patients with atrial fibrillation (AF) and ≥1 risk factor for stroke at diagnosis of AF were recruited either retrospectively (n = 5069) or prospectively (n = 5501) from 19 countries and then followed prospectively. The retrospectively enrolled cohort comprised patients with established AF (for a least 6, and up to 24 months before enrolment), who were identified retrospectively (and baseline and partial follow-up data were collected from the emedical records) and then followed prospectively between 0-18 months (such that the total time of follow-up was 24 months; data collection Dec-2009 and Oct-2010). In the prospectively enrolled cohort, patients with newly diagnosed AF (≤6 weeks after diagnosis) were recruited between Mar-2010 and Oct-2011 and were followed for 24 months after enrolment. Differences between the cohorts were observed in clinical characteristics, including type of AF, stroke prevention strategies, and event rates. More patients in the retrospectively identified cohort received vitamin K antagonists (62.1% vs. 53.2%) and fewer received non-vitamin K oral anticoagulants (1.8% vs . 4.2%). All-cause mortality rates per 100 person-years during the prospective follow-up (starting the first study visit up to 1 year) were significantly lower in the retrospective than prospectively identified cohort (3.04 [95% CI 2.51 to 3.67] vs . 4.05 [95% CI 3.53 to 4.63]; p = 0.016). Conclusions: Interpretations of data from registries that aim to evaluate the characteristics and outcomes of patients with AF must take account of differences in registry design and the impact of recall bias and survivorship bias that is incurred with retrospective enrolment. Clinical Trial Registration: - URL: http://www.clinicaltrials.gov . Unique identifier for GARFIELD-AF (NCT01090362)

    Elective Cancer Surgery in COVID-19-Free Surgical Pathways During the SARS-CoV-2 Pandemic: An International, Multicenter, Comparative Cohort Study.

    Get PDF
    PURPOSE: As cancer surgery restarts after the first COVID-19 wave, health care providers urgently require data to determine where elective surgery is best performed. This study aimed to determine whether COVID-19-free surgical pathways were associated with lower postoperative pulmonary complication rates compared with hospitals with no defined pathway. PATIENTS AND METHODS: This international, multicenter cohort study included patients who underwent elective surgery for 10 solid cancer types without preoperative suspicion of SARS-CoV-2. Participating hospitals included patients from local emergence of SARS-CoV-2 until April 19, 2020. At the time of surgery, hospitals were defined as having a COVID-19-free surgical pathway (complete segregation of the operating theater, critical care, and inpatient ward areas) or no defined pathway (incomplete or no segregation, areas shared with patients with COVID-19). The primary outcome was 30-day postoperative pulmonary complications (pneumonia, acute respiratory distress syndrome, unexpected ventilation). RESULTS: Of 9,171 patients from 447 hospitals in 55 countries, 2,481 were operated on in COVID-19-free surgical pathways. Patients who underwent surgery within COVID-19-free surgical pathways were younger with fewer comorbidities than those in hospitals with no defined pathway but with similar proportions of major surgery. After adjustment, pulmonary complication rates were lower with COVID-19-free surgical pathways (2.2% v 4.9%; adjusted odds ratio [aOR], 0.62; 95% CI, 0.44 to 0.86). This was consistent in sensitivity analyses for low-risk patients (American Society of Anesthesiologists grade 1/2), propensity score-matched models, and patients with negative SARS-CoV-2 preoperative tests. The postoperative SARS-CoV-2 infection rate was also lower in COVID-19-free surgical pathways (2.1% v 3.6%; aOR, 0.53; 95% CI, 0.36 to 0.76). CONCLUSION: Within available resources, dedicated COVID-19-free surgical pathways should be established to provide safe elective cancer surgery during current and before future SARS-CoV-2 outbreaks

    Elective cancer surgery in COVID-19-free surgical pathways during the SARS-CoV-2 pandemic: An international, multicenter, comparative cohort study

    Get PDF
    PURPOSE As cancer surgery restarts after the first COVID-19 wave, health care providers urgently require data to determine where elective surgery is best performed. This study aimed to determine whether COVID-19–free surgical pathways were associated with lower postoperative pulmonary complication rates compared with hospitals with no defined pathway. PATIENTS AND METHODS This international, multicenter cohort study included patients who underwent elective surgery for 10 solid cancer types without preoperative suspicion of SARS-CoV-2. Participating hospitals included patients from local emergence of SARS-CoV-2 until April 19, 2020. At the time of surgery, hospitals were defined as having a COVID-19–free surgical pathway (complete segregation of the operating theater, critical care, and inpatient ward areas) or no defined pathway (incomplete or no segregation, areas shared with patients with COVID-19). The primary outcome was 30-day postoperative pulmonary complications (pneumonia, acute respiratory distress syndrome, unexpected ventilation). RESULTS Of 9,171 patients from 447 hospitals in 55 countries, 2,481 were operated on in COVID-19–free surgical pathways. Patients who underwent surgery within COVID-19–free surgical pathways were younger with fewer comorbidities than those in hospitals with no defined pathway but with similar proportions of major surgery. After adjustment, pulmonary complication rates were lower with COVID-19–free surgical pathways (2.2% v 4.9%; adjusted odds ratio [aOR], 0.62; 95% CI, 0.44 to 0.86). This was consistent in sensitivity analyses for low-risk patients (American Society of Anesthesiologists grade 1/2), propensity score–matched models, and patients with negative SARS-CoV-2 preoperative tests. The postoperative SARS-CoV-2 infection rate was also lower in COVID-19–free surgical pathways (2.1% v 3.6%; aOR, 0.53; 95% CI, 0.36 to 0.76). CONCLUSION Within available resources, dedicated COVID-19–free surgical pathways should be established to provide safe elective cancer surgery during current and before future SARS-CoV-2 outbreaks

    Global overview of the management of acute cholecystitis during the COVID-19 pandemic (CHOLECOVID study)

    Get PDF
    Background: This study provides a global overview of the management of patients with acute cholecystitis during the initial phase of the COVID-19 pandemic. Methods: CHOLECOVID is an international, multicentre, observational comparative study of patients admitted to hospital with acute cholecystitis during the COVID-19 pandemic. Data on management were collected for a 2-month study interval coincident with the WHO declaration of the SARS-CoV-2 pandemic and compared with an equivalent pre-pandemic time interval. Mediation analysis examined the influence of SARS-COV-2 infection on 30-day mortality. Results: This study collected data on 9783 patients with acute cholecystitis admitted to 247 hospitals across the world. The pandemic was associated with reduced availability of surgical workforce and operating facilities globally, a significant shift to worse severity of disease, and increased use of conservative management. There was a reduction (both absolute and proportionate) in the number of patients undergoing cholecystectomy from 3095 patients (56.2 per cent) pre-pandemic to 1998 patients (46.2 per cent) during the pandemic but there was no difference in 30-day all-cause mortality after cholecystectomy comparing the pre-pandemic interval with the pandemic (13 patients (0.4 per cent) pre-pandemic to 13 patients (0.6 per cent) pandemic; P = 0.355). In mediation analysis, an admission with acute cholecystitis during the pandemic was associated with a non-significant increased risk of death (OR 1.29, 95 per cent c.i. 0.93 to 1.79, P = 0.121). Conclusion: CHOLECOVID provides a unique overview of the treatment of patients with cholecystitis across the globe during the first months of the SARS-CoV-2 pandemic. The study highlights the need for system resilience in retention of elective surgical activity. Cholecystectomy was associated with a low risk of mortality and deferral of treatment results in an increase in avoidable morbidity that represents the non-COVID cost of this pandemic

    Familial hypercholesterolaemia in children and adolescents from 48 countries: a cross-sectional study

    Get PDF
    Background: Approximately 450 000 children are born with familial hypercholesterolaemia worldwide every year, yet only 2·1% of adults with familial hypercholesterolaemia were diagnosed before age 18 years via current diagnostic approaches, which are derived from observations in adults. We aimed to characterise children and adolescents with heterozygous familial hypercholesterolaemia (HeFH) and understand current approaches to the identification and management of familial hypercholesterolaemia to inform future public health strategies. Methods: For this cross-sectional study, we assessed children and adolescents younger than 18 years with a clinical or genetic diagnosis of HeFH at the time of entry into the Familial Hypercholesterolaemia Studies Collaboration (FHSC) registry between Oct 1, 2015, and Jan 31, 2021. Data in the registry were collected from 55 regional or national registries in 48 countries. Diagnoses relying on self-reported history of familial hypercholesterolaemia and suspected secondary hypercholesterolaemia were excluded from the registry; people with untreated LDL cholesterol (LDL-C) of at least 13·0 mmol/L were excluded from this study. Data were assessed overall and by WHO region, World Bank country income status, age, diagnostic criteria, and index-case status. The main outcome of this study was to assess current identification and management of children and adolescents with familial hypercholesterolaemia. Findings: Of 63 093 individuals in the FHSC registry, 11 848 (18·8%) were children or adolescents younger than 18 years with HeFH and were included in this study; 5756 (50·2%) of 11 476 included individuals were female and 5720 (49·8%) were male. Sex data were missing for 372 (3·1%) of 11 848 individuals. Median age at registry entry was 9·6 years (IQR 5·8-13·2). 10 099 (89·9%) of 11 235 included individuals had a final genetically confirmed diagnosis of familial hypercholesterolaemia and 1136 (10·1%) had a clinical diagnosis. Genetically confirmed diagnosis data or clinical diagnosis data were missing for 613 (5·2%) of 11 848 individuals. Genetic diagnosis was more common in children and adolescents from high-income countries (9427 [92·4%] of 10 202) than in children and adolescents from non-high-income countries (199 [48·0%] of 415). 3414 (31·6%) of 10 804 children or adolescents were index cases. Familial-hypercholesterolaemia-related physical signs, cardiovascular risk factors, and cardiovascular disease were uncommon, but were more common in non-high-income countries. 7557 (72·4%) of 10 428 included children or adolescents were not taking lipid-lowering medication (LLM) and had a median LDL-C of 5·00 mmol/L (IQR 4·05-6·08). Compared with genetic diagnosis, the use of unadapted clinical criteria intended for use in adults and reliant on more extreme phenotypes could result in 50-75% of children and adolescents with familial hypercholesterolaemia not being identified. Interpretation: Clinical characteristics observed in adults with familial hypercholesterolaemia are uncommon in children and adolescents with familial hypercholesterolaemia, hence detection in this age group relies on measurement of LDL-C and genetic confirmation. Where genetic testing is unavailable, increased availability and use of LDL-C measurements in the first few years of life could help reduce the current gap between prevalence and detection, enabling increased use of combination LLM to reach recommended LDL-C targets early in life

    The DUNE Far Detector Interim Design Report, Volume 3: Dual-Phase Module

    Get PDF
    The DUNE IDR describes the proposed physics program and technical designs of the DUNE far detector modules in preparation for the full TDR to be published in 2019. It is intended as an intermediate milestone on the path to a full TDR, justifying the technical choices that flow down from the high-level physics goals through requirements at all levels of the Project. These design choices will enable the DUNE experiment to make the ground-breaking discoveries that will help to answer fundamental physics questions. Volume 3 describes the dual-phase module's subsystems, the technical coordination required for its design, construction, installation, and integration, and its organizational structure

    Improved risk stratification of patients with atrial fibrillation: an integrated GARFIELD-AF tool for the prediction of mortality, stroke and bleed in patients with and without anticoagulation.

    Get PDF
    OBJECTIVES: To provide an accurate, web-based tool for stratifying patients with atrial fibrillation to facilitate decisions on the potential benefits/risks of anticoagulation, based on mortality, stroke and bleeding risks. DESIGN: The new tool was developed, using stepwise regression, for all and then applied to lower risk patients. C-statistics were compared with CHA2DS2-VASc using 30-fold cross-validation to control for overfitting. External validation was undertaken in an independent dataset, Outcome Registry for Better Informed Treatment of Atrial Fibrillation (ORBIT-AF). PARTICIPANTS: Data from 39 898 patients enrolled in the prospective GARFIELD-AF registry provided the basis for deriving and validating an integrated risk tool to predict stroke risk, mortality and bleeding risk. RESULTS: The discriminatory value of the GARFIELD-AF risk model was superior to CHA2DS2-VASc for patients with or without anticoagulation. C-statistics (95% CI) for all-cause mortality, ischaemic stroke/systemic embolism and haemorrhagic stroke/major bleeding (treated patients) were: 0.77 (0.76 to 0.78), 0.69 (0.67 to 0.71) and 0.66 (0.62 to 0.69), respectively, for the GARFIELD-AF risk models, and 0.66 (0.64-0.67), 0.64 (0.61-0.66) and 0.64 (0.61-0.68), respectively, for CHA2DS2-VASc (or HAS-BLED for bleeding). In very low to low risk patients (CHA2DS2-VASc 0 or 1 (men) and 1 or 2 (women)), the CHA2DS2-VASc and HAS-BLED (for bleeding) scores offered weak discriminatory value for mortality, stroke/systemic embolism and major bleeding. C-statistics for the GARFIELD-AF risk tool were 0.69 (0.64 to 0.75), 0.65 (0.56 to 0.73) and 0.60 (0.47 to 0.73) for each end point, respectively, versus 0.50 (0.45 to 0.55), 0.59 (0.50 to 0.67) and 0.55 (0.53 to 0.56) for CHA2DS2-VASc (or HAS-BLED for bleeding). Upon validation in the ORBIT-AF population, C-statistics showed that the GARFIELD-AF risk tool was effective for predicting 1-year all-cause mortality using the full and simplified model for all-cause mortality: C-statistics 0.75 (0.73 to 0.77) and 0.75 (0.73 to 0.77), respectively, and for predicting for any stroke or systemic embolism over 1 year, C-statistics 0.68 (0.62 to 0.74). CONCLUSIONS: Performance of the GARFIELD-AF risk tool was superior to CHA2DS2-VASc in predicting stroke and mortality and superior to HAS-BLED for bleeding, overall and in lower risk patients. The GARFIELD-AF tool has the potential for incorporation in routine electronic systems, and for the first time, permits simultaneous evaluation of ischaemic stroke, mortality and bleeding risks. CLINICAL TRIAL REGISTRATION: URL: http://www.clinicaltrials.gov. Unique identifier for GARFIELD-AF (NCT01090362) and for ORBIT-AF (NCT01165710)
    corecore