1,104 research outputs found
Recommended from our members
VarSight: prioritizing clinically reported variants with binary classification algorithms.
BackgroundWhen applying genomic medicine to a rare disease patient, the primary goal is to identify one or more genomic variants that may explain the patient's phenotypes. Typically, this is done through annotation, filtering, and then prioritization of variants for manual curation. However, prioritization of variants in rare disease patients remains a challenging task due to the high degree of variability in phenotype presentation and molecular source of disease. Thus, methods that can identify and/or prioritize variants to be clinically reported in the presence of such variability are of critical importance.MethodsWe tested the application of classification algorithms that ingest variant annotations along with phenotype information for predicting whether a variant will ultimately be clinically reported and returned to a patient. To test the classifiers, we performed a retrospective study on variants that were clinically reported to 237 patients in the Undiagnosed Diseases Network.ResultsWe treated the classifiers as variant prioritization systems and compared them to four variant prioritization algorithms and two single-measure controls. We showed that the trained classifiers outperformed all other tested methods with the best classifiers ranking 72% of all reported variants and 94% of reported pathogenic variants in the top 20.ConclusionsWe demonstrated how freely available binary classification algorithms can be used to prioritize variants even in the presence of real-world variability. Furthermore, these classifiers outperformed all other tested methods, suggesting that they may be well suited for working with real rare disease patient datasets
Predicting mental imagery based BCI performance from personality, cognitive profile and neurophysiological patterns
Mental-Imagery based Brain-Computer Interfaces (MI-BCIs) allow their users to send commands
to a computer using their brain-activity alone (typically measured by ElectroEncephaloGraphy—
EEG), which is processed while they perform specific mental tasks. While very
promising, MI-BCIs remain barely used outside laboratories because of the difficulty
encountered by users to control them. Indeed, although some users obtain good control
performances after training, a substantial proportion remains unable to reliably control an
MI-BCI. This huge variability in user-performance led the community to look for predictors of
MI-BCI control ability. However, these predictors were only explored for motor-imagery
based BCIs, and mostly for a single training session per subject. In this study, 18 participants
were instructed to learn to control an EEG-based MI-BCI by performing 3 MI-tasks, 2
of which were non-motor tasks, across 6 training sessions, on 6 different days. Relationships
between the participants’ BCI control performances and their personality, cognitive
profile and neurophysiological markers were explored. While no relevant relationships with
neurophysiological markers were found, strong correlations between MI-BCI performances
and mental-rotation scores (reflecting spatial abilities) were revealed. Also, a predictive
model of MI-BCI performance based on psychometric questionnaire scores was proposed.
A leave-one-subject-out cross validation process revealed the stability and reliability of this
model: it enabled to predict participants’ performance with a mean error of less than 3
points. This study determined how users’ profiles impact their MI-BCI control ability and
thus clears the way for designing novel MI-BCI training protocols, adapted to the profile of
each user
Using decision analysis to support proactive management of emerging infectious wildlife diseases
Despite calls for improved responses to emerging infectious diseases in wildlife, management is seldom considered until a disease has been detected in affected populations. Reactive approaches may limit the potential for control and increase total response costs. An alternative, proactive management framework can identify immediate actions that reduce future impacts even before a disease is detected, and plan subsequent actions that are conditional on disease emergence. We identify four main obstacles to developing proactive management strategies for the newly discovered salamander pathogen Batrachochytrium salamandrivorans (Bsal). Given that uncertainty is a hallmark of wildlife disease management and that associated decisions are often complicated by multiple competing objectives, we advocate using decision analysis to create and evaluate trade-offs between proactive (pre-emergence) and reactive (post-emergence) management options. Policy makers and natural resource agency personnel can apply principles from decision analysis to improve strategies for countering emerging infectious diseases
Resilience trinity: safeguarding ecosystem functioning and services across three different time horizons and decision contexts
Ensuring ecosystem resilience is an intuitive approach to safeguard the functioning of ecosystems and hence the future provisioning of ecosystem services (ES). However, resilience is a multi-faceted concept that is difficult to operationalize. Focusing on resilience mechanisms, such as diversity, network architectures or adaptive capacity, has recently been suggested as means to operationalize resilience. Still, the focus on mechanisms is not specific enough. We suggest a conceptual framework, resilience trinity, to facilitate management based on resilience mechanisms in three distinctive decision contexts and time-horizons: i) reactive, when there is an imminent threat to ES resilience and a high pressure to act, ii) adjustive, when the threat is known in general but there is still time to adapt management, and iii) provident, when time horizons are very long and the nature of the threats is uncertain, leading to a low willingness to act. Resilience has different interpretations and implications at these different time horizons, which also prevail in different disciplines. Social ecology, ecology, and engineering are often implicitly focussing on provident, adjustive, or reactive resilience, respectively, but these different notions and of resilience and their corresponding social, ecological, and economic trade-offs need to be reconciled. Otherwise, we keep risking unintended consequences of reactive actions, or shying away from provident action because of uncertainties that cannot be reduced. The suggested trinity of time horizons and their decision contexts could help ensuring that longer-term management actions are not missed while urgent threats to ES are given priority
Ultrafast Evolution and Loss of CRISPRs Following a Host Shift in a Novel Wildlife Pathogen, Mycoplasma gallisepticum
Measureable rates of genome evolution are well documented in human pathogens but are less well understood in bacterial pathogens in the wild, particularly during and after host switches. Mycoplasma gallisepticum (MG) is a pathogenic bacterium that has evolved predominantly in poultry and recently jumped to wild house finches (Carpodacus mexicanus), a common North American songbird. For the first time we characterize the genome and measure rates of genome evolution in House Finch isolates of MG, as well as in poultry outgroups. Using whole-genome sequences of 12 House Finch isolates across a 13-year serial sample and an additional four newly sequenced poultry strains, we estimate a nucleotide diversity in House Finch isolates of only ∼2% of ancestral poultry strains and a nucleotide substitution rate of 0.8−1.2×10−5 per site per year both in poultry and in House Finches, an exceptionally fast rate rivaling some of the highest estimates reported thus far for bacteria. We also found high diversity and complete turnover of CRISPR arrays in poultry MG strains prior to the switch to the House Finch host, but after the invasion of House Finches there is progressive loss of CRISPR repeat diversity, and recruitment of novel CRISPR repeats ceases. Recent (2007) House Finch MG strains retain only ∼50% of the CRISPR repertoire founding (1994–95) strains and have lost the CRISPR–associated genes required for CRISPR function. Our results suggest that genome evolution in bacterial pathogens of wild birds can be extremely rapid and in this case is accompanied by apparent functional loss of CRISPRs
Reduced fire severity offers near-term buffer to climate-driven declines in conifer resilience across the western United States
Increasing fire severity and warmer, drier postfire conditions are making forests in the western United States (West) vulnerable to ecological transformation. Yet, the relative importance of and interactions between these drivers of forest change remain unresolved, particularly over upcoming decades. Here, we assess how the interactive impacts of changing climate and wildfire activity influenced conifer regeneration after 334 wildfires, using a dataset of postfire conifer regeneration from 10,230 field plots. Our findings highlight declining regeneration capacity across the West over the past four decades for the eight dominant conifer species studied. Postfire regeneration is sensitive to high-severity fire, which limits seed availability, and postfire climate, which influences seedling establishment. In the near-term, projected differences in recruitment probability between low- and high-severity fire scenarios were larger than projected climate change impacts for most species, suggesting that reductions in fire severity, and resultant impacts on seed availability, could partially offset expected climate-driven declines in postfire regeneration. Across 40 to 42% of the study area, we project postfire conifer regeneration to be likely following low-severity but not high-severity fire under future climate scenarios (2031 to 2050). However, increasingly warm, dry climate conditions are projected to eventually outweigh the influence of fire severity and seed availability. The percent of the study area considered unlikely to experience conifer regeneration, regardless of fire severity, increased from 5% in 1981 to 2000 to 26 to 31% by mid-century, highlighting a limited time window over which management actions that reduce fire severity may effectively support postfire conifer regeneration. © 2023 the Author(s)
Native diversity buffers against severity of non-native tree invasions
Determining the drivers of non-native plant invasions is critical for managing native ecosystems and limiting the spread of invasive species. Tree invasions in particular have been relatively overlooked, even though they have the potential to transform ecosystems and economies. Here, leveraging global tree databases, we explore how the phylogenetic and functional diversity of native tree communities, human pressure and the environment influence the establishment of non-native tree species and the subsequent invasion severity. We find that anthropogenic factors are key to predicting whether a location is invaded, but that invasion severity is underpinned by native diversity, with higher diversity predicting lower invasion severity. Temperature and precipitation emerge as strong predictors of invasion strategy, with non-native species invading successfully when they are similar to the native community in cold or dry extremes. Yet, despite the influence of these ecological forces in determining invasion strategy, we find evidence that these patterns can be obscured by human activity, with lower ecological signal in areas with higher proximity to shipping ports. Our global perspective of non-native tree invasion highlights that human drivers influence non-native tree presence, and that native phylogenetic and functional diversity have a critical role in the establishment and spread of subsequent invasions
- …