411 research outputs found

    Biologically-informed interpretable deep learning techniques for BMI prediction and gene interaction detection

    Get PDF
    The analysis of genetic point mutations at the population level can offer insights into the genetic basis of human traits, which in turn could potentially lead to new diagnostic and treatment options for heritable diseases. However, existing genetic data analysis methods tend to rely on simplifying assumptions that ignore nonlinear interactions between variants. The ability to model and describe nonlinear genetic interactions could lead to both improved trait prediction and enhanced understanding of the underlying biology. Deep Learning models offer the possibility of automatically learning complex nonlinear genetic architectures, but it is currently unclear how best to optimise them for genetic data. It is also essential that any models be able to “explain” what they have learned in order for them to be used for genetic discovery or clinical applications, which can be difficult due to the black-box nature of DL predictors. This thesis addresses a number of methodological gaps in applying explainable DL models end-to-end on variant-level genetic data. We propose novel methods for encoding genetic data for deep learning applications and show that feature encodings designed specifically for genetic variants offer the possibility of improved model efficiency and performance. We then benchmark a variety of models for the prediction of Body Mass Index using data from the UK Biobank, yielding insights into DL performance in this domain. We then propose a series of novel DL model interpretation methods with features optimised for biological insights. We first show how these can be used to validate that the network has automatically replicated existing knowledge, and then illustrate their ability to detect complex nonlinear genetic interactions that influence BMI in our cohort. Overall, we show that DL model training and interpretation procedures that have been optimised for genetic data can be used to yield new insights into disease aetiology

    La méthode d'évaluation contingente comme outil néolibéral de planification environnementale

    Get PDF
    Ce mémoire vise une critique politique de la méthode d'évaluation contingente, qui permet l'évaluation monétaire de biens environnementaux non marchands à des fins de planification environnementale. Cette méthode repose sur un sondage dans lequel on demande à un échantillon d'individus la valeur monétaire qu'ils seraient prêts à payer pour protéger un bien environnemental d'intérêt. La valeur totale est ensuite intégrée à une analyse avantages-coûts, comme coût s'il s'agit d'un projet d'exploitation ou comme bénéfice s'il s'agit d'un projet de conservation. On a largement critiqué la fiabilité de la méthode ainsi que ses fondements théoriques. Dans plusieurs études, les répondants se montrent peu sensibles aux prix et aux quantités proposés, ce qui amène des chercheurs à conclure que les résultats des questionnaires ne peuvent être utilisés dans des analyses avantages-coûts, mais devraient plutôt être interprétés comme ceux d'un pseudo-référendum. Or, face à ce constat, plutôt que de proposer l'abandon du cadre économique au profit de consultations ouvertement politiques, on suggère d'en multiplier l'utilisation afin d'habituer les répondants. Un tel entêtement à utiliser la méthode nous a amené à la considérer comme un outil qui, loin d'être neutre, s'inscrit dans un projet politique beaucoup plus vaste. Ce projet, c'est celui du néolibéralisme. Et puisque ses politiques sont entièrement basées sur la rationalité utilitariste, il lui faut produire des sujets adaptés, soit des individus opérant des choix uniquement en fonction d'un calcul d'intérêt personnel, telles des entreprises. C'est pourquoi le néolibéralisme peut être appréhendé, à la façon de Michel Foucault, comme un mode de subjectivation. En nous appuyant sur les réflexions de cet auteur dans Naissance de la biopolitique, de même que sur les écrits de plusieurs de ses héritiers, nous développons l'argument que la démultiplication des sondages d'évaluation monétaire participe à la mise en forme de sujets néolibéraux. Forcés de réfléchir les questions environnementales dans un cadre économique, ceux-ci n'entrevoient plus d'autre justice que ce qu'ils ont déclaré vouloir payer et les biens et services environnementaux qu'ils reçoivent en retour. L'utilisation de la méthode a donc pour effet de dépolitiser le rapport État-citoyen. Afin de sortir de cette rationalité néolibérale a-démocratique, on peut avoir recours à d'autres outils de planification environnementale, tels que l'évaluation multicritères, qui permet de se départir de l'aspect monétaire des évaluations.\ud ______________________________________________________________________________ \ud MOTS-CLÉS DE L’AUTEUR : méthode d'évaluation contingente, biens environnementaux non marchands, planification environnementale, néolibéralisme, mode de subjectivation, Michel Foucaul

    Development of the correction procedure for High Volume Instrument elongation measurement.

    Get PDF
    Cotton spinning mills need high-quality fibers to maintain their manufacturing efficiency. Machinery throughput is increasing and it could translate into more processes with higher breaking stress. Consequently, more fibers are susceptible to breaking or damage. To face this problem, breeders must develop new varieties whose fibers can better withstand this mechanical stress. The main tool utilized in cotton breeding programs is the High Volume Instrument (HVI), which reports in a short time measurements such as micronaire, length, color, and strength. This instrument can also determine fiber elongation, but there is no current correction method for it. Both elongation and strength factor into the work-to-break of fibers, which plays a direct role in fiber breakage and spinning performance. The objective of this work was to develop cotton elongation standards, devise a correction procedure for HVI lines, evaluate measurement stability, and validate these results with a set of independent samples. Two commercial bales, one with low and one with high HVI elongation, were identified as potential elongation standards. The potential standards were produced and evaluated. After validation, they were used to correct HVI lines against Stelometer (STrength-ELOngation-METER) measurements. An independent set of samples was tested on corrected HVIs to confirm the effectiveness of the elongation corrected measurements. The HVI data were at least as good as the Stelometer data, with increased data acquisition speed and precision. This research can help cotton breeders to improve fiber elongation and strength at the same time, resulting in better fibers for yarn spinning

    Mining rare Earth elements: Identifying the plant species most threatened by ore extraction in an insular hotspot

    Get PDF
    Conservation efforts in global biodiversity hotspots often face a common predicament: an urgent need for conservation action hampered by a significant lack of knowledge about that biodiversity. In recent decades, the computerisation of primary biodiversity data worldwide has provided the scientific community with raw material to increase our understanding of the shared natural heritage. These datasets, however, suffer from a lot of geographical and taxonomic inaccuracies. Automated tools developed to enhance their reliability have shown that detailed expert examination remains the best way to achieve robust and exhaustive datasets. In New Caledonia, one of the most important biodiversity hotspots worldwide, the plant diversity inventory is still underway, and most taxa awaiting formal description are narrow endemics, hence by definition hard to discern in the datasets. In the meantime, anthropogenic pressures, such as nickel-ore mining, are threatening the unique ultramafic ecosystems at an increasing rate. The conservation challenge is therefore a race against time, as the rarest species must be identified and protected before they vanish. In this study, based on all available datasets and resources, we applied a workflow capable of highlighting the lesser known taxa. The main challenges addressed were to aggregate all data available worldwide, and tackle the geographical and taxonomic biases, avoiding the data loss resulting from automated filtering. Every doubtful specimen went through a careful taxonomic analysis by a local and international taxonomist panel. Geolocation of the whole dataset was achieved through dataset cross-checking, local botanists’ field knowledge, and historical material examination. Field studies were also conducted to clarify the most unresolved taxa. With the help of this method and by analysing over 85,000 data, we were able to double the number of known narrow endemic taxa, elucidate 68 putative new species, and update our knowledge of the rarest species’ distributions so as to promote conservation measures

    DNA barcoding identifies cryptic animal tool materials

    Get PDF
    Funding: Biotechnology and Biological Sciences Research Council (BBSRC) (Grants BB/G023913/1 and BB/G023913/2 to C.R., and studentship to B.C.K.), the School of Biology at the University of St Andrews (studentships to M.P.S. and B.C.K.), and the Leverhulme Trust (Grant RPG-2015-273 to P.M.H.).Some animals fashion tools or constructions out of plant materials to aid foraging, reproduction, self-maintenance, or protection. Their choice of raw materials can affect the structure and properties of the resulting artifacts, with considerable fitness consequences. Documenting animals’ material preferences is challenging, however, as manufacture behavior is often difficult to observe directly, and materials may be processed so heavily that they lack identifying features. Here, we use DNA barcoding to identify, from just a few recovered tool specimens, the plant species New Caledonian crows (Corvus moneduloides) use for crafting elaborate hooked stick tools in one of our long-term study populations. The method succeeded where extensive fieldwork using an array of conventional approaches—including targeted observations, camera traps, radio-tracking, bird-mounted video cameras, and behavioral experiments with wild and temporarily captive subjects—had failed. We believe that DNA barcoding will prove useful for investigating many other tool and construction behaviors, helping to unlock significant research potential across a wide range of study systems.Publisher PDFPeer reviewe

    Genomic signature to guide adjuvant chemotherapy treatment decisions for early breast cancer patients in France: a cost-effectiveness analysis

    Get PDF
    IntroductionChemotherapy (CT) is commonly used as an adjuvant treatment for women with early breast cancer (BC). However, not all patients benefit from CT, while all are exposed to its short- and long-term toxicity. The Oncotype DX® test assesses cancer-related gene expression to estimate the risk of BC recurrence and predict the benefit of chemotherapy. The aim of this study was to estimate, from the French National Health Insurance (NHI) perspective, the cost-effectiveness of the Oncotype DX® test compared to standard of care (SoC; involving clinicopathological risk assessment only) among women with early, hormone receptor-positive, human epidermal growth factor receptor 2-negative BC considered at high clinicopathological risk of recurrence.MethodsClinical outcomes and costs were estimated over a lifetime horizon based on a two-component model that comprised a short-term decision tree representing the adjuvant treatment choice guided by the therapeutic decision support strategy (Oncotype DX® test or SoC) and a Markov model to capture long-term outcomes.ResultsIn the base case, the Oncotype DX® test reduced CT use by 55.2% and resulted in 0.337 incremental quality-adjusted life-years gained and cost savings of €3,412 per patient, compared with SoC. Being more effective and less costly than SoC, Oncotype DX® testing was the dominant strategy.DiscussionWidespread implementation of Oncotype DX® testing would improve patient care, provide equitable access to more personalized medicine, and bring cost savings to the health system

    The efficacy of therapeutic plasma exchange in COVID-19 patients on endothelial tightness in vitro is hindered by platelet activation

    Get PDF
    Coronavirus disease (COVID)-19 is characterised in particular by vascular inflammation with platelet activation and endothelial dysfunction. During the pandemic, therapeutic plasma exchange (TPE) was used to reduce the cytokine storm in the circulation and delay or prevent ICU admissions. This procedure consists in replacing the inflammatory plasma by fresh frozen plasma from healthy donors and is often used to remove pathogenic molecules from plasma (autoantibodies, immune complexes, toxins, etc.). This study uses an in vitro model of platelet-endothelial cell interactions to assess changes in these interactions by plasma from COVID-19 patients and to determine the extent to which TPE reduces such changes. We noted that exposure of an endothelial monolayer to plasmas from COVID-19 patients post-TPE induced less endothelial permeability compared to COVID-19 control plasmas. Yet, when endothelial cells were co-cultured with healthy platelets and exposed to the plasma, the beneficial effect of TPE on endothelial permeability was somewhat reduced. This was linked to platelet and endothelial phenotypical activation but not with inflammatory molecule secretion. Our work shows that, in parallel to the beneficial removal of inflammatory factors from the circulation, TPE triggers cellular activation which may partly explain the reduction in efficacy in terms of endothelial dysfunction. These findings provide new insights for improving the efficacy of TPE using supporting treatments targeting platelet activation, for instance
    • …
    corecore