1,884 research outputs found

    Bayesian optimization for materials design

    Full text link
    We introduce Bayesian optimization, a technique developed for optimizing time-consuming engineering simulations and for fitting machine learning models on large datasets. Bayesian optimization guides the choice of experiments during materials design and discovery to find good material designs in as few experiments as possible. We focus on the case when materials designs are parameterized by a low-dimensional vector. Bayesian optimization is built on a statistical technique called Gaussian process regression, which allows predicting the performance of a new design based on previously tested designs. After providing a detailed introduction to Gaussian process regression, we introduce two Bayesian optimization methods: expected improvement, for design problems with noise-free evaluations; and the knowledge-gradient method, which generalizes expected improvement and may be used in design problems with noisy evaluations. Both methods are derived using a value-of-information analysis, and enjoy one-step Bayes-optimality

    A 19-SNP coronary heart disease gene score profile in subjects with type 2 diabetes: the coronary heart disease risk in type 2 diabetes (CoRDia study) study baseline characteristics

    Get PDF
    Background: The coronary risk in diabetes (CoRDia) trial (n = 211) compares the effectiveness of usual diabetes care with a self-management intervention (SMI), with and without personalised risk information (including genetics), on clinical and behavioural outcomes. Here we present an assessment of randomisation, the cardiac risk genotyping assay, and the genetic characteristics of the recruits. / Methods: Ten-year coronary heart disease (CHD) risk was calculated using the UKPDS score. Genetic CHD risk was determined by genotyping 19 single nucleotide polymorphisms (SNPs) using Randox’s Cardiac Risk Prediction Array and calculating a gene score (GS). Accuracy of the array was assessed by genotyping a subset of pre-genotyped samples (n = 185). / Results: Overall, 10-year CHD risk ranged from 2–72 % but did not differ between the randomisation groups (p = 0.13). The array results were 99.8 % concordant with the pre-determined genotypes. The GS did not differ between the Caucasian participants in the CoRDia SMI plus risk group (n = 66) (p = 0.80) and a sample of UK healthy men (n = 1360). The GS was also associated with LDL-cholesterol (p = 0.05) and family history (p = 0.03) in a sample of UK healthy men (n = 1360). / Conclusions: CHD risk is high in this group of T2D subjects. The risk array is an accurate genotyping assay, and is suitable for estimating an individual’s genetic CHD risk. / Trial registration: This study has been registered at ClinicalTrials.gov; registration identifier NCT0189178

    Moving towards a cure in genetics : what is needed to bring somatic gene therapy to the clinic?

    Get PDF
    Clinical trials using somatic gene editing (e.g., CRISPR-Cas9) have started in Europe and the United States and may provide safe and effective treatment and cure, not only for cancers but also for some monogenic conditions. In a workshop at the 2018 European Human Genetics Conference, the challenges of bringing somatic gene editing therapies to the clinic were discussed. The regulatory process needs to be considered early in the clinical development pathway to produce the data necessary to support the approval by the European Medicines Agency. The roles and responsibilities for geneticists may include counselling to explain the treatment possibilities and safety interpretation.Peer reviewe

    'Reaching the hard to reach' - lessons learned from the VCS (voluntary and community Sector). A qualitative study.

    Get PDF
    Background The notion 'hard to reach' is a contested and ambiguous term that is commonly used within the spheres of social care and health, especially in discourse around health and social inequalities. There is a need to address health inequalities and to engage in services the marginalized and socially excluded sectors of society. Methods This paper describes a pilot study involving interviews with representatives from eight Voluntary and Community Sector (VCS) organisations . The purpose of the study was to explore the notion of 'hard to reach' and perceptions of the barriers and facilitators to accessing services for 'hard to reach' groups from a voluntary and community sector perspective. Results The 'hard to reach' may include drug users, people living with HIV, people from sexual minority communities, asylum seekers, refugees, people from black and ethnic minority communities, and homeless people although defining the notion of the 'hard to reach' is not straight forward. It may be that certain groups resist engaging in treatment services and are deemed hard to reach by a particular service or from a societal stance. There are a number of potential barriers for people who may try and access services, including people having bad experiences in the past; location and opening times of services and how services are funded and managed. A number of areas of commonality are found in terms of how access to services for 'hard to reach' individuals and groups could be improved including: respectful treatment of service users, establishing trust with service users, offering service flexibility, partnership working with other organisations and harnessing service user involvement. Conclusions: If health services are to engage with groups that are deemed 'hard to reach' and marginalised from mainstream health services, the experiences and practices for engagement from within the VCS may serve as useful lessons for service improvement for statutory health services

    NAVI: Category-Agnostic Image Collections with High-Quality 3D Shape and Pose Annotations

    Full text link
    Recent advances in neural reconstruction enable high-quality 3D object reconstruction from casually captured image collections. Current techniques mostly analyze their progress on relatively simple image collections where Structure-from-Motion (SfM) techniques can provide ground-truth (GT) camera poses. We note that SfM techniques tend to fail on in-the-wild image collections such as image search results with varying backgrounds and illuminations. To enable systematic research progress on 3D reconstruction from casual image captures, we propose NAVI: a new dataset of category-agnostic image collections of objects with high-quality 3D scans along with per-image 2D-3D alignments providing near-perfect GT camera parameters. These 2D-3D alignments allow us to extract accurate derivative annotations such as dense pixel correspondences, depth and segmentation maps. We demonstrate the use of NAVI image collections on different problem settings and show that NAVI enables more thorough evaluations that were not possible with existing datasets. We believe NAVI is beneficial for systematic research progress on 3D reconstruction and correspondence estimation. Project page: https://navidataset.github.ioComment: NeurIPS 2023 camera ready. Project page: https://navidataset.github.i

    Pathological and ecological host consequences of infection by an introduced fish parasite

    Get PDF
    The infection consequences of the introduced cestode fish parasite Bothriocephalus acheilognathi were studied in a cohort of wild, young-of-the-year common carp Cyprinus carpio that lacked co-evolution with the parasite. Within the cohort, parasite prevalence was 42% and parasite burdens were up to 12% body weight. Pathological changes within the intestinal tract of parasitized carp included distension of the gut wall, epithelial compression and degeneration, pressure necrosis and varied inflammatory changes. These were most pronounced in regions containing the largest proportion of mature proglottids. Although the body lengths of parasitized and non-parasitized fish were not significantly different, parasitized fish were of lower body condition and reduced weight compared to non-parasitized conspecifics. Stable isotope analysis (δ15N and δ13C) revealed trophic impacts associated with infection, particularly for δ15N where values for parasitized fish were significantly reduced as their parasite burden increased. In a controlled aquarium environment where the fish were fed ad libitum on an identical food source, there was no significant difference in values of δ15N and δ13C between parasitized and non-parasitized fish. The growth consequences remained, however, with parasitized fish growing significantly slower than non-parasitized fish, with their feeding rate (items s−1) also significantly lower. Thus, infection by an introduced parasite had multiple pathological, ecological and trophic impacts on a host with no experience of the parasite

    Ezrin interacts with the SARS coronavirus spike protein and restrains infection at the entry stage

    Get PDF
    Š 2012 Millet et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.Background: Entry of Severe Acute Respiratory Syndrome coronavirus (SARS-CoV) and its envelope fusion with host cell membrane are controlled by a series of complex molecular mechanisms, largely dependent on the viral envelope glycoprotein Spike (S). There are still many unknowns on the implication of cellular factors that regulate the entry process. Methodology/Principal Findings: We performed a yeast two-hybrid screen using as bait the carboxy-terminal endodomain of S, which faces the cytosol during and after opening of the fusion pore at early stages of the virus life cycle. Here we show that the ezrin membrane-actin linker interacts with S endodomain through the F1 lobe of its FERM domain and that both the eight carboxy-terminal amino-acids and a membrane-proximal cysteine cluster of S endodomain are important for this interaction in vitro. Interestingly, we found that ezrin is present at the site of entry of S-pseudotyped lentiviral particles in Vero E6 cells. Targeting ezrin function by small interfering RNA increased S-mediated entry of pseudotyped particles in epithelial cells. Furthermore, deletion of the eight carboxy-terminal amino acids of S enhanced S-pseudotyped particles infection. Expression of the ezrin dominant negative FERM domain enhanced cell susceptibility to infection by SARS-CoV and S pseudotyped particles and potentiated S-dependent membrane fusion. Conclusions/Significance: Ezrin interacts with SARS-CoV S endodomain and limits virus entry and fusion. Our data present a novel mechanism involving a cellular factor in the regulation of S-dependent early events of infection.This work was supported by the Research Grant Council of Hong Kong (RGC#760208)and the RESPARI project of the International Network of Pasteur Institutes

    Chess databases as a research vehicle in psychology : modeling large data

    Get PDF
    The game of chess has often been used for psychological investigations, particularly in cognitive science. The clear-cut rules and well-defined environment of chess provide a model for investigations of basic cognitive processes, such as perception, memory, and problem solving, while the precise rating system for the measurement of skill has enabled investigations of individual differences and expertise-related effects. In the present study, we focus on another appealing feature of chess—namely, the large archive databases associated with the game. The German national chess database presented in this study represents a fruitful ground for the investigation of multiple longitudinal research questions, since it collects the data of over 130,000 players and spans over 25 years. The German chess database collects the data of all players, including hobby players, and all tournaments played. This results in a rich and complete collection of the skill, age, and activity of the whole population of chess players in Germany. The database therefore complements the commonly used expertise approach in cognitive science by opening up new possibilities for the investigation of multiple factors that underlie expertise and skill acquisition. Since large datasets are not common in psychology, their introduction also raises the question of optimal and efficient statistical analysis. We offer the database for download and illustrate how it can be used by providing concrete examples and a step-by-step tutorial using different statistical analyses on a range of topics, including skill development over the lifetime, birth cohort effects, effects of activity and inactivity on skill, and gender differences

    The Impact of Global Warming and Anoxia on Marine Benthic Community Dynamics: an Example from the Toarcian (Early Jurassic)

    Get PDF
    The Pliensbachian-Toarcian (Early Jurassic) fossil record is an archive of natural data of benthic community response to global warming and marine long-term hypoxia and anoxia. In the early Toarcian mean temperatures increased by the same order of magnitude as that predicted for the near future; laminated, organic-rich, black shales were deposited in many shallow water epicontinental basins; and a biotic crisis occurred in the marine realm, with the extinction of approximately 5% of families and 26% of genera. High-resolution quantitative abundance data of benthic invertebrates were collected from the Cleveland Basin (North Yorkshire, UK), and analysed with multivariate statistical methods to detect how the fauna responded to environmental changes during the early Toarcian. Twelve biofacies were identified. Their changes through time closely resemble the pattern of faunal degradation and recovery observed in modern habitats affected by anoxia. All four successional stages of community structure recorded in modern studies are recognised in the fossil data (i.e. Stage III: climax; II: transitional; I: pioneer; 0: highly disturbed). Two main faunal turnover events occurred: (i) at the onset of anoxia, with the extinction of most benthic species and the survival of a few adapted to thrive in low-oxygen conditions (Stages I to 0) and (ii) in the recovery, when newly evolved species colonized the re-oxygenated soft sediments and the path of recovery did not retrace of pattern of ecological degradation (Stages I to II). The ordination of samples coupled with sedimentological and palaeotemperature proxy data indicate that the onset of anoxia and the extinction horizon coincide with both a rise in temperature and sea level. Our study of how faunal associations co-vary with long and short term sea level and temperature changes has implications for predicting the long-term effects of “dead zones” in modern oceans

    Information and Discriminability as Measures of Reliability of Sensory Coding

    Get PDF
    Response variability is a fundamental issue in neural coding because it limits all information processing. The reliability of neuronal coding is quantified by various approaches in different studies. In most cases it is largely unclear to what extent the conclusions depend on the applied reliability measure, making a comparison across studies almost impossible. We demonstrate that different reliability measures can lead to very different conclusions even if applied to the same set of data: in particular, we applied information theoretical measures (Shannon information capacity and Kullback-Leibler divergence) as well as a discrimination measure derived from signal-detection theory to the responses of blowfly photoreceptors which represent a well established model system for sensory information processing. We stimulated the photoreceptors with white noise modulated light intensity fluctuations of different contrasts. Surprisingly, the signal-detection approach leads to a safe discrimination of the photoreceptor response even when the response signal-to-noise ratio (SNR) is well below unity whereas Shannon information capacity and also Kullback-Leibler divergence indicate a very low performance. Applying different measures, can, therefore, lead to very different interpretations concerning the system's coding performance. As a consequence of the lower sensitivity compared to the signal-detection approach, the information theoretical measures overestimate internal noise sources and underestimate the importance of photon shot noise. We stress that none of the used measures and, most likely no other measure alone, allows for an unbiased estimation of a neuron's coding properties. Therefore the applied measure needs to be selected with respect to the scientific question and the analyzed neuron's functional context
    • …
    corecore