4,532 research outputs found

    Systematic comparison of ranking aggregation methods for gene lists in experimental results

    Get PDF
    MOTIVATION: A common experimental output in biomedical science is a list of genes implicated in a given biological process or disease. The gene lists resulting from a group of studies answering the same, or similar, questions can be combined by ranking aggregation methods to find a consensus or a more reliable answer. Evaluating a ranking aggregation method on a specific type of data before using it is required to support the reliability since the property of a dataset can influence the performance of an algorithm. Such evaluation on gene lists is usually based on a simulated database because of the lack of a known truth for real data. However, simulated datasets tend to be too small compared to experimental data and neglect key features, including heterogeneity of quality, relevance and the inclusion of unranked lists. RESULTS: In this study, a group of existing methods and their variations that are suitable for meta-analysis of gene lists are compared using simulated and real data. Simulated data were used to explore the performance of the aggregation methods as a function of emulating the common scenarios of real genomic data, with various heterogeneity of quality, noise level and a mix of unranked and ranked data using 20 000 possible entities. In addition to the evaluation with simulated data, a comparison using real genomic data on the SARS-CoV-2 virus, cancer (non-small cell lung cancer) and bacteria (macrophage apoptosis) was performed. We summarize the results of our evaluation in a simple flowchart to select a ranking aggregation method, and in an automated implementation using the meta-analysis by information content algorithm to infer heterogeneity of data quality across input datasets. AVAILABILITY AND IMPLEMENTATION: The code for simulated data generation and running edited version of algorithms: https://github.com/baillielab/comparison_of_RA_methods. Code to perform an optimal selection of methods based on the results of this review, using the MAIC algorithm to infer the characteristics of an input dataset, can be downloaded here: https://github.com/baillielab/maic. An online service for running MAIC: https://baillielab.net/maic. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online

    Climate change impacts on US agriculture and forestry: benefits of global climate stabilization

    Get PDF
    Increasing atmospheric carbon dioxide levels, higher temperatures, altered precipitation patterns, and other climate change impacts have already begun to affect US agriculture and forestry, with impacts expected to become more substantial in the future. There have been numerous studies of climate change impacts on agriculture or forestry, but relatively little research examining the long-term net impacts of a stabilization scenario relative to a case with unabated climate change. We provide an analysis of the potential benefits of global climate change mitigation for US agriculture and forestry through 2100, accounting for landowner decisions regarding land use, crop mix, and management practices. The analytic approach involves a combination of climate models, a crop process model (EPIC), a dynamic vegetation model used for forests (MC1), and an economic model of the US forestry and agricultural sector (FASOM-GHG). We find substantial impacts on productivity, commodity markets, and consumer and producer welfare for the stabilization scenario relative to unabated climate change, though the magnitude and direction of impacts vary across regions and commodities. Although there is variability in welfare impacts across climate simulations, we find positive net benefits from stabilization in all cases, with cumulative impacts ranging from 32.7billionto32.7 billion to 54.5 billion over the period 2015–2100. Our estimates contribute to the literature on potential benefits of GHG mitigation and can help inform policy decisions weighing alternative mitigation and adaptation actions.United States. Environmental Protection Agency. Climate Change Division (Contract EP-BPA-12-H-0023, Call Order EP-B13H-00143

    Theorising interprofessional pedagogic evaluation: framework for evaluating the impact of interprofessional CPD on practice change

    No full text
    This paper outlines the development of a conceptual framework to guide the evaluation of the impact of the pedagogy employed in continuing professional development for professionals in education, health and social care. The work is developed as part of the Centre for Excellence in Teaching and Learning: Interprofessional Learning across the Public Sector (CETL: IPPS) at the University of Southampton. The paper briefly outlines the field for pedagogic research and comments on the underpinning theories that have so far been used to guide research into interprofessional learning (IPL). It maps out the development of interprofessional CPD in its specific context as part of the CETL: IPPS with its links to a local authority undergoing service reorganisation and the role of the continuing professional development (CPD) in effecting change. It then brings together a theoretical framework with the potential toexplore, explain and evaluate the essential features of the model of pedagogy used in interprofessional CPD, in which professionals from education have for the first time been included alongside those from health and social care. The framework draws upon elements of situated learning theory, Activity Theory and Dreier’s work (2002, 1999) on trajectories of participation, particularly Personal Action Potency. By combining the resulting analytic framework with an adapted version of an established evaluation model, a theoretically-driven, practicable evaluation matrix is developed. The matrix has potential use in evaluating the impact of pedagogic input on practice change. The paper models a process for developing a conceptual framework to steer pedagogic evaluation. Such a process and the resulting matrix may be of use to other researchers who are similarly developing pedagogic evaluation

    Complex circular subsidence structures in tephra deposited on large blocks of ice: Varða tuff cone, Öræfajökull, Iceland

    Get PDF
    Several broadly circular structures up to 16 m in diameter, into which higher strata have sagged and locally collapsed, are present in a tephra outcrop on southwest Öræfajökull, southern Iceland. The tephra was sourced in a nearby basaltic tuff cone at Varða. The structures have not previously been described in tuff cones, and they probably formed by the melting out of large buried blocks of ice emplaced during a preceding jökulhlaup that may have been triggered by a subglacial eruption within the Öræfajökull ice cap. They are named ice-melt subsidence structures, and they are analogous to kettle holes that are commonly found in proglacial sandurs and some lahars sourced in ice-clad volcanoes. The internal structure is better exposed in the Varða examples because of an absence of fluvial infilling and reworking, and erosion of the outcrop to reveal the deeper geometry. The ice-melt subsidence structures at Varða are a proxy for buried ice. They are the only known evidence for a subglacial eruption and associated jökulhlaup that created the ice blocks. The recognition of such structures elsewhere will be useful in reconstructing more complete regional volcanic histories as well as for identifying ice-proximal settings during palaeoenvironmental investigations

    Percutaneous Transendocardial Delivery of Self-complementary Adeno-associated Virus 6 Achieves Global Cardiac Gene Transfer in Canines

    Get PDF
    Achieving efficient cardiac gene transfer in a large animal model has proven to be technically challenging. Previous strategies have used cardiopulmonary bypass or dual catheterization with the aid of vasodilators to deliver vectors, such as adenovirus, adeno-associated virus (AAV), or plasmid DNA. Although single-stranded AAV (ssAAV) vectors have shown the greatest promise, they suffer from delayed expression, which might be circumvented using self-complementary vectors. We sought to optimize cardiac gene transfer using a percutaneous transendocardial injection catheter to deliver adeno-associated viral vectors to the canine myocardium. Four vectors were evaluated-ssAAV9, self-complementary AAV9 (scAAV9), scAAV8, scAAV6-so that comparison could be made between single-stranded and self-complementary vectors as well as among serotypes 9, 8, and 6. We demonstrate that scAAV is superior to ssAAV and that AAV 6 is superior to the other serotypes evaluated. Biodistribution studies revealed that vector genome copies were 15-4,000 times more abundant in the heart than in any other organ for scAAV6. Percutaneous transendocardial injection of scAAV6 is a safe, effective method to achieve efficient cardiac gene transfer

    Social democracy, embeddedness and decommodification: On the conceptual innovations and intellectual affiliations of Karl Polanyi

    Get PDF
    Of the several debates that revolve around the work of the economic historian and political economist Karl Polanyi, one that continues to exercise minds concerns his analysis of, and political attitudes toward, post-war capitalism and the welfare state. Simplified a little, it is a debate with two sides. To borrow Iván Szelényi's terms, one side constructs a ‘hard’ Karl Polanyi, the other a ‘soft’ one. The former advocated a socialist mixed economy dominated by redistributive mechanisms. He was a radical socialist for whom the market should never be the dominant mechanism of economic coordination. His ‘soft’ alter ego insisted that the market system remain essentially intact but be complemented by redistributive mechanisms. The ‘double movement’ – the central thesis of his ‘Great Transformation’ – acts, in this reading, as a self-correcting mechanism that moderates the excesses of market fundamentalism; its author was positioned within the social-democratic mainstream for which the only realistic desirable goal is a regulated form of capitalism. In terms of textual evidence there is much to be said for both interpretations. In this article I suggest a different approach, one that focuses upon the meaning of Polanyi's concepts in relation to their socio-political and intellectual environment

    Localised and delocalised plasmons in metallic nano-voids

    No full text
    Nanostructured metal films comprised of periodically arranged spherical voids are grown by electrochemical deposition through a self-assembled template. Detailed measurements of the angle- and orientation-dependent reflectivity for different sample geometries reveal the spectral dispersion of several different types of surface plasmon modes. The dependence of the energies of both delocalized Bragg and localized Mie plasmons on the void goemetry is presented, along with theoretical models to explain some of these experimental findings. Strong interactions between the different plasmon modes as well as other mixing processes are identified. Understanding such plasmonic crystals allows for the engineering of devices tailored for a wide range of sensing application

    Geographically touring the eastern bloc: British geography, travel cultures and the Cold War

    Get PDF
    This paper considers the role of travel in the generation of geographical knowledge of the eastern bloc by British geographers. Based on oral history and surveys of published work, the paper examines the roles of three kinds of travel experience: individual private travels, tours via state tourist agencies, and tours by academic delegations. Examples are drawn from across the eastern bloc, including the USSR, Poland, Romania, East Germany and Albania. The relationship between travel and publication is addressed, notably within textbooks, and in the Geographical Magazine. The study argues for the extension of accounts of cultures of geographical travel, and seeks to supplement the existing historiography of Cold War geography

    Variability in the analysis of a single neuroimaging dataset by many teams

    Get PDF
    Data analysis workflows in many scientific domains have become increasingly complex and flexible. To assess the impact of this flexibility on functional magnetic resonance imaging (fMRI) results, the same dataset was independently analyzed by 70 teams, testing nine ex-ante hypotheses1. The flexibility of analytic approaches is exemplified by the fact that no two teams chose identical workflows to analyze the data. This flexibility resulted in sizeable variation in hypothesis test results, even for teams whose statistical maps were highly correlated at intermediate stages of their analysis pipeline. Variation in reported results was related to several aspects of analysis methodology. Importantly, a meta-analytic approach that aggregated information across teams yielded significant consensus in activated regions across teams. Furthermore, prediction markets of researchers in the field revealed an overestimation of the likelihood of significant findings, even by researchers with direct knowledge of the dataset2-5. Our findings show that analytic flexibility can have substantial effects on scientific conclusions, and demonstrate factors possibly related to variability in fMRI. The results emphasize the importance of validating and sharing complex analysis workflows, and demonstrate the need for multiple analyses of the same data. Potential approaches to mitigate issues related to analytical variability are discussed
    corecore