1,230 research outputs found

    Security-oriented infrastructures for social simulation

    Get PDF
    The JISC-funded National e-Infrastructure for Social Simulation (NeISS) project aims to develop and provide new services to social scientists and public/private sector policymakers interested in “what-if” questions that have an impact upon society and can be tackled through social simulation. For the first what-if question, a traffic simulation modelling how congestion will affect routes within a city or region projected across a time-span of decades has been identified. This paper describes the work that has been done in implementing a secure, user-oriented environment that provides seamless access to relevant nationally significant data sets such as the 2001 Census and demographic transition statistics from the British Household Panel Survey (BHPS) , and a Population Reconstruction Model (PRM) simulator, which simulates a population of individuals or households based upon these data sets

    GPs’ perspectives on the management of patients with multimorbidity: Systematic review and synthesis of qualitative research

    Get PDF
    Objective: To synthesise the existing published literature on the perceptions of general practitioners (GPs) or their equivalent on the clinical management of multimorbidity and determine targets for future research that aims to improve clinical care in multimorbidity. Design: Systematic review and metaethnographic synthesis of primary studies that used qualitative methods to explore GPs’ experiences of clinical management of multimorbidity or multiple chronic diseases. Data sources: EMBASE, MEDLINE, CINAHL, PsycInfo, Academic Search Complete, SocIndex, Social Science Full Text and digital theses/online libraries (database inception to September 2012) to identify literature using qualitative methods (focus groups or interviews). Review methods: The 7-step metaethnographic approach described by Noblit and Hare, which involves cross-interpretation between studies while preserving the context of the primary data. Results: Of 1805 articles identified, 37 were reviewed in detail and 10 were included, using a total of 275 GPs in 7 different countries. Four areas of difficulty specific to the management of multimorbidity emerged from these papers: disorganisation and fragmentation of healthcare; the inadequacy of guidelines and evidence-based medicine; challenges in delivering patient-centred care; and barriers to shared decision-making. A ‘line of argument’ was drawn which described GPs’ sense of isolation in decision-making for multimorbid patients. Conclusions: This systematic review shows that the problem areas for GPs in the management of multimorbidity may be classified into four domains. There will be no ‘one size fits all’ intervention for multimorbidity but these domains may be useful targets to guide the development of interventions that will assist and improve the provision of care to multimorbid patients

    UTOPIA—User-Friendly Tools for Operating Informatics Applications

    Get PDF
    Bioinformaticians routinely analyse vast amounts of information held both in large remote databases and in flat data files hosted on local machines. The contemporary toolkit available for this purpose consists of an ad hoc collection of data manipulation tools, scripting languages and visualization systems; these must often be combined in complex and bespoke ways, the result frequently being an unwieldy artefact capable of one specific task, which cannot easily be exploited or extended by other practitioners. Owing to the sizes of current databases and the scale of the analyses necessary, routine bioinformatics tasks are often automated, but many still require the unique experience and intuition of human researchers: this requires tools that support real-time interaction with complex datasets. Many existing tools have poor user interfaces and limited real-time performance when applied to realistically large datasets; much of the user's cognitive capacity is therefore focused on controlling the tool rather than on performing the research. The UTOPIA project is addressing some of these issues by building reusable software components that can be combined to make useful applications in the field of bioinformatics. Expertise in the fields of human computer interaction, high-performance rendering, and distributed systems is being guided by bioinformaticians and end-user biologists to create a toolkit that is both architecturally sound from a computing point of view, and directly addresses end-user and application-developer requirements

    On the Interpretation of Supernova Light Echo Profiles and Spectra

    Full text link
    The light echo systems of historical supernovae in the Milky Way and local group galaxies provide an unprecedented opportunity to reveal the effects of asymmetry on observables, particularly optical spectra. Scattering dust at different locations on the light echo ellipsoid witnesses the supernova from different perspectives and the light consequently scattered towards Earth preserves the shape of line profile variations introduced by asymmetries in the supernova photosphere. However, the interpretation of supernova light echo spectra to date has not involved a detailed consideration of the effects of outburst duration and geometrical scattering modifications due to finite scattering dust filament dimension, inclination, and image point-spread function and spectrograph slit width. In this paper, we explore the implications of these factors and present a framework for future resolved supernova light echo spectra interpretation, and test it against Cas A and SN 1987A light echo spectra. We conclude that the full modeling of the dimensions and orientation of the scattering dust using the observed light echoes at two or more epochs is critical for the correct interpretation of light echo spectra. Indeed, without doing so one might falsely conclude that differences exist when none are actually present.Comment: 18 pages, 22 figures, accepted for publication in Ap

    HypTrails: A Bayesian Approach for Comparing Hypotheses About Human Trails on the Web

    Full text link
    When users interact with the Web today, they leave sequential digital trails on a massive scale. Examples of such human trails include Web navigation, sequences of online restaurant reviews, or online music play lists. Understanding the factors that drive the production of these trails can be useful for e.g., improving underlying network structures, predicting user clicks or enhancing recommendations. In this work, we present a general approach called HypTrails for comparing a set of hypotheses about human trails on the Web, where hypotheses represent beliefs about transitions between states. Our approach utilizes Markov chain models with Bayesian inference. The main idea is to incorporate hypotheses as informative Dirichlet priors and to leverage the sensitivity of Bayes factors on the prior for comparing hypotheses with each other. For eliciting Dirichlet priors from hypotheses, we present an adaption of the so-called (trial) roulette method. We demonstrate the general mechanics and applicability of HypTrails by performing experiments with (i) synthetic trails for which we control the mechanisms that have produced them and (ii) empirical trails stemming from different domains including website navigation, business reviews and online music played. Our work expands the repertoire of methods available for studying human trails on the Web.Comment: Published in the proceedings of WWW'1

    Enabling quantitative data analysis through e-infrastructures

    Get PDF
    This paper discusses how quantitative data analysis in the social sciences can engage with and exploit an e-Infrastructure. We highlight how a number of activities which are central to quantitative data analysis, referred to as ‘data management’, can benefit from e-infrastructure support. We conclude by discussing how these issues are relevant to the DAMES (Data Management through e-Social Science) research Node, an ongoing project that aims to develop e-Infrastructural resources for quantitative data analysis in the social sciences

    Secure, performance-oriented data management for nanoCMOS electronics

    Get PDF
    The EPSRC pilot project Meeting the Design Challenges of nanoCMOS Electronics (nanoCMOS) is focused upon delivering a production level e-Infrastructure to meet the challenges facing the semiconductor industry in dealing with the next generation of ‘atomic-scale’ transistor devices. This scale means that previous assumptions on the uniformity of transistor devices in electronics circuit and systems design are no longer valid, and the industry as a whole must deal with variability throughout the design process. Infrastructures to tackle this problem must provide seamless access to very large HPC resources for computationally expensive simulation of statistic ensembles of microscopically varying physical devices, and manage the many hundreds of thousands of files and meta-data associated with these simulations. A key challenge in undertaking this is in protecting the intellectual property associated with the data, simulations and design process as a whole. In this paper we present the nanoCMOS infrastructure and outline an evaluation undertaken on the Storage Resource Broker (SRB) and the Andrew File System (AFS) considering in particular the extent that they meet the performance and security requirements of the nanoCMOS domain. We also describe how metadata management is supported and linked to simulations and results in a scalable and secure manner
    • 

    corecore