4,741 research outputs found

    Practising dance history: reflections on the shared processes of dance historians and dance makers.

    Get PDF
    Recent trends have identified ways in which practitioners engage in research processes which are commensurate with those of traditional scholarship. Using historiography as an example, it is argued that scholarship is also, conversely, an artistic act in its use of 'expert intuition' and in the creation of its language and narrative fictions

    The First Century of the Current Events Club of Gardiner, Maine 1892-1992

    Get PDF
    Digitized with permissions from the Gardiner Current Events Club.https://digitalmaine.com/gardiner_books/1002/thumbnail.jp

    Why people attend science festivals : interests, motivations and self-reported benefits of public engagement with research

    Get PDF
    As a form of public engagement, science festivals have rapidly expanded in size and number over recent years. However, as with other domains of informal public engagement that are not linked to policy outcomes, existing research does not fully address science festivals’ impacts and popularity.This study adduces evidence from surveys and focus groups to elucidate the perspectives of visitors at a large UK science festival. Results show that visitors value the opportunities science festivals afford to interact with scientific researchers and to encounter different types of science engagement aimed at adults, children and families. The most significant self-reported impact of attending a science festival was the development of increased interest and curiosity about new areas of scientific knowledge within a socially stimulating and enjoyable setting

    Development of an Interpretive Simulation Tool for the Proton Radiography Technique

    Get PDF
    Proton radiography is a useful diagnostic of high energy density (HED) plasmas under active theoretical and experimental development. In this paper we describe a new simulation tool that interacts realistic laser-driven point-like proton sources with three dimensional electromagnetic fields of arbitrary strength and structure and synthesizes the associated high resolution proton radiograph. The present tool's numerical approach captures all relevant physics effects, including effects related to the formation of caustics. Electromagnetic fields can be imported from PIC or hydrodynamic codes in a streamlined fashion, and a library of electromagnetic field `primitives' is also provided. This latter capability allows users to add a primitive, modify the field strength, rotate a primitive, and so on, while quickly generating a high resolution radiograph at each step. In this way, our tool enables the user to deconstruct features in a radiograph and interpret them in connection to specific underlying electromagnetic field elements. We show an example application of the tool in connection to experimental observations of the Weibel instability in counterstreaming plasmas, using ∌108\sim 10^8 particles generated from a realistic laser-driven point-like proton source, imaging fields which cover volumes of ∌10\sim10 mm3^3. Insights derived from this application show that the tool can support understanding of HED plasmas.Comment: Figures and tables related to the Appendix are included in the published journal articl

    Information-anchored sensitivity analysis: theory and application

    Get PDF
    Analysis of longitudinal randomised controlled trials is frequently complicated because patients deviate from the protocol. Where such deviations are relevant for the estimand, we are typically required to make an untestable assumption about post-deviation behaviour in order to perform our primary analysis and estimate the treatment effect. In such settings, it is now widely recognised that we should follow this with sensitivity analyses to explore the robustness of our inferences to alternative assumptions about post-deviation behaviour. Although there has been a lot of work on how to conduct such sensitivity analyses, little attention has been given to the appropriate loss of information due to missing data within sensitivity analysis. We argue more attention needs to be given to this issue, showing it is quite possible for sensitivity analysis to decrease and increase the information about the treatment effect. To address this critical issue, we introduce the concept of information-anchored sensitivity analysis. By this we mean sensitivity analysis in which the proportion of information about the treatment estimate lost due to missing data is the same as the proportion of information about the treatment estimate lost due to missing data in the primary analysis. We argue this forms a transparent, practical starting point for interpretation of sensitivity analysis. We then derive results showing that, for longitudinal continuous data, a broad class of controlled and reference-based sensitivity analyses performed by multiple imputation are information-anchored. We illustrate the theory with simulations and an analysis of a peer review trial, then discuss our work in the context of other recent work in this area. Our results give a theoretical basis for the use of controlled multiple imputation procedures for sensitivity analysis

    Motorcyclists and pillion passengers with open lower-limb fractures: a study using TARN data 2007-2014

    Get PDF
    Introduction We aimed to identify population demographics of motorcyclists and pillion passengers with isolated open lower-limb fractures, to ascertain the impact of the revised 2009 British Orthopaedic Association/British Association of Plastic Reconstructive and Aesthetic Surgeons joint standards for the management of open fractures of the lower limb (BOAST 4), in terms of time to skeletal stabilisation and soft-tissue coverage, and to observe any impact on patient movement. Methods Retrospective cohort data was collected by the Trauma Audit and Research Network (TARN). A longitudinal analysis was performed between two timeframes in England (pre-and post-BOAST 4 revision): 2007-2009 and 2010-2014. Results A total of 1564 motorcyclists and 64 pillion passengers were identified. Of these, 93% (1521/1628) were male. The median age for males was 30.5 years and 36.7 years for females. There was a statistically significant difference in the number of patients who underwent skeletal stabilisation (49% vs 65%, P < 0.0001), the time from injury to skeletal stabilisation (7.33 hours vs 14.3 hours, P < 0.0001) and the proportion receiving soft-tissue coverage (26% vs 43%, P < 0.0001). There was no difference in the time from injury to soft-tissue coverage (62.3 hours vs 63.7 hours, P = 0.726). The number of patients taken directly to a major trauma centre (or its equivalent) increased between the two timeframes (12.5% vs, 41%, P < 0.001). Conclusions Since the 2009 BOAST 4 revision, there has been no difference in the time taken from injury to soft-tissue coverage but the time from injury to skeletal stabilisation is longer. There has also been an increase in patient movement to centres offering joint orthopaedic and plastic care

    Analysis and Synthesis of Metadata Goals for Scientific Data

    Get PDF
    The proliferation of discipline-specific metadata schemes contributes to artificial barriers that can impede interdisciplinary and transdisciplinary research. The authors considered this problem by examining the domains, objectives, and architectures of nine metadata schemes used to document scientific data in the physical, life, and social sciences. They used a mixed-methods content analysis and Greenberg’s (2005) metadata objectives, principles, domains, and architectural layout (MODAL) framework, and derived 22 metadata-related goals from textual content describing each metadata scheme. Relationships are identified between the domains (e.g., scientific discipline and type of data) and the categories of scheme objectives. For each strong correlation (\u3e0.6), a Fisher’s exact test for nonparametric data was used to determine significance (p \u3c .05). Significant relationships were found between the domains and objectives of the schemes. Schemes describing observational data are more likely to have “scheme harmonization” (compatibility and interoperability with related schemes) as an objective; schemes with the objective “abstraction” (a conceptual model exists separate from the technical implementation) also have the objective “sufficiency” (the scheme defines a minimal amount of information to meet the needs of the community); and schemes with the objective “data publication” do not have the objective “element refinement.” The analysis indicates that many metadata-driven goals expressed by communities are independent of scientific discipline or the type of data, although they are constrained by historical community practices and workflows as well as the technological environment at the time of scheme creation. The analysis reveals 11 fundamental metadata goals for metadata documenting scientific data in support of sharing research data across disciplines and domains. The authors report these results and highlight the need for more metadata-related research, particularly in the context of recent funding agency policy changes

    The impact of COVID-19 on research

    Get PDF
    This article is made available for unrestricted research re-use and secondary analysis in any form or be any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic.Coronavirus disease 2019 (COVID-19) has swept across the globe causing hundreds of thousands of deaths, shutting down economies, closing borders and wreaking havoc on an unprecedented scale. It has strained healthcare services and personnel to the brink in many regions and will certainly deeply mark medical research both in the short and long-term

    How Will Astronomy Archives Survive The Data Tsunami?

    Full text link
    The field of astronomy is starting to generate more data than can be managed, served and processed by current techniques. This paper has outlined practices for developing next-generation tools and techniques for surviving this data tsunami, including rigorous evaluation of new technologies, partnerships between astronomers and computer scientists, and training of scientists in high-end software engineering engineering skills.Comment: 8 pages, 3 figures; ACM Queue. Vol 9, Number 10, October 2011 (http://queue.acm.org/detail.cfm?id=2047483
    • 

    corecore