1,383 research outputs found

    A New Vaccine for Tuberculosis: The Challenges of Development and Deployment

    Get PDF
    Tuberculosis (TB) is one of the world’s leading causes of death due to infection and efforts to control TB would be substantially aided by the availability of an improved TB vaccine. There are currently nine new TB vaccines in clinical development, and the first efficacy trials are due to commence in 2009. There are many complex ethical issues which arise at all stages of TB vaccine development, from the need to conduct trials in developing countries to informed consent and the process of ethical review. While it is important that these issues are discussed, it may also be timely to consider the challenges which may arise if a vaccine in clinical development proves to be highly effective. We examine a number of scenarios where decisions on the deployment of a new TB vaccine may impact on the rights and liberty of the individual

    Better governance, better access: practising responsible data sharing in the METADAC governance infrastructure.

    Get PDF
    BACKGROUND: Genomic and biosocial research data about individuals is rapidly proliferating, bringing the potential for novel opportunities for data integration and use. The scale, pace and novelty of these applications raise a number of urgent sociotechnical, ethical and legal questions, including optimal methods of data storage, management and access. Although the open science movement advocates unfettered access to research data, many of the UK's longitudinal cohort studies operate systems of managed data access, in which access is governed by legal and ethical agreements between stewards of research datasets and researchers wishing to make use of them. Amongst other things, these agreements aim to respect the reasonable expectations of the research participants who provided data and samples, as expressed in the consent process. Arguably, responsible data management and governance of data and sample use are foundational to the consent process in longitudinal studies and are an important source of trustworthiness in the eyes of those who contribute data to genomic and biosocial research. METHODS: This paper presents an ethnographic case study exploring the foundational principles of a governance infrastructure for Managing Ethico-social, Technical and Administrative issues in Data ACcess (METADAC), which are operationalised through a committee known as the METADAC Access Committee. METADAC governs access to phenotype, genotype and 'omic' data and samples from five UK longitudinal studies. FINDINGS: Using the example of METADAC, we argue that three key structural features are foundational for practising responsible data sharing: independence and transparency; interdisciplinarity; and participant-centric decision-making. We observe that the international research community is proactively working towards optimising the use of research data, integrating/linking these data with routine data generated by health and social care services and other administrative data services to improve the analysis, interpretation and utility of these data. The governance of these new complex data assemblages will require a range of expertise from across a number of domains and disciplines, including that of study participants. Human-mediated decision-making bodies will be central to ensuring achievable, reasoned and responsible decisions about the use of these data; the METADAC model described in this paper provides an example of how this could be realised

    Ages for exoplanet host stars

    Full text link
    Age is an important characteristic of a planetary system, but also one that is difficult to determine. Assuming that the host star and the planets are formed at the same time, the challenge is to determine the stellar age. Asteroseismology provides precise age determination, but in many cases the required detailed pulsation observations are not available. Here we concentrate on other techniques, which may have broader applicability but also serious limitations. Further development of this area requires improvements in our understanding of the evolution of stars and their age-dependent characteristics, combined with observations that allow reliable calibration of the various techniques.Comment: To appear in "Handbook of Exoplanets", eds. Deeg, H.J. & Belmonte, J.A, Springer (2018

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Measurement of the cross-section of high transverse momentum vector bosons reconstructed as single jets and studies of jet substructure in pp collisions at √s = 7 TeV with the ATLAS detector

    Get PDF
    This paper presents a measurement of the cross-section for high transverse momentum W and Z bosons produced in pp collisions and decaying to all-hadronic final states. The data used in the analysis were recorded by the ATLAS detector at the CERN Large Hadron Collider at a centre-of-mass energy of √s = 7 TeV;{\rm Te}{\rm V}andcorrespondtoanintegratedluminosityof and correspond to an integrated luminosity of 4.6\;{\rm f}{{{\rm b}}^{-1}}.ThemeasurementisperformedbyreconstructingtheboostedWorZbosonsinsinglejets.ThereconstructedjetmassisusedtoidentifytheWandZbosons,andajetsubstructuremethodbasedonenergyclusterinformationinthejetcentreofmassframeisusedtosuppressthelargemultijetbackground.ThecrosssectionforeventswithahadronicallydecayingWorZboson,withtransversemomentum. The measurement is performed by reconstructing the boosted W or Z bosons in single jets. The reconstructed jet mass is used to identify the W and Z bosons, and a jet substructure method based on energy cluster information in the jet centre-of-mass frame is used to suppress the large multi-jet background. The cross-section for events with a hadronically decaying W or Z boson, with transverse momentum {{p}_{{\rm T}}}\gt 320\;{\rm Ge}{\rm V}andpseudorapidity and pseudorapidity |\eta |\lt 1.9,ismeasuredtobe, is measured to be {{\sigma }_{W+Z}}=8.5\pm 1.7$ pb and is compared to next-to-leading-order calculations. The selected events are further used to study jet grooming techniques

    Search for pair-produced long-lived neutral particles decaying to jets in the ATLAS hadronic calorimeter in ppcollisions at √s=8TeV

    Get PDF
    The ATLAS detector at the Large Hadron Collider at CERN is used to search for the decay of a scalar boson to a pair of long-lived particles, neutral under the Standard Model gauge group, in 20.3fb−1of data collected in proton–proton collisions at √s=8TeV. This search is sensitive to long-lived particles that decay to Standard Model particles producing jets at the outer edge of the ATLAS electromagnetic calorimeter or inside the hadronic calorimeter. No significant excess of events is observed. Limits are reported on the product of the scalar boson production cross section times branching ratio into long-lived neutral particles as a function of the proper lifetime of the particles. Limits are reported for boson masses from 100 GeVto 900 GeV, and a long-lived neutral particle mass from 10 GeVto 150 GeV

    Search for direct pair production of the top squark in all-hadronic final states in proton-proton collisions at s√=8 TeV with the ATLAS detector

    Get PDF
    The results of a search for direct pair production of the scalar partner to the top quark using an integrated luminosity of 20.1fb−1 of proton–proton collision data at √s = 8 TeV recorded with the ATLAS detector at the LHC are reported. The top squark is assumed to decay via t˜→tχ˜01 or t˜→ bχ˜±1 →bW(∗)χ˜01 , where χ˜01 (χ˜±1 ) denotes the lightest neutralino (chargino) in supersymmetric models. The search targets a fully-hadronic final state in events with four or more jets and large missing transverse momentum. No significant excess over the Standard Model background prediction is observed, and exclusion limits are reported in terms of the top squark and neutralino masses and as a function of the branching fraction of t˜ → tχ˜01 . For a branching fraction of 100%, top squark masses in the range 270–645 GeV are excluded for χ˜01 masses below 30 GeV. For a branching fraction of 50% to either t˜ → tχ˜01 or t˜ → bχ˜±1 , and assuming the χ˜±1 mass to be twice the χ˜01 mass, top squark masses in the range 250–550 GeV are excluded for χ˜01 masses below 60 GeV

    Super-Resolution Imaging Strategies for Cell Biologists Using a Spinning Disk Microscope

    Get PDF
    In this study we use a spinning disk confocal microscope (SD) to generate super-resolution images of multiple cellular features from any plane in the cell. We obtain super-resolution images by using stochastic intensity fluctuations of biological probes, combining Photoactivation Light-Microscopy (PALM)/Stochastic Optical Reconstruction Microscopy (STORM) methodologies. We compared different image analysis algorithms for processing super-resolution data to identify the most suitable for analysis of particular cell structures. SOFI was chosen for X and Y and was able to achieve a resolution of ca. 80 nm; however higher resolution was possible >30 nm, dependant on the super-resolution image analysis algorithm used. Our method uses low laser power and fluorescent probes which are available either commercially or through the scientific community, and therefore it is gentle enough for biological imaging. Through comparative studies with structured illumination microscopy (SIM) and widefield epifluorescence imaging we identified that our methodology was advantageous for imaging cellular structures which are not immediately at the cell-substrate interface, which include the nuclear architecture and mitochondria. We have shown that it was possible to obtain two coloured images, which highlights the potential this technique has for high-content screening, imaging of multiple epitopes and live cell imaging
    corecore