582 research outputs found

    Should all anticoagulated patients with head injury receive a CT scan? Decision-analysis modelling of an observational cohort

    Get PDF
    Objectives: It is not currently clear whether all anticoagulated patients with a head injury should receive CT scanning or only those with evidence of traumatic brain injury (e.g. loss of consciousness or amnesia). We aimed to determine the cost-effectiveness of CT for all compared to selective CT use for anticoagulated patients with a head injury. Design: Decision-analysis modelling of data from a multi-centre observational study. Setting: 33 Emergency Departments in England and Scotland. Participants: 3566 adults (aged ≄16 years) who had suffered blunt head injury, were taking warfarin and underwent selective CT scanning. Main outcome measures: Estimated expected benefits in terms of quality-adjusted life years (QALYs) were the entire cohort to receive a CT scan; estimated increased costs of CT and also the potential cost implications associated with patient survival and improved health. These values were used to estimate the cost per QALY of implementing a strategy of CT for all patients compared to observed practice based on guidelines recommending selective CT use. Results: Of the 1420/3534 patients (40%) who did not receive a CT scan, 7 (0.5%) suffered a potentially avoidable head injury related adverse outcome. If CT scanning had been performed in all patients, appropriate treatment could have gained 3.41 additional quality-adjusted life years (QALYs) but would have incurred ÂŁ193,149 additional treatment costs and ÂŁ130,683 additional CT costs. The incremental cost-effectiveness ratio of ÂŁ94,895/QALY gained for unselective compared to selective CT use is markedly above the threshold of ÂŁ20-30,000/QALY used by the UK National Institute for Care Excellence to determine cost-effectiveness. Conclusions: CT scanning for all anticoagulated patients with head injury is not cost-effective compared with selective use of CT scanning based on guidelines recommending scanning only for those with evidence of traumatic brain injur

    Large-scale assessment of 7-11-year-olds’ cognitive and sensorimotor function within the Born in Bradford longitudinal birth cohort study [version 2; peer review: 3 approved, 1 approved with reservations]

    Get PDF
    Background: Cognitive ability and sensorimotor function are crucial aspects of children’s development, and are associated with physical and mental health outcomes and educational attainment. This paper describes cross-sectional sensorimotor and cognitive function data collected on over 15,000 children aged 7-10 years, collected as part of the Born in Bradford (BiB) longitudinal birth-cohort study. Methodological details of the large-scale data collection process are described, along with initial analyses of the data involving the relationship between cognition/sensorimotor ability and age and task difficulty, and associations between tasks. Method: Data collection was completed in 86 schools between May 2016 and July 2019. Children were tested at school, individually, using a tablet computer with a digital stylus or finger touch for input. Assessments comprised a battery of three sensorimotor tasks (Tracking, Aiming, & Steering) and five cognitive tasks (three Working Memory tasks, Inhibition, and Processing Speed), which took approximately 40 minutes. Results: Performance improved with increasing age and decreasing task difficulty, for each task. Performance on all three sensorimotor tasks was correlated, as was performance on the three working memory tasks. In addition, performance on a composite working memory score correlated with performance on both inhibition and processing speed. Interestingly, within age-group variation was much larger than between age-group variation. Conclusions: The current project collected computerised measures of a range of cognitive and sensorimotor functions at 7-10 years of age in over 15,000 children. Performance varied as expected by age and task difficulty, and showed the predicted correlations between related tasks. Large within-age group variation highlights the need to consider the profile of individual children in studying cognitive and sensorimotor development. These data can be linked to the wider BiB dataset including measures of physical and mental health, biomarkers and genome-wide data, socio-demographic information, and routine data from local health and education services

    Constraining the expansion rate of the Universe using low-redshift ellipticals as cosmic chronometers

    Full text link
    We present a new methodology to determine the expansion history of the Universe analyzing the spectral properties of early type galaxies (ETG). We found that for these galaxies the 4000\AA break is a spectral feature that correlates with the relative ages of ETGs. In this paper we describe the method, explore its robustness using theoretical synthetic stellar population models, and apply it using a SDSS sample of ∌\sim14 000 ETGs. Our motivation to look for a new technique has been to minimise the dependence of the cosmic chronometer method on systematic errors. In particular, as a test of our method, we derive the value of the Hubble constant H0=72.6±2.8H_0 = 72.6 \pm 2.8 (stat) ±2.3\pm2.3 (syst) (68% confidence), which is not only fully compatible with the value derived from the Hubble key project, but also with a comparable error budget. Using the SDSS, we also derive, assuming w=constant, a value for the dark energy equation of state parameter w=−1±0.2w = -1 \pm 0.2 (stat) ±0.3\pm0.3 (syst). Given the fact that the SDSS ETG sample only reaches z∌0.3z \sim 0.3, this result shows the potential of the method. In future papers we will present results using the high-redshift universe, to yield a determination of H(z) up to z∌1z \sim 1.Comment: 25 pages, 17 figures, JCAP accepte

    Calculating the carbon footprint:implications for governing emissions and gender relations

    Get PDF
    In this article, we use fresh empirical evidence, and draw on feminist and critical accounting and organisational theories to contend that carbon calculators can be interpreted as discriminatory control technologies. They do this by providing a new and flexible vocabulary for governing expenses, costs and investments at a distance, avoiding a sense of direct intervention by the government. Thus, given our stance that the carbon calculator cannot be considered a neutral tool, we argue that it has the potential to control personal responsibilities regarding both environmental and family‐based issues

    The Japanese model in retrospective : industrial strategies, corporate Japan and the 'hollowing out' of Japanese industry

    Get PDF
    This article provides a retrospective look at the Japanese model of industrial development. This model combined an institutional approach to production based around the Japanese Firm (Aoki's, J-mode) and strategic state intervention in industry by the Japanese Ministry of International Trade and Industry (MITI). For a long period, the alignment of state and corporate interests appeared to match the wider public interest as the Japanese economy prospered. However, since the early 1990s, the global ambitions of the corporate sector have contributed to a significant 'hollowing out' of Japan's industrial base. As the world today looks for a new direction in economic management, we suggest the Japanese model provides policy-makers with a salutary lesson in tying the wider public interest with those of the corporate sector

    A Performance Analysis Framework for WiFi/WiMAX Heterogeneous Metropolitan Networks Based on Cross-Layer Design

    Get PDF
    The communication between network nodes within different protocol domains is often regarded simply as a black box with unknown configuration conditions in the path. We address network heterogeneity using a white box approach and focus on its interconnection processes. To achieve this purpose, a Performance Analysis Framework (PAF) is proposed which is composed of the formalization of the latter using process algebra (PA) and the corresponding teletraffic performance models. In this contribution, we target the IEEE 802.16 and IEEE 802.11 protocols. For the teletraffic models, we extend previous models for such scenario with the inclusion of the following protocol operational parameters (metrics): bit error rate (BER), packet error ratio (PER), and packet length (pl). From the framework teletraffic models, the optimal packet length (OPL), end to end throughput, delay, and packet loss are obtained. The PAF outperforms previous modeling solutions in terms of delay and throughput relative to NS3 simulation results. </jats:p

    Cosmic Chronometers: Constraining the Equation of State of Dark Energy. I: H(z) Measurements

    Get PDF
    We present new determinations of the cosmic expansion history from red-envelope galaxies. We have obtained for this purpose high-quality spectra with the Keck-LRIS spectrograph of red-envelope galaxies in 24 galaxy clusters in the redshift range 0.2 < z < 1.0. We complement these Keck spectra with high-quality, publicly available archival spectra from the SPICES and VVDS surveys. We improve over our previous expansion history measurements in Simon et al. (2005) by providing two new determinations of the expansion history: H(z) = 97 +- 62 km/sec/Mpc at z = 0.5 and H(z) = 90 +- 40 km/sec/Mpc at z = 0.8. We discuss the uncertainty in the expansion history determination that arises from uncertainties in the synthetic stellar-population models. We then use these new measurements in concert with cosmic-microwave-background (CMB) measurements to constrain cosmological parameters, with a special emphasis on dark-energy parameters and constraints to the curvature. In particular, we demonstrate the usefulness of direct H(z) measurements by constraining the dark- energy equation of state parameterized by w0 and wa and allowing for arbitrary curvature. Further, we also constrain, using only CMB and H(z) data, the number of relativistic degrees of freedom to be 4 +- 0.5 and their total mass to be < 0.2 eV, both at 1-sigma.Comment: Submitted to JCA

    Towards a high precision calculation for the pion-nucleus scattering lengths

    Get PDF
    We calculate the leading isospin conserving few-nucleon contributions to pion scattering on 2^2H, 3^3He, and 4^4He. We demonstrate that the strong contributions to the pion-nucleus scattering lengths can be controlled theoretically to an accuracy of a few percent for isoscalar nuclei and of 10% for isovector nuclei. In particular, we find the π\pi-3^3He scattering length to be (62±4±7)×10−3mπ−1(62 \pm 4\pm 7)\times 10^{-3} m_{\pi}^{-1} where the uncertainties are due to ambiguities in the π\pi-N scattering lengths and few-nucleon effects, respectively. To establish this accuracy we need to identify a suitable power counting for pion-nucleus scattering. For this purpose we study the dependence of the two-nucleon contributions to the scattering length on the binding energy of 2^2H. Furthermore, we investigate the relative size of the leading two-, three-, and four-nucleon contributions. For the numerical evaluation of the pertinent integrals, aMonte Carlo method suitable for momentum space is devised. Our results show that in general the power counting suggested by Weinberg is capable to properly predict the relative importance of NN-nucleon operators, however, it fails to capture the relative strength of NN- and (N+1)(N+1)-nucleon operators, where we find a suppression by a factor of 5 compared to the predicted factor of 50. The relevance for the extraction of the isoscalar π\pi-N scattering length from pionic 2^2H and 4^4He is discussed. As a side result, we show that beyond the calculation of the π\pi-2^2H scattering length is already beyond the range of applicability of heavy pion effective field theory.Comment: 24 pages, 14 figures, 10 table

    Anatomical subgroup analysis of the MERIDIAN cohort: failed commissuration

    Get PDF
    Objective: To assess the contribution of in utero magnetic resonance (iuMR) imaging in fetuses diagnosed with either agenesis of the corpus callosum or hypogenesis of the corpus callosum (grouped as failed commissuration) on antenatal ultrasonography (USS) from the MERIDIAN cohort. Methods: We report a sub-group analysis of fetuses with failed commissuration diagnosed on USS (with or without ventriculomegaly) from the MERIDIAN study who had iuMR imaging within 2 weeks of USS and outcome reference data were available. The diagnostic accuracy of USS and iuMR are reported as well as indicators of diagnostic confidence and effects on prognosis/clinical management. Results: 79 fetuses with failed commissuration are reported (55 with agenesis and 24 with hypogenesis as the USS diagnoses). The diagnostic accuracy for detecting ‘failed commissuration’ as a group label was 34.2% for USS and 94.9% for iuMR (difference = 60.7%, 95% confidence interval 47.6% to 73.9%, p < 0.0001). The diagnostic accuracy for detecting hypogenesis of the corpus callosum as a discrete entity was 8.3% for USS and 87.5% for iuMR whilst the diagnostic accuracy for detecting agenesis of the corpus callosum as a distinct entity was 40.0% for USS and 92.7% for iuMR. There was a statistically significant improvement in ‘appropriate’ diagnostic confidence when using iuMR imaging as assessed by a score-based weighted average’ method (p < 0.0001). Prognostic information given to the women changed in 36/79 (45.6%) cases after iuMR imaging and its overall effect on clinical management was ‘significant’, ‘major’ or ‘decisive’ in 35/79 cases (44.3%). Conclusions: Our data suggests that any woman whose fetus has failed commissuration as the only intracranial finding detected on USS should have iuMR imaging for further evaluation
    • 

    corecore