776 research outputs found

    Reanalysis of cancer mortality in Japanese A-bomb survivors exposed to low doses of radiation: bootstrap and simulation methods

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The International Commission on Radiological Protection (ICRP) recommended annual occupational dose limit is 20 mSv. Cancer mortality in Japanese A-bomb survivors exposed to less than 20 mSv external radiation in 1945 was analysed previously, using a latency model with non-linear dose response. Questions were raised regarding statistical inference with this model.</p> <p>Methods</p> <p>Cancers with over 100 deaths in the 0 - 20 mSv subcohort of the 1950-1990 Life Span Study are analysed with Poisson regression models incorporating latency, allowing linear and non-linear dose response. Bootstrap percentile and Bias-corrected accelerated (BCa) methods and simulation of the Likelihood Ratio Test lead to Confidence Intervals for Excess Relative Risk (ERR) and tests against the linear model.</p> <p>Results</p> <p>The linear model shows significant large, positive values of ERR for liver and urinary cancers at latencies from 37 - 43 years. Dose response below 20 mSv is strongly non-linear at the optimal latencies for the stomach (11.89 years), liver (36.9), lung (13.6), leukaemia (23.66), and pancreas (11.86) and across broad latency ranges. Confidence Intervals for ERR are comparable using Bootstrap and Likelihood Ratio Test methods and BCa 95% Confidence Intervals are strictly positive across latency ranges for all 5 cancers. Similar risk estimates for 10 mSv (lagged dose) are obtained from the 0 - 20 mSv and 5 - 500 mSv data for the stomach, liver, lung and leukaemia. Dose response for the latter 3 cancers is significantly non-linear in the 5 - 500 mSv range.</p> <p>Conclusion</p> <p>Liver and urinary cancer mortality risk is significantly raised using a latency model with linear dose response. A non-linear model is strongly superior for the stomach, liver, lung, pancreas and leukaemia. Bootstrap and Likelihood-based confidence intervals are broadly comparable and ERR is strictly positive by bootstrap methods for all 5 cancers. Except for the pancreas, similar estimates of latency and risk from 10 mSv are obtained from the 0 - 20 mSv and 5 - 500 mSv subcohorts. Large and significant cancer risks for Japanese survivors exposed to less than 20 mSv external radiation from the atomic bombs in 1945 cast doubt on the ICRP recommended annual occupational dose limit.</p

    Entanglement-free Heisenberg-limited phase estimation

    Get PDF
    Measurement underpins all quantitative science. A key example is the measurement of optical phase, used in length metrology and many other applications. Advances in precision measurement have consistently led to important scientific discoveries. At the fundamental level, measurement precision is limited by the number N of quantum resources (such as photons) that are used. Standard measurement schemes, using each resource independently, lead to a phase uncertainty that scales as 1/sqrt(N) - known as the standard quantum limit. However, it has long been conjectured that it should be possible to achieve a precision limited only by the Heisenberg uncertainty principle, dramatically improving the scaling to 1/N. It is commonly thought that achieving this improvement requires the use of exotic quantum entangled states, such as the NOON state. These states are extremely difficult to generate. Measurement schemes with counted photons or ions have been performed with N <= 6, but few have surpassed the standard quantum limit and none have shown Heisenberg-limited scaling. Here we demonstrate experimentally a Heisenberg-limited phase estimation procedure. We replace entangled input states with multiple applications of the phase shift on unentangled single-photon states. We generalize Kitaev's phase estimation algorithm using adaptive measurement theory to achieve a standard deviation scaling at the Heisenberg limit. For the largest number of resources used (N = 378), we estimate an unknown phase with a variance more than 10 dB below the standard quantum limit; achieving this variance would require more than 4,000 resources using standard interferometry. Our results represent a drastic reduction in the complexity of achieving quantum-enhanced measurement precision.Comment: Published in Nature. This is the final versio

    Increasing blood pressure variability predicts poor functional outcome following acute stroke

    Get PDF
    Introduction: Increasing blood pressure variability has been reported following acute stroke, but there is uncertainty about how best to measure it and about the impact on prognosis following acute ischaemic stroke and transient ischaemic attack. Methods: Enhanced casual blood pressure and ambulatory blood pressure monitoring were completed at baseline (≤48 hours post symptom onset). Blood pressure variability was defined by standard deviation and coefficient of variation of systolic, diastolic, mean arterial pressure, and pulse pressure. Modified Rankin scale score ≥3 described poor functional outcome assessed at 1- and 12-months post-stroke. Multivariable logistic regression models incorporating blood pressure variability measurement and other factors were performed, and odds ratio and 95% confidence intervals reported. Results: 232 patients were recruited; 45 were dependent at 1-month, and 37 at 12-months. Dependent patients were more likely to be older, with a higher burden of pre-morbid conditions, and with increased blood pressure variability. Enhanced casual standard deviations of diastolic blood pressure [1.19 (1.02 to 1.39)] and mean arterial pressure [1.20 (1.00 to 1.43)] predicted dependency at 1-month. Predictors of 12-month dependency included: enhanced casual standard deviation of mean arterial pressure [1.21 (1.0-1.46)]; 24-hour ambulatory monitor standard deviations of diastolic blood pressure [2.30 (1.08-4.90)] and mean arterial pressure [1.72 (1.09-2.72)], and the coefficient of variation of mean arterial pressure [1.76 (1.05-2.94)]; day-time ambulatory monitor coefficient of variation of systolic blood pressure [1.44 (1.02-2.03)] and mean arterial pressure [1.46 (1.02-2.08)]; and night-time ambulatory standard deviation of diastolic blood pressure [1.65 (1.03 -2.63)], and the coefficient of variation of mean arterial pressure and [1.38 (1.00- 1.90)] and pulse pressure [1.29 (1.00–1.65)]. Conclusion: Increasing blood pressure variability is independently and modestly associated with poor functional outcome at 1- and 12-months following acute stroke

    Assessing the contribution of the herpes simplex virus DNA polymerase to spontaneous mutations

    Get PDF
    BACKGROUND: The thymidine kinase (tk) mutagenesis assay is often utilized to determine the frequency of herpes simplex virus (HSV) replication-mediated mutations. Using this assay, clinical and laboratory HSV-2 isolates were shown to have a 10- to 80-fold higher frequency of spontaneous mutations compared to HSV-1. METHODS: A panel of HSV-1 and HSV-2, along with polymerase-recombinant viruses expressing type 2 polymerase (Pol) within a type 1 genome, were evaluated using the tk and non-HSV DNA mutagenesis assays to measure HSV replication-dependent errors and determine whether the higher mutation frequency of HSV-2 is a distinct property of type 2 polymerases. RESULTS: Although HSV-2 have mutation frequencies higher than HSV-1 in the tk assay, these errors are assay-specific. In fact, wild type HSV-1 and the antimutator HSV-1 PAA(r)5 exhibited a 2–4 fold higher frequency than HSV-2 in the non-HSV DNA mutatagenesis assay. Furthermore, regardless of assay, HSV-1 recombinants expressing HSV-2 Pol had error rates similar to HSV-1, whereas the high mutator virus, HSV-2 6757, consistently showed signficant errors. Additionally, plasmid DNA containing the HSV-2 tk gene, but not type 1 tk or LacZ DNA, was shown to form an anisomorphic DNA stucture. CONCLUSIONS: This study suggests that the Pol is not solely responsible for the virus-type specific differences in mutation frequency. Accordingly, it is possible that (a) mutations may be modulated by other viral polypeptides cooperating with Pol, and (b) the localized secondary structure of the viral genome may partially account for the apparently enhanced error frequency of HSV-2

    Democratic population decisions result in robust policy-gradient learning: A parametric study with GPU simulations

    Get PDF
    High performance computing on the Graphics Processing Unit (GPU) is an emerging field driven by the promise of high computational power at a low cost. However, GPU programming is a non-trivial task and moreover architectural limitations raise the question of whether investing effort in this direction may be worthwhile. In this work, we use GPU programming to simulate a two-layer network of Integrate-and-Fire neurons with varying degrees of recurrent connectivity and investigate its ability to learn a simplified navigation task using a policy-gradient learning rule stemming from Reinforcement Learning. The purpose of this paper is twofold. First, we want to support the use of GPUs in the field of Computational Neuroscience. Second, using GPU computing power, we investigate the conditions under which the said architecture and learning rule demonstrate best performance. Our work indicates that networks featuring strong Mexican-Hat-shaped recurrent connections in the top layer, where decision making is governed by the formation of a stable activity bump in the neural population (a "non-democratic" mechanism), achieve mediocre learning results at best. In absence of recurrent connections, where all neurons "vote" independently ("democratic") for a decision via population vector readout, the task is generally learned better and more robustly. Our study would have been extremely difficult on a desktop computer without the use of GPU programming. We present the routines developed for this purpose and show that a speed improvement of 5x up to 42x is provided versus optimised Python code. The higher speed is achieved when we exploit the parallelism of the GPU in the search of learning parameters. This suggests that efficient GPU programming can significantly reduce the time needed for simulating networks of spiking neurons, particularly when multiple parameter configurations are investigated. © 2011 Richmond et al

    Ultraviolet radiation shapes seaweed communities

    Get PDF
    corecore