2,508 research outputs found

    OT 060420: A Seemingly Optical Transient Recorded by All-Sky Cameras

    Get PDF
    We report on a ~5th magnitude flash detected for approximately 10 minutes by two CONCAM all-sky cameras located in Cerro Pachon - Chile and La Palma - Spain. A third all-sky camera, located in Cerro Paranal - Chile did not detect the flash, and therefore the authors of this paper suggest that the flash was a series of cosmic-ray hits, meteors, or satellite glints. Another proposed hypothesis is that the flash was an astronomical transient with variable luminosity. In this paper we discuss bright optical transient detection using fish-eye all-sky monitors, analyze the apparently false-positive optical transient, and propose possible causes to false optical transient detection in all-sky cameras.Comment: 7 figures, 3 tables, accepted PAS

    Treating and Preventing Influenza in Aged Care Facilities: A Cluster Randomised Controlled Trial

    Get PDF
    PMCID: PMC3474842This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited

    A probabilistic approach for acoustic emission based monitoring techniques: with application to structural health monitoring

    Full text link
    It has been demonstrated that acoustic-emission (AE), inspection of structures can offer advantages over other types of monitoring techniques in the detection of damage; namely, an increased sensitivity to damage, as well as an ability to localise its source. There are, however, numerous challenges associated with the analysis of AE data. One issue is the high sampling frequencies required to capture AE activity. In just a few seconds, a recording can generate very high volumes of data, of which a significant portion may be of little interest for analysis. Identifying the individual AE events in a recorded time-series is therefore a necessary procedure to reduce the size of the dataset. Another challenge that is also generally encountered in practice, is determining the sources of AE, which is an important exercise if one wishes to enhance the quality of the diagnostic scheme. In this paper, a state-of-the-art technique is presented that can automatically identify AE events, and simultaneously help in their characterisation from a probabilistic perspective. A nonparametric Bayesian approach, based on the Dirichlet process (DP), is employed to overcome some of the challenges associated with these tasks. Two main sets of AE data are considered in this work: (1) from a journal bearing in operation, and (2) from an Airbus A320 main landing gear subjected to fatigue testing

    Plausibility functions and exact frequentist inference

    Full text link
    In the frequentist program, inferential methods with exact control on error rates are a primary focus. The standard approach, however, is to rely on asymptotic approximations, which may not be suitable. This paper presents a general framework for the construction of exact frequentist procedures based on plausibility functions. It is shown that the plausibility function-based tests and confidence regions have the desired frequentist properties in finite samples---no large-sample justification needed. An extension of the proposed method is also given for problems involving nuisance parameters. Examples demonstrate that the plausibility function-based method is both exact and efficient in a wide variety of problems.Comment: 21 pages, 5 figures, 3 table

    Minimum Decision Cost for Quantum Ensembles

    Get PDF
    For a given ensemble of NN independent and identically prepared particles, we calculate the binary decision costs of different strategies for measurement of polarised spin 1/2 particles. The result proves that, for any given values of the prior probabilities and any number of constituent particles, the cost for a combined measurement is always less than or equal to that for any combination of separate measurements upon sub-ensembles. The Bayes cost, which is that associated with the optimal strategy (i.e., a combined measurement) is obtained in a simple closed form.Comment: 11 pages, uses RevTe

    Effects of alteplase for acute stroke on the distribution of functional outcomes: a pooled analysis of 9 trials

    Get PDF
    Background—Thrombolytic therapy with intravenous alteplase within 4.5 hours of ischemic stroke onset increases the overall likelihood of an excellent outcome (no, or nondisabling, symptoms). Any improvement in functional outcome distribution has value, and herein we provide an assessment of the effect of alteplase on the distribution of the functional level by treatment delay, age, and stroke severity. Methods—Prespecified pooled analysis of 6756 patients from 9 randomized trials comparing alteplase versus placebo/open control. Ordinal logistic regression models assessed treatment differences after adjustment for treatment delay, age, stroke severity, and relevant interaction term(s). Results—Treatment with alteplase was beneficial for a delay in treatment extending to 4.5 hours after stroke onset, with a greater benefit with earlier treatment. Neither age nor stroke severity significantly influenced the slope of the relationship between benefit and time to treatment initiation. For the observed case mix of patients treated within 4.5 hours of stroke onset (mean 3 hours and 20 minutes), the net absolute benefit from alteplase (ie, the difference between those who would do better if given alteplase and those who would do worse) was 55 patients per 1000 treated (95% confidence interval, 13–91; P=0.004). Conclusions—Treatment with intravenous alteplase initiated within 4.5 hours of stroke onset increases the chance of achieving an improved level of function for all patients across the age spectrum, including the over 80s and across all severities of stroke studied (top versus bottom fifth means: 22 versus 4); the earlier that treatment is initiated, the greater the benefit

    Electromagnetic Cascades and Cascade Nucleosynthesis in the Early Universe

    Get PDF
    We describe a calculation of electromagnetic cascading in radiation and matter in the early universe initiated by the decay of massive particles or by some other process. We have used a combination of Monte Carlo and numerical techniques which enables us to use exact cross sections, where known, for all the relevant processes. In cascades initiated after the epoch of big bang nucleosynthesis γ\gamma-rays in the cascades will photodisintegrate 4^4He, producing 3^3He and deuterium. Using the observed 3^3He and deuterium abundances we are able to place constraints on the cascade energy deposition as a function of cosmic time. In the case of the decay of massive primordial particles, we place limits on the density of massive primordial particles as a function of their mean decay time, and on the expected intensity of decay neutrinos.Comment: compressed and uuencoded postscript. We now include a comparison with previous work of the photon spectrum in the cascade and the limits we calculate for the density of massive particles. The method of calculation of photon spectra at low energies has been improved. Most figures are revised. Our conclusions are substantially unchange

    Constraining Antimatter Domains in the Early Universe with Big Bang Nucleosynthesis

    Full text link
    We consider the effect of a small-scale matter-antimatter domain structure on big bang nucleosynthesis and place upper limits on the amount of antimatter in the early universe. For small domains, which annihilate before nucleosynthesis, this limit comes from underproduction of He-4. For larger domains, the limit comes from He-3 overproduction. Most of the He-3 from antiproton-helium annihilation is annihilated also. The main source of He-3 is photodisintegration of He-4 by the electromagnetic cascades initiated by the annihilation.Comment: 4 pages, 2 figures, revtex, (slightly shortened

    Dark matter and non-Newtonian gravity from General Relativity coupled to a fluid of strings

    Get PDF
    An exact solution of Einstein's field equations for a point mass surrounded by a static, spherically symmetric fluid of strings is presented. The solution is singular at the origin. Near the string cloud limit there is a 1/r1/r correction to Newton's force law. It is noted that at large distances and small accelerations, this law coincides with the phenomenological force law invented by Milgrom in order to explain the flat rotation curves of galaxies without introducing dark matter. When interpreted in the context of a cosmological model with a string fluid, the new solution naturally explains why the critical acceleration of Milgrom is of the same order of magnitude as the Hubble parameter.Comment: 12 pages, REVTeX, no figure

    Lithium-6: A Probe of the Early Universe

    Get PDF
    I consider the synthesis of 6Li due to the decay of relic particles, such as gravitinos or moduli, after the epoch of Big Bang Nucleosynthesis. The synthesized 6Li/H ratio may be compared to 6Li/H in metal-poor stars which, in the absence of stellar depletion of 6Li, yields significantly stronger constraints on relic particle densities than the usual consideration of overproduction of 3He. Production of 6Li during such an era of non-thermal nucleosynthesis may also be regarded as a possible explanation for the relatively high 6Li/H ratios observed in metal-poor halo stars.Comment: final version, Physical Review Letters, additional figure giving limits on relic decaying particle
    • …
    corecore