8,626 research outputs found

    Treatment of the background error in the statistical analysis of Poisson processes

    Get PDF
    The formalism that allows to take into account the error sigma_b of the expected mean background b in the statistical analysis of a Poisson process with the frequentistic method is presented. It is shown that the error sigma_b cannot be neglected if it is not much smaller than sqrt(b). The resulting confidence belt is larger that the one for sigma_b=0, leading to larger confidence intervals for the mean mu of signal events.Comment: 15 pages including 2 figures, RevTeX. Final version published in Phys. Rev. D 59 (1999) 11300

    Comment on "Including Systematic Uncertainties in Confidence Interval Construction for Poisson Statistics"

    Get PDF
    The incorporation of systematic uncertainties into confidence interval calculations has been addressed recently in a paper by Conrad et al. (Physical Review D 67 (2003) 012002). In their work, systematic uncertainities in detector efficiencies and background flux predictions were incorporated following the hybrid frequentist-Bayesian prescription of Cousins and Highland, but using the likelihood ratio ordering of Feldman and Cousins in order to produce "unified" confidence intervals. In general, the resulting intervals behaved as one would intuitively expect, i.e. increased with increasing uncertainties. However, it was noted that for numbers of observed events less than or of order of the expected background, the intervals could sometimes behave in a completely counter-intuitive fashion -- being seen to initially decrease in the face of increasing uncertainties, but only for the case of increasing signal efficiency uncertainty. In this comment, we show that the problematic behaviour is due to integration over the signal efficiency uncertainty while maximising the best fit alternative hypothesis likelihood. If the alternative hypothesis likelihood is determined by unconditionally maximising with respect to both the unknown signal and signal efficiency uncertainty, the limits display the correct intuitive behaviour.Comment: Submitted to Physical Review

    Access Anytime Anyplace: An Empircal Investigation of Patterns of Technology Use in Nomadic Computing Environments

    Get PDF
    With the increasing pervasiveness of mobile technologies such as cellular phones, personal digital assistants and hand held computers, mobile technologies promise the next major technological and cultural shift. Like the Internet, it is predicted that the greatest impact will not come from hardware devices or software programs, but from emerging social practices, which were not possible before. To capitalize on the benefits of mobile technologies, organizations have begun to implement nomadic computing environments. Nomadic computing environments make available the systems support needed to provide computing and communication capabilities and services to the mobile work force as they move from place to place in a manner that is transparent, integrated, convenient and adaptive. Already, anecdotes suggest that within organizations there are social implications occurring with both unintended and intended consequences being perpetuated. The problems of nomadic computing users have widely been described in terms of the challenges presented by the interplay of time, space and context, yet a theory has yet to be developed which analyzes this interplay in a single effort. A temporal human agency perspective proposes that stakeholders’ actions are influenced by their ability to recall the past, respond to the present and imagine the future. By extending the temporal human agency perspective through the recognition of the combined influence of space and context on human action, I investigated how the individual practices of eleven nomadic computing users changed after implementation. Under the umbrella of the interpretive paradigm, and using a cross case methodology this research develops a theoretical account of how several stakeholders engaged with different nomadic computing environments and explores the context of their effectiveness. Applying a literal and theoretical replication strategy to multiple longitudinal and retrospective cases, six months were spent in the field interviewing and observing participants. Data analysis included three types of coding: descriptive, interpretive and pattern coding. The findings reveal that patterns of technology use in nomadic computing environments are influenced by stakeholders’ temporal orientations; their ability to remember the past, imagine the future and respond to the present. As stakeholders all have different temporal orientations and experiences, they exhibit different practices even when engaging initially with the same organizational and technical environments. Opposing forces emerge as users attempt to be effective by resolving the benefits and disadvantages of the environment as they undergo different temporal, contextual and spatial experiences. Insights about the ability to predict future use suggest that because they are difficult to envisage in advance, social processes inhibit the predictability of what technologies users will adopt. The framework presented highlights the need to focus on understanding the diversity in nomadic computing use practices by examining how they are influenced by individual circumstances as well as shared meanings across individuals

    Unbiased cut selection for optimal upper limits in neutrino detectors: the model rejection potential technique

    Full text link
    We present a method for optimising experimental cuts in order to place the strongest constraints (upper limits) on theoretical signal models. The method relies only on signal and background expectations derived from Monte-Carlo simulations, so no bias is introduced by looking at actual data, for instance by setting a limit based on expected signal above the ``last remaining data event.'' After discussing the concept of the ``average upper limit,'' based on the expectation from an ensemble of repeated experiments with no true signal, we show how the best model rejection potential is achieved by optimising the cuts to minimise the ratio of this ``average upper limit'' to the expected signal from the model. As an example, we use this technique to determine the limit sensitivity of kilometre scale neutrino detectors to extra-terrestrial neutrino fluxes from a variety of models, e.g. active galaxies and gamma-ray bursts. We suggest that these model rejection potential optimised limits be used as a standard method of comparing the sensitivity of proposed neutrino detectors.Comment: 18 pages, 7 figures, submitted to Astroparticle Physic

    Relations of Mercury to other Metals

    Get PDF
    n/

    Setting UBVRI Photometric Zero-Points Using Sloan Digital Sky Survey ugriz Magnitudes

    Get PDF
    We discuss the use of Sloan Digital Sky Survey (SDSS) ugriz point-spread function (PSF) photometry for setting the zero points of UBVRI CCD images. From a comparison with the Landolt (1992) standards and our own photometry we find that there is a fairly abrupt change in B, V, R, & I zero points around g, r, i ~ 14.5, and in the U zero point at u ~ 16. These changes correspond to where there is significant interpolation due to saturation in the SDSS PSF fluxes. There also seems to be another, much smaller systematic effect for stars with g, r > 19.5. The latter effect is consistent with a small Malmquist bias. Because of the difficulties with PSF fluxes of brighter stars, we recommend that comparisons of ugriz and UBVRI photometry should only be made for unsaturated stars with g, r and i in the range 14.5 - 19.5, and u in the range 16 - 19.5. We give a prescription for setting the UBVRI zero points for CCD images, and general equations for transforming from ugriz to UBVRI.Comment: 13 pages. 6 figures. Accepted for publication in the Astronomical Journa

    Including Systematic Uncertainties in Confidence Interval Construction for Poisson Statistics

    Get PDF
    One way to incorporate systematic uncertainties into the calculation of confidence intervals is by integrating over probability density functions parametrizing the uncertainties. In this note we present a development of this method which takes into account uncertainties in the prediction of background processes, uncertainties in the signal detection efficiency and background efficiency and allows for a correlation between the signal and background detection efficiencies. We implement this method with the Feldman & Cousins unified approach with and without conditioning. We present studies of coverage for the Feldman & Cousins and Neyman ordering schemes. In particular, we present two different types of coverage tests for the case where systematic uncertainties are included. To illustrate the method we show the relative effect of including systematic uncertainties the case of dark matter search as performed by modern neutrino tel escopes.Comment: 23 pages, 10 figures, replaced to match published versio

    A Possible Massive Asteroid Belt Around zeta Lep

    Full text link
    We have used the Keck I telescope to image at 11.7 microns and 17.9 microns the dust emission around zeta Lep, a main sequence A-type star at 21.5 pc from the Sun with an infrared excess. The excess is at most marginally resolved at 17.9 microns. The dust distance from the star is probably less than or equal to 6 AU, although some dust may extend to 9 AU. The mass of observed dust is \~10^22 g. Since the lifetime of dust particles is about 10,000 years because of the Poytning-Robertson effect, we robustly estimate at least 4 10^26 g must reside in parent bodies which may be asteroids if the system is in a steady state and has an age of ~300 Myr. This mass is approximately 200 times that contained within the main asteroid belt in our solar system.Comment: 12 pages, 3 figures, ApJL in pres
    • 

    corecore