145,720 research outputs found

    Statistical methods applied to composition studies of ultrahigh energy cosmic rays

    Get PDF
    The mass composition of high energy cosmic rays above 101710^{17} eV is a crucial issue to solve some open questions in astrophysics such as the acceleration and propagation mechanisms. Unfortunately, the standard procedures to identify the primary particle of a cosmic ray shower have low efficiency mainly due to large fluctuations and limited experimental observables. We present a statistical method for composition studies based on several measurable features of the longitudinal development of the CR shower such as NmaxN_{max}, XmaxX_{max}, asymmetry, skewness and kurtosis. Principal component analysis (PCA) was used to evaluate the relevance of each parameter in the representation of the overall shower features and a linear discriminant analysis (LDA) was used to combine the different parameters to maximize the discrimination between different particle showers. The new parameter from LDA provides a separation between primary gammas, proton and iron nuclei better than the procedures based on XmaxX_{max} only. The method proposed here was successfully tested in the energy range from 101710^{17} to 102010^{20} eV even when limitations of shower track length were included in order to simulate the field of view of fluorescence telescopes

    Application and evaluation of sediment fingerprinting techniques in the Manawatu River catchment, New Zealand : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Geography at Massey University, Palmerston North, New Zealand

    Get PDF
    Suspended sediment is an important component of the fluvial environment, contributing not only to the physical form, but also the chemical and ecological character of river channels and adjacent floodplains. Fluvial sediment flux reflects erosion of the contributing catchment, which when enhanced can lead to a reduction in agricultural productivity, effect morphological changes in the riparian environment and alter aquatic ecosystems by elevating turbidity levels and degrading water quality. It is therefore important to identify catchment-scale erosion processes and understand rates of sediment delivery, transport and deposition into the fluvial system to be able to mitigate such adverse effects. Sediment fingerprinting is a well-used tool for evaluating sediment sources, capable of directly quantifying sediment supply through differentiating sediment sources based on their inherent geochemical signatures and statistical modelling. Confluence-based sediment fingerprinting has achieved broad scale geochemical discrimination within the 5870 km2 Manawatu catchment, which drains terrain comprising soft-rock Tertiary and Quaternary sandstones, mudstones, limestones and more indurated greywacke. Multiple sediment samples were taken upstream and downstream of major river confluences, sieved to 40 and > 35 respectively. The revised mixing model estimated Mudstone terrain to contribute 59.3 % and 61.8 %, with significant contributions estimated from Mountain Range (12.0 % and 11.4 %) and Hill Surface (11.5 % and 11.3 %) respectively, indicating that Tm, Ni, Cu, Ca, P, Mn and Cr have an influence on these sediment source estimations

    A new composition-sensitive parameter for Ultra-High Energy Cosmic Rays

    Get PDF
    A new family of parameters intended for composition studies in cosmic ray surface array detectors is proposed. The application of this technique to different array layout designs has been analyzed. The parameters make exclusive use of surface data combining the information from the total signal at each triggered detector and the array geometry. They are sensitive to the combined effects of the different muon and electromagnetic components on the lateral distribution function of proton and iron initiated showers at any given primary energy. Analytical and numerical studies have been performed in order to assess the reliability, stability and optimization of these parameters. Experimental uncertainties, the underestimation of the muon component in the shower simulation codes, intrinsic fluctuations and reconstruction errors are considered and discussed in a quantitative way. The potential discrimination power of these parameters, under realistic experimental conditions, is compared on a simplified, albeit quantitative way, with that expected from other surface and fluorescence estimators.Comment: 27 pages, 17 figures. Submitted to a refereed journa

    Extending Item Response Theory to Online Homework

    Full text link
    Item Response Theory becomes an increasingly important tool when analyzing ``Big Data'' gathered from online educational venues. However, the mechanism was originally developed in traditional exam settings, and several of its assumptions are infringed upon when deployed in the online realm. For a large enrollment physics course for scientists and engineers, the study compares outcomes from IRT analyses of exam and homework data, and then proceeds to investigate the effects of each confounding factor introduced in the online realm. It is found that IRT yields the correct trends for learner ability and meaningful item parameters, yet overall agreement with exam data is moderate. It is also found that learner ability and item discrimination is over wide ranges robust with respect to model assumptions and introduced noise, less so than item difficulty

    Learning Fair Naive Bayes Classifiers by Discovering and Eliminating Discrimination Patterns

    Full text link
    As machine learning is increasingly used to make real-world decisions, recent research efforts aim to define and ensure fairness in algorithmic decision making. Existing methods often assume a fixed set of observable features to define individuals, but lack a discussion of certain features not being observed at test time. In this paper, we study fairness of naive Bayes classifiers, which allow partial observations. In particular, we introduce the notion of a discrimination pattern, which refers to an individual receiving different classifications depending on whether some sensitive attributes were observed. Then a model is considered fair if it has no such pattern. We propose an algorithm to discover and mine for discrimination patterns in a naive Bayes classifier, and show how to learn maximum likelihood parameters subject to these fairness constraints. Our approach iteratively discovers and eliminates discrimination patterns until a fair model is learned. An empirical evaluation on three real-world datasets demonstrates that we can remove exponentially many discrimination patterns by only adding a small fraction of them as constraints

    Mass hierarchy discrimination with atmospheric neutrinos in large volume ice/water Cherenkov detectors

    Full text link
    Large mass ice/water Cherenkov experiments, optimized to detect low energy (1-20 GeV) atmospheric neutrinos, have the potential to discriminate between normal and inverted neutrino mass hierarchies. The sensitivity depends on several model and detector parameters, such as the neutrino flux profile and normalization, the Earth density profile, the oscillation parameter uncertainties, and the detector effective mass and resolution. A proper evaluation of the mass hierarchy discrimination power requires a robust statistical approach. In this work, the Toy Monte Carlo, based on an extended unbinned likelihood ratio test statistic, was used. The effect of each model and detector parameter, as well as the required detector exposure, was then studied. While uncertainties on the Earth density and atmospheric neutrino flux profiles were found to have a minor impact on the mass hierarchy discrimination, the flux normalization, as well as some of the oscillation parameter (\Delta m^2_{31}, \theta_{13}, \theta_{23}, and \delta_{CP}) uncertainties and correlations resulted critical. Finally, the minimum required detector exposure, the optimization of the low energy threshold, and the detector resolutions were also investigated.Comment: 23 pages, 16 figure

    Multiscale autocorrelation function: a new approach to anisotropy studies

    Full text link
    We present a novel catalog-independent method, based on a scale dependent approach, to detect anisotropy signatures in the arrival direction distribution of the ultra highest energy cosmic rays (UHECR). The method provides a good discrimination power for both large and small data sets, even in presence of strong contaminating isotropic background. We present some applications to simulated data sets of events corresponding to plausible scenarios for charged particles detected by world-wide surface detector-based observatories, in the last decades.Comment: 18 pages, 9 figure
    • …
    corecore