2,877 research outputs found

    On the influence of statistics on the determination of the mean value of the depth of shower maximum for ultra high energy cosmic ray showers

    Get PDF
    The chemical composition of ultra high energy cosmic rays is still uncertain. The latest results obtained by the Pierre Auger Observatory and the HiRes Collaboration, concerning the measurement of the mean value and the fluctuations of the atmospheric depth at which the showers reach the maximum development, Xmax, are inconsistent. From comparison with air shower simulations it can be seen that, while the Auger data may be interpreted as a gradual transition to heavy nuclei for energies larger than ~ 2-3x10^18 eV, the HiRes data are consistent with a composition dominated by protons. In Ref. [1] it is suggested that a possible explanation of the observed deviation of the mean value of Xmax from the proton expectation, observed by Auger, could originate in a statistical bias arising from the approximated exponential shape of the Xmax distribution, combined with the decrease of the number of events as a function of primary energy. In this paper we consider a better description of the Xmax distribution and show that the possible bias in the Auger data is at least one order of magnitude smaller than the one obtained when assuming an exponential distribution. Therefore, we conclude that the deviation of the Auger data from the proton expectation is unlikely explained by such statistical effect.Comment: To be published in Journal of Physics G: Nuclear and Particle Physic

    Prospects for GMRT to Observe Radio Waves from UHE Particles Interacting with the Moon

    Full text link
    Ultra high energy (UHE) particles of cosmic origin impact the lunar regolith and produce radio signals through Askaryan effect, signals that can be detected by Earth based radio telescopes. We calculate the expected sensitivity for observation of such events at the Giant Metrewave Radio Telescope (GMRT), both for UHE cosmic rays (CR) and UHE neutrino interactions. We find that for 30 days of observation time a significant number of detectable events is expected above 102010^{20} eV for UHECR or neutrino fluxes close to the current limits. Null detection over a period of 30 days will lower the experimental bounds on UHE particle fluxes by magnitudes competitive to both present and future experiments at the very highest energies.Comment: 21 pages, 9 figure

    A Three-Point Cosmic Ray Anisotropy Method

    Full text link
    The two-point angular correlation function is a traditional method used to search for deviations from expectations of isotropy. In this paper we develop and explore a statistically descriptive three-point method with the intended application being the search for deviations from isotropy in the highest energy cosmic rays. We compare the sensitivity of a two-point method and a "shape-strength" method for a variety of Monte-Carlo simulated anisotropic signals. Studies are done with anisotropic source signals diluted by an isotropic background. Type I and II errors for rejecting the hypothesis of isotropic cosmic ray arrival directions are evaluated for four different event sample sizes: 27, 40, 60 and 80 events, consistent with near term data expectations from the Pierre Auger Observatory. In all cases the ability to reject the isotropic hypothesis improves with event size and with the fraction of anisotropic signal. While ~40 event data sets should be sufficient for reliable identification of anisotropy in cases of rather extreme (highly anisotropic) data, much larger data sets are suggested for reliable identification of more subtle anisotropies. The shape-strength method consistently performs better than the two point method and can be easily adapted to an arbitrary experimental exposure on the celestial sphere.Comment: Fixed PDF erro

    Noisy Optimization: Convergence with a Fixed Number of Resamplings

    Get PDF
    It is known that evolution strategies in continuous domains might not converge in the presence of noise. It is also known that, under mild assumptions, and using an increasing number of resamplings, one can mitigate the effect of additive noise and recover convergence. We show new sufficient conditions for the convergence of an evolutionary algorithm with constant number of resamplings; in particular, we get fast rates (log-linear convergence) provided that the variance decreases around the optimum slightly faster than in the so-called multiplicative noise model. Keywords: Noisy optimization, evolutionary algorithm, theory.Comment: EvoStar (2014

    Assessment of left atrial volume before and after pulmonary thromboendarterectomy in chronic thromboembolic pulmonary hypertension.

    Get PDF
    BackgroundImpaired left ventricular diastolic filling is common in chronic thromboembolic pulmonary hypertension (CTEPH), and recent studies support left ventricular underfilling as a cause. To investigate this further, we assessed left atrial volume index (LAVI) in patients with CTEPH before and after pulmonary thromboendarterectomy (PTE).MethodsForty-eight consecutive CTEPH patients had pre- & post-PTE echocardiograms and right heart catheterizations. Parameters included mean pulmonary artery pressure (mPAP), pulmonary vascular resistance (PVR), cardiac index, LAVI, & mitral E/A ratio. Echocardiograms were performed 6 ± 3 days pre-PTE and 10 ± 4 days post-PTE. Regression analyses compared pre- and post-PTE LAVI with other parameters.ResultsPre-op LAVI (mean 19.0 ± 7 mL/m2) correlated significantly with pre-op PVR (R = -0.45, p = 0.001), mPAP (R = -0.28, p = 0.05) and cardiac index (R = 0.38, p = 0.006). Post-PTE, LAVI increased by 18% to 22.4 ± 7 mL/m2 (p = 0.003). This change correlated with change in PVR (765 to 311 dyne-s/cm5, p = 0.01), cardiac index (2.6 to 3.2 L/min/m2, p = 0.02), and E/A (.95 to 1.44, p = 0.002).ConclusionIn CTEPH, smaller LAVI is associated with lower cardiac output, higher mPAP, and higher PVR. LAVI increases by ~20% after PTE, and this change correlates with changes in PVR and mitral E/A. The rapid increase in LAVI supports the concept that left ventricular diastolic impairment and low E/A pre-PTE are due to left heart underfilling rather than inherent left ventricular diastolic dysfunction

    Composition of UHECR and the Pierre Auger Observatory Spectrum

    Full text link
    We fit the recently published Pierre Auger ultra-high energy cosmic ray spectrum assuming that either nucleons or nuclei are emitted at the sources. We consider the simplified cases of pure proton, or pure oxygen, or pure iron injection. We perform an exhaustive scan in the source evolution factor, the spectral index, the maximum energy of the source spectrum Z E_{max}, and the minimum distance to the sources. We show that the Pierre Auger spectrum agrees with any of the source compositions we assumed. For iron, in particular, there are two distinct solutions with high and low E_{max} (e.g. 6.4 10^{20} eV and 2 10^{19} eV) respectively which could be distinguished by either a large fraction or the near absence of proton primaries at the highest energies. We raise the possibility that an iron dominated injected flux may be in line with the latest composition measurement from the Pierre Auger Observatory where a hint of heavy element dominance is seen.Comment: 19 pages, 6 figures (33 panels)- Uses iopart.cls and iopart12.clo- In version 2: addition of a few sentences and two reference

    The eLISA/NGO Data Processing Centre

    Get PDF
    International audienceData analysis for the eLISA/NGO mission is going to be performed in several steps. The telemetry is unpacked and checked at ESA's Science Operations Centre (SOC). The instrument teams are providing the necessary calibration files for the SOC to process the Level 1 data. The next steps, the source identification, parameter extraction and construction of a catalogue of sources is performed at the Data Processing Centre (DPC). This includes determining the physical and astrophysical parameters of the sources and their strain time series. At the end of the processing, the produced Level 2 and Level 3 data are then transferred back to the SOC, which provides the data archive and the interface for the scientific community. The DPC is organised by the member states of the consortium. In this paper we describe a possible outline of the data processing centre, including the tasks to be performed, and the organisational structure

    Human Time-Frequency Acuity Beats the Fourier Uncertainty Principle

    Full text link
    The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4Ď€)1/(4\pi). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple "linear filter" models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.Comment: 4 pages, 2 figures; Accepted at PR

    Dark Matter Structures in the Universe: Prospects for Optical Astronomy in the Next Decade

    Full text link
    The Cold Dark Matter theory of gravitationally-driven hierarchical structure formation has earned its status as a paradigm by explaining the distribution of matter over large spans of cosmic distance and time. However, its central tenet, that most of the matter in the universe is dark and exotic, is still unproven; the dark matter hypothesis is sufficiently audacious as to continue to warrant a diverse battery of tests. While local searches for dark matter particles or their annihilation signals could prove the existence of the substance itself, studies of cosmological dark matter in situ are vital to fully understand its role in structure formation and evolution. We argue that gravitational lensing provides the cleanest and farthest-reaching probe of dark matter in the universe, which can be combined with other observational techniques to answer the most challenging and exciting questions that will drive the subject in the next decade: What is the distribution of mass on sub-galactic scales? How do galaxy disks form and bulges grow in dark matter halos? How accurate are CDM predictions of halo structure? Can we distinguish between a need for a new substance (dark matter) and a need for new physics (departures from General Relativity)? What is the dark matter made of anyway? We propose that the central tool in this program should be a wide-field optical imaging survey, whose true value is realized with support in the form of high-resolution, cadenced optical/infra-red imaging, and massive-throughput optical spectroscopy.Comment: White paper submitted to the 2010 Astronomy & Astrophysics Decadal Surve
    • …
    corecore