35,450 research outputs found

    Magnitude Uncertainties Impact Seismic Rate Estimates, Forecasts and Predictability Experiments

    Full text link
    The Collaboratory for the Study of Earthquake Predictability (CSEP) aims to prospectively test time-dependent earthquake probability forecasts on their consistency with observations. To compete, time-dependent seismicity models are calibrated on earthquake catalog data. But catalogs contain much observational uncertainty. We study the impact of magnitude uncertainties on rate estimates in clustering models, on their forecasts and on their evaluation by CSEP's consistency tests. First, we quantify magnitude uncertainties. We find that magnitude uncertainty is more heavy-tailed than a Gaussian, such as a double-sided exponential distribution, with scale parameter nu_c=0.1 - 0.3. Second, we study the impact of such noise on the forecasts of a simple clustering model which captures the main ingredients of popular short term models. We prove that the deviations of noisy forecasts from an exact forecast are power law distributed in the tail with exponent alpha=1/(a*nu_c), where a is the exponent of the productivity law of aftershocks. We further prove that the typical scale of the fluctuations remains sensitively dependent on the specific catalog. Third, we study how noisy forecasts are evaluated in CSEP consistency tests. Noisy forecasts are rejected more frequently than expected for a given confidence limit. The Poisson assumption of the consistency tests is inadequate for short-term forecast evaluations. To capture the idiosyncrasies of each model together with any propagating uncertainties, the forecasts need to specify the entire likelihood distribution of seismic rates.Comment: 35 pages, including 15 figures, agu styl

    Comment on "Analysis of the Spatial Distribution between Successive Earthquakes" by Davidsen and Paczuski

    Full text link
    By analyzing a southern California earthquake catalog, Davidsen and Paczuski [Phys. Rev. Lett. 94, 048501 (2005)] claim to have found evidence contradicting the theory of aftershock zone scaling in favor of scale-free statistics. We present four elements showing that Davidsen and Paczuski's results may be insensitive to the existence of physical length scales associated with aftershock zones or mainshock rupture lengths, so that their claim is unsubstantiated. (i) Their exponent smaller than 1 for a pdf implies that the power law statistics they report is at best an intermediate asymptotic; (ii) their power law is not robust to the removal of 6 months of data around Landers earthquake within a period of 17 years; (iii) the same analysis for Japan and northern California shows no evidence of robust power laws; (iv) a statistical model of earthquake triggering that explicitely obeys aftershock zone scaling can reproduce the observed histogram of Davidsen and Paczuski, demonstrating that their statistic may not be sensitive to the presence of characteristic scales associated with earthquake triggering

    Quark-Gluon-Plasma Formation at SPS Energies?

    Get PDF
    By colliding ultrarelativistic ions, one achieves presently energy densities close to the critical value, concerning the formation of a quark-gluon-plasma. This indicates the importance of fluctuations and the necessity to go beyond the investigation of average events. Therefore, we introduce a percolation approach to model the final stage (Ď„>1\tau > 1 fm/c) of ion-ion collisions, the initial stage being treated by well-established methods, based on strings and Pomerons. The percolation approach amounts to finding high density domains, and treating them as quark-matter droplets. In this way, we have a {\bf realistic, microscopic, and Monte--Carlo based model which allows for the formation of quark matter.} We find that even at SPS energies large quark-matter droplets are formed -- at a low rate though. In other words: large quark-matter droplets are formed due to geometrical fluctuation, but not in the average event.Comment: 7 Pages, HD-TVP-94-6 (1 uuencoded figure

    Model atmospheres of X-ray bursting neutron stars

    Full text link
    We present an extended set of model atmospheres and emergent spectra of X-ray bursting neutron stars in low mass X-ray binaries. Compton scattering is taken into account. The models were computed in LTE approximation for six different chemical compositions: pure hydrogen and pure helium atmospheres, and atmospheres with a solar mix of hydrogen and helium and various heavy elements abundances: Z = 1, 0.3, 0.1, and 0.01 Z_sun, for three values of gravity, log g =14.0, 14.3, and 14.6 and for 20 values of relative luminosity l = L/L_Edd in the range 0.001 - 0.98. The emergent spectra of all models are fitted by diluted blackbody spectra in the observed RXTE/PCA band 3 - 20 keV and the corresponding values of color correction factors f_c are presented. We also show how to use these dependencies to estimate the neutron star's basic parameters.Comment: 2 pages, 1 figure, conference "Astrophysics of Neutron Stars - 2010" in honor of M. Ali Alpar, Izmir, Turke

    Correlative Capacity of Composite Quantum States

    Full text link
    We characterize the optimal correlative capacity of entangled, separable, and classically correlated states. Introducing the notions of the infimum and supremum within majorization theory, we construct the least disordered separable state compatible with a set of marginals. The maximum separable correlation information supportable by the marginals of a multi-qubit pure state is shown to be an LOCC monotone. The least disordered composite of a pair of qubits is found for the above classes, with classically correlated states defined as diagonal in the product of marginal bases.Comment: 4 pages, 1 figur

    Semihard Interactions in Nuclear Collisions Based on a Unified Approach to High Energy Scattering

    Get PDF
    Our ultimate goal is the construction of a model for interactions of two nuclei in the energy range between several tens of GeV up to several TeV per nucleon in the centre-of-mass system. Such nuclear collisions are very complex, being composed of many components, and therefore some strategy is needed to construct a reliable model. The central point of our approach is the hypothesis, that the behavior of high energy interactions is universal (universality hypothesis). So, for example, the hadronization of partons in nuclear interactions follows the same rules as the one in electron-positron annihilation; the radiation of off-shell partons in nuclear collisions is based on the same principles as the one in deep inelastic scattering. We construct a model for nuclear interactions in a modular fashion. The individual modules, based on the universality hypothesis, are identified as building blocks for more elementary interactions (like e^+ e^-, lepton-proton), and can therefore be studied in a much simpler context. With these building blocks under control, we can provide a quite reliable model for nucleus-nucleus scattering, providing in particular very useful tests for the complicated numerical procedures using Monte Carlo techniques.Comment: 10 pages, no figures; Proc. of the ``Workshop on Nuclear Matter in Different Phases and Transitions'', Les Houches, France, March 31 - April 10, 199
    • …
    corecore