43,203 research outputs found

    Extreme Value Theory and the Solar Cycle

    Full text link
    We investigate the statistical properties of the extreme events of the solar cycle as measured by the sunspot number. The recent advances in the methodology of the theory of extreme values is applied to the maximal extremes of the time series of sunspots. We focus on the extreme events that exceed a carefully chosen threshold and a generalized Pareto distribution is fitted to the tail of the empirical cumulative distribution. A maximum likelihood method is used to estimate the parameters of the generalized Pareto distribution and confidence levels are also given to the parameters. Due to the lack of an automatic procedure for selecting the threshold, we analyze the sensitivity of the fitted generalized Pareto distribution to the exact value of the threshold. According to the available data, that only spans the previous ~250 years, the cumulative distribution of the time series is bounded, yielding an upper limit of 324 for the sunspot number. We also estimate that the return value for each solar cycle is ~188, while the return value for a century increases to ~228. Finally, the results also indicate that the most probable return time for a large event like the maximum at solar cycle 19 happens once every ~700 years and that the probability of finding such a large event with a frequency smaller than ~50 years is very small. In spite of the essentially extrapolative character of these results, their statistical significance is very large.Comment: 6 pages, 4 figures, accepted for publication in A&

    Applications and identification of surface correlations

    Get PDF
    We compare theoretical, experimental, and computational approaches to random rough surfaces. The aim is to produce rough surfaces with desirable correlations and to analyze the correlation functions extracted from the surface profiles. Physical applications include ultracold neutrons in a rough waveguide, lateral electronic transport, and scattering of longwave particles and waves. Results provide guidance on how to deal with experimental and computational data on rough surfaces. A supplemental goal is to optimize the neutron waveguide for GRANIT experiments. The measured correlators are identified by fitting functions or by direct spectral analysis. The results are used to compare the calculated observables with theoretical values. Because of fluctuations, the fitting procedures lead to inaccurate physical results even if the quality of the fit is very good unless one guesses the right shape of the fitting function. Reliable extraction of the correlation function from the measured surface profile seems virtually impossible without independent information on the structure of the correlation function. Direct spectral analysis of raw data rarely works better than the use of a "wrong" fitting function. Analysis of surfaces with a large correlation radius is hindered by the presence of domains and interdomain correlations

    Rank-normalization, folding, and localization: An improved R^\widehat{R} for assessing convergence of MCMC

    Full text link
    Markov chain Monte Carlo is a key computational tool in Bayesian statistics, but it can be challenging to monitor the convergence of an iterative stochastic algorithm. In this paper we show that the convergence diagnostic R^\widehat{R} of Gelman and Rubin (1992) has serious flaws. Traditional R^\widehat{R} will fail to correctly diagnose convergence failures when the chain has a heavy tail or when the variance varies across the chains. In this paper we propose an alternative rank-based diagnostic that fixes these problems. We also introduce a collection of quantile-based local efficiency measures, along with a practical approach for computing Monte Carlo error estimates for quantiles. We suggest that common trace plots should be replaced with rank plots from multiple chains. Finally, we give recommendations for how these methods should be used in practice.Comment: Minor revision for improved clarit

    Spreading of waves in nonlinear disordered media

    Full text link
    We analyze mechanisms and regimes of wave packet spreading in nonlinear disordered media. We predict that wave packets can spread in two regimes of strong and weak chaos. We discuss resonance probabilities, nonlinear diffusion equations, and predict a dynamical crossover from strong to weak chaos. The crossover is controlled by the ratio of nonlinear frequency shifts and the average eigenvalue spacing of eigenstates of the linear equations within one localization volume. We consider generalized models in higher lattice dimensions and obtain critical values for the nonlinearity power, the dimension, and norm density, which influence possible dynamical outcomes in a qualitative way.Comment: 24 pages, 3 figures. arXiv admin note: text overlap with arXiv:0901.441

    Rank-normalization, folding, and localization: An improved R^\widehat{R} for assessing convergence of MCMC

    Get PDF
    Markov chain Monte Carlo is a key computational tool in Bayesian statistics, but it can be challenging to monitor the convergence of an iterative stochastic algorithm. In this paper we show that the convergence diagnostic R^\widehat{R} of Gelman and Rubin (1992) has serious flaws. Traditional R^\widehat{R} will fail to correctly diagnose convergence failures when the chain has a heavy tail or when the variance varies across the chains. In this paper we propose an alternative rank-based diagnostic that fixes these problems. We also introduce a collection of quantile-based local efficiency measures, along with a practical approach for computing Monte Carlo error estimates for quantiles. We suggest that common trace plots should be replaced with rank plots from multiple chains. Finally, we give recommendations for how these methods should be used in practice.Comment: Minor revision for improved clarit

    Non-Gaussian Geostatistical Modeling using (skew) t Processes

    Get PDF
    We propose a new model for regression and dependence analysis when addressing spatial data with possibly heavy tails and an asymmetric marginal distribution. We first propose a stationary process with tt marginals obtained through scale mixing of a Gaussian process with an inverse square root process with Gamma marginals. We then generalize this construction by considering a skew-Gaussian process, thus obtaining a process with skew-t marginal distributions. For the proposed (skew) tt process we study the second-order and geometrical properties and in the tt case, we provide analytic expressions for the bivariate distribution. In an extensive simulation study, we investigate the use of the weighted pairwise likelihood as a method of estimation for the tt process. Moreover we compare the performance of the optimal linear predictor of the tt process versus the optimal Gaussian predictor. Finally, the effectiveness of our methodology is illustrated by analyzing a georeferenced dataset on maximum temperatures in Australi

    A continuous time random walk model for financial distributions

    Get PDF
    We apply the formalism of the continuous time random walk to the study of financial data. The entire distribution of prices can be obtained once two auxiliary densities are known. These are the probability densities for the pausing time between successive jumps and the corresponding probability density for the magnitude of a jump. We have applied the formalism to data on the US dollar/Deutsche Mark future exchange, finding good agreement between theory and the observed data.Comment: 14 pages, 5 figures, revtex4, submitted for publicatio
    corecore