1,503,269 research outputs found

    Logarithmic Quantile Estimation for Rank Statistics

    Full text link
    We prove an almost sure weak limit theorem for simple linear rank statistics for samples with continuous distributions functions. As a corollary the result extends to samples with ties, and the vector version of an a.s. central limit theorem for vectors of linear rank statistics. Moreover, we derive such a weak convergence result for some quadratic forms. These results are then applied to quantile estimation, and to hypothesis testing for nonparametric statistical designs, here demonstrated by the c-sample problem, where the samples may be dependent. In general, the method is known to be comparable to the bootstrap and other nonparametric methods (\cite{THA, FRI}) and we confirm this finding for the c-sample problem

    Asymptotics and optimal bandwidth selection for highest density region estimation

    Get PDF
    We study kernel estimation of highest-density regions (HDR). Our main contributions are two-fold. First, we derive a uniform-in-bandwidth asymptotic approximation to a risk that is appropriate for HDR estimation. This approximation is then used to derive a bandwidth selection rule for HDR estimation possessing attractive asymptotic properties. We also present the results of numerical studies that illustrate the benefits of our theory and methodology.Comment: Published in at http://dx.doi.org/10.1214/09-AOS766 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Spectral estimation of the fractional order of a L\'{e}vy process

    Get PDF
    We consider the problem of estimating the fractional order of a L\'{e}vy process from low frequency historical and options data. An estimation methodology is developed which allows us to treat both estimation and calibration problems in a unified way. The corresponding procedure consists of two steps: the estimation of a conditional characteristic function and the weighted least squares estimation of the fractional order in spectral domain. While the second step is identical for both calibration and estimation, the first one depends on the problem at hand. Minimax rates of convergence for the fractional order estimate are derived, the asymptotic normality is proved and a data-driven algorithm based on aggregation is proposed. The performance of the estimator in both estimation and calibration setups is illustrated by a simulation study.Comment: Published in at http://dx.doi.org/10.1214/09-AOS715 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Consistency of maximum likelihood estimation for some dynamical systems

    Get PDF
    We consider the asymptotic consistency of maximum likelihood parameter estimation for dynamical systems observed with noise. Under suitable conditions on the dynamical systems and the observations, we show that maximum likelihood parameter estimation is consistent. Our proof involves ideas from both information theory and dynamical systems. Furthermore, we show how some well-studied properties of dynamical systems imply the general statistical properties related to maximum likelihood estimation. Finally, we exhibit classical families of dynamical systems for which maximum likelihood estimation is consistent. Examples include shifts of finite type with Gibbs measures and Axiom A attractors with SRB measures.Comment: Published in at http://dx.doi.org/10.1214/14-AOS1259 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    On the Robustness of the Bayes and Wiener Estimators under Model Uncertainty

    Full text link
    This paper deals with the robust estimation problem of a signal given noisy observations. We assume that the actual statistics of the signal and observations belong to a ball about the nominal statistics. This ball is formed by placing a bound on the Tau-divergence family between the actual and the nominal statistics. Then, the robust estimator is obtained by minimizing the mean square error according to the least favorable statistics in that ball. Therefore, we obtain a divergence family-based minimax approach to robust estimation. We show in the case that the signal and observations have no dynamics, the Bayes estimator is the optimal solution. Moreover, in the dynamic case, the optimal offline estimator is the noncausal Wiener filter

    Estimation of sums of random variables: Examples and information bounds

    Full text link
    This paper concerns the estimation of sums of functions of observable and unobservable variables. Lower bounds for the asymptotic variance and a convolution theorem are derived in general finite- and infinite-dimensional models. An explicit relationship is established between efficient influence functions for the estimation of sums of variables and the estimation of their means. Certain ``plug-in'' estimators are proved to be asymptotically efficient in finite-dimensional models, while ``u,vu,v'' estimators of Robbins are proved to be efficient in infinite-dimensional mixture models. Examples include certain species, network and data confidentiality problems.Comment: Published at http://dx.doi.org/10.1214/009053605000000390 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Non-linear regression models for Approximate Bayesian Computation

    Full text link
    Approximate Bayesian inference on the basis of summary statistics is well-suited to complex problems for which the likelihood is either mathematically or computationally intractable. However the methods that use rejection suffer from the curse of dimensionality when the number of summary statistics is increased. Here we propose a machine-learning approach to the estimation of the posterior density by introducing two innovations. The new method fits a nonlinear conditional heteroscedastic regression of the parameter on the summary statistics, and then adaptively improves estimation using importance sampling. The new algorithm is compared to the state-of-the-art approximate Bayesian methods, and achieves considerable reduction of the computational burden in two examples of inference in statistical genetics and in a queueing model.Comment: 4 figures; version 3 minor changes; to appear in Statistics and Computin

    Asymptotic normality of maximum likelihood and its variational approximation for stochastic blockmodels

    Full text link
    Variational methods for parameter estimation are an active research area, potentially offering computationally tractable heuristics with theoretical performance bounds. We build on recent work that applies such methods to network data, and establish asymptotic normality rates for parameter estimates of stochastic blockmodel data, by either maximum likelihood or variational estimation. The result also applies to various sub-models of the stochastic blockmodel found in the literature.Comment: Published in at http://dx.doi.org/10.1214/13-AOS1124 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    corecore