46,439 research outputs found

    High-dimensional change point detection for mean and location parameters

    Get PDF
    Change point inference refers to detection of structural breaks of a sequence observation, which may have one or more distributional shifts subject to models such as mean or covariance changes. In this dissertation, we consider the offline multiple change point problem that the sample size is fixed in advance or after observation. In particular, we concentrate on high-dimensional setup where the dimension pp can be much larger than the sample size nn and traditional distribution assumptions can easily fail. The goal is to employ non-parametric approaches to identify change points without involving intermediate estimation to cross-sectional dependence. In the first part, we consider cumulative sum (CUSUM) statistics that are widely used in the change point inference and identification. We study two problems for high-dimensional mean vectors based on the ℓ∞\ell^{\infty}-norm of the CUSUM statistics. For the problem of testing for the existence of a change point in an independent sample generated from the mean-shift model, we introduce a Gaussian multiplier bootstrap to calibrate critical values of the CUSUM test statistics in high dimensions. The proposed bootstrap CUSUM test is fully data-dependent and it has strong theoretical guarantees under arbitrary dependence structures and mild moment conditions. Specifically, we show that with a boundary removal parameter the bootstrap CUSUM test enjoys the uniform validity in size under the null and it achieves the minimax separation rate under the sparse alternatives when p≫np \gg n. Once a change point is detected, we estimate the change point location by maximizing the ℓ∞\ell^{\infty}-norm of the generalized CUSUM statistics at two different weighting scales. The first estimator is based on the covariance stationary CUSUM statistics, and we prove its consistency in estimating the location at the nearly parametric rate n−1/2n^{-1/2} for sub-exponential observations. The second estimator is based on non-stationary CUSUM statistics, assigning less weights on the boundary data points. In the latter case, we show that it achieves the nearly best possible rate of convergence on the order n−1n^{-1}. In both cases, dimension impacts the rate of convergence only through the logarithm factors, and therefore consistency of the CUSUM location estimators is possible when pp is much larger than nn. In the presence of multiple change points, we propose a principled bootstrap-assisted binary segmentation (BABS) algorithm to dynamically adjust the change point detection rule and recursively estimate their locations. We derive its rate of convergence under suitable signal separation and strength conditions. The results derived are non-asymptotic and we provide extensive simulation studies to assess the finite sample performance. The empirical evidence shows an encouraging agreement with our theoretical results. In the second part, we analyze the problem of change point detection for high-dimensional distributions in a location family. We propose a robust, tuning-free (i.e., fully data-dependent), and easy-to-implement change point test formulated in the multivariate UU-statistics framework with anti-symmetric and nonlinear kernels. It achieves the robust purpose in a non-parametric setting when CUSUM statistics are sensitive to outliers and heavy-tailed distributions. Specifically, the within-sample noise is canceled out by anti-symmetry of the kernel, while the signal distortion under certain nonlinear kernels can be controlled such that the between-sample change point signal is magnitude preserving. A (half) jackknife multiplier bootstrap (JMB) tailored to the change point detection setting is proposed to calibrate the distribution of our ℓ∞\ell^{\infty}-norm aggregated test statistic. Subject to mild moment conditions on kernels, we derive the uniform rates of convergence for the JMB to approximate the sampling distribution of the test statistic, and analyze its size and power properties. Extensions to multiple change point testing and estimation are discussed with illustration from numeric studies

    A framework for adaptive Monte-Carlo procedures

    Get PDF
    Adaptive Monte Carlo methods are recent variance reduction techniques. In this work, we propose a mathematical setting which greatly relaxes the assumptions needed by for the adaptive importance sampling techniques presented by Vazquez-Abad and Dufresne, Fu and Su, and Arouna. We establish the convergence and asymptotic normality of the adaptive Monte Carlo estimator under local assumptions which are easily verifiable in practice. We present one way of approximating the optimal importance sampling parameter using a randomly truncated stochastic algorithm. Finally, we apply this technique to some examples of valuation of financial derivatives

    Robust estimation and inference for heavy tailed GARCH

    Get PDF
    We develop two new estimators for a general class of stationary GARCH models with possibly heavy tailed asymmetrically distributed errors, covering processes with symmetric and asymmetric feedback like GARCH, Asymmetric GARCH, VGARCH and Quadratic GARCH. The first estimator arises from negligibly trimming QML criterion equations according to error extremes. The second imbeds negligibly transformed errors into QML score equations for a Method of Moments estimator. In this case, we exploit a sub-class of redescending transforms that includes tail-trimming and functions popular in the robust estimation literature, and we re-center the transformed errors to minimize small sample bias. The negligible transforms allow both identification of the true parameter and asymptotic normality. We present a consistent estimator of the covariance matrix that permits classic inference without knowledge of the rate of convergence. A simulation study shows both of our estimators trump existing ones for sharpness and approximate normality including QML, Log-LAD, and two types of non-Gaussian QML (Laplace and Power-Law). Finally, we apply the tail-trimmed QML estimator to financial data.Comment: Published at http://dx.doi.org/10.3150/14-BEJ616 in the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Detecting gradual changes in locally stationary processes

    Full text link
    In a wide range of applications, the stochastic properties of the observed time series change over time. The changes often occur gradually rather than abruptly: the properties are (approximately) constant for some time and then slowly start to change. In many cases, it is of interest to locate the time point where the properties start to vary. In contrast to the analysis of abrupt changes, methods for detecting smooth or gradual change points are less developed and often require strong parametric assumptions. In this paper, we develop a fully nonparametric method to estimate a smooth change point in a locally stationary framework. We set up a general procedure which allows us to deal with a wide variety of stochastic properties including the mean, (auto)covariances and higher moments. The theoretical part of the paper establishes the convergence rate of the new estimator. In addition, we examine its finite sample performance by means of a simulation study and illustrate the methodology by two applications to financial return data.Comment: Published at http://dx.doi.org/10.1214/14-AOS1297 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • …
    corecore