6,585 research outputs found

    Improved estimation of the covariance matrix of stock returns with an application to portofolio selection

    Get PDF
    This paper proposes to estimate the covariance matrix of stock returns by an optimally weighted average of two existing estimators: the sample covariance matrix and single-index covariance matrix. This method is generally known as shrinkage, and it is standard in decision theory and in empirical Bayesian statistics. Our shrinkage estimator can be seen as a way to account for extra-market covariance without having to specify an arbitrary multi-factor structure. For NYSE and AMEX stock returns from 1972 to 1995, it can be used to select portfolios with significantly lower out-of-sample variance than a set of existing estimators, including multi-factor models.Covariance matrix estimation, factor models, portofolio selection, shrinkage

    Optimal trading strategies - a time series approach

    Get PDF
    Motivated by recent advances in the spectral theory of auto-covariance matrices, we are led to revisit a reformulation of Markowitz' mean-variance portfolio optimization approach in the time domain. In its simplest incarnation it applies to a single traded asset and allows to find an optimal trading strategy which - for a given return - is minimally exposed to market price fluctuations. The model is initially investigated for a range of synthetic price processes, taken to be either second order stationary, or to exhibit second order stationary increments. Attention is paid to consequences of estimating auto-covariance matrices from small finite samples, and auto-covariance matrix cleaning strategies to mitigate against these are investigated. Finally we apply our framework to real world data

    Covariance Estimation: The GLM and Regularization Perspectives

    Get PDF
    Finding an unconstrained and statistically interpretable reparameterization of a covariance matrix is still an open problem in statistics. Its solution is of central importance in covariance estimation, particularly in the recent high-dimensional data environment where enforcing the positive-definiteness constraint could be computationally expensive. We provide a survey of the progress made in modeling covariance matrices from two relatively complementary perspectives: (1) generalized linear models (GLM) or parsimony and use of covariates in low dimensions, and (2) regularization or sparsity for high-dimensional data. An emerging, unifying and powerful trend in both perspectives is that of reducing a covariance estimation problem to that of estimating a sequence of regression problems. We point out several instances of the regression-based formulation. A notable case is in sparse estimation of a precision matrix or a Gaussian graphical model leading to the fast graphical LASSO algorithm. Some advantages and limitations of the regression-based Cholesky decomposition relative to the classical spectral (eigenvalue) and variance-correlation decompositions are highlighted. The former provides an unconstrained and statistically interpretable reparameterization, and guarantees the positive-definiteness of the estimated covariance matrix. It reduces the unintuitive task of covariance estimation to that of modeling a sequence of regressions at the cost of imposing an a priori order among the variables. Elementwise regularization of the sample covariance matrix such as banding, tapering and thresholding has desirable asymptotic properties and the sparse estimated covariance matrix is positive definite with probability tending to one for large samples and dimensions.Comment: Published in at http://dx.doi.org/10.1214/11-STS358 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Estimating High Dimensional Covariance Matrices and its Applications

    Get PDF
    Estimating covariance matrices is an important part of portfolio selection, risk management, and asset pricing. This paper reviews the recent development in estimating high dimensional covariance matrices, where the number of variables can be greater than the number of observations. The limitations of the sample covariance matrix are discussed. Several new approaches are presented, including the shrinkage method, the observable and latent factor method, the Bayesian approach, and the random matrix theory approach. For each method, the construction of covariance matrices is given. The relationships among these methods are discussed.Factor analysis, Principal components, Singular value decomposition, Random matrix theory, Empirical Bayes, Shrinkage method, Optimal portfolios, CAPM, APT, GMM

    The generalized shrinkage estimator for the analysis of functional connectivity of brain signals

    Full text link
    We develop a new statistical method for estimating functional connectivity between neurophysiological signals represented by a multivariate time series. We use partial coherence as the measure of functional connectivity. Partial coherence identifies the frequency bands that drive the direct linear association between any pair of channels. To estimate partial coherence, one would first need an estimate of the spectral density matrix of the multivariate time series. Parametric estimators of the spectral density matrix provide good frequency resolution but could be sensitive when the parametric model is misspecified. Smoothing-based nonparametric estimators are robust to model misspecification and are consistent but may have poor frequency resolution. In this work, we develop the generalized shrinkage estimator, which is a weighted average of a parametric estimator and a nonparametric estimator. The optimal weights are frequency-specific and derived under the quadratic risk criterion so that the estimator, either the parametric estimator or the nonparametric estimator, that performs better at a particular frequency receives heavier weight. We validate the proposed estimator in a simulation study and apply it on electroencephalogram recordings from a visual-motor experiment.Comment: Published in at http://dx.doi.org/10.1214/10-AOAS396 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • …
    corecore