Common Principal Components for Dependent Random Vectors

Abstract

Let the kp-variate random vector X be partitioned into k subvectors Xi of dimension p each, and let the covariance matrix [Psi] of X be partitioned analogously into submatrices [Psi]ij. The common principal component (CPC) model for dependent random vectors assumes the existence of an orthogonal p by p matrix [beta] such that [beta]t[Psi]ij[beta] is diagonal for all (i, j). After a formal definition of the model, normal theory maximum likelihood estimators are obtained. The asymptotic theory for the estimated orthogonal matrix is derived by a new technique of choosing proper subsets of functionally independent parameters.asymptotic distribution, eigenvalue, eigenvector, entropy, maximum likelihood estimation, multivariate normal distribution, patterned covariance matrices

    Similar works

    Full text

    thumbnail-image

    Available Versions

    Last time updated on 06/07/2012