14,486 research outputs found

    Inference for covariate adjusted regression via varying coefficient models

    Full text link
    We consider covariate adjusted regression (CAR), a regression method for situations where predictors and response are observed after being distorted by a multiplicative factor. The distorting factors are unknown functions of an observable covariate, where one specific distorting function is associated with each predictor or response. The dependence of both response and predictors on the same confounding covariate may alter the underlying regression relation between undistorted but unobserved predictors and response. We consider a class of highly flexible adjustment methods for parameter estimation in the underlying regression model, which is the model of interest. Asymptotic normality of the estimates is obtained by establishing a connection to varying coefficient models. These distribution results combined with proposed consistent estimates of the asymptotic variance are used for the construction of asymptotic confidence intervals for the regression coefficients. The proposed approach is illustrated with data on serum creatinine, and finite sample properties of the proposed procedures are investigated through a simulation study.Comment: Published at http://dx.doi.org/10.1214/009053606000000083 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Principal Component Analysis for Functional Data on Riemannian Manifolds and Spheres

    Full text link
    Functional data analysis on nonlinear manifolds has drawn recent interest. Sphere-valued functional data, which are encountered for example as movement trajectories on the surface of the earth, are an important special case. We consider an intrinsic principal component analysis for smooth Riemannian manifold-valued functional data and study its asymptotic properties. Riemannian functional principal component analysis (RFPCA) is carried out by first mapping the manifold-valued data through Riemannian logarithm maps to tangent spaces around the time-varying Fr\'echet mean function, and then performing a classical multivariate functional principal component analysis on the linear tangent spaces. Representations of the Riemannian manifold-valued functions and the eigenfunctions on the original manifold are then obtained with exponential maps. The tangent-space approximation through functional principal component analysis is shown to be well-behaved in terms of controlling the residual variation if the Riemannian manifold has nonnegative curvature. Specifically, we derive a central limit theorem for the mean function, as well as root-nn uniform convergence rates for other model components, including the covariance function, eigenfunctions, and functional principal component scores. Our applications include a novel framework for the analysis of longitudinal compositional data, achieved by mapping longitudinal compositional data to trajectories on the sphere, illustrated with longitudinal fruit fly behavior patterns. RFPCA is shown to be superior in terms of trajectory recovery in comparison to an unrestricted functional principal component analysis in applications and simulations and is also found to produce principal component scores that are better predictors for classification compared to traditional functional functional principal component scores
    corecore