616 research outputs found

    Spectral Density Bandwidth Choice: Source of Nonmonotonic Power for Tests of a Mean Shift in a Time Series

    Get PDF
    Data dependent bandwidth choices for zero frequency spectral density estimators of a time series are shown to be an important source of nonmonotonic power when testing for a shift in mean. It is shown that if the spectral density is estimated under the null hypothesis of a stable mean using a data dependent bandwidth (with or without prewhitening), non-monotonic power appears naturally for some popular tests including the CUSUM test. On the other hand, under some fixed bandwidth choices, power is monotonic. Empirical examples and simulations illustrate these power properties. Theoretical explanations for the power results are provided.

    Longitudinal Scalar-on-Function Regression with Application to Tractography Data

    Get PDF
    We propose a class of estimation techniques for scalar-on-function regression in longitudinal studies where both outcomes, such as test results on motor functions, and functional predictors, such as brain images, may be observed at multiple visits. Our methods are motivated by a longitudinal brain diffusion tensor imaging (DTI) tractography study. One of the primary goals of the study is to evaluate the contemporaneous association between human function and brain imaging over time. The complexity of the study requires development of methods that can simultaneously incorporate: (1) multiple functional (and scalar) regressors; (2) longitudinal outcome and functional predictors measurements per patient; (3) Gaussian or non-Gaussian outcomes; and, (4) missing values within functional predictors. We review existing approaches designed to handle such types of data and discuss their limitations. We propose two versions of a new method, longitudinal functional principal components regression. These methods extend the well-known functional principal component regression and allow for different effects of subject-specific trends in curves and of visit-specific deviations from that trend. The different methods are compared in simulation studies, and the most promising approaches are used for analyzing the tractography data

    Fast Covariance Estimation for High-dimensional Functional Data

    Get PDF
    For smoothing covariance functions, we propose two fast algorithms that scale linearly with the number of observations per function. Most available methods and software cannot smooth covariance matrices of dimension J×JJ \times J with J>500J>500; the recently introduced sandwich smoother is an exception, but it is not adapted to smooth covariance matrices of large dimensions such as J10,000J \ge 10,000. Covariance matrices of order J=10,000J=10,000, and even J=100,000J=100,000, are becoming increasingly common, e.g., in 2- and 3-dimensional medical imaging and high-density wearable sensor data. We introduce two new algorithms that can handle very large covariance matrices: 1) FACE: a fast implementation of the sandwich smoother and 2) SVDS: a two-step procedure that first applies singular value decomposition to the data matrix and then smoothes the eigenvectors. Compared to existing techniques, these new algorithms are at least an order of magnitude faster in high dimensions and drastically reduce memory requirements. The new algorithms provide instantaneous (few seconds) smoothing for matrices of dimension J=10,000J=10,000 and very fast (<< 10 minutes) smoothing for J=100,000J=100,000. Although SVDS is simpler than FACE, we provide ready to use, scalable R software for FACE. When incorporated into R package {\it refund}, FACE improves the speed of penalized functional regression by an order of magnitude, even for data of normal size (J<500J <500). We recommend that FACE be used in practice for the analysis of noisy and high-dimensional functional data.Comment: 35 pages, 4 figure

    Bayesian Functional Data Analysis Using WinBUGS

    Get PDF
    We provide user friendly software for Bayesian analysis of functional data models using \pkg{WinBUGS}~1.4. The excellent properties of Bayesian analysis in this context are due to: (1) dimensionality reduction, which leads to low dimensional projection bases; (2) mixed model representation of functional models, which provides a modular approach to model extension; and (3) orthogonality of the principal component bases, which contributes to excellent chain convergence and mixing properties. Our paper provides one more, essential, reason for using Bayesian analysis for functional models: the existence of software.

    Bayesian Analysis for Penalized Spline Regression Using WinBUGS

    Get PDF
    Penalized splines can be viewed as BLUPs in a mixed model framework, which allows the use of mixed model software for smoothing. Thus, software originally developed for Bayesian analysis of mixed models can be used for penalized spline regression. Bayesian inference for nonparametric models enjoys the flexibility of nonparametric models and the exact inference provided by the Bayesian inferential machinery. This paper provides a simple, yet comprehensive, set of programs for the implementation of nonparametric Bayesian analysis in WinBUGS. Good mixing properties of the MCMC chains are obtained by using low-rank thin-plate splines, while simulation times per iteration are reduced employing WinBUGS specific computational tricks.

    Proofs of theorems for the JRSS-B paper `Likelihood ratio tests in linear mixed models with one variance component'

    Full text link
    Proofs of theorems for the JRSS-B paper `Likelihood ratio tests in linear mixed models with one variance component

    Multilevel functional principal component analysis

    Full text link
    The Sleep Heart Health Study (SHHS) is a comprehensive landmark study of sleep and its impacts on health outcomes. A primary metric of the SHHS is the in-home polysomnogram, which includes two electroencephalographic (EEG) channels for each subject, at two visits. The volume and importance of this data presents enormous challenges for analysis. To address these challenges, we introduce multilevel functional principal component analysis (MFPCA), a novel statistical methodology designed to extract core intra- and inter-subject geometric components of multilevel functional data. Though motivated by the SHHS, the proposed methodology is generally applicable, with potential relevance to many modern scientific studies of hierarchical or longitudinal functional outcomes. Notably, using MFPCA, we identify and quantify associations between EEG activity during sleep and adverse cardiovascular outcomes.Comment: Published in at http://dx.doi.org/10.1214/08-AOAS206 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Nonlinear tube-fitting for the analysis of anatomical and functional structures

    Full text link
    We are concerned with the estimation of the exterior surface and interior summaries of tube-shaped anatomical structures. This interest is motivated by two distinct scientific goals, one dealing with the distribution of HIV microbicide in the colon and the other with measuring degradation in white-matter tracts in the brain. Our problem is posed as the estimation of the support of a distribution in three dimensions from a sample from that distribution, possibly measured with error. We propose a novel tube-fitting algorithm to construct such estimators. Further, we conduct a simulation study to aid in the choice of a key parameter of the algorithm, and we test our algorithm with validation study tailored to the motivating data sets. Finally, we apply the tube-fitting algorithm to a colon image produced by single photon emission computed tomography (SPECT) and to a white-matter tract image produced using diffusion tensor imaging (DTI).Comment: Published in at http://dx.doi.org/10.1214/10-AOAS384 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Soft Null Hypotheses: A Case Study of Image Enhancement Detection in Brain Lesions

    Get PDF
    This work is motivated by a study of a population of multiple sclerosis (MS) patients using dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) to identify active brain lesions. At each visit, a contrast agent is administered intravenously to a subject and a series of images is acquired to reveal the location and activity of MS lesions within the brain. Our goal is to identify and quantify lesion enhancement location at the subject level and lesion enhancement patterns at the population level. With this example, we aim to address the difficult problem of transforming a qualitative scientific null hypothesis, such as "this voxel does not enhance", to a well-defined and numerically testable null hypothesis based on existing data. We call the procedure "soft null hypothesis" testing as opposed to the standard "hard null hypothesis" testing. This problem is fundamentally different from: 1) testing when a quantitative null hypothesis is given; 2) clustering using a mixture distribution; or 3) identifying a reasonable threshold with a parametric null assumption. We analyze a total of 20 subjects scanned at 63 visits (~30Gb), the largest population of such clinical brain images
    corecore