2,381,443 research outputs found

    Functional data analysis in an operator-based mixed-model framework

    Get PDF
    Functional data analysis in a mixed-effects model framework is done using operator calculus. In this approach the functional parameters are treated as serially correlated effects giving an alternative to the penalized likelihood approach, where the functional parameters are treated as fixed effects. Operator approximations for the necessary matrix computations are proposed, and semi-explicit and numerically stable formulae of linear computational complexity are derived for likelihood analysis. The operator approach renders the usage of a functional basis unnecessary and clarifies the role of the boundary conditions.Comment: Published in at http://dx.doi.org/10.3150/11-BEJ389 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Functional Linear Mixed Models for Irregularly or Sparsely Sampled Data

    Get PDF
    We propose an estimation approach to analyse correlated functional data which are observed on unequal grids or even sparsely. The model we use is a functional linear mixed model, a functional analogue of the linear mixed model. Estimation is based on dimension reduction via functional principal component analysis and on mixed model methodology. Our procedure allows the decomposition of the variability in the data as well as the estimation of mean effects of interest and borrows strength across curves. Confidence bands for mean effects can be constructed conditional on estimated principal components. We provide R-code implementing our approach. The method is motivated by and applied to data from speech production research

    Graph-Based Decoding Model for Functional Alignment of Unaligned fMRI Data

    Full text link
    Aggregating multi-subject functional magnetic resonance imaging (fMRI) data is indispensable for generating valid and general inferences from patterns distributed across human brains. The disparities in anatomical structures and functional topographies of human brains warrant aligning fMRI data across subjects. However, the existing functional alignment methods cannot handle well various kinds of fMRI datasets today, especially when they are not temporally-aligned, i.e., some of the subjects probably lack the responses to some stimuli, or different subjects might follow different sequences of stimuli. In this paper, a cross-subject graph that depicts the (dis)similarities between samples across subjects is used as a priori for developing a more flexible framework that suits an assortment of fMRI datasets. However, the high dimension of fMRI data and the use of multiple subjects makes the crude framework time-consuming or unpractical. To address this issue, we further regularize the framework, so that a novel feasible kernel-based optimization, which permits nonlinear feature extraction, could be theoretically developed. Specifically, a low-dimension assumption is imposed on each new feature space to avoid overfitting caused by the highspatial-low-temporal resolution of fMRI data. Experimental results on five datasets suggest that the proposed method is not only superior to several state-of-the-art methods on temporally-aligned fMRI data, but also suitable for dealing `with temporally-unaligned fMRI data.Comment: 17 pages, 10 figures, Proceedings of the Association for the Advancement of Artificial Intelligence (AAAI-20

    Aggregated functional data model for Near-Infrared Spectroscopy calibration and prediction

    Full text link
    Calibration and prediction for NIR spectroscopy data are performed based on a functional interpretation of the Beer-Lambert formula. Considering that, for each chemical sample, the resulting spectrum is a continuous curve obtained as the summation of overlapped absorption spectra from each analyte plus a Gaussian error, we assume that each individual spectrum can be expanded as a linear combination of B-splines basis. Calibration is then performed using two procedures for estimating the individual analytes curves: basis smoothing and smoothing splines. Prediction is done by minimizing the square error of prediction. To assess the variance of the predicted values, we use a leave-one-out jackknife technique. Departures from the standard error models are discussed through a simulation study, in particular, how correlated errors impact on the calibration step and consequently on the analytes' concentration prediction. Finally, the performance of our methodology is demonstrated through the analysis of two publicly available datasets.Comment: 27 pages, 7 figures, 7 table

    Smoothing sparse and unevenly sampled curves using semiparametric mixed models: An application to online auctions

    Get PDF
    Functional data analysis can be challenging when the functional objects are sampled only very sparsely and unevenly. Most approaches rely on smoothing to recover the underlying functional object from the data which can be difficult if the data is irregularly distributed. In this paper we present a new approach that can overcome this challenge. The approach is based on the ideas of mixed models. Specifically, we propose a semiparametric mixed model with boosting to recover the functional object. While the model can handle sparse and unevenly distributed data, it also results in conceptually more meaningful functional objects. In particular, we motivate our method within the framework of eBay's online auctions. Online auctions produce monotonic increasing price curves that are often correlated across two auctions. The semiparametric mixed model accounts for this correlation in a parsimonious way. It also estimates the underlying increasing trend from the data without imposing model-constraints. Our application shows that the resulting functional objects are conceptually more appealing. Moreover, when used to forecast the outcome of an online auction, our approach also results in more accurate price predictions compared to standard approaches. We illustrate our model on a set of 183 closed auctions for Palm M515 personal digital assistants

    The Ising Model for Neural Data: Model Quality and Approximate Methods for Extracting Functional Connectivity

    Full text link
    We study pairwise Ising models for describing the statistics of multi-neuron spike trains, using data from a simulated cortical network. We explore efficient ways of finding the optimal couplings in these models and examine their statistical properties. To do this, we extract the optimal couplings for subsets of size up to 200 neurons, essentially exactly, using Boltzmann learning. We then study the quality of several approximate methods for finding the couplings by comparing their results with those found from Boltzmann learning. Two of these methods- inversion of the TAP equations and an approximation proposed by Sessak and Monasson- are remarkably accurate. Using these approximations for larger subsets of neurons, we find that extracting couplings using data from a subset smaller than the full network tends systematically to overestimate their magnitude. This effect is described qualitatively by infinite-range spin glass theory for the normal phase. We also show that a globally-correlated input to the neurons in the network lead to a small increase in the average coupling. However, the pair-to-pair variation of the couplings is much larger than this and reflects intrinsic properties of the network. Finally, we study the quality of these models by comparing their entropies with that of the data. We find that they perform well for small subsets of the neurons in the network, but the fit quality starts to deteriorate as the subset size grows, signalling the need to include higher order correlations to describe the statistics of large networks.Comment: 12 pages, 10 figure
    corecore