627 research outputs found

    Tensor Analysis and Fusion of Multimodal Brain Images

    Get PDF
    Current high-throughput data acquisition technologies probe dynamical systems with different imaging modalities, generating massive data sets at different spatial and temporal resolutions posing challenging problems in multimodal data fusion. A case in point is the attempt to parse out the brain structures and networks that underpin human cognitive processes by analysis of different neuroimaging modalities (functional MRI, EEG, NIRS etc.). We emphasize that the multimodal, multi-scale nature of neuroimaging data is well reflected by a multi-way (tensor) structure where the underlying processes can be summarized by a relatively small number of components or "atoms". We introduce Markov-Penrose diagrams - an integration of Bayesian DAG and tensor network notation in order to analyze these models. These diagrams not only clarify matrix and tensor EEG and fMRI time/frequency analysis and inverse problems, but also help understand multimodal fusion via Multiway Partial Least Squares and Coupled Matrix-Tensor Factorization. We show here, for the first time, that Granger causal analysis of brain networks is a tensor regression problem, thus allowing the atomic decomposition of brain networks. Analysis of EEG and fMRI recordings shows the potential of the methods and suggests their use in other scientific domains.Comment: 23 pages, 15 figures, submitted to Proceedings of the IEE

    Preconditioning Kernel Matrices

    Full text link
    The computational and storage complexity of kernel machines presents the primary barrier to their scaling to large, modern, datasets. A common way to tackle the scalability issue is to use the conjugate gradient algorithm, which relieves the constraints on both storage (the kernel matrix need not be stored) and computation (both stochastic gradients and parallelization can be used). Even so, conjugate gradient is not without its own issues: the conditioning of kernel matrices is often such that conjugate gradients will have poor convergence in practice. Preconditioning is a common approach to alleviating this issue. Here we propose preconditioned conjugate gradients for kernel machines, and develop a broad range of preconditioners particularly useful for kernel matrices. We describe a scalable approach to both solving kernel machines and learning their hyperparameters. We show this approach is exact in the limit of iterations and outperforms state-of-the-art approximations for a given computational budget

    Multipolar Acoustic Source Reconstruction from Sparse Far-Field Data using ALOHA

    Full text link
    The reconstruction of multipolar acoustic or electromagnetic sources from their far-field signature plays a crucial role in numerous applications. Most of the existing techniques require dense multi-frequency data at the Nyquist sampling rate. The availability of a sub-sampled grid contributes to the null space of the inverse source-to-data operator, which causes significant imaging artifacts. For this purpose, additional knowledge about the source or regularization is required. In this letter, we propose a novel two-stage strategy for multipolar source reconstruction from sub-sampled sparse data that takes advantage of the sparsity of the sources in the physical domain. The data at the Nyquist sampling rate is recovered from sub-sampled data and then a conventional inversion algorithm is used to reconstruct sources. The data recovery problem is linked to a spectrum recovery problem for the signal with the \textit{finite rate of innovations} (FIR) that is solved using an annihilating filter-based structured Hankel matrix completion approach (ALOHA). For an accurate reconstruction, a Fourier inversion algorithm is used. The suitability of the approach is supported by experiments.Comment: 11 pages, 2 figure
    • …
    corecore