Article thumbnail
Location of Repository

Fast structure learning with modular regularization

By Greg Ver Steeg, Hrayr Harutyunyan, Daniel Moyer and Aram Galstyan

Abstract

Estimating graphical model structure from high-dimensional and undersampled data is a fundamental problem in many scientific fields. Existing approaches, such as GLASSO, latent variable GLASSO, and latent tree models, suffer from high computational complexity and may impose unrealistic sparsity priors in some cases. We introduce a novel method that leverages a newly discovered connection between information-theoretic measures and structured latent factor models to derive an optimization objective which encourages modular structures where each observed variable has a single latent parent. The proposed method has linear stepwise computational complexity w.r.t. the number of observed variables. Our experiments on synthetic data demonstrate that our approach is the only method that recovers modular structure better as the dimensionality increases. We also use our approach for estimating covariance structure for a number of real-world datasets and show that it consistently outperforms state-of-the-art estimators at a fraction of the computational cost. Finally, we apply the proposed method to high-resolution fMRI data (with more than 10^5 voxels) and show that it is capable of extracting meaningful patterns.Comment: 22 pages, accepted to NeurIPS 201

Topics: Statistics - Machine Learning, Computer Science - Information Theory
Year: 2019
OAI identifier: oai:arXiv.org:1706.03353

Suggested articles


To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.