Article thumbnail

Differential Network Learning Beyond Data Samples

By Arshdeep Sekhon, Beilun Wang, Zhe Wang and Yanjun Qi

Abstract

Learning the change of statistical dependencies between random variables is an essential task for many real-life applications, mostly in the high dimensional low sample regime. In this paper, we propose a novel differential parameter estimator that, in comparison to current methods, simultaneously allows (a) the flexible integration of multiple sources of information (data samples, variable groupings, extra pairwise evidence, etc.), (b) being scalable to a large number of variables, and (c) achieving a sharp asymptotic convergence rate. Our experiments, on more than 100 simulated and two real-world datasets, validate the flexibility of our approach and highlight the benefits of integrating spatial and anatomic information for brain connectome change discovery and epigenetic network identification.Comment: 9 pages of main draft; 25 pages of Appendix; 5 Tables ; 14 Figures ; Learning of Structure Difference between Two Graphical Model

Topics: Computer Science - Machine Learning, Statistics - Machine Learning
Year: 2020
OAI identifier: oai:arXiv.org:2004.11494

Suggested articles


To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.