232 research outputs found
From Sets to Multisets: Provable Variational Inference for Probabilistic Integer Submodular Models
Submodular functions have been studied extensively in machine learning and
data mining. In particular, the optimization of submodular functions over the
integer lattice (integer submodular functions) has recently attracted much
interest, because this domain relates naturally to many practical problem
settings, such as multilabel graph cut, budget allocation and revenue
maximization with discrete assignments. In contrast, the use of these functions
for probabilistic modeling has received surprisingly little attention so far.
In this work, we firstly propose the Generalized Multilinear Extension, a
continuous DR-submodular extension for integer submodular functions. We study
central properties of this extension and formulate a new probabilistic model
which is defined through integer submodular functions. Then, we introduce a
block-coordinate ascent algorithm to perform approximate inference for those
class of models. Finally, we demonstrate its effectiveness and viability on
several real-world social connection graph datasets with integer submodular
objectives
The Lazy Flipper: MAP Inference in Higher-Order Graphical Models by Depth-limited Exhaustive Search
This article presents a new search algorithm for the NP-hard problem of
optimizing functions of binary variables that decompose according to a
graphical model. It can be applied to models of any order and structure. The
main novelty is a technique to constrain the search space based on the topology
of the model. When pursued to the full search depth, the algorithm is
guaranteed to converge to a global optimum, passing through a series of
monotonously improving local optima that are guaranteed to be optimal within a
given and increasing Hamming distance. For a search depth of 1, it specializes
to Iterated Conditional Modes. Between these extremes, a useful tradeoff
between approximation quality and runtime is established. Experiments on models
derived from both illustrative and real problems show that approximations found
with limited search depth match or improve those obtained by state-of-the-art
methods based on message passing and linear programming.Comment: C++ Source Code available from
http://hci.iwr.uni-heidelberg.de/software.ph
Recommended from our members
New perspectives and applications for greedy algorithms in machine learning
Approximating probability densities is a core problem in Bayesian statistics, where the inference involves the computation of a posterior distribution. Variational Inference (VI) is a technique to approximate posterior distributions through optimization. It involves specifying a set of tractable densities, out of which the final approximation is to be chosen. While VI is traditionally motivated with the goal of tractability, the focus in this dissertation is to use Bayesian approximation to obtain parsimonious distributions. With this goal in mind, we develop greedy algorithm variants and study their theoretical properties by establishing novel connections of the resulting optimization problems in parsimonious VI with traditional studies in the discrete optimization literature. Specific realizations lead to efficient solutions for many sparse probabilistic models like Sparse regression, Sparse PCA, Sparse Collective Matrix Factorization (CMF) etc. For cases where existing results are insufficient to provide acceptable approximation guarantees, we extend the optimization results for some large scale algorithms to a much larger class of functions.The developed methods are applied to both simulated and real world datasets, including high dimensional functional Magnetic Resonance Imaging (fMRI) datasets, and to the real world tasks of interpreting data exploration and model predictions.Electrical and Computer Engineerin
- …