709 research outputs found
Minimizing a sum of submodular functions
We consider the problem of minimizing a function represented as a sum of
submodular terms. We assume each term allows an efficient computation of {\em
exchange capacities}. This holds, for example, for terms depending on a small
number of variables, or for certain cardinality-dependent terms.
A naive application of submodular minimization algorithms would not exploit
the existence of specialized exchange capacity subroutines for individual
terms. To overcome this, we cast the problem as a {\em submodular flow} (SF)
problem in an auxiliary graph, and show that applying most existing SF
algorithms would rely only on these subroutines.
We then explore in more detail Iwata's capacity scaling approach for
submodular flows (Math. Programming, 76(2):299--308, 1997). In particular, we
show how to improve its complexity in the case when the function contains
cardinality-dependent terms.Comment: accepted to "Discrete Applied Mathematics
Discrete Convex Functions on Graphs and Their Algorithmic Applications
The present article is an exposition of a theory of discrete convex functions
on certain graph structures, developed by the author in recent years. This
theory is a spin-off of discrete convex analysis by Murota, and is motivated by
combinatorial dualities in multiflow problems and the complexity classification
of facility location problems on graphs. We outline the theory and algorithmic
applications in combinatorial optimization problems
An Algorithmic Theory of Dependent Regularizers, Part 1: Submodular Structure
We present an exploration of the rich theoretical connections between several
classes of regularized models, network flows, and recent results in submodular
function theory. This work unifies key aspects of these problems under a common
theory, leading to novel methods for working with several important models of
interest in statistics, machine learning and computer vision.
In Part 1, we review the concepts of network flows and submodular function
optimization theory foundational to our results. We then examine the
connections between network flows and the minimum-norm algorithm from
submodular optimization, extending and improving several current results. This
leads to a concise representation of the structure of a large class of pairwise
regularized models important in machine learning, statistics and computer
vision.
In Part 2, we describe the full regularization path of a class of penalized
regression problems with dependent variables that includes the graph-guided
LASSO and total variation constrained models. This description also motivates a
practical algorithm. This allows us to efficiently find the regularization path
of the discretized version of TV penalized models. Ultimately, our new
algorithms scale up to high-dimensional problems with millions of variables
- …