Many combinatorial problems arising in machine learning can be reduced to the
problem of minimizing a submodular function. Submodular functions are a natural
discrete analog of convex functions, and can be minimized in strongly
polynomial time. Unfortunately, state-of-the-art algorithms for general
submodular minimization are intractable for larger problems. In this paper, we
introduce a novel subclass of submodular minimization problems that we call
decomposable. Decomposable submodular functions are those that can be
represented as sums of concave functions applied to modular functions. We
develop an algorithm, SLG, that can efficiently minimize decomposable
submodular functions with tens of thousands of variables. Our algorithm
exploits recent results in smoothed convex minimization. We apply SLG to
synthetic benchmarks and a joint classification-and-segmentation task, and show
that it outperforms the state-of-the-art general purpose submodular
minimization algorithms by several orders of magnitude.Comment: Expanded version of paper for Neural Information Processing Systems
201