512,822 research outputs found
An Inexact Successive Quadratic Approximation Method for Convex L-1 Regularized Optimization
We study a Newton-like method for the minimization of an objective function
that is the sum of a smooth convex function and an l-1 regularization term.
This method, which is sometimes referred to in the literature as a proximal
Newton method, computes a step by minimizing a piecewise quadratic model of the
objective function. In order to make this approach efficient in practice, it is
imperative to perform this inner minimization inexactly. In this paper, we give
inexactness conditions that guarantee global convergence and that can be used
to control the local rate of convergence of the iteration. Our inexactness
conditions are based on a semi-smooth function that represents a (continuous)
measure of the optimality conditions of the problem, and that embodies the
soft-thresholding iteration. We give careful consideration to the algorithm
employed for the inner minimization, and report numerical results on two test
sets originating in machine learning
Curvature and Optimal Algorithms for Learning and Minimizing Submodular Functions
We investigate three related and important problems connected to machine
learning: approximating a submodular function everywhere, learning a submodular
function (in a PAC-like setting [53]), and constrained minimization of
submodular functions. We show that the complexity of all three problems depends
on the 'curvature' of the submodular function, and provide lower and upper
bounds that refine and improve previous results [3, 16, 18, 52]. Our proof
techniques are fairly generic. We either use a black-box transformation of the
function (for approximation and learning), or a transformation of algorithms to
use an appropriate surrogate function (for minimization). Curiously, curvature
has been known to influence approximations for submodular maximization [7, 55],
but its effect on minimization, approximation and learning has hitherto been
open. We complete this picture, and also support our theoretical claims by
empirical results.Comment: 21 pages. A shorter version appeared in Advances of NIPS-201
Symmetric Submodular Function Minimization Under Hereditary Family Constraints
We present an efficient algorithm to find non-empty minimizers of a symmetric
submodular function over any family of sets closed under inclusion. This for
example includes families defined by a cardinality constraint, a knapsack
constraint, a matroid independence constraint, or any combination of such
constraints. Our algorithm make oracle calls to the submodular
function where is the cardinality of the ground set. In contrast, the
problem of minimizing a general submodular function under a cardinality
constraint is known to be inapproximable within (Svitkina
and Fleischer [2008]).
The algorithm is similar to an algorithm of Nagamochi and Ibaraki [1998] to
find all nontrivial inclusionwise minimal minimizers of a symmetric submodular
function over a set of cardinality using oracle calls. Their
procedure in turn is based on Queyranne's algorithm [1998] to minimize a
symmetric submodularComment: 13 pages, Submitted to SODA 201
A transformation method for constrained-function minimization
A direct method for constrained-function minimization is discussed. The method involves the construction of an appropriate function mapping all of one finite dimensional space onto the region defined by the constraints. Functions which produce such a transformation are constructed for a variety of constraint regions including, for example, those arising from linear and quadratic inequalities and equalities. In addition, the computational performance of this method is studied in the situation where the Davidon-Fletcher-Powell algorithm is used to solve the resulting unconstrained problem. Good performance is demonstrated for 19 test problems by achieving rapid convergence to a solution from several widely separated starting points
New Query Lower Bounds for Submodular Function Minimization
We consider submodular function minimization in the oracle model: given
black-box access to a submodular set function , find an element of using as few queries to
as possible. State-of-the-art algorithms succeed with
queries [LeeSW15], yet the best-known lower bound has never
been improved beyond [Harvey08].
We provide a query lower bound of for submodular function minimization,
a query lower bound for the non-trivial minimizer of a symmetric
submodular function, and a query lower bound for the non-trivial
minimizer of an asymmetric submodular function.
Our lower bound results from a connection between SFM lower bounds
and a novel concept we term the cut dimension of a graph. Interestingly, this
yields a cut-query lower bound for finding the global mincut in an
undirected, weighted graph, but we also prove it cannot yield a lower bound
better than for - mincut, even in a directed, weighted graph
- …
