1,335 research outputs found
Solving Multiple-Block Separable Convex Minimization Problems Using Two-Block Alternating Direction Method of Multipliers
In this paper, we consider solving multiple-block separable convex
minimization problems using alternating direction method of multipliers (ADMM).
Motivated by the fact that the existing convergence theory for ADMM is mostly
limited to the two-block case, we analyze in this paper, both theoretically and
numerically, a new strategy that first transforms a multi-block problem into an
equivalent two-block problem (either in the primal domain or in the dual
domain) and then solves it using the standard two-block ADMM. In particular, we
derive convergence results for this two-block ADMM approach to solve
multi-block separable convex minimization problems, including an improved
O(1/\epsilon) iteration complexity result. Moreover, we compare the numerical
efficiency of this approach with the standard multi-block ADMM on several
separable convex minimization problems which include basis pursuit, robust
principal component analysis and latent variable Gaussian graphical model
selection. The numerical results show that the multiple-block ADMM, although
lacks theoretical convergence guarantees, typically outperforms two-block
ADMMs
Private Multiplicative Weights Beyond Linear Queries
A wide variety of fundamental data analyses in machine learning, such as
linear and logistic regression, require minimizing a convex function defined by
the data. Since the data may contain sensitive information about individuals,
and these analyses can leak that sensitive information, it is important to be
able to solve convex minimization in a privacy-preserving way.
A series of recent results show how to accurately solve a single convex
minimization problem in a differentially private manner. However, the same data
is often analyzed repeatedly, and little is known about solving multiple convex
minimization problems with differential privacy. For simpler data analyses,
such as linear queries, there are remarkable differentially private algorithms
such as the private multiplicative weights mechanism (Hardt and Rothblum, FOCS
2010) that accurately answer exponentially many distinct queries. In this work,
we extend these results to the case of convex minimization and show how to give
accurate and differentially private solutions to *exponentially many* convex
minimization problems on a sensitive dataset
The Asymptotic Behavior of the Composition of Firmly Nonexpansive Mappings
In this paper we provide a unified treatment of some convex minimization
problems, which allows for a better understanding and, in some cases,
improvement of results in this direction proved recently in spaces of curvature
bounded above. For this purpose, we analyze the asymptotic behavior of
compositions of finitely many firmly nonexpansive mappings in the setting of
-uniformly convex geodesic spaces focusing on asymptotic regularity and
convergence results
Fast Primal-Dual Gradient Method for Strongly Convex Minimization Problems with Linear Constraints
In this paper we consider a class of optimization problems with a strongly
convex objective function and the feasible set given by an intersection of a
simple convex set with a set given by a number of linear equality and
inequality constraints. A number of optimization problems in applications can
be stated in this form, examples being the entropy-linear programming, the
ridge regression, the elastic net, the regularized optimal transport, etc. We
extend the Fast Gradient Method applied to the dual problem in order to make it
primal-dual so that it allows not only to solve the dual problem, but also to
construct nearly optimal and nearly feasible solution of the primal problem. We
also prove a theorem about the convergence rate for the proposed algorithm in
terms of the objective function and the linear constraints infeasibility.Comment: Submitted for DOOR 201
Entropic Projections and Dominating Points
Generalized entropic projections and dominating points are solutions to
convex minimization problems related to conditional laws of large numbers. They
appear in many areas of applied mathematics such as statistical physics,
information theory, mathematical statistics, ill-posed inverse problems or
large deviation theory. By means of convex conjugate duality and functional
analysis, criteria are derived for their existence. Representations of the
generalized entropic projections are obtained: they are the ``measure
component" of some extended entropy minimization problem.Comment: ESAIM P&S (2011) to appea
- …