50,573 research outputs found
Minimum Relative Entropy for Quantum Estimation: Feasibility and General Solution
We propose a general framework for solving quantum state estimation problems
using the minimum relative entropy criterion. A convex optimization approach
allows us to decide the feasibility of the problem given the data and, whenever
necessary, to relax the constraints in order to allow for a physically
admissible solution. Building on these results, the variational analysis can be
completed ensuring existence and uniqueness of the optimum. The latter can then
be computed by standard, efficient standard algorithms for convex optimization,
without resorting to approximate methods or restrictive assumptions on its
rank.Comment: 9 pages, no figure
Stochastic Variance Reduction Methods for Saddle-Point Problems
We consider convex-concave saddle-point problems where the objective
functions may be split in many components, and extend recent stochastic
variance reduction methods (such as SVRG or SAGA) to provide the first
large-scale linearly convergent algorithms for this class of problems which is
common in machine learning. While the algorithmic extension is straightforward,
it comes with challenges and opportunities: (a) the convex minimization
analysis does not apply and we use the notion of monotone operators to prove
convergence, showing in particular that the same algorithm applies to a larger
class of problems, such as variational inequalities, (b) there are two notions
of splits, in terms of functions, or in terms of partial derivatives, (c) the
split does need to be done with convex-concave terms, (d) non-uniform sampling
is key to an efficient algorithm, both in theory and practice, and (e) these
incremental algorithms can be easily accelerated using a simple extension of
the "catalyst" framework, leading to an algorithm which is always superior to
accelerated batch algorithms.Comment: Neural Information Processing Systems (NIPS), 2016, Barcelona, Spai
- …