78 research outputs found
Forward-backward truncated Newton methods for convex composite optimization
This paper proposes two proximal Newton-CG methods for convex nonsmooth
optimization problems in composite form. The algorithms are based on a a
reformulation of the original nonsmooth problem as the unconstrained
minimization of a continuously differentiable function, namely the
forward-backward envelope (FBE). The first algorithm is based on a standard
line search strategy, whereas the second one combines the global efficiency
estimates of the corresponding first-order methods, while achieving fast
asymptotic convergence rates. Furthermore, they are computationally attractive
since each Newton iteration requires the approximate solution of a linear
system of usually small dimension
Recovery under Side Constraints
This paper addresses sparse signal reconstruction under various types of
structural side constraints with applications in multi-antenna systems. Side
constraints may result from prior information on the measurement system and the
sparse signal structure. They may involve the structure of the sensing matrix,
the structure of the non-zero support values, the temporal structure of the
sparse representationvector, and the nonlinear measurement structure. First, we
demonstrate how a priori information in form of structural side constraints
influence recovery guarantees (null space properties) using L1-minimization.
Furthermore, for constant modulus signals, signals with row-, block- and
rank-sparsity, as well as non-circular signals, we illustrate how structural
prior information can be used to devise efficient algorithms with improved
recovery performance and reduced computational complexity. Finally, we address
the measurement system design for linear and nonlinear measurements of sparse
signals. Moreover, we discuss the linear mixing matrix design based on
coherence minimization. Then we extend our focus to nonlinear measurement
systems where we design parallel optimization algorithms to efficiently compute
stationary points in the sparse phase retrieval problem with and without
dictionary learning
Linear programming on the Stiefel manifold
Linear programming on the Stiefel manifold (LPS) is studied for the first
time. It aims at minimizing a linear objective function over the set of all
-tuples of orthonormal vectors in satisfying additional
linear constraints. Despite the classical polynomial-time solvable case ,
general (LPS) is NP-hard. According to the Shapiro-Barvinok-Pataki theorem,
(LPS) admits an exact semidefinite programming (SDP) relaxation when
, which is tight when . Surprisingly, we can greatly
strengthen this sufficient exactness condition to , which covers the
classical case and . Regarding (LPS) as a smooth nonlinear
programming problem, we reveal a nice property that under the linear
independence constraint qualification, the standard first- and second-order
{\it local} necessary optimality conditions are sufficient for {\it global}
optimality when
- …