4 research outputs found
Convex separable problems with linear and box constraints
In this work, we focus on separable convex optimization problems with linear
and box constraints and compute the solution in closed-form as a function of
some Lagrange multipliers that can be easily computed in a finite number of
iterations. This allows us to bridge the gap between a wide family of power
allocation problems of practical interest in signal processing and
communications and their efficient implementation in practice.Comment: 5 pages, 2 figures. Published at IEEE International Conference on
Acoustics, Speech and Signal Processing (ICASSP 2014
On Solving Convex Optimization Problems with Linear Ascending Constraints
In this paper, we propose two algorithms for solving convex optimization
problems with linear ascending constraints. When the objective function is
separable, we propose a dual method which terminates in a finite number of
iterations. In particular, the worst case complexity of our dual method
improves over the best-known result for this problem in Padakandla and
Sundaresan [SIAM J. Optimization, 20 (2009), pp. 1185-1204]. We then propose a
gradient projection method to solve a more general class of problems in which
the objective function is not necessarily separable. Numerical experiments show
that both our algorithms work well in test problems.Comment: 20 pages. The final version of this paper is published in
Optimization Letter
A Decomposition Algorithm for Nested Resource Allocation Problems
We propose an exact polynomial algorithm for a resource allocation problem
with convex costs and constraints on partial sums of resource consumptions, in
the presence of either continuous or integer variables. No assumption of strict
convexity or differentiability is needed. The method solves a hierarchy of
resource allocation subproblems, whose solutions are used to convert
constraints on sums of resources into bounds for separate variables at higher
levels. The resulting time complexity for the integer problem is , and the complexity of obtaining an -approximate
solution for the continuous case is , being
the number of variables, the number of ascending constraints (such that ), a desired precision, and the total resource. This
algorithm attains the best-known complexity when , and improves it when
. Extensive experimental analyses are conducted with four
recent algorithms on various continuous problems issued from theory and
practice. The proposed method achieves a higher performance than previous
algorithms, addressing all problems with up to one million variables in less
than one minute on a modern computer.Comment: Working Paper -- MIT, 23 page
Separable Convex Optimization with Nested Lower and Upper Constraints
We study a convex resource allocation problem in which lower and upper bounds
are imposed on partial sums of allocations. This model is linked to a large
range of applications, including production planning, speed optimization,
stratified sampling, support vector machines, portfolio management, and
telecommunications. We propose an efficient gradient-free divide-and-conquer
algorithm, which uses monotonicity arguments to generate valid bounds from the
recursive calls, and eliminate linking constraints based on the information
from sub-problems. This algorithm does not need strict convexity or
differentiability. It produces an -approximate solution for the
continuous problem in time
and an integer solution in time, where is
the number of decision variables, is the number of constraints, and is
the resource bound. A complexity of is also achieved
for the linear and quadratic cases. These are the best complexities known to
date for this important problem class. Our experimental analyses confirm the
good performance of the method, which produces optimal solutions for problems
with up to 1,000,000 variables in a few seconds. Promising applications to the
support vector ordinal regression problem are also investigated