73 research outputs found
On Solving Convex Optimization Problems with Linear Ascending Constraints
In this paper, we propose two algorithms for solving convex optimization
problems with linear ascending constraints. When the objective function is
separable, we propose a dual method which terminates in a finite number of
iterations. In particular, the worst case complexity of our dual method
improves over the best-known result for this problem in Padakandla and
Sundaresan [SIAM J. Optimization, 20 (2009), pp. 1185-1204]. We then propose a
gradient projection method to solve a more general class of problems in which
the objective function is not necessarily separable. Numerical experiments show
that both our algorithms work well in test problems.Comment: 20 pages. The final version of this paper is published in
Optimization Letter
Scheduling under Linear Constraints
We introduce a parallel machine scheduling problem in which the processing
times of jobs are not given in advance but are determined by a system of linear
constraints. The objective is to minimize the makespan, i.e., the maximum job
completion time among all feasible choices. This novel problem is motivated by
various real-world application scenarios. We discuss the computational
complexity and algorithms for various settings of this problem. In particular,
we show that if there is only one machine with an arbitrary number of linear
constraints, or there is an arbitrary number of machines with no more than two
linear constraints, or both the number of machines and the number of linear
constraints are fixed constants, then the problem is polynomial-time solvable
via solving a series of linear programming problems. If both the number of
machines and the number of constraints are inputs of the problem instance, then
the problem is NP-Hard. We further propose several approximation algorithms for
the latter case.Comment: 21 page
- …