842 research outputs found

    How the structure of precedence constraints may change the complexity class of scheduling problems

    Full text link
    This survey aims at demonstrating that the structure of precedence constraints plays a tremendous role on the complexity of scheduling problems. Indeed many problems can be NP-hard when considering general precedence constraints, while they become polynomially solvable for particular precedence constraints. We also show that there still are many very exciting challenges in this research area

    Throughput Maximization in Multiprocessor Speed-Scaling

    Full text link
    We are given a set of nn jobs that have to be executed on a set of mm speed-scalable machines that can vary their speeds dynamically using the energy model introduced in [Yao et al., FOCS'95]. Every job jj is characterized by its release date rjr_j, its deadline djd_j, its processing volume pi,jp_{i,j} if jj is executed on machine ii and its weight wjw_j. We are also given a budget of energy EE and our objective is to maximize the weighted throughput, i.e. the total weight of jobs that are completed between their respective release dates and deadlines. We propose a polynomial-time approximation algorithm where the preemption of the jobs is allowed but not their migration. Our algorithm uses a primal-dual approach on a linearized version of a convex program with linear constraints. Furthermore, we present two optimal algorithms for the non-preemptive case where the number of machines is bounded by a fixed constant. More specifically, we consider: {\em (a)} the case of identical processing volumes, i.e. pi,j=pp_{i,j}=p for every ii and jj, for which we present a polynomial-time algorithm for the unweighted version, which becomes a pseudopolynomial-time algorithm for the weighted throughput version, and {\em (b)} the case of agreeable instances, i.e. for which rirjr_i \le r_j if and only if didjd_i \le d_j, for which we present a pseudopolynomial-time algorithm. Both algorithms are based on a discretization of the problem and the use of dynamic programming

    Parameterized complexity of machine scheduling: 15 open problems

    Full text link
    Machine scheduling problems are a long-time key domain of algorithms and complexity research. A novel approach to machine scheduling problems are fixed-parameter algorithms. To stimulate this thriving research direction, we propose 15 open questions in this area whose resolution we expect to lead to the discovery of new approaches and techniques both in scheduling and parameterized complexity theory.Comment: Version accepted to Computers & Operations Researc

    Single-machine scheduling with stepwise tardiness costs and release times

    Get PDF
    We study a scheduling problem that belongs to the yard operations component of the railroad planning problems, namely the hump sequencing problem. The scheduling problem is characterized as a single-machine problem with stepwise tardiness cost objectives. This is a new scheduling criterion which is also relevant in the context of traditional machine scheduling problems. We produce complexity results that characterize some cases of the problem as pseudo-polynomially solvable. For the difficult-to-solve cases of the problem, we develop mathematical programming formulations, and propose heuristic algorithms. We test the formulations and heuristic algorithms on randomly generated single-machine scheduling problems and real-life datasets for the hump sequencing problem. Our experiments show promising results for both sets of problems

    A common framework and taxonomy for multicriteria scheduling problems with Interfering and competing Jobs: Multi-agent scheduling problems

    Get PDF
    Most classical scheduling research assumes that the objectives sought are common to all jobs to be scheduled. However, many real-life applications can be modeled by considering different sets of jobs, each one with its own objective(s), and an increasing number of papers addressing these problems has appeared over the last few years. Since so far the area lacks a uni ed view, the studied problems have received different names (such as interfering jobs, multi-agent scheduling, mixed-criteria, etc), some authors do not seem to be aware of important contributions in related problems, and solution procedures are often developed without taking into account existing ones. Therefore, the topic is in need of a common framework that allows for a systematic recollection of existing contributions, as well as a clear de nition of the main research avenues. In this paper we review multicriteria scheduling problems involving two or more sets of jobs and propose an uni ed framework providing a common de nition, name and notation for these problems. Moreover, we systematically review and classify the existing contributions in terms of the complexity of the problems and the proposed solution procedures, discuss the main advances, and point out future research lines in the topic

    The Lazy Bureaucrat Scheduling Problem

    Full text link
    We introduce a new class of scheduling problems in which the optimization is performed by the worker (single ``machine'') who performs the tasks. A typical worker's objective is to minimize the amount of work he does (he is ``lazy''), or more generally, to schedule as inefficiently (in some sense) as possible. The worker is subject to the constraint that he must be busy when there is work that he can do; we make this notion precise both in the preemptive and nonpreemptive settings. The resulting class of ``perverse'' scheduling problems, which we denote ``Lazy Bureaucrat Problems,'' gives rise to a rich set of new questions that explore the distinction between maximization and minimization in computing optimal schedules.Comment: 19 pages, 2 figures, Latex. To appear, Information and Computatio

    Throughput Maximization in the Speed-Scaling Setting

    Get PDF
    We are given a set of nn jobs and a single processor that can vary its speed dynamically. Each job JjJ_j is characterized by its processing requirement (work) pjp_j, its release date rjr_j and its deadline djd_j. We are also given a budget of energy EE and we study the scheduling problem of maximizing the throughput (i.e. the number of jobs which are completed on time). We propose a dynamic programming algorithm that solves the preemptive case of the problem, i.e. when the execution of the jobs may be interrupted and resumed later, in pseudo-polynomial time. Our algorithm can be adapted for solving the weighted version of the problem where every job is associated with a weight wjw_j and the objective is the maximization of the sum of the weights of the jobs that are completed on time. Moreover, we provide a strongly polynomial time algorithm to solve the non-preemptive unweighed case when the jobs have the same processing requirements. For the weighted case, our algorithm can be adapted for solving the non-preemptive version of the problem in pseudo-polynomial time.Comment: submitted to SODA 201

    Structural Properties of an Open Problem in Preemptive Scheduling

    Get PDF
    Structural properties of optimal preemptive schedules have been studied in a number of recent papers with a primary focus on two structural parameters: the minimum number of preemptions necessary, and a tight lower bound on `shifts', i.e., the sizes of intervals bounded by the times created by preemptions, job starts, or completions. So far only rough bounds for these parameters have been derived for specific problems. This paper sharpens the bounds on these structural parameters for a well-known open problem in the theory of preemptive scheduling: Instances consist of in-trees of nn unit-execution-time jobs with release dates, and the objective is to minimize the total completion time on two processors. This is among the current, tantalizing `threshold' problems of scheduling theory: Our literature survey reveals that any significant generalization leads to an NP-hard problem, but that any significant simplification leads to tractable problem. For the above problem, we show that the number of preemptions necessary for optimality need not exceed 2n12n-1; that the number must be of order Ω(logn)\Omega(\log n) for some instances; and that the minimum shift need not be less than 22n+12^{-2n+1}. These bounds are obtained by combinatorial analysis of optimal schedules rather than by the analysis of polytope corners for linear-program formulations, an approach to be found in earlier papers. The bounds immediately follow from a fundamental structural property called `normality', by which minimal shifts of a job are exponentially decreasing functions. In particular, the first interval between a preempted job's start and its preemption is a multiple of 1/2, the second such interval is a multiple of 1/4, and in general, the ii-th preemption occurs at a multiple of 2i2^{-i}. We expect the new structural properties to play a prominent role in finally settling a vexing, still-open question of complexity
    corecore