212 research outputs found
An anticipative scheduling approach with controllable processing times
In practice, machine schedules are usually subject to disruptions which have to be repaired by reactive scheduling decisions. The most popular predictive approach in project management and machine scheduling literature is to leave idle times (time buffers) in schedules in coping with disruptions, i.e. the resources will be under-utilized. Therefore, preparing initial schedules by considering possible disruption times along with rescheduling objectives is critical for the performance of rescheduling decisions. In this paper, we show that if the processing times are controllable then an anticipative approach can be used to form an initial schedule so that the limited capacity of the production resources are utilized more effectively. To illustrate the anticipative scheduling idea, we consider a non-identical parallel machining environment, where processing times can be controlled at a certain compression cost. When there is a disruption during the execution of the initial schedule, a match-up time strategy is utilized such that a repaired schedule has to catch-up initial schedule at some point in future. This requires changing machine–job assignments and processing times for the rest of the schedule which implies increased manufacturing costs. We show that making anticipative job sequencing decisions, based on failure and repair time distributions and flexibility of jobs, one can repair schedules by incurring less manufacturing cost. Our computational results show that the match-up time strategy is very sensitive to initial schedule and the proposed anticipative scheduling algorithm can be very helpful to reduce rescheduling costs
Stochastic lot sizing problem with controllable processing times
Cataloged from PDF version of article.In this study, we consider the stochastic capacitated lot sizing problem with controllable processing times where processing times can be reduced in return for extra compression cost. We assume that the compression cost function is a convex function as it may reflect increasing marginal costs of larger reductions and may be more appropriate when the resource life, energy consumption or carbon emission are taken into consideration. We consider this problem under static uncertainty strategy and α service level constraints. We first introduce a nonlinear mixed integer programming formulation of the problem, and use the recent advances in second order cone programming to strengthen it and then solve by a commercial solver. Our computational experiments show that taking the processing times as constant may lead to more costly production plans, and the value of controllable processing times becomes more evident for a stochastic environment with a limited capacity. Moreover, we observe that controllable processing times increase the solution flexibility and provide a better solution in most of the problem instances, although the largest improvements are obtained when setup costs are high and the system has medium sized capacities
Robust Resource Allocations in Temporal Networks
Temporal networks describe workflows of time-consuming tasks whose processing order is constrained by precedence relations. In many cases, the durations of the network tasks can be influenced by the assignment of resources. This leads to the problem of selecting an ‘optimal’ resource allocation, where optimality is measured by network characteristics such as the makespan (i.e., the time required to complete all tasks). In this paper, we study a robust resource allocation problem where the functional relationship between task durations and resource assignments is uncertain, and the goal is to minimise the worst-case makespan. We show that this problem is generically NP-hard. We then develop convergent bounds for the optimal objective value, as well as feasible allocations whose objective values are bracketed by these bounds. Numerical results provide empirical support for the proposed method.Robust Optimisation, Temporal Networks, Resource Allocation Problem
Energy Management
Forecasts point to a huge increase in energy demand over the next 25 years, with a direct and immediate impact on the exhaustion of fossil fuels, the increase in pollution levels and the global warming that will have significant consequences for all sectors of society. Irrespective of the likelihood of these predictions or what researchers in different scientific disciplines may believe or publicly say about how critical the energy situation may be on a world level, it is without doubt one of the great debates that has stirred up public interest in modern times. We should probably already be thinking about the design of a worldwide strategic plan for energy management across the planet. It would include measures to raise awareness, educate the different actors involved, develop policies, provide resources, prioritise actions and establish contingency plans. This process is complex and depends on political, social, economic and technological factors that are hard to take into account simultaneously. Then, before such a plan is formulated, studies such as those described in this book can serve to illustrate what Information and Communication Technologies have to offer in this sphere and, with luck, to create a reference to encourage investigators in the pursuit of new and better solutions
Establishing an Extendable Benchmarking Framework for E-Fulfillment
The growth in attended home deliveries motivates research in prescriptive analytics for e-fulfillment. Introducing new analytics solutions, for instance, for vehicle routing or revenue management, requires simulation-based benchmarking and analyses on relevant problem scenarios. Unfortunately, creating the required systems induces high overhead for analytics researchers. This paper introduces the simulation-based benchmarking framework SiLFul, which aims to support scientific rigor and practical relevance of research by reducing this overhead. It provides a toolbox of approaches, a modular and extendable architecture, and a comprehensive, application-related data model. Thereby, it facilitates controllable analyses and transparent and replicable research. Moreover, we propose a research process that leverages the framework for evaluating analytics and allows continuous development of the framework as a community effort
Two-machine flowshop scheduling with flexible operations and controllable processing times
Ankara : The Department of Industrial Engineering and the Graduate School of Engineering and Science of Bilkent University, 2011.Thesis (Master's) -- Bilkent University, 2011.Includes bibliographical references leaves 77-84.In this study, we consider a two-machine flowshop scheduling problem with identical
jobs. Each of these jobs has three operations, where the first operation must
be performed on the first machine, the second operation must be performed on
the second machine, and the third operation (named as flexible operation) can
be performed on either machine but cannot be preempted. Highly flexible CNC
machines are capable of performing different operations as long as the required
cutting tools are loaded on these machines. The processing times on these machines
can be changed easily in albeit of higher manufacturing cost by adjusting
the machining parameters like the speed of the machine, feed rate, and/or the
depth of cut. The overall problem is to determine the assignment of the flexible
operations to the machines and processing times for each job simultaneously,
with the bicriteria objective of minimizing the manufacturing cost and minimizing
makespan. For such a bicriteria problem, there is no unique optimum but a
set of nondominated solutions. Using ǫ constraint approach, the problem could
be transformed to be minimizing total manufacturing cost objective for a given
upper limit on the makespan objective. The resulting single criteria problem
is a nonlinear mixed integer formulation. For the cases where the exact algorithm
may not be efficient in terms of computation time, we propose an efficient
approximation algorithm.Uruk, ZeynepM.S
Stochastic lot sizing problem with controllable processing times
In this study, we consider the stochastic capacitated lot sizing problem with controllable processing times where processing times can be reduced in return for extra compression cost. We assume that the compression cost function is a convex function as it may reflect increasing marginal costs of larger reductions and may be more appropriate when the resource life, energy consumption or carbon emission are taken into consideration. We consider this problem under static uncertainty strategy and α service level constraints. We first introduce a nonlinear mixed integer programming formulation of the problem, and use the recent advances in second order cone programming to strengthen it and then solve by a commercial solver. Our computational experiments show that taking the processing times as constant may lead to more costly production plans, and the value of controllable processing times becomes more evident for a stochastic environment with a limited capacity. Moreover, we observe that controllable processing times increase the solution flexibility and provide a better solution in most of the problem instances, although the largest improvements are obtained when setup costs are high and the system has medium sized capacities. © 2014 Elsevier Ltd
Rescheduling parallel machines with controllable processing times
Ankara : The Department of Industrial Engineeringand the Graduate School of Engineering and Science of Bilkent University, 2012.Thesis (Master's) -- Bilkent University, 2012.Includes bibliographical references.In many manufacturing environments, the production does not always endure
as it is planned. Many times, it is interrupted by a disruption such as machine
breakdown, power loss, etc. In our problem, we are given an original production
schedule in a non-identical parallel machine environment and we assume that one
of the machines is disrupted at time t.
Our aim is to revise the schedule, although there are some restrictions that
should be considered while creating the revised schedule. Disrupted machine
is unavailable for a certain time. New schedule has to satisfy the maximum
completion time constraint of each machine. Furthermore, when we revise the
schedule we have to satisfy the constraint that the revised start time of a job
cannot be earlier than its original start time. Because, we assume that jobs are
not ready before their original start times in the revised schedule.
Therefore, we have to find an alternative solution to decrease the negative
impacts of this disruption as much as possible. One way to process a disrupted
job in the revised schedule is to reallocate the job to another machine. The other
way is to keep the disrupted job at its original machine, but to delay its start time
after the end time of the disruption. Since the machines might be fully utilized
originally, we may have to compress some of the processing times in order to
add a new job to a machine or to reallocate the jobs after the disruption ends.
Consequently, we assume that the processing times are controllable within the
given lower and upper bounds.
Our first objective is to minimize the sum of reallocation and nonlinear compression
costs. Besides, it is important to deliver the orders on time, not earlier or later than they are promised. Therefore, we try to maintain the original completion
times as much as possible. So, the second objective is to minimize the
total absolute deviations of the completion times in the revised schedule from the
original completion times.
We developed a bi-criteria non-linear mathematical model to solve this nonidentical
parallel machine rescheduling problem. Since we have two objectives, we
handled the second objective by giving it an upper bound and adding this bound
as a constraint to the problem. By utilizing the second order cone programming,
we solved this mixed-integer nonlinear mathematical model using a commercial
MIP solver such as CPLEX. We also propose a decision tree based heuristic
algorithm. Our algorithm generates a set of solutions for a problem instance
and we test the solution quality of the algorithm solving same problem instances
by the mathematical model. According to our computational experiments, the
proposed heuristic approach could obtain close solutions for the first objective for
a given upper bound on the second objective.Muhafız, MügeM.S
A Linear Parameter-Varying Approach to Data Predictive Control
By means of the linear parameter-varying (LPV) Fundamental Lemma, we derive
novel data-driven predictive control (DPC) methods for LPV systems. In
particular, we present output-feedback and state-feedback-based LPV-DPC methods
with terminal ingredients, which guarantee exponential stability and recursive
feasibility. We provide methods for the data-based computation of these
terminal ingredients. Furthermore, an in-depth analysis of the properties and
implementation aspects of the LPV-DPC schemes is given, including alternative
recursive formulations, application for nonlinear systems and handling
noise-disturbed data. We demonstrate the performance of the proposed methods on
a simulation example involving a nonlinear unbalanced disc system.Comment: Submitted to IEEE-TAC. Extended version. 17 page
- …