428 research outputs found

    Modeling and Solving Flow Shop Scheduling Problem Considering Worker Resource

    Get PDF
    In this paper, an uninterrupted hybrid flow scheduling problem is modeled under uncertainty conditions. Due to the uncertainty of processing time in workshops, fuzzy programming method has been used to control the parameters of processing time and preparation time. In the proposed model, there are several jobs that must be processed by machines and workers, respectively. The main purpose of the proposed model is to determine the correct sequence of operations and assign operations to each machine and each worker at each stage, so that the total completion time (Cmax) is minimized. Also this paper, fuzzy programming method is used for control unspecified parameter has been used from GAMS software to solve sample problems. The results of problem solving in small and medium dimensions show that with increasing uncertainty, the amount of processing time and consequently the completion time increases. Increases from the whole work. On the other hand, with the increase in the number of machines and workers in each stage due to the high efficiency of the machines, the completion time of all works has decreased. Innovations in this paper include uninterrupted hybrid flow storage scheduling with respect to fuzzy processing time and preparation time in addition to payment time. The allocation of workers and machines to jobs is another innovation of this article

    SICStus MT - A Multithreaded Execution Environment for SICStus Prolog

    Get PDF
    The development of intelligent software agents and other complex applications which continuously interact with their environments has been one of the reasons why explicit concurrency has become a necessity in a modern Prolog system today. Such applications need to perform several tasks which may be very different with respect to how they are implemented in Prolog. Performing these tasks simultaneously is very tedious without language support. This paper describes the design, implementation and evaluation of a prototype multithreaded execution environment for SICStus Prolog. The threads are dynamically managed using a small and compact set of Prolog primitives implemented in a portable way, requiring almost no support from the underlying operating system

    Four decades of research on the open-shop scheduling problem to minimize the makespan

    Full text link
    One of the basic scheduling problems, the open-shop scheduling problem has a broad range of applications across different sectors. The problem concerns scheduling a set of jobs, each of which has a set of operations, on a set of different machines. Each machine can process at most one operation at a time and the job processing order on the machines is immaterial, i.e., it has no implication for the scheduling outcome. The aim is to determine a schedule, i.e., the completion times of the operations processed on the machines, such that a performance criterion is optimized. While research on the problem dates back to the 1970s, there have been reviving interests in the computational complexity of variants of the problem and solution methodologies in the past few years. Aiming to provide a complete road map for future research on the open-shop scheduling problem, we present an up-to-date and comprehensive review of studies on the problem that focuses on minimizing the makespan, and discuss potential research opportunities

    Dynamic scheduling in a multi-product manufacturing system

    Get PDF
    To remain competitive in global marketplace, manufacturing companies need to improve their operational practices. One of the methods to increase competitiveness in manufacturing is by implementing proper scheduling system. This is important to enable job orders to be completed on time, minimize waiting time and maximize utilization of equipment and machineries. The dynamics of real manufacturing system are very complex in nature. Schedules developed based on deterministic algorithms are unable to effectively deal with uncertainties in demand and capacity. Significant differences can be found between planned schedules and actual schedule implementation. This study attempted to develop a scheduling system that is able to react quickly and reliably for accommodating changes in product demand and manufacturing capacity. A case study, 6 by 6 job shop scheduling problem was adapted with uncertainty elements added to the data sets. A simulation model was designed and implemented using ARENA simulation package to generate various job shop scheduling scenarios. Their performances were evaluated using scheduling rules, namely, first-in-first-out (FIFO), earliest due date (EDD), and shortest processing time (SPT). An artificial neural network (ANN) model was developed and trained using various scheduling scenarios generated by ARENA simulation. The experimental results suggest that the ANN scheduling model can provided moderately reliable prediction results for limited scenarios when predicting the number completed jobs, maximum flowtime, average machine utilization, and average length of queue. This study has provided better understanding on the effects of changes in demand and capacity on the job shop schedules. Areas for further study includes: (i) Fine tune the proposed ANN scheduling model (ii) Consider more variety of job shop environment (iii) Incorporate an expert system for interpretation of results. The theoretical framework proposed in this study can be used as a basis for further investigation

    A simulation modelling approach to improve the OEE of a bottling line

    Get PDF
    This dissertation presents a simulation approach to improve the efficiency performance, in terms of OEE, of an automated bottling line. A simulation model of the system is created by means of the software AnyLogic; it is used to solve the case. The problems faced are a sequencing problem related to the order the formats of bottles are processed and the buffer sizing problem. Either theoretical aspects on OEE, job sequencing and simulation and practical aspects are presented

    Ensuring Service Level Agreements for Composite Services by Means of Request Scheduling

    Get PDF
    Building distributed systems according to the Service-Oriented Architecture (SOA) allows simplifying the integration process, reducing development costs and increasing scalability, interoperability and openness. SOA endorses the reusability of existing services and aggregating them into new service layers for future recycling. At the same time, the complexity of large service-oriented systems negatively reflects on their behavior in terms of the exhibited Quality of Service. To address this problem this thesis focuses on using request scheduling for meeting Service Level Agreements (SLAs). The special focus is given to composite services specified by means of workflow languages. The proposed solution suggests using two level scheduling: global and local. The global policies assign the response time requirements for component service invocations. The local scheduling policies are responsible for performing request scheduling in order to meet these requirements. The proposed scheduling approach can be deployed without altering the code of the scheduled services, does not require a central point of control and is platform independent. The experiments, conducted using a simulation, were used to study the effectiveness and the feasibility of the proposed scheduling schemes in respect to various deployment requirements. The validity of the simulation was confirmed by comparing its results to the results obtained in experiments with a real-world service. The proposed approach was shown to work well under different traffic conditions and with different types of SLAs

    A graph based process model measurement framework using scheduling theory

    Get PDF
    Software development processes, as a means of ensuring software quality and productivity, have been widely accepted within the software development community; software process modeling, on the other hand, continues to be a subject of interest in the research community. Even with organizations that have achieved higher SEI maturity levels, processes are by and large described in documents and reinforced as guidelines or laws governing software development activities. The lack of industry-wide adaptation of software process modeling as part of development activities can be attributed to two major reasons: lack of forecast power in the (software) process modeling and lack of integration mechanism for the described process to seamlessly interact with daily development activities. This dissertation describes a research through which a framework has been established where processes can be manipulated, measured, and dynamically modified by interacting with project management techniques and activities in an integrated process modeling environment, thus closing the gap between process modeling and software development. In this research, processes are described using directed graphs, similar to the techniques with CPM. This way, the graphs can be manipulated visually while the properties of the graphs-can be used to check their validity. The partial ordering and the precedence relationship of the tasks in the graphs are similar to the one studied in other researches [Delcambre94] [Mills96]. Measurements of the effectiveness of the processes are added in this research. These measurements provide bases for the judgment when manipulating the graphs to produce or modify a process. Software development can be considered as activities related to three sets: a set of tasks (τ), a set of resources (ρ), and a set of constraints (y). The process, P, is then a function of all the sets interacting with each other: P = {τ, ρ, y). The interactions of these sets can be described in terms of different machine models using scheduling theory. While trying to produce an optimal solution satisfying a set of prescribed conditions using the analytical method would lead to a practically non-feasible formulation, many heuristic algorithms in scheduling theory combined with manual manipulation of the tasks can help to produce a reasonable good process, the effectiveness of which is reflected through a set of measurement criteria, in particular, the make-span, the float, and the bottlenecks. Through an integrated process modeling environment, these measurements can be obtained in real time, thus providing a feedback loop during the process execution. This feedback loop is essential for risk management and control

    Task scheduling for application integration: A strategy for large volumes of data

    Get PDF
    Enterprise Application Integration is the research field, which provides methodologies, techniques and tools for modelling and implementing integration processes. An integration process performs the orchestration of a set of applications to keep them synchronised or to allow the creation of new features. It can be represented by a workflow composed of tasks and communication channels. Integration platforms are tools for the design and execution of integration processes in which, the runtime system is the component responsible for execution time of the tasks and the allocation of computational resources that perform them. The processing of a large volume of data, corresponding to execution of millions of tasks, can cause situations of overload, characterised by the accumulation of tasks in internal queues awaiting computational resources in the runtime systems, resulting in unacceptable response time for the external applications and users. Our research hypothesis is that the runtime systems of the integration platforms use simplistic heuristics for scheduling tasks, which does not allow them to maintain acceptable levels of performance when there are overload situations. In this research work, we developed (i) a representation for integration processes, (ii) a characterisation for your task schedules, (iii) a heuristic to deal with situations of overload, (iv) a mathematical model for a performance metric of the execution of integration processes and (v) a simulation tool for task scheduling heuristics. Our research results indicate that, in situations of overload, our heuristic promotes a balanced workload distribution and an increase in the performance of the execution of the integration processes.Integração de Aplicações Empresariais é o campo de pesquisa, que fornece metodologias, técnicas e ferramentas para modelar e implementar processos de integração. Um processo de integração executa a orquestração de um conjunto de aplicações para mantê-las sincronizadas ou para permitir a criação de novas funcionalidades. Ele pode ser representado por um fluxo de trabalho composto por tarefas e canais de comunicação. Plataformas de integração são ferramentas para projetar e executar processos de integração, nas quais o motor de execução é o componente responsável pelo tempo de execução das tarefas e pela alocação de recursos computacionais que as executam. O processamento de um grande volume de dados, correspondendo a execução de milhões de tarefas, pode causar situações de sobrecarga, caracterizadas pelo acúmulo de tarefas em filas internas que aguardam recursos computacionais nos motores de execução, resultando em tempos de resposta inaceitáveis para aplicações e usuários externos. Nossa hipótese de pesquisa é que os motores de execução das plataformas de integração usam heurísticas simplistas para agendar tarefas, o que não lhes permitem manter níveis aceitáveis de desempenho em situações de sobrecarga. Neste trabalho de pesquisa, desenvolvemos (i) uma representação para processos de integração, (ii) uma caracterização para seus agendamentos de tarefas, (iii) uma heurística para lidar com situações de sobrecarga, (iv) um modelo matemático para uma métrica de desempenho da execução de processos de integração e (v) uma ferramenta de simulação para heurísticas de agendamento de tarefas. Nossos resultados de pesquisa indicam que, em situações de sobrecarga, nossa heurística promove uma distribuição equilibrada da carga de trabalho e um aumento no desempenho da execução dos processos de integração
    corecore