32,163 research outputs found

    Multiprocessor task scheduling in multistage hyrid flowshops: a genetic algorithm approach

    Get PDF
    This paper considers multiprocessor task scheduling in a multistage hybrid flow-shop environment. The objective is to minimize the make-span, that is, the completion time of all the tasks in the last stage. This problem is of practical interest in the textile and process industries. A genetic algorithm (GA) is developed to solve the problem. The GA is tested against a lower bound from the literature as well as against heuristic rules on a test bed comprising 400 problems with up to 100 jobs, 10 stages, and with up to five processors on each stage. For small problems, solutions found by the GA are compared to optimal solutions, which are obtained by total enumeration. For larger problems, optimum solutions are estimated by a statistical prediction technique. Computational results show that the GA is both effective and efficient for the current problem. Test problems are provided in a web site at www.benchmark.ibu.edu.tr/mpt-h; fsp

    Spatial-temporal data modelling and processing for personalised decision support

    Get PDF
    The purpose of this research is to undertake the modelling of dynamic data without losing any of the temporal relationships, and to be able to predict likelihood of outcome as far in advance of actual occurrence as possible. To this end a novel computational architecture for personalised ( individualised) modelling of spatio-temporal data based on spiking neural network methods (PMeSNNr), with a three dimensional visualisation of relationships between variables is proposed. In brief, the architecture is able to transfer spatio-temporal data patterns from a multidimensional input stream into internal patterns in the spiking neural network reservoir. These patterns are then analysed to produce a personalised model for either classification or prediction dependent on the specific needs of the situation. The architecture described above was constructed using MatLab© in several individual modules linked together to form NeuCube (M1). This methodology has been applied to two real world case studies. Firstly, it has been applied to data for the prediction of stroke occurrences on an individual basis. Secondly, it has been applied to ecological data on aphid pest abundance prediction. Two main objectives for this research when judging outcomes of the modelling are accurate prediction and to have this at the earliest possible time point. The implications of these findings are not insignificant in terms of health care management and environmental control. As the case studies utilised here represent vastly different application fields, it reveals more of the potential and usefulness of NeuCube (M1) for modelling data in an integrated manner. This in turn can identify previously unknown (or less understood) interactions thus both increasing the level of reliance that can be placed on the model created, and enhancing our human understanding of the complexities of the world around us without the need for over simplification. Read less Keywords Personalised modelling; Spiking neural network; Spatial-temporal data modelling; Computational intelligence; Predictive modelling; Stroke risk predictio

    Spatial-temporal data modelling and processing for personalised decision support

    Get PDF
    The purpose of this research is to undertake the modelling of dynamic data without losing any of the temporal relationships, and to be able to predict likelihood of outcome as far in advance of actual occurrence as possible. To this end a novel computational architecture for personalised ( individualised) modelling of spatio-temporal data based on spiking neural network methods (PMeSNNr), with a three dimensional visualisation of relationships between variables is proposed. In brief, the architecture is able to transfer spatio-temporal data patterns from a multidimensional input stream into internal patterns in the spiking neural network reservoir. These patterns are then analysed to produce a personalised model for either classification or prediction dependent on the specific needs of the situation. The architecture described above was constructed using MatLab© in several individual modules linked together to form NeuCube (M1). This methodology has been applied to two real world case studies. Firstly, it has been applied to data for the prediction of stroke occurrences on an individual basis. Secondly, it has been applied to ecological data on aphid pest abundance prediction. Two main objectives for this research when judging outcomes of the modelling are accurate prediction and to have this at the earliest possible time point. The implications of these findings are not insignificant in terms of health care management and environmental control. As the case studies utilised here represent vastly different application fields, it reveals more of the potential and usefulness of NeuCube (M1) for modelling data in an integrated manner. This in turn can identify previously unknown (or less understood) interactions thus both increasing the level of reliance that can be placed on the model created, and enhancing our human understanding of the complexities of the world around us without the need for over simplification. Read less Keywords Personalised modelling; Spiking neural network; Spatial-temporal data modelling; Computational intelligence; Predictive modelling; Stroke risk predictio

    An investigation into minimising total energy consumption and total completion time in a flexible job shop for recycling carbon fiber reinforced polymer

    Get PDF
    The increased use of carbon fiber reinforced polymer (CFRP) in industry coupled with European Union restrictions on landfill disposal has resulted in a need to develop relevant recycling technologies. Several methods, such as mechanical grinding, thermolysis and solvolysis, have been tried to recover the carbon fibers. Optimisation techniques for reducing energy consumed by above processes have also been developed. However, the energy efficiency of recycling CFRP at the workshop level has never been considered before. An approach to incorporate energy reduction into consideration while making the scheduling plans for a CFRP recycling workshop is presented in this paper. This research sets in a flexible job shop circumstance, model for the bi-objective problem that minimise total processing energy consumption and makespan is developed. A modified Genetic Algorithm for solving the raw material lot splitting problem is developed. A case study of the lot sizing problem in the flexible job shop for recycling CFRP is presented to show how scheduling plans affect energy consumption, and to prove the feasibility of the model and the developed algorithm

    The integrated use of enterprise and system dynamics modelling techniques in support of business decisions

    Get PDF
    Enterprise modelling techniques support business process re-engineering by capturing existing processes and based on perceived outputs, support the design of future process models capable of meeting enterprise requirements. System dynamics modelling tools on the other hand are used extensively for policy analysis and modelling aspects of dynamics which impact on businesses. In this paper, the use of enterprise and system dynamics modelling techniques has been integrated to facilitate qualitative and quantitative reasoning about the structures and behaviours of processes and resource systems used by a Manufacturing Enterprise during the production of composite bearings. The case study testing reported has led to the specification of a new modelling methodology for analysing and managing dynamics and complexities in production systems. This methodology is based on a systematic transformation process, which synergises the use of a selection of public domain enterprise modelling, causal loop and continuous simulationmodelling techniques. The success of the modelling process defined relies on the creation of useful CIMOSA process models which are then converted to causal loops. The causal loop models are then structured and translated to equivalent dynamic simulation models using the proprietary continuous simulation modelling tool iThink

    Efficient heuristics for the parallel blocking flow shop scheduling problem

    Get PDF
    We consider the NP-hard problem of scheduling n jobs in F identical parallel flow shops, each consisting of a series of m machines, and doing so with a blocking constraint. The applied criterion is to minimize the makespan, i.e., the maximum completion time of all the jobs in F flow shops (lines). The Parallel Flow Shop Scheduling Problem (PFSP) is conceptually similar to another problem known in the literature as the Distributed Permutation Flow Shop Scheduling Problem (DPFSP), which allows modeling the scheduling process in companies with more than one factory, each factory with a flow shop configuration. Therefore, the proposed methods can solve the scheduling problem under the blocking constraint in both situations, which, to the best of our knowledge, has not been studied previously. In this paper, we propose a mathematical model along with some constructive and improvement heuristics to solve the parallel blocking flow shop problem (PBFSP) and thus minimize the maximum completion time among lines. The proposed constructive procedures use two approaches that are totally different from those proposed in the literature. These methods are used as initial solution procedures of an iterated local search (ILS) and an iterated greedy algorithm (IGA), both of which are combined with a variable neighborhood search (VNS). The proposed constructive procedure and the improved methods take into account the characteristics of the problem. The computational evaluation demonstrates that both of them –especially the IGA– perform considerably better than those algorithms adapted from the DPFSP literature.Peer ReviewedPostprint (author's final draft

    COMBINATION OF ACO AND PSO TO MINIMIZE MAKESPAN IN ORDERED FLOWSHOP SCHEDULING PROBLEMS

    Get PDF
    The problem of scheduling flowshop production is one of the most versatile problems and is often encountered in many industries. Effective scheduling is important because it has a significant impact on reducing costs and increasing productivity. However, solving the ordered flowshop scheduling problem with the aim of minimizing makespan requires a difficult computation known as NP-hard. This research will contribute to the application of combination ACO and PSO to minimize makespan in the ordered flowshop scheduling problem. The performance of the proposed scheduling algorithm is evaluated by testing the data set of 600 ordered flowshop scheduling problems with various combinations of job and machine size combinations. The test results show that the ACO-PSO algorithm is able to provide a better scheduling solution for the scheduling group with small dimensions, namely 76 instances from a total of 600 inctances and is not good at obtaining makespan in the scheduling group with large dimensions. The ACO-PSO algorithm uses execution time which increases as the dimension size (multiple jobs and many machines) increases in a scheduled instanc

    A linear programming-based method for job shop scheduling

    Get PDF
    We present a decomposition heuristic for a large class of job shop scheduling problems. This heuristic utilizes information from the linear programming formulation of the associated optimal timing problem to solve subproblems, can be used for any objective function whose associated optimal timing problem can be expressed as a linear program (LP), and is particularly effective for objectives that include a component that is a function of individual operation completion times. Using the proposed heuristic framework, we address job shop scheduling problems with a variety of objectives where intermediate holding costs need to be explicitly considered. In computational testing, we demonstrate the performance of our proposed solution approach
    corecore