933 research outputs found

    Survey of dynamic scheduling in manufacturing systems

    Get PDF

    A synthesis of logic and bio-inspired techniques in the design of dependable systems

    Get PDF
    Much of the development of model-based design and dependability analysis in the design of dependable systems, including software intensive systems, can be attributed to the application of advances in formal logic and its application to fault forecasting and verification of systems. In parallel, work on bio-inspired technologies has shown potential for the evolutionary design of engineering systems via automated exploration of potentially large design spaces. We have not yet seen the emergence of a design paradigm that effectively combines these two techniques, schematically founded on the two pillars of formal logic and biology, from the early stages of, and throughout, the design lifecycle. Such a design paradigm would apply these techniques synergistically and systematically to enable optimal refinement of new designs which can be driven effectively by dependability requirements. The paper sketches such a model-centric paradigm for the design of dependable systems, presented in the scope of the HiP-HOPS tool and technique, that brings these technologies together to realise their combined potential benefits. The paper begins by identifying current challenges in model-based safety assessment and then overviews the use of meta-heuristics at various stages of the design lifecycle covering topics that span from allocation of dependability requirements, through dependability analysis, to multi-objective optimisation of system architectures and maintenance schedules

    An agent-based simulator for quantifying the cost of uncertainty in production systems

    Get PDF
    Product-mix problems, where a range of products that generate different incomes compete for a limited set of production resources, are key to the success of many organisations. In their deterministic forms, these are simple optimisation problems; however, the consideration of stochasticity may turn them into analytically and/or computationally intractable problems. Thus, simulation becomes a powerful approach for providing efficient solutions to real-world productmix problems. In this paper, we develop a simulator for exploring the cost of uncertainty in these production systems using Petri nets and agent-based techniques. Specifically, we implement a stochastic version of Goldratt’s PQ problem that incorporates uncertainty in the volume and mix of customer demand. Through statistics, we derive regression models that link the net profit to the level of variability in the volume and mix. While the net profit decreases as uncertainty grows, we find that the system is able to effectively accommodate a certain level of variability when using a Drum-Buffer-Rope mechanism. In this regard, we reveal that the system is more robust to mix than to volume uncertainty. Later, we analyse the cost-benefit trade-off of uncertainty reduction, which has important implications for professionals. This analysis may help them optimise the profitability of investments. In this regard, we observe that mitigating volume uncertainty should be given higher consideration when the costs of reducing variability are low, while the efforts are best concentrated on alleviating mix uncertainty under high costs.This article was financially supported by the State Research Agency of the Spanish Ministry of Science and Innovation (MCIN/AEI/ 10.13039/50110 0 011033), via the project SPUR, with grant ref. PID2020–117021GB-I00. In addition, the authors greatly appreciate the valuable and constructive feedback received from the Editorial team of this journal and two anonymous reviewers in the different stages of the review process

    Using statistical-model-checking-based simulation for evaluating the robustness of a production schedule

    Get PDF
    Published in Service Orientation in Holonic and Multi-Agent Manufacturing, Borangiu T., Trentesaux D., Thomas A., Cardin O. (eds). Studies in Computational Intelligence, vol 762, pp. 345-357, Springer, ChamInternational audienceIndustry 4.0 implies new scheduling problems linked to the optimal using of flexible resources and to mass customisation of products. In this context, first research results show that Discrete Event Systems models and tools are a relevant alternative to the classical approaches for modelling scheduling problems and for solving them. Moreover, the challenges of the industry 4.0 mean taking into account the uncertainties linked to the mass customisation (volume and mix of the demand) but also to the states of the resources (failures, operation durations,. . .). The goal of this paper is to show how it is possible to use the simulation based on statistical model checking for taking into account these uncertainties and for evaluating the robustness of a given schedule

    Evaluating the Robustness of Resource Allocations Obtained through Performance Modeling with Stochastic Process Algebra

    Get PDF
    Recent developments in the field of parallel and distributed computing has led to a proliferation of solving large and computationally intensive mathematical, science, or engineering problems, that consist of several parallelizable parts and several non-parallelizable (sequential) parts. In a parallel and distributed computing environment, the performance goal is to optimize the execution of parallelizable parts of an application on concurrent processors. This requires efficient application scheduling and resource allocation for mapping applications to a set of suitable parallel processors such that the overall performance goal is achieved. However, such computational environments are often prone to unpredictable variations in application (problem and algorithm) and system characteristics. Therefore, a robustness study is required to guarantee a desired level of performance. Given an initial workload, a mapping of applications to resources is considered to be robust if that mapping optimizes execution performance and guarantees a desired level of performance in the presence of unpredictable perturbations at runtime. In this research, a stochastic process algebra, Performance Evaluation Process Algebra (PEPA), is used for obtaining resource allocations via a numerical analysis of performance modeling of the parallel execution of applications on parallel computing resources. The PEPA performance model is translated into an underlying mathematical Markov chain model for obtaining performance measures. Further, a robustness analysis of the allocation techniques is performed for finding a robustmapping from a set of initial mapping schemes. The numerical analysis of the performance models have confirmed similarity with the simulation results of earlier research available in existing literature. When compared to direct experiments and simulations, numerical models and the corresponding analyses are easier to reproduce, do not incur any setup or installation costs, do not impose any prerequisites for learning a simulation framework, and are not limited by the complexity of the underlying infrastructure or simulation libraries

    An agile and adaptive holonic architecture for manufacturing control

    Get PDF
    Tese de doutoramento. Engenharia Electrotécnica e de Computadores. 2004. Faculdade de Engenharia. Universidade do Port

    Activity Report: Automatic Control 1997

    Get PDF
    • …
    corecore