479 research outputs found

    Effective and efficient estimation of distribution algorithms for permutation and scheduling problems.

    Get PDF
    Estimation of Distribution Algorithm (EDA) is a branch of evolutionary computation that learn a probabilistic model of good solutions. Probabilistic models are used to represent relationships between solution variables which may give useful, human-understandable insights into real-world problems. Also, developing an effective PM has been shown to significantly reduce function evaluations needed to reach good solutions. This is also useful for real-world problems because their representations are often complex needing more computation to arrive at good solutions. In particular, many real-world problems are naturally represented as permutations and have expensive evaluation functions. EDAs can, however, be computationally expensive when models are too complex. There has therefore been much recent work on developing suitable EDAs for permutation representation. EDAs can now produce state-of-the-art performance on some permutation benchmark problems. However, models are still complex and computationally expensive making them hard to apply to real-world problems. This study investigates some limitations of EDAs in solving permutation and scheduling problems. The focus of this thesis is on addressing redundancies in the Random Key representation, preserving diversity in EDA, simplifying the complexity attributed to the use of multiple local improvement procedures and transferring knowledge from solving a benchmark project scheduling problem to a similar real-world problem. In this thesis, we achieve state-of-the-art performance on the Permutation Flowshop Scheduling Problem benchmarks as well as significantly reducing both the computational effort required to build the probabilistic model and the number of function evaluations. We also achieve competitive results on project scheduling benchmarks. Methods adapted for solving a real-world project scheduling problem presents significant improvements

    Energy efficiency in discrete-manufacturing systems: insights, trends, and control strategies

    Get PDF
    Since the depletion of fossil energy sources, rising energy prices, and governmental regulation restrictions, the current manufacturing industry is shifting towards more efficient and sustainable systems. This transformation has promoted the identification of energy saving opportunities and the development of new technologies and strategies oriented to improve the energy efficiency of such systems. This paper outlines and discusses most of the research reported during the last decade regarding energy efficiency in manufacturing systems, the current technologies and strategies to improve that efficiency, identifying and remarking those related to the design of management/control strategies. Based on this fact, this paper aims to provide a review of strategies for reducing energy consumption and optimizing the use of resources within a plant into the context of discrete manufacturing. The review performed concerning the current context of manufacturing systems, control systems implemented, and their transformation towards Industry 4.0 might be useful in both the academic and industrial dimension to identify trends and critical points and suggest further research lines.Peer ReviewedPreprin

    Systems Engineering: Availability and Reliability

    Get PDF
    Current trends in Industry 4.0 are largely related to issues of reliability and availability. As a result of these trends and the complexity of engineering systems, research and development in this area needs to focus on new solutions in the integration of intelligent machines or systems, with an emphasis on changes in production processes aimed at increasing production efficiency or equipment reliability. The emergence of innovative technologies and new business models based on innovation, cooperation networks, and the enhancement of endogenous resources is assumed to be a strong contribution to the development of competitive economies all around the world. Innovation and engineering, focused on sustainability, reliability, and availability of resources, have a key role in this context. The scope of this Special Issue is closely associated to that of the ICIE’2020 conference. This conference and journal’s Special Issue is to present current innovations and engineering achievements of top world scientists and industrial practitioners in the thematic areas related to reliability and risk assessment, innovations in maintenance strategies, production process scheduling, management and maintenance or systems analysis, simulation, design and modelling

    Analyses and optimizations of timing-constrained embedded systems considering resource synchronization and machine learning approaches

    Get PDF
    Nowadays, embedded systems have become ubiquitous, powering a vast array of applications from consumer electronics to industrial automation. Concurrently, statistical and machine learning algorithms are being increasingly adopted across various application domains, such as medical diagnosis, autonomous driving, and environmental analysis, offering sophisticated data analysis and decision-making capabilities. As the demand for intelligent and time-sensitive applications continues to surge, accompanied by growing concerns regarding data privacy, the deployment of machine learning models on embedded devices has emerged as an indispensable requirement. However, this integration introduces both significant opportunities for performance enhancement and complex challenges in deployment optimization. On the one hand, deploying machine learning models on embedded systems with limited computational capacity, power budgets, and stringent timing requirements necessitates additional adjustments to ensure optimal performance and meet the imposed timing constraints. On the other hand, the inherent capabilities of machine learning, such as self-adaptation during runtime, prove invaluable in addressing challenges encountered in embedded systems, aiding in optimization and decision-making processes. This dissertation introduces two primary modifications for the analyses and optimizations of timing-constrained embedded systems. For one thing, it addresses the relatively long access times required for shared resources of machine learning tasks. For another, it considers the limited communication resources and data privacy concerns in distributed embedded systems when deploying machine learning models. Additionally, this work provides a use case that employs a machine learning method to tackle challenges specific to embedded systems. By addressing these key aspects, this dissertation contributes to the analysis and optimization of timing-constrained embedded systems, considering resource synchronization and machine learning models to enable improved performance and efficiency in real-time applications with stringent constraints

    Diagnóstico de fallos y optimización de la planificación en un marco de e-mantenimiento.

    Get PDF
    324 p.El objetivo principal es demostrar el potencial de mejora que las técnicas y metodologías relacionadas con la analítica prescriptiva, pueden proporcionar en aplicaciones de mantenimiento industrial. Las tecnologías desarrolladas se pueden agrupar en tres ámbitos: - El e-mantenimiento, relacionado fundamentalmente con el desarrollo de plataformas colaborativas e inteligentes que permiten la integración de nuevos sensores, sistemas de comunicaciones, estándares y protocolos, conceptos, métodos de almacenamiento y análisis etc. que entran continuamente en nuestro abanico de posibilidades y nos ofrecen la posibilidad de seguir una tendencia de mejora en la optimización de activos y procesos, y en la interoperabilidad entre sistemas.- Las Redes Bayesianas (Bayesian Networks ¿ BNs) junto con otras metodologías de recogida de información utilizadas en ingeniería nos ofrecen la posibilidad de automatizar la tarea de diagnóstico y predicción de fallos.- La optimización de las estrategias de mantenimiento, mediante simulaciones de fallos y análisis coste-efectividad, que ayudan a la toma de decisiones a la hora de seleccionar una estrategia de mantenimiento adecuada para el activo. Además, mediante el uso de algoritmos de optimización logramos mejorar la planificación del mantenimiento, reduciendo los tiempos y costes para realizar las tareas en un parque de activos

    Compilers that learn to optimise: a probabilistic machine learning approach

    Get PDF
    Compiler optimisation is the process of making a compiler produce better code, i.e. code that, for example, runs faster on a target architecture. Although numerous program transformations for optimisation have been proposed in the literature, these transformations are not always beneficial and they can interact in very complex ways. Traditional approaches adopted by compiler writers fix the order of the transformations and decide when and how these transformations should be applied to a program by using hard-coded heuristics. However, these heuristics require a lot of time and effort to construct and may sacrifice performance on programs they have not been tuned for.This thesis proposes a probabilistic machine learning solution to the compiler optimisation problem that automatically determines "good" optimisation strategies for programs. This approach uses predictive modelling in order to search the space of compiler transformations. Unlike most previous work that learns when/how to apply a single transformation in isolation or a fixed-order set of transformations, the techniques proposed in this thesis are capable of tackling the general problem of predicting "good" sequences of compiler transformations. This is achieved by exploiting transference across programs with two different techniques: Predictive Search Distributions (PSD) and multi-task Gaussian process prediction (multi-task GP). While the former directly addresses the problem of predicting "good" transformation sequences, the latter learns regression models (or proxies) of the performance of the programs in order to rapidly scan the space of transformation sequences.Both methods, PSD and multi-task GP, are formulated as general machine learning techniques. In particular, the PSD method is proposed in order to speed up search in combinatorial optimisation problems by learning a distribution over good solutions on a set of problem in¬ stances and using that distribution to search the optimisation space of a problem that has not been seen before. Likewise, multi-task GP is proposed as a general method for multi-task learning that directly models the correlation between several machine learning tasks, exploiting the shared information across the tasks.Additionally, this thesis presents an extension to the well-known analysis of variance (ANOVA) methodology in order to deal with sequence data. This extension is used to address the problem of optimisation space characterisation by identifying and quantifying the main effects of program transformations and their interactions.Finally, the machine learning methods proposed are successfully applied to a data set that has been generated as a result of the application of source-to-source transformations to 12 C programs from the UTDSP benchmark suite
    corecore