28 research outputs found

    Estimating performance indexes of a baggage handling system using metamodels

    Full text link
    In this study, we develop some deterministic metamodels to quickly and precisely predict the future of a technically complex system. The underlying system is essentially a stochastic, discrete event simulation model of a big baggage handling system. The highly detailed simulation model of this is used for conducting some experiments and logging data which are then used for training artificial neural network metamodels. Demonstrated results show that the developed metamodels are well able to predict different performance measures related to the travel time of bags within this system. In contrast to the simulation models which are computationally expensive and expertise extensive to be developed, run, and maintained, the artificial neural network metamodels could serve as real time decision aiding tools which are considerably fast, precise, simple to use, and reliable.<br /

    Constructing prediction intervals for neural network metamodels of complex systems

    Full text link

    Predicting amount of saleable products using neural network metamodels of casthouses

    Full text link
    This study aims at developing abstract metamodels for approximating highly nonlinear relationships within a metal casting plant. Metal casting product quality nonlinearly depends on many controllable and uncontrollable factors. For improving the productivity of the system, it is vital for operation planners to predict in advance the amount of high quality products. Neural networks metamodels are developed and applied in this study for predicting the amount of saleable products. Training of metamodels is done using the Levenberg-Marquardt and Bayesian learning methods. Statistical measures are calculated for the developed metamodels over a grid of neural network structures. Demonstrated results indicate that Bayesian-based neural network metamodels outperform the Levenberg-Marquardt-based metamodels in terms of both prediction accuracy and robustness to the metamodel complexity. In contrast, the latter metamodels are computationally less expensive and generate the results more quickly

    Developing optimal neural network metamodels based on prediction intervals

    Full text link

    An Interval Based Approach To Model Input Uncertainty In Discrete-event Simulation

    Get PDF
    The objective of this research is to increase the robustness of discrete-event simulation (DES) when input uncertainties associated models and parameters are present. Input uncertainties in simulation have different sources, including lack of data, conflicting information and beliefs, lack of introspection, measurement errors, and lack of information about dependency. A reliable solution is obtained from a simulation mechanism that accounts for these uncertainty components in simulation. An interval-based simulation (IBS) mechanism based on imprecise probabilities is proposed, where the statistical distribution parameters in simulation are intervals instead of precise real numbers. This approach incorporates variability and uncertainty in systems. In this research, a standard procedure to estimate interval parameters of probability distributions is developed based on the measurement of simulation robustness. New mechanisms based on the inverse transform to generate interval random variates are proposed. A generic approach to specify the required replication length to achieve a desired level of robustness is derived. Furthermore, three simulation clock advancement approaches in the interval-based simulation are investigated. A library of Java-based IBS toolkits that simulates queueing systems is developed to demonstrate the new proposed reliable simulation. New interval statistics for interval data analysis are proposed to support decision making. To assess the performance of the IBS, we developed an interval-based metamodel for automated material handling systems, which generates interval performance measures that are more reliable and computationally more efficient than traditional DES simulation results

    Cycle Time Estimation in a Semiconductor Wafer Fab: A concatenated Machine Learning Approach

    Get PDF
    Die fortschreitende Digitalisierung aller Bereiche des Lebens und der Industrie lässt die Nachfrage nach Mikrochips steigen. Immer mehr Branchen – unter anderem auch die Automobilindustrie – stellen fest, dass die Lieferketten heutzutage von den Halbleiterherstellern abhängig sind, was kürzlich zur Halbleiterkrise geführt hat. Diese Situation erhöht den Bedarf an genauen Vorhersagen von Lieferzeiten von Halbleitern. Da aber deren Produktion extrem schwierig ist, sind solche Schätzungen nicht einfach zu erstellen. Gängige Ansätze sind entweder zu simpel (z.B. Mittelwert- oder rollierende Mittelwertschätzer) oder benötigen zu viel Zeit für detaillierte Szenarioanalysen (z.B. ereignisdiskrete Simulationen). Daher wird in dieser Arbeit eine neue Methodik vorgeschlagen, die genauer als Mittelwert- oder rollierende Mittelwertschätzer, aber schneller als Simulationen sein soll. Diese Methodik nutzt eine Verkettung von Modellen des maschinellen Lernens, die in der Lage sind, Wartezeiten in einer Halbleiterfabrik auf der Grundlage einer Reihe von Merkmalen vorherzusagen. In dieser Arbeit wird diese Methodik entwickelt und analysiert. Sie umfasst eine detaillierte Analyse der für jedes Modell benötigten Merkmale, eine Analyse des genauen Produktionsprozesses, den jedes Produkt durchlaufen muss – was als "Route" bezeichnet wird – und entwickelte Strategien zur Bewältigung von Unsicherheiten, wenn die Merkmalswerte in der Zukunft nicht bekannt sind. Zusätzlichwird die vorgeschlagene Methodik mit realen Betriebsdaten aus einerWafer-Fabrik der Robert Bosch GmbH evaluiert. Es kann gezeigt werden, dass die Methodik den Mittelwert- und Rollierenden Mittelwertschätzern überlegen ist, insbesondere in Situationen, in denen die Zykluszeit eines Loses signifikant vom Mittelwert abweicht. Zusätzlich kann gezeigt werden, dass die Ausführungszeit der Methode signifikant kürzer ist als die einer detaillierten Simulation

    SURROGATE SEARCH: A SIMULATION OPTIMIZATION METHODOLOGY FOR LARGE-SCALE SYSTEMS

    Get PDF
    For certain settings in which system performance cannot be evaluated by analytical methods, simulation models are widely utilized. This is especially for complex systems. To try to optimize these models, simulation optimization techniques have been developed. These attempt to identify the system designs and parameters that result in (near) optimal system performance. Although more realistic results can be provided by simulation, the computational time for simulator execution, and consequently, simulation optimization may be very long. Hence, the major challenge in determining improved system designs by incorporating simulation and search methodologies is to develop more efficient simulation optimization heuristics or algorithms. This dissertation develops a new approach, Surrogate Search, to determine near optimal system designs for large-scale simulation problems that contain combinatorial decision variables. First, surrogate objective functions are identified by analyzing simulation results to observe system behavior. Multiple linear regression is utilized to examine simulation results and construct surrogate objective functions. The identified surrogate objective functions, which can be quickly executed, are then utilized as simulator replacements in the search methodologies. For multiple problems containing different settings of the same simulation model, only one surrogate objective function needs to be identified. The development of surrogate objective functions benefits the optimization process by reducing the number of simulation iterations. Surrogate Search approaches are developed for two combinatorial problems, operator assignment and task sequencing, using a large-scale sortation system simulation model. The experimental results demonstrate that Surrogate Search can be applied to such large-scale simulation problems and outperform recognized simulation optimization methodology, Scatter Search (SS). This dissertation provides a systematic methodology to perform simulation optimization for complex operations research problems and contributes to the simulation optimization field

    Semantic data integration for supply chain management: with a specific focus on applications in the semiconductor industry

    Get PDF
    Supply Chain Management (SCM) is essential to monitor, control, and enhance the performance of SCs. Increasing globalization and diversity of Supply Chains (SC)s lead to complex SC structures, limited visibility among SC partners, and challenging collaboration caused by dispersed data silos. Digitalization is responsible for driving and transforming SCs of fundamental sectors such as the semiconductor industry. This is further accelerated due to the inevitable role that semiconductor products play in electronics, IoT, and security systems. Semiconductor SCM is unique as the SC operations exhibit special features, e.g., long production lead times and short product life. Hence, systematic SCM is required to establish information exchange, overcome inefficiency resulting from incompatibility, and adapt to industry-specific challenges. The Semantic Web is designed for linking data and establishing information exchange. Semantic models provide high-level descriptions of the domain that enable interoperability. Semantic data integration consolidates the heterogeneous data into meaningful and valuable information. The main goal of this thesis is to investigate Semantic Web Technologies (SWT) for SCM with a specific focus on applications in the semiconductor industry. As part of SCM, End-to-End SC modeling ensures visibility of SC partners and flows. Existing models are limited in the way they represent operational SC relationships beyond one-to-one structures. The scarcity of empirical data from multiple SC partners hinders the analysis of the impact of supply network partners on each other and the benchmarking of the overall SC performance. In our work, we investigate (i) how semantic models can be used to standardize and benchmark SCs. Moreover, in a volatile and unpredictable environment, SC experts require methodical and efficient approaches to integrate various data sources for informed decision-making regarding SC behavior. Thus, this work addresses (ii) how semantic data integration can help make SCs more efficient and resilient. Moreover, to secure a good position in a competitive market, semiconductor SCs strive to implement operational strategies to control demand variation, i.e., bullwhip, while maintaining sustainable relationships with customers. We examine (iii) how we can apply semantic technologies to specifically support semiconductor SCs. In this thesis, we provide semantic models that integrate, in a standardized way, SC processes, structure, and flows, ensuring both an elaborate understanding of the holistic SCs and including granular operational details. We demonstrate that these models enable the instantiation of a synthetic SC for benchmarking. We contribute with semantic data integration applications to enable interoperability and make SCs more efficient and resilient. Moreover, we leverage ontologies and KGs to implement customer-oriented bullwhip-taming strategies. We create semantic-based approaches intertwined with Artificial Intelligence (AI) algorithms to address semiconductor industry specifics and ensure operational excellence. The results prove that relying on semantic technologies contributes to achieving rigorous and systematic SCM. We deem that better standardization, simulation, benchmarking, and analysis, as elaborated in the contributions, will help master more complex SC scenarios. SCs stakeholders can increasingly understand the domain and thus are better equipped with effective control strategies to restrain disruption accelerators, such as the bullwhip effect. In essence, the proposed Sematic Web Technology-based strategies unlock the potential to increase the efficiency, resilience, and operational excellence of supply networks and the semiconductor SC in particular

    Design and Management of Manufacturing Systems

    Get PDF
    Although the design and management of manufacturing systems have been explored in the literature for many years now, they still remain topical problems in the current scientific research. The changing market trends, globalization, the constant pressure to reduce production costs, and technical and technological progress make it necessary to search for new manufacturing methods and ways of organizing them, and to modify manufacturing system design paradigms. This book presents current research in different areas connected with the design and management of manufacturing systems and covers such subject areas as: methods supporting the design of manufacturing systems, methods of improving maintenance processes in companies, the design and improvement of manufacturing processes, the control of production processes in modern manufacturing systems production methods and techniques used in modern manufacturing systems and environmental aspects of production and their impact on the design and management of manufacturing systems. The wide range of research findings reported in this book confirms that the design of manufacturing systems is a complex problem and that the achievement of goals set for modern manufacturing systems requires interdisciplinary knowledge and the simultaneous design of the product, process and system, as well as the knowledge of modern manufacturing and organizational methods and techniques

    Positioning of a wireless relay node for useful cooperative communication

    Get PDF
    Given the exorbitant amount of data transmitted and the increasing demand for data connectivity in the 21st century, it has become imperative to search for pro-active and sustainable solutions to the effectively alleviate the overwhelming burden imposed on wireless networks. In this study a Decode and Forward cooperative relay channel is analyzed, with the employment of Maximal Ratio Combining at the destination node as the method of offering diversity combining. The system framework used is based on a three-node relay channel with a source node, relay node and a destination node. A model for the wireless communications channel is formulated in order for simulation to be carried out to investigate the impact on performance of relaying on a node placed at the edge of cell. Firstly, an AWGN channel is used before the effect of Rayleigh fading is taken into consideration. Result shows that performance of cooperative relaying performance is always superior or similar to conventional relaying. Additionally, relaying is beneficial when the relay is placed closer to the receiver
    corecore