43 research outputs found

    On the use of the Core Manufacturing Simulation Data (CMSD) standard: experiences and recommendations

    Get PDF
    The Core Manufacturing Simulation Data (CMSD) information model is defined by SISO standards SISO-STD-008-01-2012 and SISO-STD-008-2010. The main objective of CMSD is to facilitate interoperability between simulation systems and other information systems in the manufacturing domain. While CMSD is mainly intended as standardized data exchange format, its capabilities go beyond simple data exchange. Frequently CMSD based system descriptions are used for purposes of automatic simulation model generation. In this paper, we report on practical experiences using the CMSD standard for such purposes as well as for purposes of simulation model initialization and simulation output data collection. Based on our experiences we suggest potential enhancements for a future revision of the standard

    Framework for the usage of data from real-time indoor localization systems to derive inputs for manufacturing simulation

    Get PDF
    Discrete event simulation is becoming increasingly important in the planning and operation of complex manufacturing systems. A major problem with today’s approach to manufacturing simulation studies is the collection and processing of data from heterogeneous sources, because the data is often of poor quality and does not contain all the necessary information for a simulation. This work introduces a framework that uses a real-time indoor localization systems (RTILS) as a central main data harmonizer, that is designed to feed production data into a manufacturing simulation from a single source of truth. It is shown, based on different data quality dimensions, how this contributes to a better overall data quality in manufacturing simulation. Furthermore, a detailed overview on which simulation inputs can be derived from the RTILS data is given

    Orchestrated Platform for Cyber-Physical Systems

    Get PDF
    One of the main driving forces in the era of cyber-physical systems (CPSs) is the introduction of massive sensor networks (or nowadays various Internet of things solutions as well) into manufacturing processes, connected cars, precision agriculture, and so on. Therefore, large amounts of sensor data have to be ingested at the server side in order to generate and make the "twin digital model" or virtual factory of the existing physical processes for (among others) predictive simulation and scheduling purposes usable. In this paper, we focus on our ultimate goal, a novel software container-based approach with cloud agnostic orchestration facilities that enable the system operators in the industry to create and manage scalable, virtual IT platforms on-demand for these two typical major pillars of CPS: (1) server-side (i.e., back-end) framework for sensor networks and (2) configurable simulation tool for predicting the behavior of manufacturing systems. The paper discusses the scalability of the applied discrete-event simulation tool and the layered back-end framework starting from simple virtual machine-level to sophisticated multilevel autoscaling use case scenario. The presented achievements and evaluations leverage on (among others) the synergy of the existing EasySim simulator, our new CQueue software container manager, the continuously developed Octopus cloud orchestrator tool, and the latest version of the evolving MiCADO framework for integrating such tools into a unified platform

    Packet Dispatching Schemes for Three-Stage Buffered Clos-Network Switches

    Get PDF
    Non

    Design and Development of an Architecture for Demonstrating the Interplay of Emerging SISO Standards

    Get PDF
    Simulation Interoperability Standards Organization (SISO) SIW Conference PaperThe Simulation Interoperability Standards Organization (SISO) focuses on facilitating simulation interoperability across government and non-government applications worldwide. A number of standards are emerging that will individually have great impact on the development and operation of simulation systems, as well as interoperation across simulation systems and command and control systems. Taken together, however, the emerging standards represent a set of capabilities and technologies which can revolutionize the simulation industry, radically improving the way we develop and deliver interoperable systems

    Automatische Generierung adaptiver Modelle zur Simulation von Produktionssystemen

    Get PDF
    The simulation of production and logistics processes is used today in a variety of industries. Simulation is used for the analysis, design, and optimization of production and logistics processes and their resource requirements and can be used here both in the planning, commissioning, and during the actual operation.The undisputed great potentials of material flow simulation stand against high costs and effort for implementing simulation models and conducting simulation studies. Due to the poor integration and standardization of the simulation and increasing demands of companies with respect to accuracy, flexibility, adaptability, speed, cost, and reusability the expenses for using simulation are increasing.One approach that has been attempted repeatedly for several years as a contribution to mitigate the cost of using simulation is the automatic generation of simulation models. Automatic model generation refers to different approaches permitting simulation models or parts of models to be produced by means of algorithms. So far, no approach has been published which yields good results for a broad spectrum of application areas and industries.In this work, a comprehensive framework for the integration and automation of the simulation was designed and validated. The framework consists of organizational, methodical, and prototypically technical components. In this context, it is argued that for a broad application of automatic model generation the use of standards is required. Specifically, the Core Manufacturing Simulation Data (CMSD) is proposed as useful standard and a reference implementation of the standard provides the basis for the entire work. The support of all simulation phases, i.e. not only model building but also the evaluation of alternatives, initialization, evaluation of results, etc. is ensured throughout the entire framework. Furthermore, model generation methods and procedures for representing dynamic behavior in simulation models were specifically classified and selected methods were implemented and presented.Ein Ansatz, der seit einigen Jahren immer wieder als ein Lösungsbeitrag für eine bessere Nutzung der Simulation von Produktionsprozessen gerade in KMU’s betrachtet wird, ist die automatische Generierung von Simulationsmodellen. In dieser Arbeit wird ein umfassendes Rahmenwerk zur Integration bzw. Automatisierung der Simulation vorgestellt. Es wurden organisatorische, methodische als auch prototypisch technische Komponenten entworfen und validiert. Hierbei wird die These vertreten, dass eine breit anwendbare automatische Modellgenerierung allein durch die Nutzung von Standards zum Datenaustausch bzw. zur Konzeptmodellerstellung sinnvoll zu implementieren ist. Konkret wurde der Core Manufacturing Simulation Data (CMSD) Standard genutzt bzw. bildet dessen Referenzanwendung die Basis der Arbeit. Die Unterstützung aller Simulationsphasen, d.h. nicht allein der Modellerstellung sondern auch der Alternativenbildung, Initialisierung, Ergebnisauswertung usw. wird in allen Komponenten durchgehend gewährleistet. Weiterhin wurden konkret Modellgenerierungsmethoden und Verfahren zur Abbildung des dynamischen Verhaltens in Modellen klassifiziert und einzelne Lösungsansätze vorgestellt.Auch im Buchhandel erhältlich: Automatische Generierung adaptiver Modelle zur Simulation von Produktionssystemen / Sören Bergmann Ilmenau : Univ.-Verl. Ilmenau, 2014. XXXVII, 221 S. ISBN 978-3-86360-084-6 Preis: 31,20

    Automatische Generierung eines Simulationsmodells zur Unterstützung der Umplanung einer Baustellenmontage

    Get PDF
    Fixed-Layout Assembly (FLA) systems are used to assemble large and bulky products. These products are often unique and require customer-specific engineering and customization. FLA systems are frequently prone to disturbances and plan deviations throughout operations: delayed deliveries, incompatibility or failures of equipment, and unplanned absences of operators. Planners therefore need a simple and efficient tool to quickly forecast the impact of changes on the whole assembly system. A solution concept has been presented by the authors in a previous publication (Billiet and Stark, 2022). The authors presented a method to automatically generate a simulation model by using data concerning the products, orders and shifts from the ERP system. This paper describes the implementation of the previously presented solution concept by applying it to a FLA for the production of Large Motors and Converters (LMC) in Berlin

    ROOT - A C++ Framework for Petabyte Data Storage, Statistical Analysis and Visualization

    Full text link
    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, ROOT offers packages for complex data modeling and fitting, as well as multivariate classification based on machine learning techniques. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way

    A fracture-controlled path-following technique for phase-field modeling of brittle fracture

    Get PDF
    In the phase-field description of brittle fracture, the fracture-surface area can be expressed as a functional of the phase field (or damage field). In this work we study the applicability of this explicit expression as a (non-linear) path-following constraint to robustly track the equilibrium path in quasi-static fracture propagation simulations, which can include snap-back phenomena. Moreover, we derive a fracture-controlled staggered solution procedure by systematic decoupling of the path-following controlled elasticity and phase-field problems. The fracture-controlled monolithic and staggered solution procedures are studied for a series of numerical test cases. The numerical results demonstrate the robustness of the new approach, and provide insight in the advantages and disadvantages of the monolithic and staggered procedures
    corecore