72 research outputs found

    Realising the open virtual commissioning of modular automation systems

    Get PDF
    To address the challenges in the automotive industry posed by the need to rapidly manufacture more product variants, and the resultant need for more adaptable production systems, radical changes are now required in the way in which such systems are developed and implemented. In this context, two enabling approaches for achieving more agile manufacturing, namely modular automation systems and virtual commissioning, are briefly reviewed in this contribution. Ongoing research conducted at Loughborough University which aims to provide a modular approach to automation systems design coupled with a virtual engineering toolset for the (re)configuration of such manufacturing automation systems is reported. The problems faced in the virtual commissioning of modular automation systems are outlined. AutomationML - an emerging neutral data format which has potential to address integration problems is discussed. The paper proposes and illustrates a collaborative framework in which AutomationML is adopted for the data exchange and data representation of related models to enable efficient open virtual prototype construction and virtual commissioning of modular automation systems. A case study is provided to show how to create the data model based on AutomationML for describing a modular automation system

    Plant-Wide Diagnosis: Cause-and-Effect Analysis Using Process Connectivity and Directionality Information

    Get PDF
    Production plants used in modern process industry must produce products that meet stringent environmental, quality and profitability constraints. In such integrated plants, non-linearity and strong process dynamic interactions among process units complicate root-cause diagnosis of plant-wide disturbances because disturbances may propagate to units at some distance away from the primary source of the upset. Similarly, implemented advanced process control strategies, backup and recovery systems, use of recycle streams and heat integration may hamper detection and diagnostic efforts. It is important to track down the root-cause of a plant-wide disturbance because once corrective action is taken at the source, secondary propagated effects can be quickly eliminated with minimum effort and reduced down time with the resultant positive impact on process efficiency, productivity and profitability. In order to diagnose the root-cause of disturbances that manifest plant-wide, it is crucial to incorporate and utilize knowledge about the overall process topology or interrelated physical structure of the plant, such as is contained in Piping and Instrumentation Diagrams (P&IDs). Traditionally, process control engineers have intuitively referred to the physical structure of the plant by visual inspection and manual tracing of fault propagation paths within the process structures, such as the process drawings on printed P&IDs, in order to make logical conclusions based on the results from data-driven analysis. This manual approach, however, is prone to various sources of errors and can quickly become complicated in real processes. The aim of this thesis, therefore, is to establish innovative techniques for the electronic capture and manipulation of process schematic information from large plants such as refineries in order to provide an automated means of diagnosing plant-wide performance problems. This report also describes the design and implementation of a computer application program that integrates: (i) process connectivity and directionality information from intelligent P&IDs (ii) results from data-driven cause-and-effect analysis of process measurements and (iii) process know-how to aid process control engineers and plant operators gain process insight. This work explored process intelligent P&IDs, created with AVEVA® P&ID, a Computer Aided Design (CAD) tool, and exported as an ISO 15926 compliant platform and vendor independent text-based XML description of the plant. The XML output was processed by a software tool developed in Microsoft® .NET environment in this research project to computationally generate connectivity matrix that shows plant items and their connections. The connectivity matrix produced can be exported to Excel® spreadsheet application as a basis for other application and has served as precursor to other research work. The final version of the developed software tool links statistical results of cause-and-effect analysis of process data with the connectivity matrix to simplify and gain insights into the cause and effect analysis using the connectivity information. Process knowhow and understanding is incorporated to generate logical conclusions. The thesis presents a case study in an atmospheric crude heating unit as an illustrative example to drive home key concepts and also describes an industrial case study involving refinery operations. In the industrial case study, in addition to confirming the root-cause candidate, the developed software tool was set the task to determine the physical sequence of fault propagation path within the plant. This was then compared with the hypothesis about disturbance propagation sequence generated by pure data-driven method. The results show a high degree of overlap which helps to validate statistical data-driven technique and easily identify any spurious results from the data-driven multivariable analysis. This significantly increase control engineers confidence in data-driven method being used for root-cause diagnosis. The thesis concludes with a discussion of the approach and presents ideas for further development of the methods

    An approach to open virtual commissioning for component-based automation

    Get PDF
    Increasing market demands for highly customised products with shorter time-to-market and at lower prices are forcing manufacturing systems to be built and operated in a more efficient ways. In order to overcome some of the limitations in traditional methods of automation system engineering, this thesis focuses on the creation of a new approach to Virtual Commissioning (VC). In current VC approaches, virtual models are driven by pre-programmed PLC control software. These approaches are still time-consuming and heavily control expertise-reliant as the required programming and debugging activities are mainly performed by control engineers. Another current limitation is that virtual models validated during VC are difficult to reuse due to a lack of tool-independent data models. Therefore, in order to maximise the potential of VC, there is a need for new VC approaches and tools to address these limitations. The main contributions of this research are: (1) to develop a new approach and the related engineering tool functionality for directly deploying PLC control software based on component-based VC models and reusable components; and (2) to build tool-independent common data models for describing component-based virtual automation systems in order to enable data reusability. [Continues.

    Development of bioplastics from oil plant by-products

    Get PDF
    The feasibility of making bioplastics out of oil cake from Crambe Abyssinica, Brassica Carinata and Brassica Napus (Rapeseed) was studied in order to produce added value for these by-products. The materials were hot pressed and extruded at 100 ˚C using glycerol as a plasticizer. Pressing without plasticizer produced brittle materials. Tensile properties, moisture content, water and oil absorption were determined for the hot pressed sheets. Ball milling of the oil cake did not result in improved properties of the pressed sheets. As the three plants all have semi-drying oils, a siccative was added in different concentrations to crosslink the oil and add to the matrix of the bioplastic. Sheets were pressed after preheating the materials 0h, 2 h or 6 h. Sheets were also pressed from granulated extrudate with different concentrations of siccative after pre-heating for 0 h or 2 h. Tensile properties, moisture content, water and oil absorption were determined for the pressed sheets. Pre-heating before pressing had a larger impact than siccative concentration on the properties of crambe and carinata sheets. Siccative concentration had larger influences on the properties of rapeseed cake sheets. Carinata and rapeseed cake were easier to process than crambe as they flowed better. Carinata sheets and extrudates were more flexible than crambe and rapeseed cake sheets and extrudates. A tray could be pressed from a mixture of crambe and rapeseed cake. All three materials show potential to be used as bioplastics for rigid items, perhaps in packaging applications

    Suurten datamäärien hallinta prosessiteollisuudessa

    Get PDF
    The idea of Internet of Things (IoT) is to connect all the devices into one network and to enable interoperability between them. Interoperability benefits also the process industry when the control devices and software can interoperate with management software. One part of the industrial IoT is being able to efficiently analyze the data from the field devices so that for example predictive maintenance can be achieved. Information modelling is needed to enable communication between the different software and to make analyzing data easier. This thesis examines the state of the IoT and the benefits of information modelling. The aim is to find the information modelling standard most suitable for the process industry and to figure out how standard conforming information models are created. The literature part of this thesis studies the current state and the future of IoT. The focus is especially on the possibilities it brings for the oil and gas industry. A broad collection of information modelling standards is introduced. According to the comparison made, OPC UA was selected in this work as the most suitable standard for the needs of process industry. In the experimental part the information modelling process is introduced and three OPC UA modelling tools are examined. Instructions for information modelling with OPC UA were created. An OPC UA standard conforming information model of a distillation column was created to be used to configure a soft sensor. The model was validated using expert knowledge. The model was also successfully connected to a data source that was in this case a DCS emulator.Esineiden internetin ajatuksena on kytkeä kaikki laitteet samaan verkkoon ja mahdollistaa niiden välinen yhteensopivuus. Myös prosessiteollisuudessa on hyötyä yhteensopivuudesta, kun säätölaitteet ja ohjausjärjestelmät voivat kommunikoida hallintojärjestelmien kanssa. Teollisessa esineiden internetissä kenttälaitteiden tuottamaa data pystytään analysoimaan tehokkaasti siten, että esimerkiksi ennakoiva huolto on mahdollista. Tietomalleja tarvitaan laitteiden välisen kommunikaation mahdollistamiseksi ja tiedon analysoinnin helpottamiseksi. Tämä diplomityö käsittelee esineiden internetin tilaa sekä tietomallinnuksella saavutettavia hyötyjä. Tavoitteena on löytää prosessiteollisuuteen sopivin tietomallinnusstandardi sekä selvittää, miten valitun standardin mukaisia tietomalleja laaditaan. Kirjallisuusosassa selvitellään esineiden internetin nykytila sekä tulevaisuudennäkymät. Erityisest keskitytään esineiden internetin öljy- ja kaasuteollisuudelle tuomiin mahdollisuuksiin. Työssä esitellään laaja kokoelma tietomallinnusstandardeja. Tehdyn vertailun jälkeen OPC UA valittiin tässä työssä prosessiteollisuuden käyttötarkoitukisiin sopivimmaksi standardiksi. Soveltavassa osassa esitellään tietomallinnusprosessi sekä tutustutaan kolmeen erilaiseen OPC UA tietomallinnustyökaluun. Tietomallintamisesta OPC UA -standardin avulla laadittiin ohjeet. Työssä laadittiin OPC UA:n mukainen tietomalli tislauskolonnista virtuaalisen säätimen konfigurointikäyttöön. Laaditun mallin toimivuutta arvioitiin asiantuntijoiden avulla. Malli kiinnitettiin onnistuneesti tietolähteeseen, joka tässä tapauksessa oli DCS emulaattori

    Proceedings of the 2009 Joint Workshop of Fraunhofer IOSB and Institute for Anthropomatics, Vision and Fusion Laboratory

    Get PDF
    The joint workshop of the Fraunhofer Institute of Optronics, System Technologies and Image Exploitation IOSB, Karlsruhe, and the Vision and Fusion Laboratory (Institute for Anthropomatics, Karlsruhe Institute of Technology (KIT)), is organized annually since 2005 with the aim to report on the latest research and development findings of the doctoral students of both institutions. This book provides a collection of 16 technical reports on the research results presented on the 2009 workshop

    Modeling and Simulation Methodologies for Digital Twin in Industry 4.0

    Get PDF
    The concept of Industry 4.0 represents an innovative vision of what will be the factory of the future. The principles of this new paradigm are based on interoperability and data exchange between dierent industrial equipment. In this context, Cyber- Physical Systems (CPSs) cover one of the main roles in this revolution. The combination of models and the integration of real data coming from the field allows to obtain the virtual copy of the real plant, also called Digital Twin. The entire factory can be seen as a set of CPSs and the resulting system is also called Cyber-Physical Production System (CPPS). This CPPS represents the Digital Twin of the factory with which it would be possible analyze the real factory. The interoperability between the real industrial equipment and the Digital Twin allows to make predictions concerning the quality of the products. More in details, these analyses are related to the variability of production quality, prediction of the maintenance cycle, the accurate estimation of energy consumption and other extra-functional properties of the system. Several tools [2] allow to model a production line, considering dierent aspects of the factory (i.e. geometrical properties, the information flows etc.) However, these simulators do not provide natively any solution for the design integration of CPSs, making impossible to have precise analysis concerning the real factory. Furthermore, for the best of our knowledge, there are no solution regarding a clear integration of data coming from real equipment into CPS models that composes the entire production line. In this context, the goal of this thesis aims to define an unified methodology to design and simulate the Digital Twin of a plant, integrating data coming from real equipment. In detail, the presented methodologies focus mainly on: integration of heterogeneous models in production line simulators; Integration of heterogeneous models with ad-hoc simulation strategies; Multi-level simulation approach of CPS and integration of real data coming from sensors into models. All the presented contributions produce an environment that allows to perform simulation of the plant based not only on synthetic data, but also on real data coming from equipments

    Computer-aided applications in process plant safety

    Get PDF
    Process plants that produce chemical products through pre-designed processes are fundamental in the Chemical Engineering industry. The safety of hazardous processing plants is of paramount importance as an accident could cause major damage to property and/or injury to people. HAZID is a computer system that helps designers and operators of process plants to identify potential design and operation problems given a process plant design. However, there are issues that need to be addressed before such a system will be accepted for common use. This research project considers how to improve the usability and acceptability of such a system by developing tools to test the developed models in order for the users to gain confidence in HAZID s output as HAZID is a model based system with a library of equipment models. The research also investigates the development of computer-aided safety applications and how they can be integrated together to extend HAZID to support different kinds of safety-related reasoning tasks. Three computer-aided tools and one reasoning system have been developed from this project. The first is called Model Test Bed, which is to test the correctness of models that have been built. The second is called Safe Isolation Tool, which is to define isolation boundary and identify potential hazards for isolation work. The third is an Instrument Checker, which lists all the instruments and their connections with process items in a process plant for the engineers to consider whether the instrument and its loop provide safeguards to the equipment during the hazard identification procedure. The fourth is a cause-effect analysis system that can automatically generate cause-effect tables for the control engineers to consider the safety design of the control of a plant as the table shows process events and corresponding process responses designed by the control engineer. The thesis provides a full description of the above four tools and how they are integrated into the HAZID system to perform control safety analysis and hazard identification in process plants

    Aggregoiva OPC UA palvelin yleiseen tiedon yhdistämiseen

    Get PDF
    OPC UA is an industrial communication protocol that enables the modelling of complex information with semantics and exposing it in the address space of an OPC UA server. With developments such as the Industrial Internet of Things and Industrie 4.0, the amount of data in the industrial environment is increasing and it is provided by an increasing number of sources. This can lead to information becoming increasingly scattered, which creates difficulties and inefficiencies in getting a view of all the available information. This thesis presents the design and implementation of a software solution that can integrate information from multiple OPC UA source servers that provide information in different ways and from different viewpoints. An existing aggregating OPC UA server was improved based on elicited requirements to implement an integration platform that can group together and display the heterogeneous information sources in its specially organized address space. The developed software solution consists of three parts: instance aggregation, type aggregation and service mappings, that cooperate together to create the needed functionality. The implemented prototype solution was evaluated in several test cases and found to meet the goals set for it. The instance aggregation procedure is able to find and group relevant information from different sources, while the type aggregation and service mappings keep the type definitions of the aggregated information intact. The instance aggregation procedure can also be configured by the user with a set of rules that enable compatibility with different use case needs. In the future, the results of this thesis will be used as a starting point in the incremental development of improved versions of the aggregation feature.Teollisuudessa käytetty OPC UA -tiedonsiirtomäärittely mahdollistaa monimutkaisen tiedon ja semantiikan esittämisen UPC UA -palvelimen osoiteavaruudessa oliomallin avulla. Teollisen internetin ja Industrie 4.0:n viitoittama suunta teollisuudessa on lisääntyvä tiedon määrä yhä useammista tietolähteistä. Tämän seurauksena tieto voi pirstaloitua ja täten vaikeuttaa kokonaiskuvan saantia olemassaolevasta tiedosta. Tämä diplomityö esittelee suunnittelun ja toteutuksen ohjelmistolle, joka pystyy integroimaan tietoa useista eri OPC UA -lähdepalvelimista, jotka voivat esittää tietoa eri tavoin ja eri näkökulmista. Olemassaolevaa aggregoivaa OPC UA -palvelinta kehitettiin uusiin vaatimuksiin perustuen toteuttamaan integraatioalusta, joka voi ryhmitellä yhteen ja näyttää tietoa erilaisista lähteistä tarkoituksenmukaisesti järjestetyssä nimiavaruudessaan. Kehitetty ohjelmistoratkaisu koostuu kolmesta osasta: instanssien aggregoinnista, tyyppien aggregoinnista ja palvelukartoituksista, jotka toimivat yhdessä tuottaakseen tarvittavan toiminnallisuuden. Kehitettyä prototyyppiratkaisua arvioitiin useissa testitapauksissa ja sen havaittiin täyttävän sille asetetut tavoitteet. Instanssien aggregointi pystyy löytämään ja ryhmittelemään yhteenkuuluvat tiedot eri lähteistä, kun taas tyyppien aggregointi ja palvelukartoitukset pitävät aggregoidun tiedon tyypppimäärittelyt muuttumattomina. Käyttäjä voi konfiguroida instanssien aggregointia käyttämällä erityisiä sääntömäärittelyjä, jotka mahdollistavat aggregointiprosessin yhteensopivuuden eri käyttötarpeiden kanssa. Tulevaisuudessa tässä opinnäytetyössä saatuja tuloksia käytetään lähtökohtana aggregointitoiminnallisuuden asteittaisesssa jatkokehittämisessä
    corecore