662 research outputs found

    Advancements in Real-Time Simulation of Power and Energy Systems

    Get PDF
    Modern power and energy systems are characterized by the wide integration of distributed generation, storage and electric vehicles, adoption of ICT solutions, and interconnection of different energy carriers and consumer engagement, posing new challenges and creating new opportunities. Advanced testing and validation methods are needed to efficiently validate power equipment and controls in the contemporary complex environment and support the transition to a cleaner and sustainable energy system. Real-time hardware-in-the-loop (HIL) simulation has proven to be an effective method for validating and de-risking power system equipment in highly realistic, flexible, and repeatable conditions. Controller hardware-in-the-loop (CHIL) and power hardware-in-the-loop (PHIL) are the two main HIL simulation methods used in industry and academia that contribute to system-level testing enhancement by exploiting the flexibility of digital simulations in testing actual controllers and power equipment. This book addresses recent advances in real-time HIL simulation in several domains (also in new and promising areas), including technique improvements to promote its wider use. It is composed of 14 papers dealing with advances in HIL testing of power electronic converters, power system protection, modeling for real-time digital simulation, co-simulation, geographically distributed HIL, and multiphysics HIL, among other topics

    Ontology-Driven Guidelines for Architecting Digital Twins in Factory Automation Applications

    Get PDF
    The rapid emerging technologies in various fields permitted the creation of simulation tools. These tools are designed to replicate physical systems in order to provide faster, cheaper and more detailed illustrative analysis of the physical system. In this regard, the concept of digital twins has been introduced to generally define these simulation tools. In fact, and according to the creator of the digital twin term Micheal Grieves, a digital twin is defined as a physical system, a digital replica of the physical system and information flow between the former parts. This definition is simple and generic for describing digital twins and yet, holistic. This broad definition creates a challenge for developers who target the development of such applications. Therefore, this paper presents a paradigm for architecting digital twins for manufacturing processes. The approach is inspired by the definitions of the ISA95 standard and the onion concept of computer applications to create multi-layer and multi-level concepts. Furthermore, and to satisfy the different required features by industries, the approach considers a multi-perspective concept that allows the separation of the digital twin views based on functionality. This paradigm aims at providing a modular, scalable, reusable, interoperable and composable approach for developing digital twins. Then, an implementation of the approach has been introduced using an ontology-based system and the IEC61499 standard. This implementation has been demonstrated on a discrete manufacturing assembly line.publishedVersionPeer reviewe

    Supercomputing Frontiers

    Get PDF
    This open access book constitutes the refereed proceedings of the 6th Asian Supercomputing Conference, SCFA 2020, which was planned to be held in February 2020, but unfortunately, the physical conference was cancelled due to the COVID-19 pandemic. The 8 full papers presented in this book were carefully reviewed and selected from 22 submissions. They cover a range of topics including file systems, memory hierarchy, HPC cloud platform, container image configuration workflow, large-scale applications, and scheduling

    Enabling the Development and Implementation of Digital Twins : Proceedings of the 20th International Conference on Construction Applications of Virtual Reality

    Get PDF
    Welcome to the 20th International Conference on Construction Applications of Virtual Reality (CONVR 2020). This year we are meeting on-line due to the current Coronavirus pandemic. The overarching theme for CONVR2020 is "Enabling the development and implementation of Digital Twins". CONVR is one of the world-leading conferences in the areas of virtual reality, augmented reality and building information modelling. Each year, more than 100 participants from all around the globe meet to discuss and exchange the latest developments and applications of virtual technologies in the architectural, engineering, construction and operation industry (AECO). The conference is also known for having a unique blend of participants from both academia and industry. This year, with all the difficulties of replicating a real face to face meetings, we are carefully planning the conference to ensure that all participants have a perfect experience. We have a group of leading keynote speakers from industry and academia who are covering up to date hot topics and are enthusiastic and keen to share their knowledge with you. CONVR participants are very loyal to the conference and have attended most of the editions over the last eighteen editions. This year we are welcoming numerous first timers and we aim to help them make the most of the conference by introducing them to other participants

    Analysis of Remote Diagnosis Architecture for a PLCBbased Automated Assembly System

    Get PDF
    To troubleshoot equipment installed in geographically distant locations, equipment manufacturers and system integrators are increasingly resorting to remote diagnosis in order to reduce the down time of the equipment, thereby achieving savings in cost and time on both the customer and manufacturer side. Remote diagnosis involves the use of communication technologies to perform fault diagnosis of a system located at a site distant to a troubleshooter. In order to achieve remote diagnosis, several frameworks have been proposed incorporating advancements such as automated fault diagnosis, collaborative diagnosis and mobile communication techniques. Standards exist for the capabilities representative of different levels of remote equipment diagnosis. Several studies have been performed to analyze the ability of human machine interface to assist troubleshooters in local fault diagnosis. However, the ability of a remote diagnosis system architecture to assist the troubleshooter in performing diagnosis and the effects of the failure types and other factors in a remote diagnosis environment on remote troubleshooting performance are not frequently addressed. In this thesis, an attempt is made to understand the factors that affect remote troubleshooting performance: remote diagnosis architecture, nature of failure, skill level of the local operator and level of expertise of the remote troubleshooter. For this purpose, three hierarchical levels of remote diagnosis architectures to diagnose failures in a PLC based automated assembly system were built based on existing standards. Common failures in automated assembly systems were identified and duplicated. Experiments were performed in which expert and novice troubleshooters used these remote diagnosis architectures to diagnose different types of failures while working with novice and engineer operators. The results suggest that in the diagnosis of failures related to measured or monitored system variables by remote expert troubleshooters, remote troubleshooting performance improved with the increase in the levels of the remote diagnosis architectures. In contrast, in the diagnosis of these failures by novice troubleshooters, no significant difference was observed among the three architectures in terms of remote troubleshooting performance and the novice troubleshooters experienced problems with managing the increased information available. Failures unrelated to monitored system parameters resulted in significantly reduced remote troubleshooting performance with all the three architectures in comparison to the failures related to monitored system parameters for both expert and novice troubleshooters. The experts exhibited better information gathering capabilities by spending more time per information source and making fewer transitions between information sources while diagnosing failures. The increase in capabilities of the architectures resulted in reduced operator interaction to a to a greater extent with experts. The difference in terms of overall remote troubleshooting performance between engineer and novice operators was not found to be significant

    Coastal Biophysical Inventory Database for the Point Reyes National Seashore

    Get PDF
    The Coastal Biophysical Inventory Database is the repository of the data gathered from a rapid assessment of approximately 161 km of the intertidal habitat managed by the Point Reyes National Seashore and Golden Gate National Recreation Area. The Coastal Biophysical Inventory Database is modeled after the “Alaska Coastal Resources Inventory and Mapping Database” and CoastWalker program of Glacier Bay National Park and Preserve. The protocol and database were adapted for this effort to represent the features of the Point Reyes National Seashore and Golden Gate National Recreation Area located along the northern central coast of California. The database is an integration of spatial data and observation data entered and browsed through an interface designed to complement the methods of the observation protocol. The Coastal Biophysical Inventory (CBI) and Mapping Protocol is the methodology to collect and store repeatable observations of the intertidal zone to create a baseline of information useful for resource management and potentially assist damage assessment in the event of an oil spill. The inventory contributes to the knowledge needed for the conservation of coastal resources managed in the public’s trust. The Coastal Biophysical Inventory Database is a Microsoft Access 2003 format relational database with a customized data entry interface programmed in Microsoft Access Visual Basic for Applications. The interface facilitates the entry, storage and relation of substrate, biology, photographs, and other field observations. Data can be browsed or queried using query tools common to the Microsoft Access software or using custom spatial query tools built into the interface with ESRI MapObjects LT 2.0 ActiveX COM objects. The Coastal Biophysical Inventory’s GIS data set is useful for collecting, analyzing and reporting field observations about the intertidal zone. The GIS data set is linked to the observation data set through a unique number, the Segment ID, by using the relate tools found in ArcGIS (9.2-10). The Segment ID is a non-repeating number that references a section of coastline that is delineated by the type and form of the substrate observed. The Segment ID allows connection to the biological observations and other observation records such as photos or the original data sheets. Through ArcGIS connections to the observation database using the Segment ID, summaries of biodiversity or habitat can be made by location. The Coastal Biophysical Inventory has completed its initial goals to assess the coastline of two National Parks. The data set collected provides a snapshot of information and the database allows for future observations to be recorded. It provides coastal resource managers a broad insight and orientation to the intertidal resources managed by the National Park Service

    Advances in Grid Computing

    Get PDF
    This book approaches the grid computing with a perspective on the latest achievements in the field, providing an insight into the current research trends and advances, and presenting a large range of innovative research papers. The topics covered in this book include resource and data management, grid architectures and development, and grid-enabled applications. New ideas employing heuristic methods from swarm intelligence or genetic algorithm and quantum encryption are considered in order to explain two main aspects of grid computing: resource management and data management. The book addresses also some aspects of grid computing that regard architecture and development, and includes a diverse range of applications for grid computing, including possible human grid computing system, simulation of the fusion reaction, ubiquitous healthcare service provisioning and complex water systems
    corecore