9 research outputs found

    What Is a Digital Twin? Experimental Design for a Data-Centric Machine Learning Perspective in Health

    Get PDF
    The idea of a digital twin has recently gained widespread attention. While, so far, it has been used predominantly for problems in engineering and manufacturing, it is believed that a digital twin also holds great promise for applications in medicine and health. However, a problem that severely hampers progress in these fields is the lack of a solid definition of the concept behind a digital twin that would be directly amenable for such big data-driven fields requiring a statistical data analysis. In this paper, we address this problem. We will see that the term ’digital twin’, as used in the literature, is like a Matryoshka doll. For this reason, we unstack the concept via a data-centric machine learning perspective, allowing us to define its main components. As a consequence, we suggest to use the term Digital Twin System instead of digital twin because this highlights its complex interconnected substructure. In addition, we address ethical concerns that result from treatment suggestions for patients based on simulated data and a possible lack of explainability of the underling models.publishedVersionPeer reviewe

    Ubiquitous Control of a CNC Machine: Proof of Concept for Industrial IoT Applications

    Get PDF
    In this paper, an integrated system to control and manage a state-of-the-art industrial computer numerical control (CNC) machine (Studer S33) using a commercially available tablet (Samsung Galaxy Tablet S2) is presented as a proof of concept (PoC) for the ubiquitous control of industrial machines. As a PoC, the proposed system provides useful insights to support the further development of full-fledged systems for Industrial Internet of Things (IIoT) applications. The proposed system allows for the quasi-decentralisation of the control architecture of conventional programmable logic controller (PLC)-based industrial control systems (ICSs) through data and information exchange over the transmission control protocol and the internet protocol (TCP/IP) suite using multiple agents. Based on the TCP/IP suite, a network device (Samsung Galaxy Tablet S2) and a process field net (PROFINET) device (Siemens Simatic S7-1200) are interfaced using a single-board computer (Raspberry Pi 4). An override system mainly comprising emergency stop and acknowledge buttons is also configured using the single-board computer. The input signals from the override system are transmitted to the PROFINET device (i.e., the industrial control unit (ICU)) over TCP/IP. A fully functional working prototype is realised as a PoC for an integrated system designated for the wireless and ubiquitous control of the CNC machine. The working prototype as an entity mainly comprises a mobile (handheld) touch-sensitive human-machine interface (HMI), a shielded single-board computer, and an override system, all fitted into a compact case with physical dimensions of 300 mm by 180 mm by 175 mm. To avert potential cyber attacks or threats to a reasonable extent and to guarantee the security of the PoC, a multi-factor authentication (MFA) including an administrative password and an IP address is implemented to control the access to the web-based ubiquitous HMI proffered by the PoC

    Enhancing Digital Twins of Semi-Automatic Production Lines by Digitizing Operator Skills

    Get PDF
    In recent years, Industry 4.0 has provided many tools to replicate, monitor, and control physical systems. The purpose is to connect production assets to build cyber-physical systems that ensure the safety, quality, and efficiency of production processes. Particularly, the concept of digital twins has been introduced to create the virtual representation of physical systems where both elements are connected to exchange information. This general definition encompasses a series of major challenges for the developers of those functionalities. Among them is how to introduce the human perspective into the virtual replica. Therefore, this paper presents an approach for incorporating human factors in digital twins. This approach introduces a methodology to offer suggestions about employee rotations based on their previous performance during a shift. Afterward, this method is integrated into a digital twin to perform human performance assessments to manage workers’ jobs. Furthermore, the presented approach is mainly comprised of a human skills modelling engine and a human scheduling engine. Finally, for demonstrating the approach, a simulated serial single-product manufacturing assembly line has been introduced.publishedVersionPeer reviewe

    Complementary Use of Ground-Based Proximal Sensing and Airborne/Spaceborne Remote Sensing Techniques in Precision Agriculture: A Systematic Review

    Get PDF
    As the global population continues to increase, projected to reach an estimated 9.7 billion people by 2050, there will be a growing demand for food production and agricultural resources. Transition toward Agriculture 4.0 is expected to enhance agricultural productivity through the integration of advanced technologies, increase resource efficiency, ensure long-term food security by applying more sustainable farming practices, and enhance resilience and climate change adaptation. By integrating technologies such as ground IoT sensing and remote sensing, via both satellite and Unmanned Aerial Vehicles (UAVs), and exploiting data fusion and data analytics, farming can make the transition to a more efficient, productive, and sustainable paradigm. The present work performs a systematic literature review (SLR), identifying the challenges associated with UAV, Satellite, and Ground Sensing in their application in agriculture, comparing them and discussing their complementary use to facilitate Precision Agriculture (PA) and transition to Agriculture 4.0

    Diseño de una interfaz digital Twein de un brazo robótico Ur3 que realiza marcado para corte de superficies usando Experior.

    Get PDF
    El presente trabajo de titulación tuvo como objetivo el diseño de una interfaz Digital Twin de un brazo robótico UR3 que realiza el marcado para corte de superficies en el contexto de la producción flexible. La interfaz se logró a través de una interacción Humano-Computador-Máquina (HCMI). El proceso se desarrolló inicialmente en PolyScope con la adquisición de piezas rectangulares como ordenes de producción personalizadas, luego mediante un algoritmo de cut and packing, en este caso usando el algoritmo de guillotina programado en Python, se organizó y ubicó las piezas en la mejor disposición basado en el tamaño de la plancha la cual es el recurso disponible, la trasmisión y recepción de datos entre PolyScope y Python se basó en el protocolo TCP/IP. A continuación, el diseño del Digital Twin se lo realizó dentro del software Experior donde se usó el brazo robótico UR3 proporcionado por la misma compañía y se modeló el entorno en el que trabaja este. Los datos del entrono real fueron recabados por una adaptación en el efector final del brazo robótico UR3 llamada Wrist Camera y enviados hacia Experior usando protocolo MODBUS para que de esta manera el entorno virtual refleje una réplica exacta del entorno real. Finalmente, se realizó una implementación de todo el sistema de forma virtual usando Ursim como una copia exacta del brazo robótico UR3 distribuido por Universal Robots. Para la evaluación del sistema se obtuvo un retardo medio de 3.64 ms considerando que el sistema se puso a prueba dentro de un entorno virtual y consiguiendo una fidelidad de movimiento de 99,758% en relación con el brazo robótico UR3 y su Digital Twin.The objective of this research is the design of a Digital Twin interface of a robotic arm UR3 that performs the marking for cutting surfaces in the context of flexible production. The interface was achieved through Human-Computer-Machine Interaction (HCMI). The process was initially developed in PolyScope with the acquisition of rectangular pieces as personalized production orders, then through a cut and packing algorithm programmed in Python, the pieces were organized and placed in the best arrangement based on the size of the plate which is the available resource, the transmission and reception of data between PolyScope and Python was based on TCP/IP program. Afterwards, the Digital Twin design was made within the Experior software where the UR3 robotic arm provided by the same Company was used and the environment in which it works was modified. The data from the real environment was collected by an adaptation in the final effector of the UR3 robotic arm called Wrist Camera and sent to Experior using MODBUS program so that the virtual environment reflects an exact copy of the real environment. Eventually, the entire system was implemented virtually using Ursim as an exact copy of the UR3 robotic arm distributed by Universal Robots. For the evaluation of the system, an average delay of 3,64 ms was obtained considering that the system was tested within a virtual environment and achieving a 99,758% movement fidelity in relation to the UR3 robotic arm and its Digital Twin

    Cyber-Physical Systems and Digital Twins in the Industrial Internet of Things [Cyber-Physical Systems]

    No full text

    Digital Twins for the built environment: Learning from conceptual and process models in manufacturing

    Get PDF
    The overall aim of this paper is to contribute to a better understanding of the Digital Twin (DT) paradigm in the built environment by drawing inspiration from existing DT research in manufacturing. The DT is a Product Life Management information construct that has migrated to the built environment while research on the subject has grown intensely in recent years. Common to early research phases, DT research in the built environment has developed organically, setting the basis for mature definitions and robust research frameworks. As DT research in manufacturing is the most developed, this paper seeks to advance the understanding of DTs in the built environment by analysing how the DT systems reported in manufacturing literature are structured and how they function. Firstly, this paper presents a thorough review and a comparison of DT, cyber-physical systems (CPS), and building information modelling (BIM). Then, the results of the review and categorisation of DT structural and functional descriptions are presented. Fifty-four academic publications and industry reports were reviewed, and their structural and functional descriptions were analysed in detail. Three types of structural models (i.e. conceptual models, system architectures, and data models) and three types of functional models (process and communication models) were identified. DT maturity models were reviewed as well. From the reviewed descriptions, four categories of DT conceptual models (prototypical, model-based, interface-oriented, and service-based) and six categories of DT process models (DT creation, DT synchronisation, asset monitoring, prognosis and simulation, optimal operations, and optimised design) were defined and its applicability to the AECO assessed. While model-based and service-based models are the most applicable to the built environment, amendments are still required. Prognosis and simulation process models are the most widely applicable for AECO use-cases. The main contribution to knowledge of this study is that it compiles the DT’s structural and functional descriptions used in manufacturing and it provides the basis to develop DT conceptual and process models specific to requirements of the built environment sectors

    Establishing model-to-model interoperability in an engineering workflow

    Get PDF
    The modeling tools available for engineering design and analysis are traditionally created in isolation with features and capabilities geared toward a particular domain, that is, that many traditional engineering modeling tools are isolated, unable to readily connect to or be integrated into a larger tool set. With the advent of cloud computing and the success of delivering applications built using a microservices architecture, a more modern approach would allow an engineering design application to be composed of smaller independently developed and contributed applications within an environment capable of executing those applications without modification to the environment. This is somewhat analogous to an application store with one main difference. In the engineering design and analysis case, the goal is to enable the coupling of the applications together to perform higher level analysis, whereas in the application store, most applications are used independently. This work introduces an Application Coupling Interface (ACI), for declaring the semantics of the application programming interfaces (APIs) of a modeled subsystem, a central repository providing access to curated web enabled engineered subsystems via ACIs and an extension of an existing cloud enabled engineering modeling/design environment to incorporate a new messaging system capable of autonomously orchestrating the execution and exchange of data between the subsystems. Together, these components provide the basis for an extensible analysis and design platform that accelerates discovery and innovation through the promotion of contribution and reuse of web enabled engineering models

    Gestione sostenibile del vigneto mediante Data Science e Big Data Management

    Get PDF
    Negli ultimi anni, la ricerca in ambito viticolo (Vitis vinifera L.) è stata notevolmente influenzata dalla necessità duplice di rispondere alla crescente domanda di prodotto ad elevati standard qualitativi e a mediare criticità derivanti dagli effetti del cambiamento climatico. Alla base della mediazione di tali fattori risulta fondamentale una ricalibrazione della gestione del vigneto, spostandosi da un approccio convenzionale che prevede una sua gestione come unità omogenea, verso uno che tenga in considerazione le sue discontinuità spaziali legate alle peculiarità pedoclimatiche e alle variabili biotiche, le quali, avendo riflessi eterogenei sul ciclo biologico della vite, determinano un uso non sempre razionale delle risorse. Emerge così l’esigenza di un rinnovamento dei sistemi di monitoraggio, che unisca il trasferimento tecnologico alle conoscenze scientifiche pregresse, verso usi mirati e calibrati sull'ambito viticolo, attraverso i quali poter attuare strategie previsionali che permettano la salvaguardia degli equilibri ecologici pur mantenendo inalterato il livello di produttività e qualità. Nello scenario della moderna viticoltura, il flusso di dati estratti dal campo proviene da fonti diverse tra loro. Si tratta di informazioni relative a diversi aspetti, che vanno dalla caratterizzazione della fisiologia delle piante, alla natura del contesto pedoclimatico fino a dati relativi alla gestione colturale: concimazione, irrigazione, potatura. Appare chiaro che, oltre a fornire grandi opportunità di indagine del sistema vigneto, questa abbondanza e diversificazione dei dati pone di fronte l’onere di dover gestire moli di dati spesso non strutturati che, pur avendo un grande valore intrinseco, richiedono di essere analizzate e sintetizzate affinché possano essere utilizzate in maniera proficua per la gestione agronomica del vigneto. Questi, infatti, se slegati dal contesto o se letti individualmente, danno spesso informazioni assai scarse, difficilmente leggibili, poco legate alla realtà applicativa e che in alcuni casi portano ad errori. Lo scopo dell’analisi di tali dati (chiamati non a caso Big Data) è quindi quello di individuare correlazioni, tendenze, pattern che si ripetono secondo schemi più o meno intuitivi, dinamiche di interdipendenza nascoste o comunque non facilmente identificabili, al fine di elaborare modelli simulativi costantemente aggiornati sulla base della biodiversità del panorama viticolo e dei contesti pedoclimatici, che consentano decisioni basate su dati più strettamente connessi alla realtà di campo anziché sulla semplice speculazione empirica o su serie storiche, con relativi vantaggi gestionali. Gli obiettivi della tesio sono stati quelli di: (i) sviluppare metodologie per l'acquisizione e l'analisi di immagini RGB dal contesto vigneto ed estrarre e analizzare i dati ad esse relativi per meglio comprendere le criticità, i vantaggi e le prospettive applicative di tale tecnologia; (ii) sviluppare modelli per la stima dello stato idrico della vite basati sull'analisi spazio-temporale di dati relativi al sistema pianta-suolo-atmosfera, per acquisire utili informazioni sulla gestione dell'irrigazione; (iii) applicare le metodologie e i modelli di simulazione sviluppati su casi studio reali per valutarne le prestazioni, confrontandoli con metodi esistenti, e analizzando la loro accuratezza nel fornire informazioni per la gestione sostenibile del vigneto
    corecore