10,177 research outputs found

    A deep learning-enhanced digital twin framework for improving safety and reliability in human-robot collaborative manufacturing

    Get PDF
    In Industry 5.0, Digital Twins bring in flexibility and efficiency for smart manufacturing. Recently, the success of artificial intelligence techniques such as deep learning has led to their adoption in manufacturing and especially in human–robot collaboration. Collaborative manufacturing tasks involving human operators and robots pose significant safety and reliability concerns. In response to these concerns, a deep learning-enhanced Digital Twin framework is introduced through which human operators and robots can be detected and their actions can be classified during the manufacturing process, enabling autonomous decision making by the robot control system. Developed using Unreal Engine 4, our Digital Twin framework complies with the Robotics Operating System specification, and supports synchronous control and communication between the Digital Twin and the physical system. In our framework, a fully-supervised detector based on a faster region-based convolutional neural network is firstly trained on synthetic data generated by the Digital Twin, and then tested on the physical system to demonstrate the effectiveness of the proposed Digital Twin-based framework. To ensure safety and reliability, a semi-supervised detector is further designed to bridge the gap between the twin system and the physical system, and improved performance is achieved by the semi-supervised detector compared to the fully-supervised detector that is simply trained on either synthetic data or real data. The evaluation of the framework in multiple scenarios in which human operators collaborate with a Universal Robot 10 shows that it can accurately detect the human and robot, and classify their actions under a variety of conditions. The data from this evaluation have been made publicly available, and can be widely used for research and operational purposes. Additionally, a semi-automated annotation tool from the Digital Twin framework is published to benefit the collaborative robotics community

    Digitalization and Development

    Get PDF
    This book examines the diffusion of digitalization and Industry 4.0 technologies in Malaysia by focusing on the ecosystem critical for its expansion. The chapters examine the digital proliferation in major sectors of agriculture, manufacturing, e-commerce and services, as well as the intermediary organizations essential for the orderly performance of socioeconomic agents. The book incisively reviews policy instruments critical for the effective and orderly development of the embedding organizations, and the regulatory framework needed to quicken the appropriation of socioeconomic synergies from digitalization and Industry 4.0 technologies. It highlights the importance of collaboration between government, academic and industry partners, as well as makes key recommendations on how to encourage adoption of IR4.0 technologies in the short- and long-term. This book bridges the concepts and applications of digitalization and Industry 4.0 and will be a must-read for policy makers seeking to quicken the adoption of its technologies

    Natural and Technological Hazards in Urban Areas

    Get PDF
    Natural hazard events and technological accidents are separate causes of environmental impacts. Natural hazards are physical phenomena active in geological times, whereas technological hazards result from actions or facilities created by humans. In our time, combined natural and man-made hazards have been induced. Overpopulation and urban development in areas prone to natural hazards increase the impact of natural disasters worldwide. Additionally, urban areas are frequently characterized by intense industrial activity and rapid, poorly planned growth that threatens the environment and degrades the quality of life. Therefore, proper urban planning is crucial to minimize fatalities and reduce the environmental and economic impacts that accompany both natural and technological hazardous events

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    Machine learning applications in search algorithms for gravitational waves from compact binary mergers

    Get PDF
    Gravitational waves from compact binary mergers are now routinely observed by Earth-bound detectors. These observations enable exciting new science, as they have opened a new window to the Universe. However, extracting gravitational-wave signals from the noisy detector data is a challenging problem. The most sensitive search algorithms for compact binary mergers use matched filtering, an algorithm that compares the data with a set of expected template signals. As detectors are upgraded and more sophisticated signal models become available, the number of required templates will increase, which can make some sources computationally prohibitive to search for. The computational cost is of particular concern when low-latency alerts should be issued to maximize the time for electromagnetic follow-up observations. One potential solution to reduce computational requirements that has started to be explored in the last decade is machine learning. However, different proposed deep learning searches target varying parameter spaces and use metrics that are not always comparable to existing literature. Consequently, a clear picture of the capabilities of machine learning searches has been sorely missing. In this thesis, we closely examine the sensitivity of various deep learning gravitational-wave search algorithms and introduce new methods to detect signals from binary black hole and binary neutron star mergers at previously untested statistical confidence levels. By using the sensitive distance as our core metric, we allow for a direct comparison of our algorithms to state-of-the-art search pipelines. As part of this thesis, we organized a global mock data challenge to create a benchmark for machine learning search algorithms targeting compact binaries. This way, the tools developed in this thesis are made available to the greater community by publishing them as open source software. Our studies show that, depending on the parameter space, deep learning gravitational-wave search algorithms are already competitive with current production search pipelines. We also find that strategies developed for traditional searches can be effectively adapted to their machine learning counterparts. In regions where matched filtering becomes computationally expensive, available deep learning algorithms are also limited in their capability. We find reduced sensitivity to long duration signals compared to the excellent results for short-duration binary black hole signals

    Investigation of the metabolism of rare nucleotides in plants

    Get PDF
    Nucleotides are metabolites involved in primary metabolism, and specialized metabolism and have a regulatory role in various biochemical reactions in all forms of life. While in other organisms, the nucleotide metabolome was characterized extensively, comparatively little is known about the cellular concentrations of nucleotides in plants. The aim of this dissertation was to investigate the nucleotide metabolome and enzymes influencing the composition and quantities of nucleotides in plants. For this purpose, a method for the analysis of nucleotides and nucleosides in plants and algae was developed (Chapter 2.1), which comprises efficient quenching of enzymatic activity, liquid-liquid extraction and solid phase extraction employing a weak-anionexchange resin. This method allowed the analysis of the nucleotide metabolome of plants in great depth including the quantification of low abundant deoxyribonucleotides and deoxyribonucleosides. The details of the method were summarized in an article, serving as a laboratory protocol (Chapter 2.2). Furthermore, we contributed a review article (Chapter 2.3) that summarizes the literature about nucleotide analysis and recent technological advances with a focus on plants and factors influencing and hindering the analysis of nucleotides in plants, i.e., a complex metabolic matrix, highly stable phosphatases and physicochemical properties of nucleotides. To analyze the sub-cellular concentrations of metabolites, a protocol for the rapid isolation of highly pure mitochondria utilizing affinity chromatography was developed (Chapter 2.4). The method for the purification of nucleotides furthermore contributed to the comprehensive analysis of the nucleotide metabolome in germinating seeds and in establishing seedlings of A. thaliana, with a focus on genes involved in the synthesis of thymidilates (Chapter 2.5) and the characterization of a novel enzyme of purine nucleotide degradation, the XANTHOSINE MONOPHOSPHATE PHOSPHATASE (Chapter 2.6). Protein homology analysis comparing A. thaliana, S. cerevisiae, and H. sapiens led to the identification and characterization of an enzyme involved in the metabolite damage repair system of plants, the INOSINE TRIPHOSPHATE PYROPHOSPHATASE (Chapter 2.7). It was shown that this enzyme dephosphorylates deaminated purine nucleotide triphosphates and thus prevents their incorporation into nucleic acids. Lossof-function mutants senesce early and have a constitutively increased content of salicylic acid. Also, the source of deaminated purine nucleotides in plants was investigated and it was shown that abiotic factors contribute to nucleotide damage.Nukleotide sind Metaboliten, die am Primärstoffwechsel und an spezialisierten Stoffwechselvorgängen beteiligt sind und eine regulierende Rolle bei verschiedenen biochemischen Reaktionen in allen Lebensformen spielen. Während bei anderen Organismen das Nukleotidmetabolom umfassend charakterisiert wurde, ist in Pflanzen vergleichsweise wenig über die zellulären Konzentrationen von Nukleotiden bekannt. Ziel dieser Dissertation war es, das Nukleotidmetabolom und die Enzyme zu untersuchen, die die Zusammensetzung und Menge der Nukleotide in Pflanzen beeinflussen. Zu diesem Zweck wurde eine Methode zur Analyse von Nukleotiden und Nukleosiden in Pflanzen und Algen entwickelt (Kapitel 2.1), die ein effizientes Stoppen enzymatischer Aktivität, eine Flüssig-Flüssig-Extraktion und eine Festphasenextraktion unter Verwendung eines schwachen Ionenaustauschers umfasst. Mit dieser Methode konnte das Nukleotidmetabolom von Pflanzen eingehend analysiert werden, einschließlich der Quantifizierung von Desoxyribonukleotiden und Desoxyribonukleosiden mit geringer Abundanz. Die Einzelheiten der Methode wurden in einem Artikel zusammengefasst, der als Laborprotokoll dient (Kapitel 2.2). Darüber hinaus wurde ein Übersichtsartikel (Kapitel 2.3) verfasst, der die Literatur über die Analyse von Nukleotiden und die jüngsten technologischen Fortschritte zusammenfasst. Der Schwerpunkt lag hierbei auf Pflanzen und Faktoren, die die Analyse von Nukleotiden in Pflanzen beeinflussen oder behindern, d. h. eine komplexe Matrix, hochstabile Phosphatasen und physikalisch-chemische Eigenschaften von Nukleotiden. Um die subzellulären Konzentrationen von Metaboliten zu analysieren, wurde ein Protokoll für die schnelle Isolierung hochreiner Mitochondrien unter Verwendung einer Affinitätschromatographie entwickelt (Kapitel 2.4). Die Methode zur Analyse von Nukleotiden trug außerdem zu einer umfassenden Analyse des Nukleotidmetaboloms in keimenden Samen und in sich etablierenden Keimlingen von A. thaliana bei, wobei der Schwerpunkt auf Genen lag, die an der Synthese von Thymidilaten beteiligt sind (Kapitel 2.5), sowie zu der Charakterisierung eines neuen Enzyms des Purinnukleotidabbaus, der XANTHOSINE MONOPHOSPHATE PHOSPHATASE (Kapitel 2.6). Eine Proteinhomologieanalyse, die A. thaliana, S. cerevisiae und H. sapiens miteinander verglich führte zur Identifizierung und Charakterisierung eines Enzyms, das an der Reparatur von geschädigten Metaboliten in Pflanzen beteiligt ist, der INOSINE TRIPHOSPHATE PYROPHOSPHATASE (Kapitel 2.7). Es konnte gezeigt werden, dass dieses Enzym desaminierte Purinnukleotidtriphosphate dephosphoryliert und so deren Einbau in Nukleinsäuren verhindert. Funktionsverlustmutanten altern früh und weisen einen konstitutiv erhöhten Gehalt an Salicylsäure auf. Außerdem wurde die Quelle der desaminierten Purinnukleotide in Pflanzen untersucht, und es wurde gezeigt, dass abiotische Faktoren zur Nukleotidschädigung beitragen

    Medical Image Analysis using Deep Relational Learning

    Full text link
    In the past ten years, with the help of deep learning, especially the rapid development of deep neural networks, medical image analysis has made remarkable progress. However, how to effectively use the relational information between various tissues or organs in medical images is still a very challenging problem, and it has not been fully studied. In this thesis, we propose two novel solutions to this problem based on deep relational learning. First, we propose a context-aware fully convolutional network that effectively models implicit relation information between features to perform medical image segmentation. The network achieves the state-of-the-art segmentation results on the Multi Modal Brain Tumor Segmentation 2017 (BraTS2017) and Multi Modal Brain Tumor Segmentation 2018 (BraTS2018) data sets. Subsequently, we propose a new hierarchical homography estimation network to achieve accurate medical image mosaicing by learning the explicit spatial relationship between adjacent frames. We use the UCL Fetoscopy Placenta dataset to conduct experiments and our hierarchical homography estimation network outperforms the other state-of-the-art mosaicing methods while generating robust and meaningful mosaicing result on unseen frames.Comment: arXiv admin note: substantial text overlap with arXiv:2007.0778

    Neural Architecture Search for Image Segmentation and Classification

    Get PDF
    Deep learning (DL) is a class of machine learning algorithms that relies on deep neural networks (DNNs) for computations. Unlike traditional machine learning algorithms, DL can learn from raw data directly and effectively. Hence, DL has been successfully applied to tackle many real-world problems. When applying DL to a given problem, the primary task is designing the optimum DNN. This task relies heavily on human expertise, is time-consuming, and requires many trial-and-error experiments. This thesis aims to automate the laborious task of designing the optimum DNN by exploring the neural architecture search (NAS) approach. Here, we propose two new NAS algorithms for two real-world problems: pedestrian lane detection for assistive navigation and hyperspectral image segmentation for biosecurity scanning. Additionally, we also introduce a new dataset-agnostic predictor of neural network performance, which can be used to speed-up NAS algorithms that require the evaluation of candidate DNNs

    ABC: Adaptive, Biomimetic, Configurable Robots for Smart Farms - From Cereal Phenotyping to Soft Fruit Harvesting

    Get PDF
    Currently, numerous factors, such as demographics, migration patterns, and economics, are leading to the critical labour shortage in low-skilled and physically demanding parts of agriculture. Thus, robotics can be developed for the agricultural sector to address these shortages. This study aims to develop an adaptive, biomimetic, and configurable modular robotics architecture that can be applied to multiple tasks (e.g., phenotyping, cutting, and picking), various crop varieties (e.g., wheat, strawberry, and tomato) and growing conditions. These robotic solutions cover the entire perception–action–decision-making loop targeting the phenotyping of cereals and harvesting fruits in a natural environment. The primary contributions of this thesis are as follows. a) A high-throughput method for imaging field-grown wheat in three dimensions, along with an accompanying unsupervised measuring method for obtaining individual wheat spike data are presented. The unsupervised method analyses the 3D point cloud of each trial plot, containing hundreds of wheat spikes, and calculates the average size of the wheat spike and total spike volume per plot. Experimental results reveal that the proposed algorithm can effectively identify spikes from wheat crops and individual spikes. b) Unlike cereal, soft fruit is typically harvested by manual selection and picking. To enable robotic harvesting, the initial perception system uses conditional generative adversarial networks to identify ripe fruits using synthetic data. To determine whether the strawberry is surrounded by obstacles, a cluster complexity-based perception system is further developed to classify the harvesting complexity of ripe strawberries. c) Once the harvest-ready fruit is localised using point cloud data generated by a stereo camera, the platform’s action system can coordinate the arm to reach/cut the stem using the passive motion paradigm framework, as inspired by studies on neural control of movement in the brain. Results from field trials for strawberry detection, reaching/cutting the stem of the fruit with a mean error of less than 3 mm, and extension to analysing complex canopy structures/bimanual coordination (searching/picking) are presented. Although this thesis focuses on strawberry harvesting, ongoing research is heading toward adapting the architecture to other crops. The agricultural food industry remains a labour-intensive sector with a low margin, and cost- and time-efficiency business model. The concepts presented herein can serve as a reference for future agricultural robots that are adaptive, biomimetic, and configurable
    • …
    corecore