290 research outputs found

    Machine learning and deep learning based methods toward Industry 4.0 predictive maintenance in induction motors: Α state of the art survey

    Get PDF
    Purpose: Developments in Industry 4.0 technologies and Artificial Intelligence (AI) have enabled data-driven manufacturing. Predictive maintenance (PdM) has therefore become the prominent approach for fault detection and diagnosis (FD/D) of induction motors (IMs). The maintenance and early FD/D of IMs are critical processes, considering that they constitute the main power source in the industrial production environment. Machine learning (ML) methods have enhanced the performance and reliability of PdM. Various deep learning (DL) based FD/D methods have emerged in recent years, providing automatic feature engineering and learning and thereby alleviating drawbacks of traditional ML based methods. This paper presents a comprehensive survey of ML and DL based FD/D methods of IMs that have emerged since 2015. An overview of the main DL architectures used for this purpose is also presented. A discussion of the recent trends is given as well as future directions for research. Design/methodology/approach: A comprehensive survey has been carried out through all available publication databases using related keywords. Classification of the reviewed works has been done according to the main ML and DL techniques and algorithms Findings: DL based PdM methods have been mainly introduced and implemented for IM fault diagnosis in recent years. Novel DL FD/D methods are based on single DL techniques as well as hybrid techniques. DL methods have also been used for signal preprocessing and moreover, have been combined with traditional ML algorithms to enhance the FD/D performance in feature engineering. Publicly available datasets have been mostly used to test the performance of the developed methods, however industrial datasets should become available as well. Multi-agent system (MAS) based PdM employing ML classifiers has been explored. Several methods have investigated multiple IM faults, however, the presence of multiple faults occurring simultaneously has rarely been investigated. Originality/value: The paper presents a comprehensive review of the recent advances in PdM of IMs based on ML and DL methods that have emerged since 2015Peer Reviewe

    Enhancing smart home energy efficiency through accurate load prediction using deep convolutional neural networks

    Get PDF
    The method of predicting the electricity load of a home using deep learning techniques is called intelligent home load prediction based on deep convolutional neural networks. This method uses convolutional neural networks to analyze data from various sources such as weather, time of day, and other factors to accurately predict the electricity load of a home. The purpose of this method is to help optimize energy usage and reduce energy costs. The article proposes a deep learning-based approach for nonpermanent residential electrical energy load forecasting that employs temporal convolutional networks (TCN) to model historic load collection with timeseries traits and to study notably dynamic patterns of variants amongst attribute parameters of electrical energy consumption. The method considers the timeseries homes of the information and offers parallelization of large-scale facts processing with magnificent operational efficiency, considering the timeseries aspects of the information and the problematic inherent correlations between variables. The exams have been done using the UCI public dataset, and the experimental findings validate the method's efficacy, which has clear, sensible implications for setting up intelligent strength grid dispatching

    Bayesian gravitation based classification for hyperspectral images.

    Get PDF
    Integration of spectral and spatial information is extremely important for the classification of high-resolution hyperspectral images (HSIs). Gravitation describes interaction among celestial bodies which can be applied to measure similarity between data for image classification. However, gravitation is hard to combine with spatial information and rarely been applied in HSI classification. This paper proposes a Bayesian Gravitation based Classification (BGC) to integrate the spectral and spatial information of local neighbors and training samples. In the BGC method, each testing pixel is first assumed as a massive object with unit volume and a particular density, where the density is taken as the data mass in BGC. Specifically, the data mass is formulated as an exponential function of the spectral distribution of its neighbors and the spatial prior distribution of its surrounding training samples based on the Bayesian theorem. Then, a joint data gravitation model is developed as the classification measure, in which the data mass is taken to weigh the contribution of different neighbors in a local region. Four benchmark HSI datasets, i.e. the Indian Pines, Pavia University, Salinas, and Grss_dfc_2014, are tested to verify the BGC method. The experimental results are compared with that of several well-known HSI classification methods, including the support vector machines, sparse representation, and other eight state-of-the-art HSI classification methods. The BGC shows apparent superiority in the classification of high-resolution HSIs and also flexibility for HSIs with limited samples

    Identifying and Detecting Attacks in Industrial Control Systems

    Get PDF
    The integrity of industrial control systems (ICS) found in utilities, oil and natural gas pipelines, manufacturing plants and transportation is critical to national wellbeing and security. Such systems depend on hundreds of field devices to manage and monitor a physical process. Previously, these devices were specific to ICS but they are now being replaced by general purpose computing technologies and, increasingly, these are being augmented with Internet of Things (IoT) nodes. Whilst there are benefits to this approach in terms of cost and flexibility, it has attracted a wider community of adversaries. These include those with significant domain knowledge, such as those responsible for attacks on Iran’s Nuclear Facilities, a Steel Mill in Germany, and Ukraine’s power grid; however, non specialist attackers are becoming increasingly interested in the physical damage it is possible to cause. At the same time, the approach increases the number and range of vulnerabilities to which ICS are subject; regrettably, conventional techniques for analysing such a large attack space are inadequate, a cause of major national concern. In this thesis we introduce a generalisable approach based on evolutionary multiobjective algorithms to assist in identifying vulnerabilities in complex heterogeneous ICS systems. This is both challenging and an area that is currently lacking research. Our approach has been to review the security of currently deployed ICS systems, and then to make use of an internationally recognised ICS simulation testbed for experiments, assuming that the attacking community largely lack specific ICS knowledge. Using the simulator, we identified vulnerabilities in individual components and then made use of these to generate attacks. A defence against these attacks in the form of novel intrusion detection systems were developed, based on a range of machine learning models. Finally, this was further subject to attacks created using the evolutionary multiobjective algorithms, demonstrating, for the first time, the feasibility of creating sophisticated attacks against a well-protected adversary using automated mechanisms

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    Attentive monitoring of multiple video streams driven by a Bayesian foraging strategy

    Full text link
    In this paper we shall consider the problem of deploying attention to subsets of the video streams for collating the most relevant data and information of interest related to a given task. We formalize this monitoring problem as a foraging problem. We propose a probabilistic framework to model observer's attentive behavior as the behavior of a forager. The forager, moment to moment, focuses its attention on the most informative stream/camera, detects interesting objects or activities, or switches to a more profitable stream. The approach proposed here is suitable to be exploited for multi-stream video summarization. Meanwhile, it can serve as a preliminary step for more sophisticated video surveillance, e.g. activity and behavior analysis. Experimental results achieved on the UCR Videoweb Activities Dataset, a publicly available dataset, are presented to illustrate the utility of the proposed technique.Comment: Accepted to IEEE Transactions on Image Processin

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    Adaptive Chemical Agent Detection in Dynamic, Changing Environments

    Get PDF
    In this thesis a framework for adaptive chemical detection is developed, considering the application scenario of autonomous, robot mounted chemical agent detection in dynamic, changing environments. Chemical detection is performed by the Receptor Density Algorithm (RDA), a previously developed immune-inspired anomaly detection algorithm, which suffers from a decrease in its performance when the background environment changes. Focusing on the software part of the system, the goal of this thesis is to adapt the RDA quickly and autonomously, without requiring user feedback. The approach adopted is to first detect a change in the background data generating distribution, also defined as concept drift, and adapt in response to this detected change. Statistical hypothesis testing is used to determine whether there has been concept drift in consecutive time windows of the incoming sensor data. Five different statistical methods are tested on mass spectrometry data, enhanced with artificial concept drift that signifies a changing environment. The results show that, while no one method is universally best, statistical hypothesis testing can detect concept drift in the context of chemical sensing and it can differentiate between anomalies and concept drift. The adaptation of the system, which is triggered by the detection of concept drift, is achieved by switching to an ensemble (a set) of RDAs , created from a pool of pre-existing candidates. A novel mechanism for evaluating and selecting the members of the ensemble from this pool is proposed; it uses implicit performance information, extracted from the dynamics of the RDA, and does not require new user input to evaluate the candidates for the new environment. An ensemble of 5 members, selected in this way is found to be significantly better than a single RDA, the previously known best, reducing both the false detections and the number of missed anomalies. This method for selecting the ensemble members is also found significantly better than populating the ensemble based on their performance of the environment before concept drift. Finally, it is found that the ensemble can be created online, with its performance converging to the offline variant
    • …
    corecore