57 research outputs found

    Ensemble deep learning: A review

    Get PDF
    Ensemble learning combines several individual models to obtain better generalization performance. Currently, deep learning models with multilayer processing architecture is showing better performance as compared to the shallow or traditional classification models. Deep ensemble learning models combine the advantages of both the deep learning models as well as the ensemble learning such that the final model has better generalization performance. This paper reviews the state-of-art deep ensemble models and hence serves as an extensive summary for the researchers. The ensemble models are broadly categorised into ensemble models like bagging, boosting and stacking, negative correlation based deep ensemble models, explicit/implicit ensembles, homogeneous /heterogeneous ensemble, decision fusion strategies, unsupervised, semi-supervised, reinforcement learning and online/incremental, multilabel based deep ensemble models. Application of deep ensemble models in different domains is also briefly discussed. Finally, we conclude this paper with some future recommendations and research directions

    A Comprehensive Survey of Deep Learning in Remote Sensing: Theories, Tools and Challenges for the Community

    Full text link
    In recent years, deep learning (DL), a re-branding of neural networks (NNs), has risen to the top in numerous areas, namely computer vision (CV), speech recognition, natural language processing, etc. Whereas remote sensing (RS) possesses a number of unique challenges, primarily related to sensors and applications, inevitably RS draws from many of the same theories as CV; e.g., statistics, fusion, and machine learning, to name a few. This means that the RS community should be aware of, if not at the leading edge of, of advancements like DL. Herein, we provide the most comprehensive survey of state-of-the-art RS DL research. We also review recent new developments in the DL field that can be used in DL for RS. Namely, we focus on theories, tools and challenges for the RS community. Specifically, we focus on unsolved challenges and opportunities as it relates to (i) inadequate data sets, (ii) human-understandable solutions for modelling physical phenomena, (iii) Big Data, (iv) non-traditional heterogeneous data sources, (v) DL architectures and learning algorithms for spectral, spatial and temporal data, (vi) transfer learning, (vii) an improved theoretical understanding of DL systems, (viii) high barriers to entry, and (ix) training and optimizing the DL.Comment: 64 pages, 411 references. To appear in Journal of Applied Remote Sensin

    Evaluation of Machine Learning approach in flood prediction scenarios and its input parameters: A systematic review

    Get PDF
    Flood disaster is a major disaster that frequently happens globally, it brings serious impacts to lives, property, infrastructure and environment. To stop flooding seems to be difficult but to prevent from serious damages that caused by flood is possible. Thus, implementing flood prediction could help in flood preparation and possibly to reduce the impact of flooding. This study aims to evaluate the existing machine learning (ML) approaches for flood prediction as well as evaluate parameters used for predicting flood, the evaluation is based on the review of previous research articles. In order to achieve the aim, this study is in two-fold; the first part is to identify flood prediction approaches specifically using ML methods and the second part is to identify flood prediction parameters that have been used as input parameters for flood prediction model. The main contribution of this paper is to determine the most recent ML techniques in flood prediction and identify the notable parameters used as model input so that researchers and/or flood managers can refer to the prediction results as the guideline in considering ML method for early flood prediction

    Scalable and Efficient Network Anomaly Detection on Connection Data Streams

    Get PDF
    Everyday, security experts and analysts must deal with and face the huge increase of cyber security threats that are propagating very fast on the Internet and threatening the security of hundreds of millions of users worldwide. The detection of such threats and attacks is of paramount importance to these experts in order to prevent these threats and mitigate their effects in the future. Thus, the need for security solutions that can prevent, detect, and mitigate such threats is imminent and must be addressed with scalable and efficient solutions. To this end, we propose a scalable framework, called Daedalus, to analyze streams of NIDS (network-based intrusion detection system) logs in near real-time and to extract useful threat security intelligence. The proposed system pre-processes massive amounts of connections stream logs received from different participating organizations and applies an elaborated anomaly detection technique in order to distinguish between normal and abnormal or anomalous network behaviors. As such, Daedalus detects network traffic anomalies by extracting a set of significant pre-defined features from the connection logs and then applying a time series-based technique in order to detect abnormal behavior in near real-time. Moreover, we correlate IP blocks extracted from the logs with some external security signature-based feeds that detect factual malicious activities (e.g., malware families and hashes, ransomware distribution, and command and control centers) in order to validate the proposed approach. Performed experiments demonstrate that Daedalus accurately identifies the malicious activities with an average F_1 score of 92.88\%. We further compare our proposed approach with existing K-Means and deep learning (LSTMs) approaches and demonstrate the accuracy and efficiency of our system

    Overløpskontroll i avløpsnett med forskjellige modelleringsteknikker og internet of things

    Get PDF
    Increased urbanization and extreme rainfall events are causing more frequent instances of sewer overflow, leading to the pollution of water resources and negative environmental, health, and fiscal impacts. At the same time, the treatment capacity of wastewater treatment plants is seriously affected. The main aim of this Ph.D. thesis is to use the Internet of Things and various modeling techniques to investigate the use of real-time control on existing sewer systems to mitigate overflow. The role of the Internet of Things is to provide continuous monitoring and real-time control of sewer systems. Data collected by the Internet of Things are also useful for model development and calibration. Models are useful for various purposes in real-time control, and they can be distinguished as those suitable for simulation and those suitable for prediction. Models that are suitable for a simulation, which describes the important phenomena of a system in a deterministic way, are useful for developing and analyzing different control strategies. Meanwhile, models suitable for prediction are usually employed to predict future system states. They use measurement information about the system and must have a high computational speed. To demonstrate how real-time control can be used to manage sewer systems, a case study was conducted for this thesis in Drammen, Norway. In this study, a hydraulic model was used as a model suitable for simulation to test the feasibility of different control strategies. Considering the recent advances in artificial intelligence and the large amount of data collected through the Internet of Things, the study also explored the possibility of using artificial intelligence as a model suitable for prediction. A summary of the results of this work is presented through five papers. Paper I demonstrates that one mainstream artificial intelligence technique, long short-term memory, can precisely predict the time series data from the Internet of Things. Indeed, the Internet of Things and long short-term memory can be powerful tools for sewer system managers or engineers, who can take advantage of real-time data and predictions to improve decision-making. In Paper II, a hydraulic model and artificial intelligence are used to investigate an optimal in-line storage control strategy that uses the temporal storage volumes in pipes to reduce overflow. Simulation results indicate that during heavy rainfall events, the response behavior of the sewer system differs with respect to location. Overflows at a wastewater treatment plant under different control scenarios were simulated and compared. The results from the hydraulic model show that overflows were reduced dramatically through the intentional control of pipes with in-line storage capacity. To determine available in-line storage capacity, recurrent neural networks were employed to predict the upcoming flow coming into the pipes that were to be controlled. Paper III and Paper IV describe a novel inter-catchment wastewater transfer solution. The inter-catchment wastewater transfer method aims at redistributing spatially mismatched sewer flows by transferring wastewater from a wastewater treatment plant to its neighboring catchment. In Paper III, the hydraulic behaviors of the sewer system under different control scenarios are assessed using the hydraulic model. Based on the simulations, inter-catchment wastewater transfer could efficiently reduce total overflow from a sewer system and wastewater treatment plant. Artificial intelligence was used to predict inflow to the wastewater treatment plant to improve inter-catchment wastewater transfer functioning. The results from Paper IV indicate that inter-catchment wastewater transfer might result in an extra burden for a pump station. To enhance the operation of the pump station, long short-term memory was employed to provide multi-step-ahead water level predictions. Paper V proposes a DeepCSO model based on large and high-resolution sensors and multi-task learning techniques. Experiments demonstrated that the multi-task approach is generally better than single-task approaches. Furthermore, the gated recurrent unit and long short-term memory-based multi-task learning models are especially suitable for capturing the temporal and spatial evolution of combined sewer overflow events and are superior to other methods. The DeepCSO model could help guide the real-time operation of sewer systems at a citywide level.publishedVersio
    corecore