2,391 research outputs found

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    DxNAT - Deep Neural Networks for Explaining Non-Recurring Traffic Congestion

    Full text link
    Non-recurring traffic congestion is caused by temporary disruptions, such as accidents, sports games, adverse weather, etc. We use data related to real-time traffic speed, jam factors (a traffic congestion indicator), and events collected over a year from Nashville, TN to train a multi-layered deep neural network. The traffic dataset contains over 900 million data records. The network is thereafter used to classify the real-time data and identify anomalous operations. Compared with traditional approaches of using statistical or machine learning techniques, our model reaches an accuracy of 98.73 percent when identifying traffic congestion caused by football games. Our approach first encodes the traffic across a region as a scaled image. After that the image data from different timestamps is fused with event- and time-related data. Then a crossover operator is used as a data augmentation method to generate training datasets with more balanced classes. Finally, we use the receiver operating characteristic (ROC) analysis to tune the sensitivity of the classifier. We present the analysis of the training time and the inference time separately

    Deep Learning Approach for Intrusion Detection System (IDS) in the Internet of Things (IoT) Network using Gated Recurrent Neural Networks (GRU)

    Get PDF
    The Internet of Things (IoT) is a complex paradigm where billions of devices are connected to a network. These connected devices form an intelligent system of systems that share the data without human-to-computer or human-to-human interaction. These systems extract meaningful data that can transform human lives, businesses, and the world in significant ways. However, the reality of IoT is prone to countless cyber-attacks in the extremely hostile environment like the internet. The recent hack of 2014 Jeep Cherokee, iStan pacemaker, and a German steel plant are a few notable security breaches. To secure an IoT system, the traditional high-end security solutions are not suitable, as IoT devices are of low storage capacity and less processing power. Moreover, the IoT devices are connected for longer time periods without human intervention. This raises a need to develop smart security solutions which are light-weight, distributed and have a high longevity of service. Rather than per-device security for numerous IoT devices, it is more feasible to implement security solutions for network data. The artificial intelligence theories like Machine Learning and Deep Learning have already proven their significance when dealing with heterogeneous data of various sizes. To substantiate this, in this research, we have applied concepts of Deep Learning and Transmission Control Protocol/Internet Protocol (TCP/IP) to build a light-weight distributed security solution with high durability for IoT network security. First, we have examined the ways of improving IoT architecture and proposed a light-weight and multi-layered design for an IoT network. Second, we have analyzed the existingapplications of Machine Learning and Deep Learning to the IoT and Cyber-Security. Third, we have evaluated deep learning\u27s Gated Recurrent Neural Networks (LSTM and GRU) on the DARPA/KDD Cup \u2799 intrusion detection data set for each layer in the designed architecture. Finally, from the evaluated metrics, we have proposed the best neural network design suitable for the IoT Intrusion Detection System. With an accuracy of 98.91% and False Alarm Rate of 0.76 %, this unique research outperformed the performance results of existing methods over the KDD Cup \u2799 dataset. For this first time in the IoT research, the concepts of Gated Recurrent Neural Networks are applied for the IoT security

    Malware Detection in Internet of Things (IoT) Devices Using Deep Learning

    Get PDF
    Internet of Things (IoT) devices usage is increasing exponentially with the spread of the internet. With the increasing capacity of data on IoT devices, these devices are becoming venerable to malware attacks; therefore, malware detection becomes an important issue in IoT devices. An effective, reliable, and time-efficient mechanism is required for the identification of sophisticated malware. Researchers have proposed multiple methods for malware detection in recent years, however, accurate detection remains a challenge. We propose a deep learning-based ensemble classification method for the detection of malware in IoT devices. It uses a three steps approach; in the first step, data is preprocessed using scaling, normalization, and de-noising, whereas in the second step, features are selected and one hot encoding is applied followed by the ensemble classifier based on CNN and LSTM outputs for detection of malware. We have compared results with the state-of-the-art methods and our proposed method outperforms the existing methods on standard datasets with an average accuracy of 99.5%.publishedVersio

    Congestion Prediction in Internet of Things Network using Temporal Convolutional Network A Centralized Approach

    Get PDF
    The unprecedented ballooning of network traffic flow, specifically, Internet of Things (IoT) network traffic, has big stressed of congestion on todays Internet. Non-recurring network traffic flow may be caused by temporary disruptions, such as packet drop, poor quality of services, delay, etc. Hence, the network traffic flow estimation is important in IoT networks to predict congestion. As the data in IoT networks is collected from a large number of diversified devices which have unlike format of data and also manifest complex correlations, so the generated data is heterogeneous and nonlinear in nature. Conventional machine learning approaches unable to deal with nonlinear datasets and suffer from misclassification of real network traffic due to overfitting. Therefore, it also becomes really hard for conventional machine learning tools like shallow neural networks to predict the congestion accurately. Accuracy of congestion prediction algorithms play an important role to control the congestion by regulating the send rate of the source. Various deeplearning methods (LSTM, CNN, GRU, etc.) are considered in designing network traffic flow predictors, which have shown promising results. In this work, we propose a novel congestion predictor for IoT, that uses Temporal Convolutional Network (TCN). Furthermore, we use Taguchi method to optimize the TCN model that reduces the number of runs of the experiments. We compare TCN with other four deep learning-based models concerning Mean Absolute Error (MAE) and Mean Relative Error (MRE). The experimental results show that TCN based deep learning framework achieves improved performance with 95.52% accuracy in predicting network congestion. Further, we design the Home IoT network testbed to capture the real network traffic flows as no standard dataset is available

    A deep learning approach for intrusion detection in Internet of Things using bi-directional long short-term memory recurrent neural network

    Get PDF
    Internet-of-Things connects every ‘thing’ with the Internet and allows these ‘things’ to communicate with each other. IoT comprises of innumerous interconnected devices of diverse complexities and trends. This fundamental nature of IoT structure intensifies the amount of attack targets which might affect the sustainable growth of IoT. Thus, security issues become a crucial factor to be addressed. A novel deep learning approach have been proposed in this thesis, for performing real-time detections of security threats in IoT systems using the Bi-directional Long Short-Term Memory Recurrent Neural Network (BLSTM RNN). The proposed approach have been implemented through Google TensorFlow implementation framework and Python programming language. To train and test the proposed approach, UNSW-NB15 dataset has been employed, which is the most up-to-date benchmark dataset with sequential samples and contemporary attack patterns. This thesis work employs binary classification of attack and normal patterns. The experimental result demonstrates the proficiency of the introduced model with respect to recall, precision, FAR and f-1 score. The model attains over 97% detection accuracy. The test result demonstrates that BLSTM RNN is profoundly effective for building highly efficient model for intrusion detection and offers a novel research methodology
    • …
    corecore