123 research outputs found

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Machine Learning Meets Communication Networks: Current Trends and Future Challenges

    Get PDF
    The growing network density and unprecedented increase in network traffic, caused by the massively expanding number of connected devices and online services, require intelligent network operations. Machine Learning (ML) has been applied in this regard in different types of networks and networking technologies to meet the requirements of future communicating devices and services. In this article, we provide a detailed account of current research on the application of ML in communication networks and shed light on future research challenges. Research on the application of ML in communication networks is described in: i) the three layers, i.e., physical, access, and network layers; and ii) novel computing and networking concepts such as Multi-access Edge Computing (MEC), Software Defined Networking (SDN), Network Functions Virtualization (NFV), and a brief overview of ML-based network security. Important future research challenges are identified and presented to help stir further research in key areas in this direction

    User mobility prediction and management using machine learning

    Get PDF
    The next generation mobile networks (NGMNs) are envisioned to overcome current user mobility limitations while improving the network performance. Some of the limitations envisioned for mobility management in the future mobile networks are: addressing the massive traffic growth bottlenecks; providing better quality and experience to end users; supporting ultra high data rates; ensuring ultra low latency, seamless handover (HOs) from one base station (BS) to another, etc. Thus, in order for future networks to manage users mobility through all of the stringent limitations mentioned, artificial intelligence (AI) is deemed to play a key role automating end-to-end process through machine learning (ML). The objectives of this thesis are to explore user mobility predictions and management use-cases using ML. First, background and literature review is presented which covers, current mobile networks overview, and ML-driven applications to enable user’s mobility and management. Followed by the use-cases of mobility prediction in dense mobile networks are analysed and optimised with the use of ML algorithms. The overall framework test accuracy of 91.17% was obtained in comparison to all other mobility prediction algorithms through artificial neural network (ANN). Furthermore, a concept of mobility prediction-based energy consumption is discussed to automate and classify user’s mobility and reduce carbon emissions under smart city transportation achieving 98.82% with k-nearest neighbour (KNN) classifier as an optimal result along with 31.83% energy savings gain. Finally, context-aware handover (HO) skipping scenario is analysed in order to improve over all quality of service (QoS) as a framework of mobility management in next generation networks (NGNs). The framework relies on passenger mobility, trains trajectory, travelling time and frequency, network load and signal ratio data in cardinal directions i.e, North, East, West, and South (NEWS) achieving optimum result of 94.51% through support vector machine (SVM) classifier. These results were fed into HO skipping techniques to analyse, coverage probability, throughput, and HO cost. This work is extended by blockchain-enabled privacy preservation mechanism to provide end-to-end secure platform throughout train passengers mobility

    Predicting Internet of Things Data Traffic Through LSTM and Autoregressive Spectrum Analysis

    Get PDF
    The rapid increase of Internet of Things (IoT) applications and services has led to massive amounts of heterogeneous data. Hence, we need to re-think how IoT data influences the network. In this paper, we study the characteristics of IoT data traffic in the context of smart cities. Aiming at analyzing the influence of IoT data traffic on the access and core network, we generate various IoT data traffic according to the characteristics of different IoT applications. Based on the analysis of the inherent features of the aggregated IoT data traffic, we propose a Long Short-Term Memory (LSTM) model combined with autoregressive spectrum analysis to predict the IoT data traffic. In this model, the autoregressive spectrum analysis is used to estimate the minimum length of the historical data needed for predicting the traffic in the future, which alleviates LSTM's performance deterioration with the increase of sequence length. A sliding window enables predicting the long-term tendency of IoT data traffic while keeping the inherent features of the data traffic. The evaluation results show that the proposed model converges quickly and can predict the variations of IoT traffic more accurately than other methods and the general LSTM model.Peer reviewe

    Predicting Internet of Things Data Traffic Through LSTM and Autoregressive Spectrum Analysis

    Get PDF
    The rapid increase of Internet of Things (IoT) applications and services has led to massive amounts of heterogeneous data. Hence, we need to re-think how IoT data influences the network. In this paper, we study the characteristics of IoT data traffic in the context of smart cities. Aiming at analyzing the influence of IoT data traffic on the access and core network, we generate various IoT data traffic according to the characteristics of different IoT applications. Based on the analysis of the inherent features of the aggregated IoT data traffic, we propose a Long Short-Term Memory (LSTM) model combined with autoregressive spectrum analysis to predict the IoT data traffic. In this model, the autoregressive spectrum analysis is used to estimate the minimum length of the historical data needed for predicting the traffic in the future, which alleviates LSTM's performance deterioration with the increase of sequence length. A sliding window enables predicting the long-term tendency of IoT data traffic while keeping the inherent features of the data traffic. The evaluation results show that the proposed model converges quickly and can predict the variations of IoT traffic more accurately than other methods and the general LSTM model.Peer reviewe
    • …
    corecore