3,031 research outputs found

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    A Review of Wireless Sensor Networks with Cognitive Radio Techniques and Applications

    Get PDF
    The advent of Wireless Sensor Networks (WSNs) has inspired various sciences and telecommunication with its applications, there is a growing demand for robust methodologies that can ensure extended lifetime. Sensor nodes are small equipment which may hold less electrical energy and preserve it until they reach the destination of the network. The main concern is supposed to carry out sensor routing process along with transferring information. Choosing the best route for transmission in a sensor node is necessary to reach the destination and conserve energy. Clustering in the network is considered to be an effective method for gathering of data and routing through the nodes in wireless sensor networks. The primary requirement is to extend network lifetime by minimizing the consumption of energy. Further integrating cognitive radio technique into sensor networks, that can make smart choices based on knowledge acquisition, reasoning, and information sharing may support the network's complete purposes amid the presence of several limitations and optimal targets. This examination focuses on routing and clustering using metaheuristic techniques and machine learning because these characteristics have a detrimental impact on cognitive radio wireless sensor node lifetime

    Optimal Resource Allocation Using Deep Learning-Based Adaptive Compression For Mhealth Applications

    Get PDF
    In the last few years the number of patients with chronic diseases that require constant monitoring increases rapidly; which motivates the researchers to develop scalable remote health applications. Nevertheless, transmitting big real-time data through a dynamic network limited by the bandwidth, end-to-end delay and transmission energy; will be an obstacle against having an efficient transmission of the data. The problem can be resolved by applying data reduction techniques on the vital signs at the transmitter side and reconstructing the data at the receiver side (i.e. the m-Health center). However, a new problem will be introduced which is the ability to receive the vital signs at the server side with an acceptable distortion rate (i.e. deformation of vital signs because of inefficient data reduction). In this thesis, we integrate efficient data reduction with wireless networking to deliver an adaptive compression with an acceptable distortion, while reacting to the wireless network dynamics such as channel fading and user mobility. A Deep Learning (DL) approach was used to implement an adaptive compression technique to compress and reconstruct the vital signs in general and specifically the Electroencephalogram Signal (EEG) with the minimum distortion. Then, a resource allocation framework was introduced to minimize the transmission energy along with the distortion of the reconstructed signa
    • …
    corecore