1,929 research outputs found
Machine Learning in Wireless Sensor Networks: Algorithms, Strategies, and Applications
Wireless sensor networks monitor dynamic environments that change rapidly
over time. This dynamic behavior is either caused by external factors or
initiated by the system designers themselves. To adapt to such conditions,
sensor networks often adopt machine learning techniques to eliminate the need
for unnecessary redesign. Machine learning also inspires many practical
solutions that maximize resource utilization and prolong the lifespan of the
network. In this paper, we present an extensive literature review over the
period 2002-2013 of machine learning methods that were used to address common
issues in wireless sensor networks (WSNs). The advantages and disadvantages of
each proposed algorithm are evaluated against the corresponding problem. We
also provide a comparative guide to aid WSN designers in developing suitable
machine learning solutions for their specific application challenges.Comment: Accepted for publication in IEEE Communications Surveys and Tutorial
A survey of machine learning techniques applied to self organizing cellular networks
In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future
Reinforcement Learning in Self Organizing Cellular Networks
Self-organization is a key feature as cellular networks densify and become more heterogeneous, through the additional small cells such as pico and femtocells. Self- organizing networks (SONs) can perform self-configuration, self-optimization, and self-healing. These operations can cover basic tasks such as the configuration of a newly installed base station, resource management, and fault management in the network. In other words, SONs attempt to minimize human intervention where they use measurements from the network to minimize the cost of installation, configuration, and maintenance of the network. In fact, SONs aim to bring two main factors in play: intelligence and autonomous adaptability. One of the main requirements for achieving such goals is to learn from sensory data and signal measurements in networks. Therefore, machine learning techniques can play a major role in processing underutilized sensory data to enhance the performance of SONs.
In the first part of this dissertation, we focus on reinforcement learning as a viable approach for learning from signal measurements. We develop a general framework in heterogeneous cellular networks agnostic to the learning approach. We design multiple reward functions and study different effects of the reward function, Markov state model, learning rate, and cooperation methods on the performance of reinforcement learning in cellular networks. Further, we look into the optimality of reinforcement learning solutions and provide insights into how to achieve optimal solutions.
In the second part of the dissertation, we propose a novel architecture based on spatial indexing for system-evaluation of heterogeneous 5G cellular networks. We develop an open-source platform based on the proposed architecture that can be used to study large scale directional cellular networks. The proposed platform is used for generating training data sets of accurate signal-to-interference-plus-noise-ratio (SINR) values in millimeter-wave communications for machine learning purposes. Then, with taking advantage of the developed platform, we look into dense millimeter-wave networks as one of the key technologies in 5G cellular networks. We focus on topology management of millimeter-wave backhaul networks and study and provide multiple insights on the evaluation and selection of proper performance metrics in dense millimeter-wave networks. Finally, we finish this part by proposing a self-organizing solution to achieve k-connectivity via reinforcement learning in the topology management of wireless networks
Recommended from our members
Traffic engineering multi-layer optimization for wireless mesh network transmission a campus network routing protocol transmission performance inhancement
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel UniversityThe wireless mesh network is a potential network for the future due to its excellent inherent characteristic for dynamic self-healing, self-configuration and self-organization. It also has the advantage of easy interoperability networking and the ability to form multi-linked ad-hoc networks. It has a decentralized topology, is cheap and highly scalable. Furthermore, its ease in deployment and easy maintenance are other inherent networking qualities. These aforementioned qualities of the wireless mesh network bring advantages to transmission capability of heterogeneous networks. However, transmissions in wireless mesh network create comparative performance based challenges such as congestion, load-balancing, scalability over increasing networks and coverage capacity. Consequently, these challenges and problems in the routing and switching of packets in the wireless mesh network routing protocols led to a proposal on the resolution of these failures with a combination algorithm and a management based security for the network and its transmitted packets. There are equally contentious services like reliability of the network and quality of service for real-time multimedia traffic flows with other challenges such as path computation and selection in the wireless mesh network.
This thesis is therefore a cumulative proposal to the resolution of the outlined challenges and open research areas posed by using wireless mesh network routing protocol. It advances the resolution of these challenges in the mesh environment using a hybrid optimization – traffic engineering, to increase the effectiveness and the reliability of the network. It also proffers a cumulative resolution of the diverse contributions on wireless mesh network routing protocol and transmission. Adaptation and optimization are carried out on the wireless mesh network designed network using traffic engineering mechanism and technique. The research examines the patterns of mesh packet transmission and evaluates the challenges and failures in the mesh network packet transmission. It develops a solution based algorithm for resolutions and proposes the traffic engineering based solution.. These resultant performances and analysis are usually tested and compared over wireless mesh IEEE802.11n or other older proposed documented solution.
This thesis used a carefully designed campus mesh network to show a comparative evaluation of an optimal performance of the mesh nodes and routers over a normal IEE802.11n based wireless domain network to show differentiation by optimization using the created algorithms. Furthermore, the indexes of performance being the metric are used to measure the utility and the reliability, including capacity and throughput at the destination during traffic engineered transmission. In addition, the security of these transmitted data and packets are optimized under a traffic engineered technique. Finally, this thesis offers an understanding to the security contribution using traffic engineering resolution to create a management algorithm for processing and computation of the wireless mesh networks security needs. The results of this thesis confirmed, completed and extended the existing predictions with real measurement
Machine Learning for Unmanned Aerial System (UAS) Networking
Fueled by the advancement of 5G new radio (5G NR), rapid development has occurred in many fields. Compared with the conventional approaches, beamforming and network slicing enable 5G NR to have ten times decrease in latency, connection density, and experienced throughput than 4G long term evolution (4G LTE). These advantages pave the way for the evolution of Cyber-physical Systems (CPS) on a large scale. The reduction of consumption, the advancement of control engineering, and the simplification of Unmanned Aircraft System (UAS) enable the UAS networking deployment on a large scale to become feasible. The UAS networking can finish multiple complex missions simultaneously. However, the limitations of the conventional approaches are still a big challenge to make a trade-off between the massive management and efficient networking on a large scale.
With 5G NR and machine learning, in this dissertation, my contributions can be summarized as the following: I proposed a novel Optimized Ad-hoc On-demand Distance Vector (OAODV) routing protocol to improve the throughput of Intra UAS networking. The novel routing protocol can reduce the system overhead and be efficient. To improve the security, I proposed a blockchain scheme to mitigate the malicious basestations for cellular connected UAS networking and a proof-of-traffic (PoT) to improve the efficiency of blockchain for UAS networking on a large scale. Inspired by the biological cell paradigm, I proposed the cell wall routing protocols for heterogeneous UAS networking. With 5G NR, the inter connections between UAS networking can strengthen the throughput and elasticity of UAS networking. With machine learning, the routing schedulings for intra- and inter- UAS networking can enhance the throughput of UAS networking on a large scale. The inter UAS networking can achieve the max-min throughput globally edge coloring. I leveraged the upper and lower bound to accelerate the optimization of edge coloring.
This dissertation paves a way regarding UAS networking in the integration of CPS and machine learning. The UAS networking can achieve outstanding performance in a decentralized architecture. Concurrently, this dissertation gives insights into UAS networking on a large scale. These are fundamental to integrating UAS and National Aerial System (NAS), critical to aviation in the operated and unmanned fields. The dissertation provides novel approaches for the promotion of UAS networking on a large scale. The proposed approaches extend the state-of-the-art of UAS networking in a decentralized architecture. All the alterations can contribute to the establishment of UAS networking with CPS
Recommended from our members
Capacity Enhancement Approaches for Long Term Evolution networks: Capacity Enhancement-Inspired Self-Organized Networking to Enhance Capacity and Fairness of Traffic in Long Term Evolution Networks by Utilising Dynamic Mobile Base-Stations
The long-term evolution (LTE) network has been proposed to provide better network capacity than the earlier 3G network. Driven by the market, the conventional LTE (3G) network standard could not achieve the expectations of the international mobile telecommunications advanced (IMT-Advanced) standard. To satisfy this gap, the LTE-Advanced was introduced with additional network functionalities to meet up with the IMT-Advanced Standard. In addition, due to the need to minimize operational expenditure (OPEX) and reduce human interventions, the wireless cellular networks are required to be self-aware, self-reconfigurable, self-adaptive and smart. An example of such network involves transceiver base stations (BTSs) within a self-organizing network (SON).
Besides these great breakthroughs, the conventional LTE and LTE-Advanced networks have not been designed with the intelligence of scalable capacity output especially in sudden demographic changes, namely during events of football, malls, worship centres or during religious and cultural festivals. Since most of these events cannot be predicted, modern cellular networks must be scalable in terms of capacity and coverage in such unpredictable demographic surge. Thus, the use of dynamic BTSs is proposed to be used in modern and future cellular networks for crowd and demographic change managements.
Dynamic BTSs are complements of the capability of SONs to search, determine and deploy less crowded/idle BTSs to densely crowded cells for scalable capacity management. The mobile BTSs will discover areas of dark coverages and fill-up the gap in terms of providing cellular services. The proposed network relieves the LTE network from overloading thus reducing packet loss, delay and improves fair load sharing.
In order to trail the best (least) path, a bio-inspired optimization algorithm based on swarm-particle optimization is proposed over the dynamic BTS network. It uses the ant-colony optimization algorithm (ACOA) to find the least path. A comparison between an optimized path and the un-optimized path showed huge gain in terms of delay, fair load sharing and the percentage of packet loss
Recommended from our members
Improving next-generation wireless network performance and reliability with deep learning
A rudimentary question whether machine learning in general, or deep learning in particular, could add to the well-established field of wireless communications, which has been evolving for close to a century, is often raised. While the use of deep learning based methods is likely to help build intelligent wireless solutions, this use becomes particularly challenging for the lower layers in the wireless communication stack. The introduction of the fifth generation of wireless communications (5G) has triggered the demand for “network intelligence” to support its promises for very high data rates and extremely low latency. Consequently, 5G wireless operators are faced with the challenges of network complexity, diversification of services, and personalized user experience. Industry standards have created enablers (such as the network data analytics function), but these enablers focus on post-mortem analysis at higher stack layers and have a periodicity in the time scale of seconds (or larger). The goal of this dissertation is to show a solution for these challenges and how a data-driven approach using deep learning could add to the field of wireless communications. In particular, I propose intelligent predictive and prescriptive abilities to boost reliability and eliminate performance bottlenecks in 5G cellular networks and beyond, show contributions that justify the value of deep learning in wireless communications across several different layers, and offer in-depth analysis and comparisons with baselines and industry standards. First, to improve multi-antenna network reliability against wireless impairments with power control and interference coordination for both packetized voice and beamformed data bearers, I propose the use of a joint beamforming, power control, and interference coordination algorithm based on deep reinforcement learning. This algorithm uses a string of bits and logic operations to enable simultaneous actions to be performed by the reinforcement learning agent. Consequently, a joint reward function is also proposed. I compare the performance of my proposed algorithm with the brute force approach and show that similar performance is achievable but with faster run-time as the number of transmit antennas increases. Second, in enhancing the performance of coordinated multipoint, I propose the use of deep learning binary classification to learn a surrogate function to trigger a second transmission stream instead of depending on the popular signal to interference plus noise measurement quantity. This surrogate function improves the users' sum-rate through focusing on pre-logarithmic terms in the sum-rate formula, which have larger impact on this rate. Third, performance of band switching can be improved without the need for a full channel estimation. My proposal of using deep learning to classify the quality of two frequency bands prior to granting the band switching leads to a significant improvement in users' throughput. This is due to the elimination of the industry standard measurement gap requirement—a period of silence where no data is sent to the users so they could measure the frequency bands before switching. In this dissertation, a group of algorithms for wireless network performance and reliability for downlink are proposed. My results show that the introduction of user coordinates enhance the accuracy of the predictions made with deep learning. Also, the choice of signal to interference plus noise ratio as the optimization objective may not always be the best choice to improve user throughput rates. Further, exploiting the spatial correlation of channels in different frequency bands can improve certain network procedures without the need for perfect knowledge of the per-band channel state information. Hence, an understanding of these results help develop novel solutions to enhancing these wireless networks at a much smaller time scale compared to the industry standards todayElectrical and Computer Engineerin
Recommended from our members
Interference Aware Cognitive Femtocell Networks
Femtocells Access Points (FAP) are low power, plug and play home base stations which are designed to extend the cellular radio range in indoor environments where macrocell coverage is generally poor. They offer significant increases in data rates over a short range, enabling high speed wireless and mobile broadband services, with the femtocell network overlaid onto the macrocell in a dual-tier arrangement. In contrast to conventional cellular systems which are well planned, FAP are arbitrarily installed by the end users and this can create harmful interference to both collocated femtocell and macrocell users. The interference becomes particularly serious in high FAP density scenarios and compromises the ensuing data rate. Consequently, effective management of both cross and co-tier interference is a major design challenge in dual-tier networks.
Since traditional radio resource management techniques and architectures for single-tier systems are either not applicable or operate inefficiently, innovative dual-tier approaches to intelligently manage interference are required. This thesis presents a number of original contributions to fulfill this objective including, a new hybrid cross-tier spectrum sharing model which builds upon an existing fractional frequency reuse technique to ensure minimal impact on the macro-tier resource allocation. A new flexible and adaptive virtual clustering framework is then formulated to alleviate co-tier interference in high FAP densities situations and finally, an intelligent coverage extension algorithm is developed to mitigate excessive femto-macrocell handovers, while upholding the required quality of service provision.
This thesis contends that to exploit the undoubted potential of dual-tier, macro-femtocell architectures an interference awareness solution is necessary. Rigorous evidence confirms that noteworthy performance improvements can be achieved in the quality of the received signal and throughput by applying cognitive methods to manage interference
- …