94 research outputs found

    Adaptive stochastic radio access selection scheme for cellular-WLAN heterogeneous communication systems

    Get PDF
    This study proposes a novel adaptive stochastic radio access selection scheme for mobile users in heterogeneous cellular-wireless local area network (WLAN) systems. In this scheme, a mobile user located in dual coverage area randomly selects WLAN with probability of ω when there is a need for downloading a chunk of data. The value of ω is optimised according to the status of both networks in terms of network load and signal quality of both cellular and WLAN networks. An analytical model based on continuous time Markov chain is proposed to optimise the value of ω and compute the performance of proposed scheme in terms of energy efficiency, throughput, and call blocking probability. Both analytical and simulation results demonstrate the superiority of the proposed scheme compared with the mainstream network selection schemes: namely, WLAN-first and load balancing

    4G and Beyond - Exploiting Heterogeneity in Mobile Networks

    Get PDF

    Optimization of Mobility Parameters using Fuzzy Logic and Reinforcement Learning in Self-Organizing Networks

    Get PDF
    In this thesis, several optimization techniques for next-generation wireless networks are proposed to solve different problems in the field of Self-Organizing Networks and heterogeneous networks. The common basis of these problems is that network parameters are automatically tuned to deal with the specific problem. As the set of network parameters is extremely large, this work mainly focuses on parameters involved in mobility management. In addition, the proposed self-tuning schemes are based on Fuzzy Logic Controllers (FLC), whose potential lies in the capability to express the knowledge in a similar way to the human perception and reasoning. In addition, in those cases in which a mathematical approach has been required to optimize the behavior of the FLC, the selected solution has been Reinforcement Learning, since this methodology is especially appropriate for learning from interaction, which becomes essential in complex systems such as wireless networks. Taking this into account, firstly, a new Mobility Load Balancing (MLB) scheme is proposed to solve persistent congestion problems in next-generation wireless networks, in particular, due to an uneven spatial traffic distribution, which typically leads to an inefficient usage of resources. A key feature of the proposed algorithm is that not only the parameters are optimized, but also the parameter tuning strategy. Secondly, a novel MLB algorithm for enterprise femtocells scenarios is proposed. Such scenarios are characterized by the lack of a thorough deployment of these low-cost nodes, meaning that a more efficient use of radio resources can be achieved by applying effective MLB schemes. As in the previous problem, the optimization of the self-tuning process is also studied in this case. Thirdly, a new self-tuning algorithm for Mobility Robustness Optimization (MRO) is proposed. This study includes the impact of context factors such as the system load and user speed, as well as a proposal for coordination between the designed MLB and MRO functions. Fourthly, a novel self-tuning algorithm for Traffic Steering (TS) in heterogeneous networks is proposed. The main features of the proposed algorithm are the flexibility to support different operator policies and the adaptation capability to network variations. Finally, with the aim of validating the proposed techniques, a dynamic system-level simulator for Long-Term Evolution (LTE) networks has been designed

    Distributed Cognitive RAT Selection in 5G Heterogeneous Networks: A Machine Learning Approach

    Get PDF
    The leading role of the HetNet (Heterogeneous Networks) strategy as the key Radio Access Network (RAN) architecture for future 5G networks poses serious challenges to the current cell selection mechanisms used in cellular networks. The max-SINR algorithm, although effective historically for performing the most essential networking function of wireless networks, is inefficient at best and obsolete at worst in 5G HetNets. The foreseen embarrassment of riches and diversified propagation characteristics of network attachment points spanning multiple Radio Access Technologies (RAT) requires novel and creative context-aware system designs. The association and routing decisions, in the context of single-RAT or multi-RAT connections, need to be optimized to efficiently exploit the benefits of the architecture. However, the high computational complexity required for multi-parametric optimization of utility functions, the difficulty of modeling and solving Markov Decision Processes, the lack of guarantees of stability of Game Theory algorithms, and the rigidness of simpler methods like Cell Range Expansion and operator policies managed by the Access Network Discovery and Selection Function (ANDSF), makes neither of these state-of-the-art approaches a favorite. This Thesis proposes a framework that relies on Machine Learning techniques at the terminal device-level for Cognitive RAT Selection. The use of cognition allows the terminal device to learn both a multi-parametric state model and effective decision policies, based on the experience of the device itself. This implies that a terminal, after observing its environment during a learning period, may formulate a system characterization and optimize its own association decisions without any external intervention. In our proposal, this is achieved through clustering of appropriately defined feature vectors for building a system state model, supervised classification to obtain the current system state, and reinforcement learning for learning good policies. This Thesis describes the above framework in detail and recommends adaptations based on the experimentation with the X-means, k-Nearest Neighbors, and Q-learning algorithms, the building blocks of the solution. The network performance of the proposed framework is evaluated in a multi-agent environment implemented in MATLAB where it is compared with alternative RAT selection mechanisms

    Access network selection schemes for multiple calls in next generation wireless networks

    Get PDF
    There is an increasing demand for internet services by mobile subscribers over the wireless access networks, with limited radio resources and capacity constraints. A viable solution to this capacity crunch is the deployment of heterogeneous networks. However, in this wireless environment, the choice of the most appropriate Radio Access Technology (RAT) that can Tsustain or meet the quality of service (QoS) requirements of users' applications require careful planning and cost efficient radio resource management methods. Previous research works on access network selection have focused on selecting a suitable RAT for a user's single call request. With the present request for multiple calls over wireless access networks, where each call has different QoS requirements and the available networks exhibit dynamic channel conditions, the choice of a suitable RAT capable of providing the "Always Best Connected" (ABC) experience for the user becomes a challenge. In this thesis, the problem of selecting the suitable RAT that is capable of meeting the QoS requirements for multiple call requests by mobile users in access networks is investigated. In addressing this problem, we proposed the use of Complex PRoprtional ASsesment (COPRAS) and Consensus-based Multi-Attribute Group Decision Making (MAGDM) techniques as novel and viable RAT selection methods for a grouped-multiple call. The performance of the proposed COPRAS multi-attribute decision making approach to RAT selection for a grouped-call has been evaluated through simulations in different network scenarios. The results show that the COPRAS method, which is simple and flexible, is more efficient in the selection of appropriate RAT for group multiple calls. The COPRAS method reduces handoff frequency and is computationally inexpensive when compared with other methods such as the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS), Simple Additive Weighting (SAW) and Multiplicative Exponent Weighting (MEW). The application of the proposed consensus-based algorithm in the selection of a suitable RAT for group-multiple calls, comprising of voice, video-streaming, and file-downloading has been intensively investigated. This algorithm aggregates the QoS requirement of the individual application into a collective QoS for the group calls. This new and novel approach to RAT selection for a grouped-call measures and compares the consensus degree of the collective solution and individual solution against a predefined threshold value. Using the methods of coincidence among preferences and coincidence among solutions with a predefined consensus threshold of 0.9, we evaluated the performance of the consensus-based RAT selection scheme through simulations under different network scenarios. The obtained results show that both methods of coincidences have the capability to select the most suitable RAT for a group of multiple calls. However, the method of coincidence among solutions achieves better results in terms of accuracy, it is less complex and the number of iteration before achieving the predefined consensus threshold is reduced. A utility-based RAT selection method for parallel traffic-streaming in an overlapped heterogeneous wireless network has also been developed. The RAT selection method was modeled with constraints on terminal battery power, service cost and network congestion to select a specified number of RATs that optimizes the terminal interface utility. The results obtained show an optimum RAT selection strategy that maximizes the terminal utility and selects the best RAT combinations for user's parallel-streaming for voice, video and file-download

    An Innovative RAN Architecture for Emerging Heterogeneous Networks: The Road to the 5G Era

    Full text link
    The global demand for mobile-broadband data services has experienced phenomenal growth over the last few years, driven by the rapid proliferation of smart devices such as smartphones and tablets. This growth is expected to continue unabated as mobile data traffic is predicted to grow anywhere from 20 to 50 times over the next 5 years. Exacerbating the problem is that such unprecedented surge in smartphones usage, which is characterized by frequent short on/off connections and mobility, generates heavy signaling traffic load in the network signaling storms . This consumes a disproportion amount of network resources, compromising network throughput and efficiency, and in extreme cases can cause the Third-Generation (3G) or 4G (long-term evolution (LTE) and LTE-Advanced (LTE-A)) cellular networks to crash. As the conventional approaches of improving the spectral efficiency and/or allocation additional spectrum are fast approaching their theoretical limits, there is a growing consensus that current 3G and 4G (LTE/LTE-A) cellular radio access technologies (RATs) won\u27t be able to meet the anticipated growth in mobile traffic demand. To address these challenges, the wireless industry and standardization bodies have initiated a roadmap for transition from 4G to 5G cellular technology with a key objective to increase capacity by 1000Ã? by 2020 . Even though the technology hasn\u27t been invented yet, the hype around 5G networks has begun to bubble. The emerging consensus is that 5G is not a single technology, but rather a synergistic collection of interworking technical innovations and solutions that collectively address the challenge of traffic growth. The core emerging ingredients that are widely considered the key enabling technologies to realize the envisioned 5G era, listed in the order of importance, are: 1) Heterogeneous networks (HetNets); 2) flexible backhauling; 3) efficient traffic offload techniques; and 4) Self Organizing Networks (SONs). The anticipated solutions delivered by efficient interworking/ integration of these enabling technologies are not simply about throwing more resources and /or spectrum at the challenge. The envisioned solution, however, requires radically different cellular RAN and mobile core architectures that efficiently and cost-effectively deploy and manage radio resources as well as offload mobile traffic from the overloaded core network. The main objective of this thesis is to address the key techno-economics challenges facing the transition from current Fourth-Generation (4G) cellular technology to the 5G era in the context of proposing a novel high-risk revolutionary direction to the design and implementation of the envisioned 5G cellular networks. The ultimate goal is to explore the potential and viability of cost-effectively implementing the 1000x capacity challenge while continuing to provide adequate mobile broadband experience to users. Specifically, this work proposes and devises a novel PON-based HetNet mobile backhaul RAN architecture that: 1) holistically addresses the key techno-economics hurdles facing the implementation of the envisioned 5G cellular technology, specifically, the backhauling and signaling challenges; and 2) enables, for the first time to the best of our knowledge, the support of efficient ground-breaking mobile data and signaling offload techniques, which significantly enhance the performance of both the HetNet-based RAN and LTE-A\u27s core network (Evolved Packet Core (EPC) per 3GPP standard), ensure that core network equipment is used more productively, and moderate the evolving 5G\u27s signaling growth and optimize its impact. To address the backhauling challenge, we propose a cost-effective fiber-based small cell backhaul infrastructure, which leverages existing fibered and powered facilities associated with a PON-based fiber-to-the-Node/Home (FTTN/FTTH)) residential access network. Due to the sharing of existing valuable fiber assets, the proposed PON-based backhaul architecture, in which the small cells are collocated with existing FTTN remote terminals (optical network units (ONUs)), is much more economical than conventional point-to-point (PTP) fiber backhaul designs. A fully distributed ring-based EPON architecture is utilized here as the fiber-based HetNet backhaul. The techno-economics merits of utilizing the proposed PON-based FTTx access HetNet RAN architecture versus that of traditional 4G LTE-A\u27s RAN will be thoroughly examined and quantified. Specifically, we quantify the techno-economics merits of the proposed PON-based HetNet backhaul by comparing its performance versus that of a conventional fiber-based PTP backhaul architecture as a benchmark. It is shown that the purposely selected ring-based PON architecture along with the supporting distributed control plane enable the proposed PON-based FTTx RAN architecture to support several key salient networking features that collectively significantly enhance the overall performance of both the HetNet-based RAN and 4G LTE-A\u27s core (EPC) compared to that of the typical fiber-based PTP backhaul architecture in terms of handoff capability, signaling overhead, overall network throughput and latency, and QoS support. It will also been shown that the proposed HetNet-based RAN architecture is not only capable of providing the typical macro-cell offloading gain (RAN gain) but also can provide ground-breaking EPC offloading gain. The simulation results indicate that the overall capacity of the proposed HetNet scales with the number of deployed small cells, thanks to LTE-A\u27s advanced interference management techniques. For example, if there are 10 deployed outdoor small cells for every macrocell in the network, then the overall capacity will be approximately 10-11x capacity gain over a macro-only network. To reach the 1000x capacity goal, numerous small cells including 3G, 4G, and WiFi (femtos, picos, metros, relays, remote radio heads, distributed antenna systems) need to be deployed indoors and outdoors, at all possible venues (residences and enterprises)

    Comparison of vertical handover decision-based techniques in heterogeneous networks

    Get PDF
    Industry leaders are currently setting out standards for 5G Networks projected for 2020 or even sooner. Future generation networks will be heterogeneous in nature because no single network type is capable of optimally meeting all the rapid changes in customer demands. Heterogeneous networks are typically characterized by some network architecture, base stations of varying transmission power, transmission solutions and the deployment of a mix of technologies (multiple radio access technologies). In heterogeneous networks, the processes involved when a mobile node successfully switches from one radio access technology to the other for the purpose of quality of service continuity is termed vertical handover or vertical handoff. Active calls that get dropped, or cases where there is discontinuity of service experienced by mobile users can be attributed to the phenomenon of delayed handover or an outright case of an unsuccessful handover procedure. This dissertation analyses the performance of a fuzzy-based VHO algorithm scheme in a Wi-Fi, WiMAX, UMTS and LTE integrated network using the OMNeT++ discrete event simulator. The loose coupling type network architecture is adopted and results of the simulation are analysed and compared for the two major categories of handover basis; multiple and single criteria based handover methods. The key performance indices from the simulations showed better overall throughput, better call dropped rate and shorter handover time duration for the multiple criteria based decision method compared to the single criteria based technique. This work also touches on current trends, challenges in area of seamless handover and initiatives for future Networks (Next Generation Heterogeneous Networks)
    corecore