82 research outputs found

    UE-Initiated Cell Reselection Game for Cell Load Balancing in a Wireless Network

    Get PDF

    Load balancing using cell range expansion in LTE advanced heterogeneous networks

    Get PDF
    The use of heterogeneous networks is on the increase, fueled by consumer demand for more data. The main objective of heterogeneous networks is to increase capacity. They offer solutions for efficient use of spectrum, load balancing and improvement of cell edge coverage amongst others. However, these solutions have inherent challenges such as inter-cell interference and poor mobility management. In heterogeneous networks there is transmit power disparity between macro cell and pico cell tiers, which causes load imbalance between the tiers. Due to the conventional user-cell association strategy, whereby users associate to a base station with the strongest received signal strength, few users associate to small cells compared to macro cells. To counter the effects of transmit power disparity, cell range expansion is used instead of the conventional strategy. The focus of our work is on load balancing using cell range expansion (CRE) and network utility optimization techniques to ensure fair sharing of load in a macro and pico cell LTE Advanced heterogeneous network. The aim is to investigate how to use an adaptive cell range expansion bias to optimize Pico cell coverage for load balancing. Reviewed literature points out several approaches to solve the load balancing problem in heterogeneous networks, which include, cell range expansion and utility function optimization. Then, we use cell range expansion, and logarithmic utility functions to design a load balancing algorithm. In the algorithm, user and base station associations are optimized by adapting CRE bias to pico base station load status. A price update mechanism based on a suboptimal solution of a network utility optimization problem is used to adapt the CRE bias. The price is derived from the load status of each pico base station. The performance of the algorithm was evaluated by means of an LTE MATLAB toolbox. Simulations were conducted according to 3GPP and ITU guidelines for modelling heterogeneous networks and propagation environment respectively. Compared to a static CRE configuration, the algorithm achieved more fairness in load distribution. Further, it achieved a better trade-off between cell edge and cell centre user throughputs. [Please note: this thesis file has been deferred until December 2016

    Increased energy efficiency in LTE networks through reduced early handover

    Get PDF
    “A thesis submitted to the University of Bedfordshire, in partial fulfilment of the requirements for the degree of Doctor of Philosophy”.Long Term Evolution (LTE) is enormously adopted by several mobile operators and has been introduced as a solution to fulfil ever-growing Users (UEs) data requirements in cellular networks. Enlarged data demands engage resource blocks over prolong time interval thus results into more dynamic power consumption at downlink in Basestation. Therefore, realisation of UEs requests come at the cost of increased power consumption which directly affects operator operational expenditures. Moreover, it also contributes in increased CO2 emissions thus leading towards Global Warming. According to research, Global Information and Communication Technology (ICT) systems consume approximately 1200 to 1800 Terawatts per hour of electricity annually. Importantly mobile communication industry is accountable for more than one third of this power consumption in ICT due to increased data requirements, number of UEs and coverage area. Applying these values to global warming, telecommunication is responsible for 0.3 to 0.4 percent of worldwide CO2 emissions. Moreover, user data volume is expected to increase by a factor of 10 every five years which results in 16 to 20 percent increase in associated energy consumption which directly effects our environment by enlarged global warming. This research work focuses on the importance of energy saving in LTE and initially propose bandwidth expansion based energy saving scheme which combines two resource blocks together to form single super RB, thereby resulting in reduced Physical Downlink Control Channel Overhead (PDCCH). Thus, decreased PDCCH overhead helps in reduced dynamic power consumption up to 28 percent. Subsequently, novel reduced early handover (REHO) based idea is proposed and combined with bandwidth expansion to form enhanced energy ii saving scheme. System level simulations are performed to investigate the performance of REHO scheme; it was found that reduced early handover provided around 35% improved energy saving while compared to LTE standard in 3rd Generation Partnership Project (3GPP) based scenario. Since there is a direct relationship between energy consumption, CO2 emissions and vendors operational expenditure (OPEX); due to reduced power consumption and increased energy efficiency, REHO subsequently proven to be a step towards greener communication with lesser CO2 footprint and reduced operational expenditure values. The main idea of REHO lies in the fact that it initiate handovers earlier and turn off freed resource blocks as compare to LTE standard. Therefore, the time difference (Transmission Time Intervals) between REHO based early handover and LTE standard handover is a key component for energy saving achieved, which is estimated through axiom of Euclidean geometry. Moreover, overall system efficiency is investigated through the analysis of numerous performance related parameters in REHO and LTE standard. This led to a key finding being made to guide the vendors about the choice of energy saving in relation to radio link failure and other important parameters

    Optimisation de la gestion des interférences inter-cellulaires et de l'attachement des mobiles dans les réseaux cellulaires LTE

    Get PDF
    Driven by an exponential growth in mobile broadband-enabled devices and a continue dincrease in individual data consumption, mobile data traffic has grown 4000-fold over the past 10 years and almost 400-million-fold over the past 15 years. Homogeneouscellular networks have been facing limitations to handle soaring mobile data traffic and to meet the growing end-user demand for more bandwidth and betterquality of experience. These limitations are mainly related to the available spectrumand the capacity of the network. Telecommunication industry has to address these challenges and meet exploding demand. At the same time, it has to guarantee a healthy economic model to reduce the carbon footprint which is caused by mobile communications.Heterogeneous Networks (HetNets), composed of macro base stations and low powerbase stations of different types, are seen as the key solution to improve spectral efficiency per unit area and to eliminate coverage holes. In such networks, intelligent user association and interference management schemes are needed to achieve gains in performance. Due to the large imbalance in transmission power between macroand small cells, user association based on strongest signal received is not adapted inHetNets as only few users would attach to low power nodes. A technique based onCell Individual Offset (CIO) is therefore required to perform load balancing and to favor some Small Cell (SC) attraction against Macro Cell (MC). This offset is addedto users’ Reference Signal Received Power (RSRP) measurements and hence inducing handover towards different eNodeBs. As Long Term Evolution (LTE) cellular networks use the same frequency sub-bands, mobile users may experience strong inter-cellxv interference, especially at cell edge. Therefore, there is a need to coordinate resource allocation among the cells and minimize inter-cell interference. To mitigate stronginter-cell interference, the resource, in time, frequency and power domain, should be allocated efficiently. A pattern for each dimension is computed to permit especially for cell edge users to benefit of higher throughput and quality of experience. The optimization of all these parameters can also offer gain in energy use. In this thesis,we propose a concrete versatile dynamic solution performing an optimization of user association and resource allocation in LTE cellular networks maximizing a certainnet work utility function that can be adequately chosen. Our solution, based on gametheory, permits to compute Cell Individual Offset and a pattern of power transmission over frequency and time domain for each cell. We present numerical simulations toillustrate the important performance gain brought by this optimization. We obtain significant benefits in the average throughput and also cell edge user through put of40% and 55% gains respectively. Furthermore, we also obtain a meaningful improvement in energy efficiency. This work addresses industrial research challenges and assuch, a prototype acting on emulated HetNets traffic has been implemented.Conduit par une croissance exponentielle dans les appareils mobiles et une augmentation continue de la consommation individuelle des données, le trafic de données mobiles a augmenté de 4000 fois au cours des 10 dernières années et près de 400millions fois au cours des 15 dernières années. Les réseaux cellulaires homogènes rencontrent de plus en plus de difficultés à gérer l’énorme trafic de données mobiles et à assurer un débit plus élevé et une meilleure qualité d’expérience pour les utilisateurs.Ces difficultés sont essentiellement liées au spectre disponible et à la capacité du réseau.L’industrie de télécommunication doit relever ces défis et en même temps doit garantir un modèle économique pour les opérateurs qui leur permettra de continuer à investir pour répondre à la demande croissante et réduire l’empreinte carbone due aux communications mobiles. Les réseaux cellulaires hétérogènes (HetNets), composés de stations de base macro et de différentes stations de base de faible puissance,sont considérés comme la solution clé pour améliorer l’efficacité spectrale par unité de surface et pour éliminer les trous de couverture. Dans de tels réseaux, il est primordial d’attacher intelligemment les utilisateurs aux stations de base et de bien gérer les interférences afin de gagner en performance. Comme la différence de puissance d’émission est importante entre les grandes et petites cellules, l’association habituelle des mobiles aux stations de bases en se basant sur le signal le plus fort, n’est plus adaptée dans les HetNets. Une technique basée sur des offsets individuelles par cellule Offset(CIO) est donc nécessaire afin d’équilibrer la charge entre les cellules et d’augmenter l’attraction des petites cellules (SC) par rapport aux cellules macro (MC). Cette offset est ajoutée à la valeur moyenne de la puissance reçue du signal de référence(RSRP) mesurée par le mobile et peut donc induire à un changement d’attachement vers différents eNodeB. Comme les stations de bases dans les réseaux cellulaires LTE utilisent les mêmes sous-bandes de fréquences, les mobiles peuvent connaître une forte interférence intercellulaire, en particulier en bordure de cellules. Par conséquent, il est primordial de coordonner l’allocation des ressources entre les cellules et de minimiser l’interférence entre les cellules. Pour atténuer la forte interférence intercellulaire, les ressources, en termes de temps, fréquence et puissance d’émission, devraient être alloués efficacement. Un modèle pour chaque dimension est calculé pour permettre en particulier aux utilisateurs en bordure de cellule de bénéficier d’un débit plus élevé et d’une meilleure qualité de l’expérience. L’optimisation de tous ces paramètres peut également offrir un gain en consommation d’énergie. Dans cette thèse, nous proposons une solution dynamique polyvalente effectuant une optimisation de l’attachement des mobiles aux stations de base et de l’allocation des ressources dans les réseaux cellulaires LTE maximisant une fonction d’utilité du réseau qui peut être choisie de manière adéquate.Notre solution, basée sur la théorie des jeux, permet de calculer les meilleures valeurs pour l’offset individuelle par cellule (CIO) et pour les niveaux de puissance à appliquer au niveau temporel et fréquentiel pour chaque cellule. Nous présentons des résultats des simulations effectuées pour illustrer le gain de performance important apporté par cette optimisation. Nous obtenons une significative hausse dans le débit moyen et le débit des utilisateurs en bordure de cellule avec 40 % et 55 % de gains respectivement. En outre, on obtient un gain important en énergie. Ce travail aborde des défis pour l’industrie des télécoms et en tant que tel, un prototype de l’optimiseur a été implémenté en se basant sur un trafic HetNets émulé

    Green Cellular Networks: A Survey, Some Research Issues and Challenges

    Full text link
    Energy efficiency in cellular networks is a growing concern for cellular operators to not only maintain profitability, but also to reduce the overall environment effects. This emerging trend of achieving energy efficiency in cellular networks is motivating the standardization authorities and network operators to continuously explore future technologies in order to bring improvements in the entire network infrastructure. In this article, we present a brief survey of methods to improve the power efficiency of cellular networks, explore some research issues and challenges and suggest some techniques to enable an energy efficient or "green" cellular network. Since base stations consume a maximum portion of the total energy used in a cellular system, we will first provide a comprehensive survey on techniques to obtain energy savings in base stations. Next, we discuss how heterogeneous network deployment based on micro, pico and femto-cells can be used to achieve this goal. Since cognitive radio and cooperative relaying are undisputed future technologies in this regard, we propose a research vision to make these technologies more energy efficient. Lastly, we explore some broader perspectives in realizing a "green" cellular network technologyComment: 16 pages, 5 figures, 2 table

    A Cognitive Routing framework for Self-Organised Knowledge Defined Networks

    Get PDF
    This study investigates the applicability of machine learning methods to the routing protocols for achieving rapid convergence in self-organized knowledge-defined networks. The research explores the constituents of the Self-Organized Networking (SON) paradigm for 5G and beyond, aiming to design a routing protocol that complies with the SON requirements. Further, it also exploits a contemporary discipline called Knowledge-Defined Networking (KDN) to extend the routing capability by calculating the “Most Reliable” path than the shortest one. The research identifies the potential key areas and possible techniques to meet the objectives by surveying the state-of-the-art of the relevant fields, such as QoS aware routing, Hybrid SDN architectures, intelligent routing models, and service migration techniques. The design phase focuses primarily on the mathematical modelling of the routing problem and approaches the solution by optimizing at the structural level. The work contributes Stochastic Temporal Edge Normalization (STEN) technique which fuses link and node utilization for cost calculation; MRoute, a hybrid routing algorithm for SDN that leverages STEN to provide constant-time convergence; Most Reliable Route First (MRRF) that uses a Recurrent Neural Network (RNN) to approximate route-reliability as the metric of MRRF. Additionally, the research outcomes include a cross-platform SDN Integration framework (SDN-SIM) and a secure migration technique for containerized services in a Multi-access Edge Computing environment using Distributed Ledger Technology. The research work now eyes the development of 6G standards and its compliance with Industry-5.0 for enhancing the abilities of the present outcomes in the light of Deep Reinforcement Learning and Quantum Computing

    Energy-Efficient Solutions For Green Mobile Networks

    Get PDF

    Quantifying Potential Energy Efficiency Gain in Green Cellular Wireless Networks

    Full text link
    Conventional cellular wireless networks were designed with the purpose of providing high throughput for the user and high capacity for the service provider, without any provisions of energy efficiency. As a result, these networks have an enormous Carbon footprint. In this paper, we describe the sources of the inefficiencies in such networks. First we present results of the studies on how much Carbon footprint such networks generate. We also discuss how much more mobile traffic is expected to increase so that this Carbon footprint will even increase tremendously more. We then discuss specific sources of inefficiency and potential sources of improvement at the physical layer as well as at higher layers of the communication protocol hierarchy. In particular, considering that most of the energy inefficiency in cellular wireless networks is at the base stations, we discuss multi-tier networks and point to the potential of exploiting mobility patterns in order to use base station energy judiciously. We then investigate potential methods to reduce this inefficiency and quantify their individual contributions. By a consideration of the combination of all potential gains, we conclude that an improvement in energy consumption in cellular wireless networks by two orders of magnitude, or even more, is possible.Comment: arXiv admin note: text overlap with arXiv:1210.843
    • …
    corecore