15 research outputs found

    A survey of self organisation in future cellular networks

    Get PDF
    This article surveys the literature over the period of the last decade on the emerging field of self organisation as applied to wireless cellular communication networks. Self organisation has been extensively studied and applied in adhoc networks, wireless sensor networks and autonomic computer networks; however in the context of wireless cellular networks, this is the first attempt to put in perspective the various efforts in form of a tutorial/survey. We provide a comprehensive survey of the existing literature, projects and standards in self organising cellular networks. Additionally, we also aim to present a clear understanding of this active research area, identifying a clear taxonomy and guidelines for design of self organising mechanisms. We compare strength and weakness of existing solutions and highlight the key research areas for further development. This paper serves as a guide and a starting point for anyone willing to delve into research on self organisation in wireless cellular communication networks

    Traffic-Driven Energy Efficient Operational Mechanisms in Cellular Access Networks

    Get PDF
    Recent explosive growth in mobile data traffic is increasing energy consumption in cellular networks at an incredible rate. Moreover, as a direct result of the conventional static network provisioning approach, a significant amount of electrical energy is being wasted in the existing networks. Therefore, in recent time, the issue of designing energy efficient cellular networks has drawn significant attention, which is also the foremost motivation behind this research. The proposed research is particularly focused on the design of self-organizing type traffic-sensitive dynamic network reconfiguring mechanisms for energy efficiency in cellular systems. Under the proposed techniques, radio access networks (RANs) are adaptively reconfigured using less equipment leading to reduced energy utilization. Several energy efficient cellular network frameworks by employing inter-base station (BS) cooperation in RANs are proposed. Under these frameworks, based on the instantaneous traffic demand, BSs are dynamically switched between active and sleep modes by redistributing traffic among them and thus, energy savings is achieved. The focus is then extended to exploiting the availability of multiple cellular networks for extracting energy savings through inter-RAN cooperation. Mathematical models for both of these single-RAN and multi-RAN cooperation mechanisms are also formulated. An alternative energy saving technique using dynamic sectorization (DS) under which some of the sectors in the underutilized BSs are turned into sleep mode is also proposed. Algorithms for both the distributed and the centralized implementations are developed. Finally, a two-dimensional energy efficient network provisioning mechanism is proposed by jointly applying both the DS and the dynamic BS switching. Extensive simulations are carried out, which demonstrate the capability of the proposed mechanisms in substantially enhancing the energy efficiency of cellular networks

    Energy efficiency in cellular wireless networks

    Get PDF
    Energy efficiency of Long Term Evolution (LTE) cellular communication networks has become a major concern for network operators, not only to reduce the operational costs, but also to reduce their environmental effects. Within LTE cellular networks, base stations are responsible for most of the energy consumption, consuming 70-95% or more of the network power depending on the network topology, configuration, radio technology and data rates that are used. Power control is an important function in cellular wireless networks and refers to setting the output power levels of transmitters, termed eNodeB in the downlink and user equipment (UEs) in the uplink. LTE utilizes two different mechanisms for uplink power control: Open Loop Power Control (OLPC) and Closed Loop Power Control (CLPC). Uplink OLPC is performed by the UE following eNodeB configuration and can compensate for long term channel variation such as path loss and shadowing. The uplink CLPC mechanism attempts to improve power control performance by compensating fast channel variations due to multipath fading. In CLPC the eNodeB sends Transmit Power Control (TPC) commands to the UE to adjust the UE’s transmit power. This thesis focuses on an Open Loop Power Control (OLPC) scheme for LTE uplink by using the Okumura-Hata propagation path loss model to set the User Equipment (UE) uplink transmit power control parameters in order to reduce the UE energy consumption. In general, the UE requires more power to connect to distant base stations than closer base stations and therefore this thesis analyses the required power levels using the Okumura-Hata propagation path loss model. Estimation of path loss is very important in initial deployment of wireless network and cell planning. This thesis analyses the Okumura-Hata propagation path loss in different receiver antenna heights (

    Energy Management in LTE Networks

    Get PDF
    Wireless cellular networks have seen dramatic growth in number of mobile users. As a result, data requirements, and hence the base-station power consumption has increased significantly. It in turn adds to the operational expenditures and also causes global warming. The base station power consumption in long-term evolution (LTE) has, therefore, become a major challenge for vendors to stay green and profitable in competitive cellular industry. It necessitates novel methods to devise energy efficient communication in LTE. Importance of the topic has attracted huge research interests worldwide. Energy saving (ES) approaches proposed in the literature can be broadly classified in categories of energy efficient resource allocation, load balancing, carrier aggregation, and bandwidth expansion. Each of these methods has its own pros and cons leading to a tradeoff between ES and other performance metrics resulting into open research questions. This paper discusses various ES techniques for the LTE systems and critically analyses their usability through a comprehensive comparative study

    A Survey of Self Organisation in Future Cellular Networks

    Full text link

    Hard handover for load balancing in long term evolution network

    Get PDF
    This thesis presents a hard handover for load balancing in Long Term Evolution (LTE) network. LTE is a cellular self-organizing network (SON) standardized by Third Generation Project (3GPP) to optimally provide high data rate and high quality of service to end users. However, the huge amount of data requirements for the diverse multimedia services by LTE subscribers is fast affecting the network’s quality of service (QoS) negatively. On the other hand, the need for an optimized energy consumption algorithm to reduce the network access cost and optimize the battery life of the user’s equipment (UE) is also on the increase. Therefore, the main aim of this thesis is to provide a new solution for load control as well as providing energy efficient solution for both the network and the mobile devices. In the first contribution, a new network-energy efficient handover decision algorithm for load balancing is developed. The algorithm uses load information and reference signal received power (RSRP) as decision parameters for the handover decision scheme. The second contribution focuses on the development of an optimized handover decision algorithm for the load balancing and ping-pong control. The algorithm uses the cell load information, the received signal strength (RSS) and an adaptive timer as inputs for the handover decision procedure. Besides, the third contribution is on the development of a handover decision algorithm to optimize the UEs energy consumption as well as load balancing optimization. Overall, key performance indicators such as load distribution index (LDI), number of unsatisfied users (NUU), cumulative number of ping-pong handover request (CNPHR), cumulative number of non-ping-pong handover request (CNNPHR), average throughput of the cell (ATC), handover blocking rate (HBR), new call blocking rate (NCBR) and number of handover calls (NHC) were evaluated through simulations. The results were compared with some other works in the literature. In particular, the proposed algorithm achieved over 10% higher for LDI, 50% lower for NUU, 30% higher for CNPHR and 5% lower for CNNPH when compared with works in the literature. Other results are 10% higher for ATC, 75% lower for HBR and 40% lower for NCBR. In general, the proposed handover decision algorithm for energy efficient load balancing management in LTE has proven its ability for energy consumption optimization, load balancing management and pingpong handover control

    Increased energy efficiency in LTE networks through reduced early handover

    Get PDF
    “A thesis submitted to the University of Bedfordshire, in partial fulfilment of the requirements for the degree of Doctor of Philosophy”.Long Term Evolution (LTE) is enormously adopted by several mobile operators and has been introduced as a solution to fulfil ever-growing Users (UEs) data requirements in cellular networks. Enlarged data demands engage resource blocks over prolong time interval thus results into more dynamic power consumption at downlink in Basestation. Therefore, realisation of UEs requests come at the cost of increased power consumption which directly affects operator operational expenditures. Moreover, it also contributes in increased CO2 emissions thus leading towards Global Warming. According to research, Global Information and Communication Technology (ICT) systems consume approximately 1200 to 1800 Terawatts per hour of electricity annually. Importantly mobile communication industry is accountable for more than one third of this power consumption in ICT due to increased data requirements, number of UEs and coverage area. Applying these values to global warming, telecommunication is responsible for 0.3 to 0.4 percent of worldwide CO2 emissions. Moreover, user data volume is expected to increase by a factor of 10 every five years which results in 16 to 20 percent increase in associated energy consumption which directly effects our environment by enlarged global warming. This research work focuses on the importance of energy saving in LTE and initially propose bandwidth expansion based energy saving scheme which combines two resource blocks together to form single super RB, thereby resulting in reduced Physical Downlink Control Channel Overhead (PDCCH). Thus, decreased PDCCH overhead helps in reduced dynamic power consumption up to 28 percent. Subsequently, novel reduced early handover (REHO) based idea is proposed and combined with bandwidth expansion to form enhanced energy ii saving scheme. System level simulations are performed to investigate the performance of REHO scheme; it was found that reduced early handover provided around 35% improved energy saving while compared to LTE standard in 3rd Generation Partnership Project (3GPP) based scenario. Since there is a direct relationship between energy consumption, CO2 emissions and vendors operational expenditure (OPEX); due to reduced power consumption and increased energy efficiency, REHO subsequently proven to be a step towards greener communication with lesser CO2 footprint and reduced operational expenditure values. The main idea of REHO lies in the fact that it initiate handovers earlier and turn off freed resource blocks as compare to LTE standard. Therefore, the time difference (Transmission Time Intervals) between REHO based early handover and LTE standard handover is a key component for energy saving achieved, which is estimated through axiom of Euclidean geometry. Moreover, overall system efficiency is investigated through the analysis of numerous performance related parameters in REHO and LTE standard. This led to a key finding being made to guide the vendors about the choice of energy saving in relation to radio link failure and other important parameters

    Energy efficiency comparison between 2.1 GHz and 28 GHz based communication networks

    Get PDF
    Mobile communications have revolutionized the way we communicate around the globe, making communication easier, faster and cheaper. In the first three generations of mobile networks, the primary focus was on voice calls, and as such, the traffic on the networks was not as heavy as it currently is. Towards the fourth generation however, there was an explosive increase in mobile data traffic, driven in part by the heavy use of smart phones, tablets and cloud services, that is in turn increasing heavy energy consumption by the mobile networks to meet increased demand. Addition of power conditioning equipment adds on to the overall energy consumption of the base stations, necessitating deployment of energy efficient solutions to deal with the impacts and costs of heavy energy consumption. This thesis investigates the energy efficiency performance of mobile networks in various scenarios in a dense urban environment. Consideration is given to the future deployment of 5G networks, and simulations are carried out at 2.1 GHz and 28 GHz frequencies with a channel bandwidth of 20 MHz in the 2.1 GHz simulation and 20 MHz in 28 GHz scenario. The channel bandwidth of the 28 GHz system is then increased ten-fold and another system performance evaluation is then done. Parameters used for evaluating the system performance include the received signal strength, signal-to-interference-plus-noise-ratio, spectral efficiency and power efficiency are also considered. The results suggest that deployment of networks using mmWave frequencies with the same parameters as the 2.1 GHz does not improve the overall performance of the system but improves the throughput when a bandwidth of 200 MHz band is allocated. The use of antenna masking with down tilting improves the gains of the system in all three systems. The conclusion drawn is that if all factors are the same, mmWave systems can be installed in the same site locations as 2.1 GHz systems. However, to achieve better performance, some significant modifications would need to be considered, like the use of antenna arrays and beam steering techniques. This simulation has considered outdoor users only, with indoor users eliminated. The parameters in a real network deployment might differ and the results could change, which in turn could change the performance of the system
    corecore