157 research outputs found

    Mobility Analysis and Management for Heterogeneous Networks

    Get PDF
    The global mobile data traffic has increased tremendously in the last decade due to the technological advancement in smartphones. Their endless usage and bandwidth-intensive applications will saturate current 4G technologies and has motivated the need for concrete research in order to sustain the mounting data traffic demand. In this regard, the network densification has shown to be a promising direction to cope with the capacity demands in future 5G wireless networks. The basic idea is to deploy several low power radio access nodes called small cells closer to the users on the existing large radio foot print of macrocells, and this constitutes a heterogeneous network (HetNet). However, there are many challenges that operators face with the dense HetNet deployment. The mobility management becomes a challenging task due to triggering of frequent handovers when a user moves across the network coverage areas. When there are fewer users associated in certain small cells, this can lead to significant increase in the energy consumption. Intelligently switching them to low energy consumption modes or turning them off without seriously degrading user performance is desirable in order to improve the energy savings in HetNets. This dynamic power level switching in the small cells, however, may cause unnecessary handovers, and it becomes important to ensure energy savings without compromising handover performance. Finally, it is important to evaluate mobility management schemes in real network deployments, in order to find any problems affecting the quality of service (QoS) of the users. The research presented in this dissertation aims to address these challenges. First, to tackle the mobility management issue, we develop a closed form, analytical model to study the handover and ping-pong performance as a function of network parameters in the small cells, and verify its performance using simulations. Secondly, we incorporate fuzzy logic based game-theoretic framework to address and examine the energy efficiency improvements in HetNets. In addition, we design fuzzy inference rules for handover decisions and target base station selection is performed through a fuzzy ranking technique in order to enhance the mobility robustness, while also considering energy/spectral efficiency. Finally, we evaluate the mobility performance by carrying out drive test in an existing 4G long term evolution (LTE) network deployment using software defined radios (SDR). This helps to obtain network quality information in order to find any problems affecting the QoS of the users

    Performance enhancement for LTE and beyond systems

    Get PDF
    A thesis submitted to the University of Bedfordshire, in partial fulfilment of the requirements for the degree of Doctor of PhilosophyWireless communication systems have undergone fast development in recent years. Based on GSM/EDGE and UMTS/HSPA, the 3rd Generation Partnership Project (3GPP) specified the Long Term Evolution (LTE) standard to cope with rapidly increasing demands, including capacity, coverage, and data rate. To achieve this goal, several key techniques have been adopted by LTE, such as Multiple-Input and Multiple-Output (MIMO), Orthogonal Frequency-Division Multiplexing (OFDM), and heterogeneous network (HetNet). However, there are some inherent drawbacks regarding these techniques. Direct conversion architecture is adopted to provide a simple, low cost transmitter solution. The problem of I/Q imbalance arises due to the imperfection of circuit components; the orthogonality of OFDM is vulnerable to carrier frequency offset (CFO) and sampling frequency offset (SFO). The doubly selective channel can also severely deteriorate the receiver performance. In addition, the deployment of Heterogeneous Network (HetNet), which permits the co-existence of macro and pico cells, incurs inter-cell interference for cell edge users. The impact of these factors then results in significant degradation in relation to system performance. This dissertation aims to investigate the key techniques which can be used to mitigate the above problems. First, I/Q imbalance for the wideband transmitter is studied and a self-IQ-demodulation based compensation scheme for frequencydependent (FD) I/Q imbalance is proposed. This combats the FD I/Q imbalance by using the internal diode of the transmitter and a specially designed test signal without any external calibration instruments or internal low-IF feedback path. The instrument test results show that the proposed scheme can enhance signal quality by 10 dB in terms of image rejection ratio (IRR). In addition to the I/Q imbalance, the system suffers from CFO, SFO and frequency-time selective channel. To mitigate this, a hybrid optimum OFDM receiver with decision feedback equalizer (DFE) to cope with the CFO, SFO and doubly selective channel. The algorithm firstly estimates the CFO and channel frequency response (CFR) in the coarse estimation, with the help of hybrid classical timing and frequency synchronization algorithms. Afterwards, a pilot-aided polynomial interpolation channel estimation, combined with a low complexity DFE scheme, based on minimum mean squared error (MMSE) criteria, is developed to alleviate the impact of the residual SFO, CFO, and Doppler effect. A subspace-based signal-to-noise ratio (SNR) estimation algorithm is proposed to estimate the SNR in the doubly selective channel. This provides prior knowledge for MMSE-DFE and automatic modulation and coding (AMC). Simulation results show that this proposed estimation algorithm significantly improves the system performance. In order to speed up algorithm verification process, an FPGA based co-simulation is developed. Inter-cell interference caused by the co-existence of macro and pico cells has a big impact on system performance. Although an almost blank subframe (ABS) is proposed to mitigate this problem, the residual control signal in the ABS still inevitably causes interference. Hence, a cell-specific reference signal (CRS) interference cancellation algorithm, utilizing the information in the ABS, is proposed. First, the timing and carrier frequency offset of the interference signal is compensated by utilizing the cross-correlation properties of the synchronization signal. Afterwards, the reference signal is generated locally and channel response is estimated by making use of channel statistics. Then, the interference signal is reconstructed based on the previous estimate of the channel, timing and carrier frequency offset. The interference is mitigated by subtracting the estimation of the interference signal and LLR puncturing. The block error rate (BLER) performance of the signal is notably improved by this algorithm, according to the simulation results of different channel scenarios. The proposed techniques provide low cost, low complexity solutions for LTE and beyond systems. The simulation and measurements show good overall system performance can be achieved

    System capacity enhancement for 5G network and beyond

    Get PDF
    A thesis submitted to the University of Bedfordshire, in fulfilment of the requirements for the degree of Doctor of PhilosophyThe demand for wireless digital data is dramatically increasing year over year. Wireless communication systems like Laptops, Smart phones, Tablets, Smart watch, Virtual Reality devices and so on are becoming an important part of people’s daily life. The number of mobile devices is increasing at a very fast speed as well as the requirements for mobile devices such as super high-resolution image/video, fast download speed, very short latency and high reliability, which raise challenges to the existing wireless communication networks. Unlike the previous four generation communication networks, the fifth-generation (5G) wireless communication network includes many technologies such as millimetre-wave communication, massive multiple-input multiple-output (MIMO), visual light communication (VLC), heterogeneous network (HetNet) and so forth. Although 5G has not been standardised yet, these above technologies have been studied in both academia and industry and the goal of the research is to enhance and improve the system capacity for 5G networks and beyond by studying some key problems and providing some effective solutions existing in the above technologies from system implementation and hardware impairments’ perspective. The key problems studied in this thesis include interference cancellation in HetNet, impairments calibration for massive MIMO, channel state estimation for VLC, and low latency parallel Turbo decoding technique. Firstly, inter-cell interference in HetNet is studied and a cell specific reference signal (CRS) interference cancellation method is proposed to mitigate the performance degrade in enhanced inter-cell interference coordination (eICIC). This method takes carrier frequency offset (CFO) and timing offset (TO) of the user’s received signal into account. By reconstructing the interfering signal and cancelling it afterwards, the capacity of HetNet is enhanced. Secondly, for massive MIMO systems, the radio frequency (RF) impairments of the hardware will degrade the beamforming performance. When operated in time duplex division (TDD) mode, a massive MIMO system relies on the reciprocity of the channel which can be broken by the transmitter and receiver RF impairments. Impairments calibration has been studied and a closed-loop reciprocity calibration method is proposed in this thesis. A test device (TD) is introduced in this calibration method that can estimate the transmitters’ impairments over-the-air and feed the results back to the base station via the Internet. The uplink pilots sent by the TD can assist the BS receivers’ impairment estimation. With both the uplink and downlink impairments estimates, the reciprocity calibration coefficients can be obtained. By computer simulation and lab experiment, the performance of the proposed method is evaluated. Channel coding is an essential part of a wireless communication system which helps fight with noise and get correct information delivery. Turbo codes is one of the most reliable codes that has been used in many standards such as WiMAX and LTE. However, the decoding process of turbo codes is time-consuming and the decoding latency should be improved to meet the requirement of the future network. A reverse interleave address generator is proposed that can reduce the decoding time and a low latency parallel turbo decoder has been implemented on a FPGA platform. The simulation and experiment results prove the effectiveness of the address generator and show that there is a trade-off between latency and throughput with a limited hardware resource. Apart from the above contributions, this thesis also investigated multi-user precoding for MIMO VLC systems. As a green and secure technology, VLC is achieving more and more attention and could become a part of 5G network especially for indoor communication. For indoor scenario, the MIMO VLC channel could be easily ill-conditioned. Hence, it is important to study the impact of the channel state to the precoding performance. A channel state estimation method is proposed based on the signal to interference noise ratio (SINR) of the users’ received signal. Simulation results show that it can enhance the capacity of the indoor MIMO VLC system

    Algorithm-Architecture Co-Design for Digital Front-Ends in Mobile Receivers

    Get PDF
    The methodology behind this work has been to use the concept of algorithm-hardware co-design to achieve efficient solutions related to the digital front-end in mobile receivers. It has been shown that, by looking at algorithms and hardware architectures together, more efficient solutions can be found; i.e., efficient with respect to some design measure. In this thesis the main focus have been placed on two such parameters; first reduced complexity algorithms to lower energy consumptions at limited performance degradation, secondly to handle the increasing number of wireless standards that preferably should run on the same hardware platform. To be able to perform this task it is crucial to understand both sides of the table, i.e., both algorithms and concepts for wireless communication as well as the implications arising on the hardware architecture. It is easier to handle the high complexity by separating those disciplines in a way of layered abstraction. However, this representation is imperfect, since many interconnected "details" belonging to different layers are lost in the attempt of handling the complexity. This results in poor implementations and the design of mobile terminals is no exception. Wireless communication standards are often designed based on mathematical algorithms with theoretical boundaries, with few considerations to actual implementation constraints such as, energy consumption, silicon area, etc. This thesis does not try to remove the layer abstraction model, given its undeniable advantages, but rather uses those cross-layer "details" that went missing during the abstraction. This is done in three manners: In the first part, the cross-layer optimization is carried out from the algorithm perspective. Important circuit design parameters, such as quantization are taken into consideration when designing the algorithm for OFDM symbol timing, CFO, and SNR estimation with a single bit, namely, the Sign-Bit. Proof-of-concept circuits were fabricated and showed high potential for low-end receivers. In the second part, the cross-layer optimization is accomplished from the opposite side, i.e., the hardware-architectural side. A SDR architecture is known for its flexibility and scalability over many applications. In this work a filtering application is mapped into software instructions in the SDR architecture in order to make filtering-specific modules redundant, and thus, save silicon area. In the third and last part, the optimization is done from an intermediate point within the algorithm-architecture spectrum. Here, a heterogeneous architecture with a combination of highly efficient and highly flexible modules is used to accomplish initial synchronization in at least two concurrent OFDM standards. A demonstrator was build capable of performing synchronization in any two standards, including LTE, WiFi, and DVB-H

    D4.3 Final Report on Network-Level Solutions

    Full text link
    Research activities in METIS reported in this document focus on proposing solutions to the network-level challenges of future wireless communication networks. Thereby, a large variety of scenarios is considered and a set of technical concepts is proposed to serve the needs envisioned for the 2020 and beyond. This document provides the final findings on several network-level aspects and groups of solutions that are considered essential for designing future 5G solutions. Specifically, it elaborates on: -Interference management and resource allocation schemes -Mobility management and robustness enhancements -Context aware approaches -D2D and V2X mechanisms -Technology components focused on clustering -Dynamic reconfiguration enablers These novel network-level technology concepts are evaluated against requirements defined by METIS for future 5G systems. Moreover, functional enablers which can support the solutions mentioned aboveare proposed. We find that the network level solutions and technology components developed during the course of METIS complement the lower layer technology components and thereby effectively contribute to meeting 5G requirements and targets.Aydin, O.; Valentin, S.; Ren, Z.; Botsov, M.; Lakshmana, TR.; Sui, Y.; Sun, W.... (2015). D4.3 Final Report on Network-Level Solutions. http://hdl.handle.net/10251/7675

    Building upon NB-IoT networks : a roadmap towards 5G new radio networks

    Get PDF
    Narrowband Internet of Things (NB-IoT) is a type of low-power wide-area (LPWA) technology standardized by the 3rd-Generation Partnership Project (3GPP) and based on long-term evolution (LTE) functionalities. NB-IoT has attracted significant interest from the research community due to its support for massive machine-type communication (mMTC) and various IoT use cases that have stringent specifications in terms of connectivity, energy efficiency, reachability, reliability, and latency. However, as the capacity requirements for different IoT use cases continue to grow, the various functionalities of the LTE evolved packet core (EPC) system may become overladen and inevitably suboptimal. Several research efforts are ongoing to meet these challenges; consequently, we present an overview of these efforts, mainly focusing on the Open System Interconnection (OSI) layer of the NB-IoT framework. We present an optimized architecture of the LTE EPC functionalities, as well as further discussion about the 3GPP NB-IoT standardization and its releases. Furthermore, the possible 5G architectural design for NB-IoT integration, the enabling technologies required for 5G NB-IoT, the 5G NR coexistence with NB-IoT, and the potential architectural deployment schemes of NB-IoT with cellular networks are introduced. In this article, a description of cloud-assisted relay with backscatter communication, a comprehensive review of the technical performance properties and channel communication characteristics from the perspective of the physical (PHY) and medium-access control (MAC) layer of NB-IoT, with a focus on 5G, are presented. The different limitations associated with simulating these systems are also discussed. The enabling market for NB-IoT, the benefits for a few use cases, and possible critical challenges related to their deployment are also included. Finally, present challenges and open research directions on the PHY and MAC properties, as well as the strengths, weaknesses, opportunities, and threats (SWOT) analysis of NB-IoT, are presented to foster the prospective research activities.http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6287639pm2021Electrical, Electronic and Computer Engineerin

    A Tutorial on Beam Management for 3GPP NR at mmWave Frequencies

    Full text link
    The millimeter wave (mmWave) frequencies offer the availability of huge bandwidths to provide unprecedented data rates to next-generation cellular mobile terminals. However, mmWave links are highly susceptible to rapid channel variations and suffer from severe free-space pathloss and atmospheric absorption. To address these challenges, the base stations and the mobile terminals will use highly directional antennas to achieve sufficient link budget in wide area networks. The consequence is the need for precise alignment of the transmitter and the receiver beams, an operation which may increase the latency of establishing a link, and has important implications for control layer procedures, such as initial access, handover and beam tracking. This tutorial provides an overview of recently proposed measurement techniques for beam and mobility management in mmWave cellular networks, and gives insights into the design of accurate, reactive and robust control schemes suitable for a 3GPP NR cellular network. We will illustrate that the best strategy depends on the specific environment in which the nodes are deployed, and give guidelines to inform the optimal choice as a function of the system parameters.Comment: 22 pages, 19 figures, 10 tables, published in IEEE Communications Surveys and Tutorials. Please cite it as M. Giordani, M. Polese, A. Roy, D. Castor and M. Zorzi, "A Tutorial on Beam Management for 3GPP NR at mmWave Frequencies," in IEEE Communications Surveys & Tutorials, vol. 21, no. 1, pp. 173-196, First quarter 201

    Novel Wake-up Scheme for Energy-Efficient Low-Latency Mobile Devices in 5G Networks

    Get PDF
    Improved mobile device battery lifetime and latency mini-mization are critical requirements for enhancing the mobile broadband services and user experience. Long-term evolution (LTE) networks have adopted discontinuous reception (DRX) as the baseline solution for prolonged battery lifetime. However, in every DRX cycle, the mobile device baseband processing unit monitors and decodes the control signaling, and thus all instances without any actual data allocation leads to unnecessary energy consumption. This fact together with the long start-up and power-down times can prevent adopting frequent wake-up instants, which in turn leads to considerable latency. In this work,a novel wake-up scheme is described and studied, to tackle the trade-off between latency and battery lifetime in future 5G networks, seeking thus to facilitate an always-available experience, rather than always-on. Analytical and simulation-based results show that the proposed scheme is a promising approach to control the user plane latency and energy consumption, when the device is operating in the power saving mode. The aim of this article is to describe the overall wake-up system operating principle and the associated signaling methods,receiver processing solutions and essential implementation aspects. Additionally, the advantages compared to DRX-based systems are shown and demonstrated, through the analysis of the system energy-efficiency and latency characteristics, with special emphasis on future 5G-grade mobile device

    Solunvaihdon suorituskyvyn arviointi 450 MHz ja 2600 MHz LTE-verkkojen välillä

    Get PDF
    This thesis evaluates handover performance between two different LTE frequency bands, band 31 and band 38, which are operated on 450 MHz and 2600 MHz frequencies, respectively. Mobile network operators are deploying multiple LTE frequency bands within same geographical areas in order to meet demand created by continuously growing mobile data usage. This creates additional challenges to network design, performance optimization and mobility management. Studied bands 31 and 38 differ on their propagation characteristics, as well as on their specified transmission capabilities. Bands also utilize different duplex methods, Frequency Division Duplex and Time Division Duplex. Performance evaluation was conducted in order to allow efficient usage of both bands. Evaluation is based on information obtained from 3GPP specifications and laboratory measurements conducted with commercially available equipment. Current handover parameters of the studied network have been optimized for 450 MHz cells only, and utilize mostly default configurations introduced by device manufacturer. This configuration is evaluated and more suitable handover strategy is proposed. The proposed strategy is then compared with the default strategy through measurements conducted in laboratory environment. Conducted measurements confirm that with proper handover parameter optimization, 2600 MHz frequency band can be prioritized over less capable 450 MHz band, which is likely to improve user perceived service quality. By utilizing collected results, associated network operator could improve offered services and gain savings in network equipment costs.Tässä diplomityössä tutkitaan solunvaihdon suorituskykyä kahden LTE-taajuuskaistan, 31 ja 38, välillä. Taajuuskaistaa 31 operoidaan 450 MHz taajuudella ja taajuuskaistaa 38 2600 MHz taajuudella. Vastatakseen jatkuvaan mobiilidatan käytön kasvuun, verkko-operaattorit ottavat käyttöön useita LTE-taajuuksia saman maantieteellisen alueen sisällä. Tämä luo ylimääräisiä haasteita verkkosuunniteluun, verkon suorituskyvyn optimointiin ja mobiliteetin hallintaan. Tutkitut taajuuskaistat eroavat niin etenemis- kuin tiedonsiirtokyvyiltään. Lisäksi taajuuskaistat käyttävät erilaisia duplex-muotoja. Suorituskyvyn arvioinnin tarkoitus on mahdollistaa molempien taajuuskaistojen tehokas käyttö. Suorituskyvyn arviointi perustuu 3GPP:n spesifikaatioihin ja kaupallisella laitteistolla suoritettuihin laboratoriomittauksiin. Nykyisin käytössä olevat verkkoparametrit on optimoitu vain 450 MHz solujen käyttöön, jonka lisäksi suuri osa verkon konfiguraatioista hyödyntää valmistan käyttämiä oletusarvoja. Työssä verkon konfiguraatiolla suoritetaan arviointi, jonka perusteella esitetään suositeltu solunvaihdon strategia. Suositeltua strategiaa verrataan oletus-strategiaan laboratoriomittausten avulla. Mittaustulokset näyttävät toteen, että oikeanlaisilla solunvaihdon parametreilla 2600 MHz taajuuskaistaa voidaan priorisoida heikomman 450 MHz taajuuskaistan yli. Monissa tilanteissa tämä parantaa käyttäjien verkosta saamaa palvelukokemusta. Hyödyntämällä tämän työn tuottamia tuloksia, verkko-operaattori voi parantaa tarjoamaansa palvelua ja saavuttaa säästöjä laitehankinnoissa
    • …
    corecore