819 research outputs found

    A Robust Maximum Likelihood Scheme for PSS Detection and Integer Frequency Offset Recovery in LTE Systems

    Get PDF
    Before establishing a communication link in a cellular network, the user terminal must activate a synchronization procedure called initial cell search in order to acquire specific information about the serving base station. To accomplish this task, the primary synchronization signal (PSS) and secondary synchronization signal (SSS) are periodically transmitted in the downlink of a long term evolution (LTE) network. Since SSS detection can be performed only after successful identification of the primary signal, in this work, we present a novel algorithm for joint PSS detection, sector index identification, and integer frequency offset (IFO) recovery in an LTE system. The proposed scheme relies on the maximum likelihood (ML) estimation criterion and exploits a suitable reduced-rank representation of the channel frequency response, which proves robust against multipath distortions and residual timing errors. We show that a number of PSS detection methods that were originally introduced through heuristic reasoning can be derived from our ML framework by simply selecting an appropriate model for the channel gains over the PSS subcarriers. Numerical simulations indicate that the proposed scheme can be effectively applied in the presence of severe multipath propagation, where existing alternatives provide unsatisfactory performance

    ML estimation of timing, integer frequency and primary sequence index in LTE systems

    Get PDF
    This paper addresses the problem of maximum likelihood (ML) estimation of slot timing, integer carrier frequency offset and primary sequence index for the downlink of Long Term Evolution (LTE) systems. The proposed algorithm is designed to exploit the knowledge of the pilot Zadoff-Chu sequence embedded in the primary synchronization signal (PSS). The estimation process is affected by the presence of a large set of nuisance parameters, which need to be estimated jointly with the parameters of interest. As a consequence, the exact ML solution is extremely complex and we have developed a suboptimal algorithm designed to provide a good balance between estimation accuracy and complexity. In particular, a key finding is a reduced-rank representation for the frequency response of the channel, which is required by the ML estimator but is not available at receiver prior to having acquired synchronization. Compared to existing alternatives, the resulting scheme exhibits improved accuracy in the estimation of all three parameters of interest

    Performance enhancement for LTE and beyond systems

    Get PDF
    A thesis submitted to the University of Bedfordshire, in partial fulfilment of the requirements for the degree of Doctor of PhilosophyWireless communication systems have undergone fast development in recent years. Based on GSM/EDGE and UMTS/HSPA, the 3rd Generation Partnership Project (3GPP) specified the Long Term Evolution (LTE) standard to cope with rapidly increasing demands, including capacity, coverage, and data rate. To achieve this goal, several key techniques have been adopted by LTE, such as Multiple-Input and Multiple-Output (MIMO), Orthogonal Frequency-Division Multiplexing (OFDM), and heterogeneous network (HetNet). However, there are some inherent drawbacks regarding these techniques. Direct conversion architecture is adopted to provide a simple, low cost transmitter solution. The problem of I/Q imbalance arises due to the imperfection of circuit components; the orthogonality of OFDM is vulnerable to carrier frequency offset (CFO) and sampling frequency offset (SFO). The doubly selective channel can also severely deteriorate the receiver performance. In addition, the deployment of Heterogeneous Network (HetNet), which permits the co-existence of macro and pico cells, incurs inter-cell interference for cell edge users. The impact of these factors then results in significant degradation in relation to system performance. This dissertation aims to investigate the key techniques which can be used to mitigate the above problems. First, I/Q imbalance for the wideband transmitter is studied and a self-IQ-demodulation based compensation scheme for frequencydependent (FD) I/Q imbalance is proposed. This combats the FD I/Q imbalance by using the internal diode of the transmitter and a specially designed test signal without any external calibration instruments or internal low-IF feedback path. The instrument test results show that the proposed scheme can enhance signal quality by 10 dB in terms of image rejection ratio (IRR). In addition to the I/Q imbalance, the system suffers from CFO, SFO and frequency-time selective channel. To mitigate this, a hybrid optimum OFDM receiver with decision feedback equalizer (DFE) to cope with the CFO, SFO and doubly selective channel. The algorithm firstly estimates the CFO and channel frequency response (CFR) in the coarse estimation, with the help of hybrid classical timing and frequency synchronization algorithms. Afterwards, a pilot-aided polynomial interpolation channel estimation, combined with a low complexity DFE scheme, based on minimum mean squared error (MMSE) criteria, is developed to alleviate the impact of the residual SFO, CFO, and Doppler effect. A subspace-based signal-to-noise ratio (SNR) estimation algorithm is proposed to estimate the SNR in the doubly selective channel. This provides prior knowledge for MMSE-DFE and automatic modulation and coding (AMC). Simulation results show that this proposed estimation algorithm significantly improves the system performance. In order to speed up algorithm verification process, an FPGA based co-simulation is developed. Inter-cell interference caused by the co-existence of macro and pico cells has a big impact on system performance. Although an almost blank subframe (ABS) is proposed to mitigate this problem, the residual control signal in the ABS still inevitably causes interference. Hence, a cell-specific reference signal (CRS) interference cancellation algorithm, utilizing the information in the ABS, is proposed. First, the timing and carrier frequency offset of the interference signal is compensated by utilizing the cross-correlation properties of the synchronization signal. Afterwards, the reference signal is generated locally and channel response is estimated by making use of channel statistics. Then, the interference signal is reconstructed based on the previous estimate of the channel, timing and carrier frequency offset. The interference is mitigated by subtracting the estimation of the interference signal and LLR puncturing. The block error rate (BLER) performance of the signal is notably improved by this algorithm, according to the simulation results of different channel scenarios. The proposed techniques provide low cost, low complexity solutions for LTE and beyond systems. The simulation and measurements show good overall system performance can be achieved

    System capacity enhancement for 5G network and beyond

    Get PDF
    A thesis submitted to the University of Bedfordshire, in fulfilment of the requirements for the degree of Doctor of PhilosophyThe demand for wireless digital data is dramatically increasing year over year. Wireless communication systems like Laptops, Smart phones, Tablets, Smart watch, Virtual Reality devices and so on are becoming an important part of people’s daily life. The number of mobile devices is increasing at a very fast speed as well as the requirements for mobile devices such as super high-resolution image/video, fast download speed, very short latency and high reliability, which raise challenges to the existing wireless communication networks. Unlike the previous four generation communication networks, the fifth-generation (5G) wireless communication network includes many technologies such as millimetre-wave communication, massive multiple-input multiple-output (MIMO), visual light communication (VLC), heterogeneous network (HetNet) and so forth. Although 5G has not been standardised yet, these above technologies have been studied in both academia and industry and the goal of the research is to enhance and improve the system capacity for 5G networks and beyond by studying some key problems and providing some effective solutions existing in the above technologies from system implementation and hardware impairments’ perspective. The key problems studied in this thesis include interference cancellation in HetNet, impairments calibration for massive MIMO, channel state estimation for VLC, and low latency parallel Turbo decoding technique. Firstly, inter-cell interference in HetNet is studied and a cell specific reference signal (CRS) interference cancellation method is proposed to mitigate the performance degrade in enhanced inter-cell interference coordination (eICIC). This method takes carrier frequency offset (CFO) and timing offset (TO) of the user’s received signal into account. By reconstructing the interfering signal and cancelling it afterwards, the capacity of HetNet is enhanced. Secondly, for massive MIMO systems, the radio frequency (RF) impairments of the hardware will degrade the beamforming performance. When operated in time duplex division (TDD) mode, a massive MIMO system relies on the reciprocity of the channel which can be broken by the transmitter and receiver RF impairments. Impairments calibration has been studied and a closed-loop reciprocity calibration method is proposed in this thesis. A test device (TD) is introduced in this calibration method that can estimate the transmitters’ impairments over-the-air and feed the results back to the base station via the Internet. The uplink pilots sent by the TD can assist the BS receivers’ impairment estimation. With both the uplink and downlink impairments estimates, the reciprocity calibration coefficients can be obtained. By computer simulation and lab experiment, the performance of the proposed method is evaluated. Channel coding is an essential part of a wireless communication system which helps fight with noise and get correct information delivery. Turbo codes is one of the most reliable codes that has been used in many standards such as WiMAX and LTE. However, the decoding process of turbo codes is time-consuming and the decoding latency should be improved to meet the requirement of the future network. A reverse interleave address generator is proposed that can reduce the decoding time and a low latency parallel turbo decoder has been implemented on a FPGA platform. The simulation and experiment results prove the effectiveness of the address generator and show that there is a trade-off between latency and throughput with a limited hardware resource. Apart from the above contributions, this thesis also investigated multi-user precoding for MIMO VLC systems. As a green and secure technology, VLC is achieving more and more attention and could become a part of 5G network especially for indoor communication. For indoor scenario, the MIMO VLC channel could be easily ill-conditioned. Hence, it is important to study the impact of the channel state to the precoding performance. A channel state estimation method is proposed based on the signal to interference noise ratio (SINR) of the users’ received signal. Simulation results show that it can enhance the capacity of the indoor MIMO VLC system

    Lightly synchronized Multipacket Reception in Machine-Type Communications Networks

    Get PDF
    Machine Type Communication (MTC) applications were designed to monitor and control elements of our surroundings and environment. MTC applications have a different set of requirements compared to the traditional communication devices, with Machine to Machine (M2M) data being mostly short, asynchronous, bursty and sometimes requiring end-to-end delays below 1ms. With the growth of MTC, the new generation of mobile communications has to be able to present different types of services with very different requirements, i.e. the same network has to be capable of "supplying" connection to the user that just wants to download a video or use social media, allowing at the same time MTC that has completely different requirements, without deteriorating both experiences. The challenges associated to the implementation of MTC require disruptive changes at the Physical (PHY) and Medium Access Control (MAC) layers, that lead to a better use of the spectrum available. The orthogonality and synchronization requirements of the PHY layer of current Long Term Evolution Advanced (LTE-A) radio access network (based on glsofdm and Single Carrier Frequency Domain Equalization (SC-FDE)) are obstacles for this new 5th Generation (5G) architecture. Generalized Frequency Division Multiplexing (GFDM) and other modulation techniques were proposed as candidates for the 5G PHY layer, however they also suffer from visible degradation when the transmitter and receiver are not synchronized, leading to a poor performance when collisions occur in an asynchronous MAC layer. This dissertation addresses the requirements of M2M traffic at the MAC layer applying multipacket reception (MPR) techniques to handle the bursty nature of the traffic and synchronization tones and optimized back-off approaches to reduce the delay. It proposes a new MAC protocol and analyses its performance analytically considering an SC-FDE modulation. The models are validated using a system level cross-layer simulator developed in MATLAB, which implements the MAC protocol and applies PHY layer performance models. The results show that the MAC’s latency depends mainly on the number of users and the load of each user, and can be controlled using these two parameters
    • …
    corecore