36 research outputs found

    Performance enhancement for LTE and beyond systems

    Get PDF
    A thesis submitted to the University of Bedfordshire, in partial fulfilment of the requirements for the degree of Doctor of PhilosophyWireless communication systems have undergone fast development in recent years. Based on GSM/EDGE and UMTS/HSPA, the 3rd Generation Partnership Project (3GPP) specified the Long Term Evolution (LTE) standard to cope with rapidly increasing demands, including capacity, coverage, and data rate. To achieve this goal, several key techniques have been adopted by LTE, such as Multiple-Input and Multiple-Output (MIMO), Orthogonal Frequency-Division Multiplexing (OFDM), and heterogeneous network (HetNet). However, there are some inherent drawbacks regarding these techniques. Direct conversion architecture is adopted to provide a simple, low cost transmitter solution. The problem of I/Q imbalance arises due to the imperfection of circuit components; the orthogonality of OFDM is vulnerable to carrier frequency offset (CFO) and sampling frequency offset (SFO). The doubly selective channel can also severely deteriorate the receiver performance. In addition, the deployment of Heterogeneous Network (HetNet), which permits the co-existence of macro and pico cells, incurs inter-cell interference for cell edge users. The impact of these factors then results in significant degradation in relation to system performance. This dissertation aims to investigate the key techniques which can be used to mitigate the above problems. First, I/Q imbalance for the wideband transmitter is studied and a self-IQ-demodulation based compensation scheme for frequencydependent (FD) I/Q imbalance is proposed. This combats the FD I/Q imbalance by using the internal diode of the transmitter and a specially designed test signal without any external calibration instruments or internal low-IF feedback path. The instrument test results show that the proposed scheme can enhance signal quality by 10 dB in terms of image rejection ratio (IRR). In addition to the I/Q imbalance, the system suffers from CFO, SFO and frequency-time selective channel. To mitigate this, a hybrid optimum OFDM receiver with decision feedback equalizer (DFE) to cope with the CFO, SFO and doubly selective channel. The algorithm firstly estimates the CFO and channel frequency response (CFR) in the coarse estimation, with the help of hybrid classical timing and frequency synchronization algorithms. Afterwards, a pilot-aided polynomial interpolation channel estimation, combined with a low complexity DFE scheme, based on minimum mean squared error (MMSE) criteria, is developed to alleviate the impact of the residual SFO, CFO, and Doppler effect. A subspace-based signal-to-noise ratio (SNR) estimation algorithm is proposed to estimate the SNR in the doubly selective channel. This provides prior knowledge for MMSE-DFE and automatic modulation and coding (AMC). Simulation results show that this proposed estimation algorithm significantly improves the system performance. In order to speed up algorithm verification process, an FPGA based co-simulation is developed. Inter-cell interference caused by the co-existence of macro and pico cells has a big impact on system performance. Although an almost blank subframe (ABS) is proposed to mitigate this problem, the residual control signal in the ABS still inevitably causes interference. Hence, a cell-specific reference signal (CRS) interference cancellation algorithm, utilizing the information in the ABS, is proposed. First, the timing and carrier frequency offset of the interference signal is compensated by utilizing the cross-correlation properties of the synchronization signal. Afterwards, the reference signal is generated locally and channel response is estimated by making use of channel statistics. Then, the interference signal is reconstructed based on the previous estimate of the channel, timing and carrier frequency offset. The interference is mitigated by subtracting the estimation of the interference signal and LLR puncturing. The block error rate (BLER) performance of the signal is notably improved by this algorithm, according to the simulation results of different channel scenarios. The proposed techniques provide low cost, low complexity solutions for LTE and beyond systems. The simulation and measurements show good overall system performance can be achieved

    Algorithms for channel impairment mitigation in broadband wireless communications

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Timing and Carrier Synchronization in Wireless Communication Systems: A Survey and Classification of Research in the Last 5 Years

    Get PDF
    Timing and carrier synchronization is a fundamental requirement for any wireless communication system to work properly. Timing synchronization is the process by which a receiver node determines the correct instants of time at which to sample the incoming signal. Carrier synchronization is the process by which a receiver adapts the frequency and phase of its local carrier oscillator with those of the received signal. In this paper, we survey the literature over the last 5 years (2010–2014) and present a comprehensive literature review and classification of the recent research progress in achieving timing and carrier synchronization in single-input single-output (SISO), multiple-input multiple-output (MIMO), cooperative relaying, and multiuser/multicell interference networks. Considering both single-carrier and multi-carrier communication systems, we survey and categorize the timing and carrier synchronization techniques proposed for the different communication systems focusing on the system model assumptions for synchronization, the synchronization challenges, and the state-of-the-art synchronization solutions and their limitations. Finally, we envision some future research directions

    CELLULAR-ENABLED MACHINE TYPE COMMUNICATIONS: RECENT TECHNOLOGIES AND COGNITIVE RADIO APPROACHES

    Get PDF
    The scarcity of bandwidth has always been the main obstacle for providing reliable high data-rate wireless links, which are in great demand to accommodate nowadays and immediate future wireless applications. In addition, recent reports have showed inefficient usage and under-utilization of the available bandwidth. Cognitive radio (CR) has recently emerged as a promising solution to enhance the spectrum utilization, where it offers the ability for unlicensed users to access the licensed spectrum opportunistically. By allowing opportunistic spectrum access which is the main concept for the interweave network model, the overall spectrum utilization can be improved. This requires cognitive radio networks (CRNs) to consider the spectrum sensing and monitoring as an essential enabling process for the interweave network model. Machine-to-machine (M2M) communication, which is the basic enabler for the Internet-of-Things (IoT), has emerged to be a key element in future networks. Machines are expected to communicate with each other exchanging information and data without human intervention. The ultimate objective of M2M communications is to construct comprehensive connections among all machines distributed over an extensive coverage area. Due to the radical change in the number of users, the network has to carefully utilize the available resources in order to maintain reasonable quality-of-service (QoS). Generally, one of the most important resources in wireless communications is the frequency spectrum. To utilize the frequency spectrum in IoT environment, it can be argued that cognitive radio concept is a possible solution from the cost and performance perspectives. Thus, supporting numerous number of machines is possible by employing dual-mode base stations which can apply cognitive radio concept in addition to the legacy licensed frequency assignment. In this thesis, a detailed review of the state of the art related to the application of spectrum sensing in CR communications is considered. We present the latest advances related to the implementation of the legacy spectrum sensing approaches. We also address the implementation challenges for cognitive radios in the direction of spectrum sensing and monitoring. We propose a novel algorithm to solve the reduced throughput issue due to the scheduled spectrum sensing and monitoring. Further, two new architectures are considered to significantly reduce the power consumption required by the CR to enable wideband sensing. Both systems rely on the 1-bit quantization at the receiver side. The system performance is analytically investigated and simulated. Also, complexity and power consumption are investigated and studied. Furthermore, we address the challenges that are expected from the next generation M2M network as an integral part of the future IoT. This mainly includes the design of low-power low-cost machine with reduced bandwidth. The trade-off between cost, feasibility, and performance are also discussed. Because of the relaxation of the frequency and spatial diversities, in addition, to enabling the extended coverage mode, initial synchronization and cell search have new challenges for cellular-enabled M2M systems. We study conventional solutions with their pros and cons including timing acquisition, cell detection, and frequency offset estimation algorithms. We provide a technique to enhance the performance in the presence of the harsh detection environment for LTE-based machines. Furthermore, we present a frequency tracking algorithm for cellular M2M systems that utilizes the new repetitive feature of the broadcast channel symbols in next generation Long Term Evolution (LTE) systems. In the direction of narrowband IoT support, we propose a cell search and initial synchronization algorithm that utilizes the new set of narrowband synchronization signals. The proposed algorithms have been simulated at very low signal to noise ratios and in different fading environments

    Blind Estimation of OFDM System Parameters for Automatic Signal Identification

    Get PDF
    Orthogonal frequency division multiplexing (OFDM) has gained worldwide popular­ ity in broadband wireless communications recently due to its high spectral efficiency and robust performance in multipath fading channels. A growing trend of smart receivers which can support and adapt to multiple OFDM based standards auto­ matically brings the necessity of identifying different standards by estimating OFDM system parameters without a priori information. Consequently, blind estimation and identification of OFDM system parameters has received considerable research atten­ tions. Many techniques have been developed for blind estimation of various OFDM parameters, whereas estimation of the sampling frequency is often ignored. Further­ more, the estimated sampling frequency of an OFDM signal has to be very accurate for data recovery due to the high sensitivity of OFDM signals to sampling clock offset. To address the aforementioned problems, we propose a two-step cyclostation- arity based algorithm with low computational complexity to precisely estimate the sampling frequency of a received oversampled OFDM signal. With this estimated sampling frequency and oversampling ratio, other OFDM system parameters, i.e., the number of subcarriers, symbol duration and cyclic prefix (CP) length can be es­ timated based on the cyclic property from CP sequentially. In addition, modulation scheme used in the OFDM can be classified based on the higher-order statistics (HOS) of the frequency domain OFDM signal. All the proposed algorithms are verified by a lab testing system including a vec­ tor signal generator, a spectrum analyzer and a high speed digitizer. The evaluation results confirm the high precision and efficacy of the proposed algorithm in realistic scenarios

    On Low-Pass Phase Noise Mitigation in OFDM System for mmWave Communications

    Get PDF
    corecore