138 research outputs found

    Scattered Pilots and Virtual Carriers Based Frequency Offset Tracking for OFDM Systems: Algorithms, Identifiability, and Performance Analysis

    Get PDF
    In this paper, we propose a novel carrier frequency offset (CFO) tracking algorithm for orthogonal frequency division multiplexing (OFDM) systems by exploiting scattered pilot carriers and virtual carriers embedded in the existing OFDM standards. Assuming that the channel remains constant during two consecutive OFDM blocks and perfect timing, a CFO tracking algorithm is proposed using the limited number of pilot carriers in each OFDM block. Identifiability of this pilot based algorithm is fully discussed under the noise free environment, and a constellation rotation strategy is proposed to eliminate the c-ambiguity for arbitrary constellations. A weighted algorithm is then proposed by considering both scattered pilots and virtual carriers. We find that, the pilots increase the performance accuracy of the algorithm, while the virtual carriers reduce the chance of CFO outlier. Therefore, the proposed tracking algorithm is able to achieve full range CFO estimation, can be used before channel estimation, and could provide improved performance compared to existing algorithms. The asymptotic mean square error (MSE) of the proposed algorithm is derived and simulation results agree with the theoretical analysis

    MIMO-OFDM channel estimation in the presence of carrier frequency offset

    Get PDF
    A multiple-input multiple-output (MIMO) wireless communication system with orthogonal frequency division multiplexing (OFDM) is expected to be a promising scheme. However, the estimation of the carrier frequency offset (CFO) and the channel parameters is a great challenging task. In this paper, a maximum-likelihood- (ML-) based algorithm is proposed to jointly estimate the frequency-selective channels and the CFO in MIMO-OFDM by using a block-type pilot. The proposed algorithm is capable of dealing with the CFO range nearly ±1/2 useful OFDM signal bandwidth. Furthermore, the cases with timing error and unknown channel order are discussed. The Cramér-Rao bound (CRB) for the problem is developed to evaluate the performance of the algorithm. Computer simulations show that the proposed algorithm can exploit the gain from multiantenna to improve effectively the estimation performance and achieve the CRB in high signal-to-noise ratio (SNR). © 2005 Hindawi Publishing Corporation

    Performance enhancement for LTE and beyond systems

    Get PDF
    A thesis submitted to the University of Bedfordshire, in partial fulfilment of the requirements for the degree of Doctor of PhilosophyWireless communication systems have undergone fast development in recent years. Based on GSM/EDGE and UMTS/HSPA, the 3rd Generation Partnership Project (3GPP) specified the Long Term Evolution (LTE) standard to cope with rapidly increasing demands, including capacity, coverage, and data rate. To achieve this goal, several key techniques have been adopted by LTE, such as Multiple-Input and Multiple-Output (MIMO), Orthogonal Frequency-Division Multiplexing (OFDM), and heterogeneous network (HetNet). However, there are some inherent drawbacks regarding these techniques. Direct conversion architecture is adopted to provide a simple, low cost transmitter solution. The problem of I/Q imbalance arises due to the imperfection of circuit components; the orthogonality of OFDM is vulnerable to carrier frequency offset (CFO) and sampling frequency offset (SFO). The doubly selective channel can also severely deteriorate the receiver performance. In addition, the deployment of Heterogeneous Network (HetNet), which permits the co-existence of macro and pico cells, incurs inter-cell interference for cell edge users. The impact of these factors then results in significant degradation in relation to system performance. This dissertation aims to investigate the key techniques which can be used to mitigate the above problems. First, I/Q imbalance for the wideband transmitter is studied and a self-IQ-demodulation based compensation scheme for frequencydependent (FD) I/Q imbalance is proposed. This combats the FD I/Q imbalance by using the internal diode of the transmitter and a specially designed test signal without any external calibration instruments or internal low-IF feedback path. The instrument test results show that the proposed scheme can enhance signal quality by 10 dB in terms of image rejection ratio (IRR). In addition to the I/Q imbalance, the system suffers from CFO, SFO and frequency-time selective channel. To mitigate this, a hybrid optimum OFDM receiver with decision feedback equalizer (DFE) to cope with the CFO, SFO and doubly selective channel. The algorithm firstly estimates the CFO and channel frequency response (CFR) in the coarse estimation, with the help of hybrid classical timing and frequency synchronization algorithms. Afterwards, a pilot-aided polynomial interpolation channel estimation, combined with a low complexity DFE scheme, based on minimum mean squared error (MMSE) criteria, is developed to alleviate the impact of the residual SFO, CFO, and Doppler effect. A subspace-based signal-to-noise ratio (SNR) estimation algorithm is proposed to estimate the SNR in the doubly selective channel. This provides prior knowledge for MMSE-DFE and automatic modulation and coding (AMC). Simulation results show that this proposed estimation algorithm significantly improves the system performance. In order to speed up algorithm verification process, an FPGA based co-simulation is developed. Inter-cell interference caused by the co-existence of macro and pico cells has a big impact on system performance. Although an almost blank subframe (ABS) is proposed to mitigate this problem, the residual control signal in the ABS still inevitably causes interference. Hence, a cell-specific reference signal (CRS) interference cancellation algorithm, utilizing the information in the ABS, is proposed. First, the timing and carrier frequency offset of the interference signal is compensated by utilizing the cross-correlation properties of the synchronization signal. Afterwards, the reference signal is generated locally and channel response is estimated by making use of channel statistics. Then, the interference signal is reconstructed based on the previous estimate of the channel, timing and carrier frequency offset. The interference is mitigated by subtracting the estimation of the interference signal and LLR puncturing. The block error rate (BLER) performance of the signal is notably improved by this algorithm, according to the simulation results of different channel scenarios. The proposed techniques provide low cost, low complexity solutions for LTE and beyond systems. The simulation and measurements show good overall system performance can be achieved

    Timing and Carrier Synchronization in Wireless Communication Systems: A Survey and Classification of Research in the Last 5 Years

    Get PDF
    Timing and carrier synchronization is a fundamental requirement for any wireless communication system to work properly. Timing synchronization is the process by which a receiver node determines the correct instants of time at which to sample the incoming signal. Carrier synchronization is the process by which a receiver adapts the frequency and phase of its local carrier oscillator with those of the received signal. In this paper, we survey the literature over the last 5 years (2010–2014) and present a comprehensive literature review and classification of the recent research progress in achieving timing and carrier synchronization in single-input single-output (SISO), multiple-input multiple-output (MIMO), cooperative relaying, and multiuser/multicell interference networks. Considering both single-carrier and multi-carrier communication systems, we survey and categorize the timing and carrier synchronization techniques proposed for the different communication systems focusing on the system model assumptions for synchronization, the synchronization challenges, and the state-of-the-art synchronization solutions and their limitations. Finally, we envision some future research directions

    Waveform Advancements and Synchronization Techniques for Generalized Frequency Division Multiplexing

    Get PDF
    To enable a new level of connectivity among machines as well as between people and machines, future wireless applications will demand higher requirements on data rates, response time, and reliability from the communication system. This will lead to a different system design, comprising a wide range of deployment scenarios. One important aspect is the evolution of physical layer (PHY), specifically the waveform modulation. The novel generalized frequency division multiplexing (GFDM) technique is a prominent proposal for a flexible block filtered multicarrier modulation. This thesis introduces an advanced GFDM concept that enables the emulation of other prominent waveform candidates in scenarios where they perform best. Hence, a unique modulation framework is presented that is capable of addressing a wide range of scenarios and to upgrade the PHY for 5G networks. In particular, for a subset of system parameters of the modulation framework, the problem of symbol time offset (STO) and carrier frequency offset (CFO) estimation is investigated and synchronization approaches, which can operate in burst and continuous transmissions, are designed. The first part of this work presents the modulation principles of prominent 5G candidate waveforms and then focuses on the GFDM basic and advanced attributes. The GFDM concept is extended towards the use of OQAM, introducing the novel frequency-shift OQAM-GFDM, and a new low complexity model based on signal processing carried out in the time domain. A new prototype filter proposal highlights the benefits obtained in terms of a reduced out-of-band (OOB) radiation and more attractive hardware implementation cost. With proper parameterization of the advanced GFDM, the achieved gains are applicable to other filtered OFDM waveforms. In the second part, a search approach for estimating STO and CFO in GFDM is evaluated. A self-interference metric is proposed to quantify the effective SNR penalty caused by the residual time and frequency misalignment or intrinsic inter-symbol interference (ISI) and inter-carrier interference (ICI) for arbitrary pulse shape design in GFDM. In particular, the ICI can be used as a non-data aided approach for frequency estimation. Then, GFDM training sequences, defined either as an isolated preamble or embedded as a midamble or pseudo-circular pre/post-amble, are designed. Simulations show better OOB emission and good estimation results, either comparable or superior, to state-of-the-art OFDM system in wireless channels

    Algorithm-Architecture Co-Design for Digital Front-Ends in Mobile Receivers

    Get PDF
    The methodology behind this work has been to use the concept of algorithm-hardware co-design to achieve efficient solutions related to the digital front-end in mobile receivers. It has been shown that, by looking at algorithms and hardware architectures together, more efficient solutions can be found; i.e., efficient with respect to some design measure. In this thesis the main focus have been placed on two such parameters; first reduced complexity algorithms to lower energy consumptions at limited performance degradation, secondly to handle the increasing number of wireless standards that preferably should run on the same hardware platform. To be able to perform this task it is crucial to understand both sides of the table, i.e., both algorithms and concepts for wireless communication as well as the implications arising on the hardware architecture. It is easier to handle the high complexity by separating those disciplines in a way of layered abstraction. However, this representation is imperfect, since many interconnected "details" belonging to different layers are lost in the attempt of handling the complexity. This results in poor implementations and the design of mobile terminals is no exception. Wireless communication standards are often designed based on mathematical algorithms with theoretical boundaries, with few considerations to actual implementation constraints such as, energy consumption, silicon area, etc. This thesis does not try to remove the layer abstraction model, given its undeniable advantages, but rather uses those cross-layer "details" that went missing during the abstraction. This is done in three manners: In the first part, the cross-layer optimization is carried out from the algorithm perspective. Important circuit design parameters, such as quantization are taken into consideration when designing the algorithm for OFDM symbol timing, CFO, and SNR estimation with a single bit, namely, the Sign-Bit. Proof-of-concept circuits were fabricated and showed high potential for low-end receivers. In the second part, the cross-layer optimization is accomplished from the opposite side, i.e., the hardware-architectural side. A SDR architecture is known for its flexibility and scalability over many applications. In this work a filtering application is mapped into software instructions in the SDR architecture in order to make filtering-specific modules redundant, and thus, save silicon area. In the third and last part, the optimization is done from an intermediate point within the algorithm-architecture spectrum. Here, a heterogeneous architecture with a combination of highly efficient and highly flexible modules is used to accomplish initial synchronization in at least two concurrent OFDM standards. A demonstrator was build capable of performing synchronization in any two standards, including LTE, WiFi, and DVB-H

    Timing synchronization in MIMO-OFDM systems

    Get PDF
    OFDM (Orthogonal Frequency Division Multiplexing) provides a promising physical layer for 4G and 3GPP LTE Systems in terms of efficient use of bandwidth and high data rates. It is used in several applications likeWiFi (IEEE 802.11n),WiMax (IEEE 802.16), Digital Audio Broadcasting (DAB), Digital Video Broadcasting (DVB) and so on. OFDM suffers from inter-symbol interference and inter-carrier interference in wireless and fading environments and it is important to estimate and correct the start of OFDM symbol efficiently to reduce timing and frequency offset errors. Synchronization issues in OFDM are crucial and can lead to certain amount of information loss if they are not properly addressed. There are two modes of implementation forDigital Video Broadcasting-Terrestrial (DVB-T) and this thesis implements the 2K mode. It highlights the implementation of OFDM in DVB-T according to the European Telecommunications Standards Institute (ETSI) . It mainly focuses on the timing offset problem present in OFDM systems and its proposed solution using Cyclic Prefix (CP) as a modified Schmidl and Cox’s (SC) algorithm. Simulations were performed to compare the different synchronization methods with different amount of timing offsets and under different channel environments
    • …
    corecore