2,819 research outputs found

    Timing and Carrier Synchronization in Wireless Communication Systems: A Survey and Classification of Research in the Last 5 Years

    Get PDF
    Timing and carrier synchronization is a fundamental requirement for any wireless communication system to work properly. Timing synchronization is the process by which a receiver node determines the correct instants of time at which to sample the incoming signal. Carrier synchronization is the process by which a receiver adapts the frequency and phase of its local carrier oscillator with those of the received signal. In this paper, we survey the literature over the last 5 years (2010–2014) and present a comprehensive literature review and classification of the recent research progress in achieving timing and carrier synchronization in single-input single-output (SISO), multiple-input multiple-output (MIMO), cooperative relaying, and multiuser/multicell interference networks. Considering both single-carrier and multi-carrier communication systems, we survey and categorize the timing and carrier synchronization techniques proposed for the different communication systems focusing on the system model assumptions for synchronization, the synchronization challenges, and the state-of-the-art synchronization solutions and their limitations. Finally, we envision some future research directions

    Near-Instantaneously Adaptive HSDPA-Style OFDM Versus MC-CDMA Transceivers for WIFI, WIMAX, and Next-Generation Cellular Systems

    No full text
    Burts-by-burst (BbB) adaptive high-speed downlink packet access (HSDPA) style multicarrier systems are reviewed, identifying their most critical design aspects. These systems exhibit numerous attractive features, rendering them eminently eligible for employment in next-generation wireless systems. It is argued that BbB-adaptive or symbol-by-symbol adaptive orthogonal frequency division multiplex (OFDM) modems counteract the near instantaneous channel quality variations and hence attain an increased throughput or robustness in comparison to their fixed-mode counterparts. Although they act quite differently, various diversity techniques, such as Rake receivers and space-time block coding (STBC) are also capable of mitigating the channel quality variations in their effort to reduce the bit error ratio (BER), provided that the individual antenna elements experience independent fading. By contrast, in the presence of correlated fading imposed by shadowing or time-variant multiuser interference, the benefits of space-time coding erode and it is unrealistic to expect that a fixed-mode space-time coded system remains capable of maintaining a near-constant BER

    Multiuser MIMO-OFDM for Next-Generation Wireless Systems

    No full text
    This overview portrays the 40-year evolution of orthogonal frequency division multiplexing (OFDM) research. The amelioration of powerful multicarrier OFDM arrangements with multiple-input multiple-output (MIMO) systems has numerous benefits, which are detailed in this treatise. We continue by highlighting the limitations of conventional detection and channel estimation techniques designed for multiuser MIMO OFDM systems in the so-called rank-deficient scenarios, where the number of users supported or the number of transmit antennas employed exceeds the number of receiver antennas. This is often encountered in practice, unless we limit the number of users granted access in the base station’s or radio port’s coverage area. Following a historical perspective on the associated design problems and their state-of-the-art solutions, the second half of this treatise details a range of classic multiuser detectors (MUDs) designed for MIMO-OFDM systems and characterizes their achievable performance. A further section aims for identifying novel cutting-edge genetic algorithm (GA)-aided detector solutions, which have found numerous applications in wireless communications in recent years. In an effort to stimulate the cross pollination of ideas across the machine learning, optimization, signal processing, and wireless communications research communities, we will review the broadly applicable principles of various GA-assisted optimization techniques, which were recently proposed also for employment inmultiuser MIMO OFDM. In order to stimulate new research, we demonstrate that the family of GA-aided MUDs is capable of achieving a near-optimum performance at the cost of a significantly lower computational complexity than that imposed by their optimum maximum-likelihood (ML) MUD aided counterparts. The paper is concluded by outlining a range of future research options that may find their way into next-generation wireless systems

    Wavelet-based multi-carrier code division multiple access systems

    Get PDF
    EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    A General Framework for Analyzing, Characterizing, and Implementing Spectrally Modulated, Spectrally Encoded Signals

    Get PDF
    Fourth generation (4G) communications will support many capabilities while providing universal, high speed access. One potential enabler for these capabilities is software defined radio (SDR). When controlled by cognitive radio (CR) principles, the required waveform diversity is achieved via a synergistic union called CR-based SDR. Research is rapidly progressing in SDR hardware and software venues, but current CR-based SDR research lacks the theoretical foundation and analytic framework to permit efficient implementation. This limitation is addressed here by introducing a general framework for analyzing, characterizing, and implementing spectrally modulated, spectrally encoded (SMSE) signals within CR-based SDR architectures. Given orthogonal frequency division multiplexing (OFDM) is a 4G candidate signal, OFDM-based signals are collectively classified as SMSE since modulation and encoding are spectrally applied. The proposed framework provides analytic commonality and unification of SMSE signals. Applicability is first shown for candidate 4G signals, and resultant analytic expressions agree with published results. Implementability is then demonstrated in multiple coexistence scenarios via modeling and simulation to reinforce practical utility

    Performance enhancement for LTE and beyond systems

    Get PDF
    A thesis submitted to the University of Bedfordshire, in partial fulfilment of the requirements for the degree of Doctor of PhilosophyWireless communication systems have undergone fast development in recent years. Based on GSM/EDGE and UMTS/HSPA, the 3rd Generation Partnership Project (3GPP) specified the Long Term Evolution (LTE) standard to cope with rapidly increasing demands, including capacity, coverage, and data rate. To achieve this goal, several key techniques have been adopted by LTE, such as Multiple-Input and Multiple-Output (MIMO), Orthogonal Frequency-Division Multiplexing (OFDM), and heterogeneous network (HetNet). However, there are some inherent drawbacks regarding these techniques. Direct conversion architecture is adopted to provide a simple, low cost transmitter solution. The problem of I/Q imbalance arises due to the imperfection of circuit components; the orthogonality of OFDM is vulnerable to carrier frequency offset (CFO) and sampling frequency offset (SFO). The doubly selective channel can also severely deteriorate the receiver performance. In addition, the deployment of Heterogeneous Network (HetNet), which permits the co-existence of macro and pico cells, incurs inter-cell interference for cell edge users. The impact of these factors then results in significant degradation in relation to system performance. This dissertation aims to investigate the key techniques which can be used to mitigate the above problems. First, I/Q imbalance for the wideband transmitter is studied and a self-IQ-demodulation based compensation scheme for frequencydependent (FD) I/Q imbalance is proposed. This combats the FD I/Q imbalance by using the internal diode of the transmitter and a specially designed test signal without any external calibration instruments or internal low-IF feedback path. The instrument test results show that the proposed scheme can enhance signal quality by 10 dB in terms of image rejection ratio (IRR). In addition to the I/Q imbalance, the system suffers from CFO, SFO and frequency-time selective channel. To mitigate this, a hybrid optimum OFDM receiver with decision feedback equalizer (DFE) to cope with the CFO, SFO and doubly selective channel. The algorithm firstly estimates the CFO and channel frequency response (CFR) in the coarse estimation, with the help of hybrid classical timing and frequency synchronization algorithms. Afterwards, a pilot-aided polynomial interpolation channel estimation, combined with a low complexity DFE scheme, based on minimum mean squared error (MMSE) criteria, is developed to alleviate the impact of the residual SFO, CFO, and Doppler effect. A subspace-based signal-to-noise ratio (SNR) estimation algorithm is proposed to estimate the SNR in the doubly selective channel. This provides prior knowledge for MMSE-DFE and automatic modulation and coding (AMC). Simulation results show that this proposed estimation algorithm significantly improves the system performance. In order to speed up algorithm verification process, an FPGA based co-simulation is developed. Inter-cell interference caused by the co-existence of macro and pico cells has a big impact on system performance. Although an almost blank subframe (ABS) is proposed to mitigate this problem, the residual control signal in the ABS still inevitably causes interference. Hence, a cell-specific reference signal (CRS) interference cancellation algorithm, utilizing the information in the ABS, is proposed. First, the timing and carrier frequency offset of the interference signal is compensated by utilizing the cross-correlation properties of the synchronization signal. Afterwards, the reference signal is generated locally and channel response is estimated by making use of channel statistics. Then, the interference signal is reconstructed based on the previous estimate of the channel, timing and carrier frequency offset. The interference is mitigated by subtracting the estimation of the interference signal and LLR puncturing. The block error rate (BLER) performance of the signal is notably improved by this algorithm, according to the simulation results of different channel scenarios. The proposed techniques provide low cost, low complexity solutions for LTE and beyond systems. The simulation and measurements show good overall system performance can be achieved
    • …
    corecore