465 research outputs found

    Frequency Domain Independent Component Analysis Applied To Wireless Communications Over Frequency-selective Channels

    Get PDF
    In wireless communications, frequency-selective fading is a major source of impairment for wireless communications. In this research, a novel Frequency-Domain Independent Component Analysis (ICA-F) approach is proposed to blindly separate and deconvolve signals traveling through frequency-selective, slow fading channels. Compared with existing time-domain approaches, the ICA-F is computationally efficient and possesses fast convergence properties. Simulation results confirm the effectiveness of the proposed ICA-F. Orthogonal Frequency Division Multiplexing (OFDM) systems are widely used in wireless communications nowadays. However, OFDM systems are very sensitive to Carrier Frequency Offset (CFO). Thus, an accurate CFO compensation technique is required in order to achieve acceptable performance. In this dissertation, two novel blind approaches are proposed to estimate and compensate for CFO within the range of half subcarrier spacing: a Maximum Likelihood CFO Correction approach (ML-CFOC), and a high-performance, low-computation Blind CFO Estimator (BCFOE). The Bit Error Rate (BER) improvement of the ML-CFOC is achieved at the expense of a modest increase in the computational requirements without sacrificing the system bandwidth or increasing the hardware complexity. The BCFOE outperforms the existing blind CFO estimator [25, 128], referred to as the YG-CFO estimator, in terms of BER and Mean Square Error (MSE), without increasing the computational complexity, sacrificing the system bandwidth, or increasing the hardware complexity. While both proposed techniques outperform the YG-CFO estimator, the BCFOE is better than the ML-CFOC technique. Extensive simulation results illustrate the performance of the ML-CFOC and BCFOE approaches

    SYNCHRONIZATION AND RESOURCE ALLOCATION IN DOWNLINK OFDM SYSTEMS

    Get PDF
    The next generation (4G) wireless systems are expected to provide universal personal and multimedia communications with seamless connection and very high rate transmissions and without regard to the users’ mobility and location. OFDM technique is recognized as one of the leading candidates to provide the wireless signalling for 4G systems. The major challenges in downlink multiuser OFDM based 4G systems include the wireless channel, the synchronization and radio resource management. Thus algorithms are required to achieve accurate timing and frequency offset estimation and the efficient utilization of radio resources such as subcarrier, bit and power allocation. The objectives of the thesis are of two fields. Firstly, we presented the frequency offset estimation algorithms for OFDM systems. Building our work upon the classic single user OFDM architecture, we proposed two FFT-based frequency offset estimation algorithms with low computational complexity. The computer simulation results and comparisons show that the proposed algorithms provide smaller error variance than previous well-known algorithm. Secondly, we presented the resource allocation algorithms for OFDM systems. Building our work upon the downlink multiuser OFDM architecture, we aimed to minimize the total transmit power by exploiting the system diversity through the management of subcarrier allocation, adaptive modulation and power allocation. Particularly, we focused on the dynamic resource allocation algorithms for multiuser OFDM system and multiuser MIMO-OFDM system. For the multiuser OFDM system, we proposed a lowiv complexity channel gain difference based subcarrier allocation algorithm. For the multiuser MIMO-OFDM system, we proposed a unit-power based subcarrier allocation algorithm. These proposed algorithms are all combined with the optimal bit allocation algorithm to achieve the minimal total transmit power. The numerical results and comparisons with various conventional nonadaptive and adaptive algorithmic approaches are provided to show that the proposed resource allocation algorithms improve the system efficiencies and performance given that the Quality of Service (QoS) for each user is guaranteed. The simulation work of this project is based on hand written codes in the platform of the MATLAB R2007b

    Timing and Carrier Synchronization in Wireless Communication Systems: A Survey and Classification of Research in the Last 5 Years

    Get PDF
    Timing and carrier synchronization is a fundamental requirement for any wireless communication system to work properly. Timing synchronization is the process by which a receiver node determines the correct instants of time at which to sample the incoming signal. Carrier synchronization is the process by which a receiver adapts the frequency and phase of its local carrier oscillator with those of the received signal. In this paper, we survey the literature over the last 5 years (2010–2014) and present a comprehensive literature review and classification of the recent research progress in achieving timing and carrier synchronization in single-input single-output (SISO), multiple-input multiple-output (MIMO), cooperative relaying, and multiuser/multicell interference networks. Considering both single-carrier and multi-carrier communication systems, we survey and categorize the timing and carrier synchronization techniques proposed for the different communication systems focusing on the system model assumptions for synchronization, the synchronization challenges, and the state-of-the-art synchronization solutions and their limitations. Finally, we envision some future research directions

    Performance enhancement for LTE and beyond systems

    Get PDF
    A thesis submitted to the University of Bedfordshire, in partial fulfilment of the requirements for the degree of Doctor of PhilosophyWireless communication systems have undergone fast development in recent years. Based on GSM/EDGE and UMTS/HSPA, the 3rd Generation Partnership Project (3GPP) specified the Long Term Evolution (LTE) standard to cope with rapidly increasing demands, including capacity, coverage, and data rate. To achieve this goal, several key techniques have been adopted by LTE, such as Multiple-Input and Multiple-Output (MIMO), Orthogonal Frequency-Division Multiplexing (OFDM), and heterogeneous network (HetNet). However, there are some inherent drawbacks regarding these techniques. Direct conversion architecture is adopted to provide a simple, low cost transmitter solution. The problem of I/Q imbalance arises due to the imperfection of circuit components; the orthogonality of OFDM is vulnerable to carrier frequency offset (CFO) and sampling frequency offset (SFO). The doubly selective channel can also severely deteriorate the receiver performance. In addition, the deployment of Heterogeneous Network (HetNet), which permits the co-existence of macro and pico cells, incurs inter-cell interference for cell edge users. The impact of these factors then results in significant degradation in relation to system performance. This dissertation aims to investigate the key techniques which can be used to mitigate the above problems. First, I/Q imbalance for the wideband transmitter is studied and a self-IQ-demodulation based compensation scheme for frequencydependent (FD) I/Q imbalance is proposed. This combats the FD I/Q imbalance by using the internal diode of the transmitter and a specially designed test signal without any external calibration instruments or internal low-IF feedback path. The instrument test results show that the proposed scheme can enhance signal quality by 10 dB in terms of image rejection ratio (IRR). In addition to the I/Q imbalance, the system suffers from CFO, SFO and frequency-time selective channel. To mitigate this, a hybrid optimum OFDM receiver with decision feedback equalizer (DFE) to cope with the CFO, SFO and doubly selective channel. The algorithm firstly estimates the CFO and channel frequency response (CFR) in the coarse estimation, with the help of hybrid classical timing and frequency synchronization algorithms. Afterwards, a pilot-aided polynomial interpolation channel estimation, combined with a low complexity DFE scheme, based on minimum mean squared error (MMSE) criteria, is developed to alleviate the impact of the residual SFO, CFO, and Doppler effect. A subspace-based signal-to-noise ratio (SNR) estimation algorithm is proposed to estimate the SNR in the doubly selective channel. This provides prior knowledge for MMSE-DFE and automatic modulation and coding (AMC). Simulation results show that this proposed estimation algorithm significantly improves the system performance. In order to speed up algorithm verification process, an FPGA based co-simulation is developed. Inter-cell interference caused by the co-existence of macro and pico cells has a big impact on system performance. Although an almost blank subframe (ABS) is proposed to mitigate this problem, the residual control signal in the ABS still inevitably causes interference. Hence, a cell-specific reference signal (CRS) interference cancellation algorithm, utilizing the information in the ABS, is proposed. First, the timing and carrier frequency offset of the interference signal is compensated by utilizing the cross-correlation properties of the synchronization signal. Afterwards, the reference signal is generated locally and channel response is estimated by making use of channel statistics. Then, the interference signal is reconstructed based on the previous estimate of the channel, timing and carrier frequency offset. The interference is mitigated by subtracting the estimation of the interference signal and LLR puncturing. The block error rate (BLER) performance of the signal is notably improved by this algorithm, according to the simulation results of different channel scenarios. The proposed techniques provide low cost, low complexity solutions for LTE and beyond systems. The simulation and measurements show good overall system performance can be achieved

    Near-Instantaneously Adaptive HSDPA-Style OFDM Versus MC-CDMA Transceivers for WIFI, WIMAX, and Next-Generation Cellular Systems

    No full text
    Burts-by-burst (BbB) adaptive high-speed downlink packet access (HSDPA) style multicarrier systems are reviewed, identifying their most critical design aspects. These systems exhibit numerous attractive features, rendering them eminently eligible for employment in next-generation wireless systems. It is argued that BbB-adaptive or symbol-by-symbol adaptive orthogonal frequency division multiplex (OFDM) modems counteract the near instantaneous channel quality variations and hence attain an increased throughput or robustness in comparison to their fixed-mode counterparts. Although they act quite differently, various diversity techniques, such as Rake receivers and space-time block coding (STBC) are also capable of mitigating the channel quality variations in their effort to reduce the bit error ratio (BER), provided that the individual antenna elements experience independent fading. By contrast, in the presence of correlated fading imposed by shadowing or time-variant multiuser interference, the benefits of space-time coding erode and it is unrealistic to expect that a fixed-mode space-time coded system remains capable of maintaining a near-constant BER
    • …
    corecore