897 research outputs found

    Performance evaluation of channel estimation techniques for MIMO-OFDM systems with adaptive sub-carrier allocation

    Get PDF

    Performance Analysis and Enhancement of Multiband OFDM for UWB Communications

    Full text link
    In this paper, we analyze the frequency-hopping orthogonal frequency-division multiplexing (OFDM) system known as Multiband OFDM for high-rate wireless personal area networks (WPANs) based on ultra-wideband (UWB) transmission. Besides considering the standard, we also propose and study system performance enhancements through the application of Turbo and Repeat-Accumulate (RA) codes, as well as OFDM bit-loading. Our methodology consists of (a) a study of the channel model developed under IEEE 802.15 for UWB from a frequency-domain perspective suited for OFDM transmission, (b) development and quantification of appropriate information-theoretic performance measures, (c) comparison of these measures with simulation results for the Multiband OFDM standard proposal as well as our proposed extensions, and (d) the consideration of the influence of practical, imperfect channel estimation on the performance. We find that the current Multiband OFDM standard sufficiently exploits the frequency selectivity of the UWB channel, and that the system performs in the vicinity of the channel cutoff rate. Turbo codes and a reduced-complexity clustered bit-loading algorithm improve the system power efficiency by over 6 dB at a data rate of 480 Mbps.Comment: 32 pages, 10 figures, 1 table. Submitted to the IEEE Transactions on Wireless Communications (Sep. 28, 2005). Minor revisions based on reviewers' comments (June 23, 2006

    Optimum Averaging of Superimposed Training Schemes in OFDM under Realistic Time-Variant Channels

    Get PDF
    The current global bandwidth shortage in orthogonal frequency division multiplexing (OFDM)-based systems motivates the use of more spectrally efficient techniques. Superimposed training (ST) is a candidate in this regard because it exhibits no information rate loss. Additionally, it is very flexible to deploy and it requires low computational cost. However, data symbols sent together with training sequences cause an intrinsic interference. Previous studies, based on an oversimplified channel (a quasi-static channel model) have solved this interference by averaging the received signal over the coherence time. In this paper, the mean square error (MSE) of the channel estimation is minimized in a realistic time-variant scenario. The optimization problem is stated and theoretical derivations are presented to attain the optimum amount of OFDM symbols to be averaged. The derived optimal value for averaging is dependent on the signal-to-noise ratio (SNR) and it provides a better MSE, of up to two orders of magnitude, than the amount given by the coherence time. Moreover, in most cases, the optimal number of OFDM symbols for averaging is much shorter, about 90% reduction of the coherence time, thus it provides a decrease of the system delay. Therefore, these results match the goal of improving performance in terms of channel estimation error while getting even better energy efficiency, and reducing delays.This work was supported by the Spanish National Project Hybrid Terrestrial/Satellite Air Interface for 5G and Beyond - Areas of Dif-cult Access (TERESA-ADA) [Ministerio de EconomĂ­a y Competitividad (MINECO)/Agencia Estatal de InvestigaciĂłn (AEI)/Fondo Europeo de Desarrollo Regional (FEDER), UniĂłn Europea (UE)] under Grant TEC2017-90093-C3-2-R
    • …
    corecore