9,026 research outputs found

    Iterative Joint Channel Estimation and Multi-User Detection for Multiple-Antenna Aided OFDM Systems

    No full text
    Multiple-Input-Multiple-Output (MIMO) Orthogonal Frequency Division Multiplexing (OFDM) systems have recently attracted substantial research interest. However, compared to Single-Input-Single-Output (SISO) systems, channel estimation in the MIMO scenario becomes more challenging, owing to the increased number of independent transmitter-receiver links to be estimated. In the context of the Bell LAyered Space-Time architecture (BLAST) or Space Division Multiple Access (SDMA) multi-user MIMO OFDM systems, none of the known channel estimation techniques allows the number of users to be higher than the number of receiver antennas, which is often referred to as a “rank-deficient” scenario, owing to the constraint imposed by the rank of the MIMO channel matrix. Against this background, in this paper we propose a new Genetic Algorithm (GA) assisted iterative Joint Channel Estimation and Multi-User Detection (GA-JCEMUD) approach for multi-user MIMO SDMA-OFDM systems, which provides an effective solution to the multi-user MIMO channel estimation problem in the above-mentioned rank-deficient scenario. Furthermore, the GAs invoked in the data detection literature can only provide a hard-decision output for the Forward Error Correction (FEC) or channel decoder, which inevitably limits the system’s achievable performance. By contrast, our proposed GA is capable of providing “soft” outputs and hence it becomes capable of achieving an improved performance with the aid of FEC decoders. A range of simulation results are provided to demonstrate the superiority of the proposed scheme. Index Terms—Channel estimation, genetic algorithm, multiple-input-multiple-output, multi-user detection, orthogonal frequency division multiplexing, space division multiple access

    A Linear Multi-User Detector for STBC MC-CDMA Systems based on the Adaptive Implementation of the Minimum-Conditional Bit-Error-Rate Criterion and on Genetic Algorithm-assisted MMSE Channel Estimation

    Get PDF
    The implementation of efficient baseband receivers characterized by affordable computational load is a crucial point in the development of transmission systems exploiting diversity in different domains. In this paper, we are proposing a linear multi-user detector for MIMO MC-CDMA systems with Alamouti’s Space-Time Block Coding, inspired by the concept of Minimum Conditional Bit-Error-Rate (MCBER) and relying on Genetic-Algorithm (GA)-assisted MMSE channel estimation. The MCBER combiner has been implemented in adaptive way by using Least-Mean-Square (LMS) optimization. Firstly, we shall analyze the proposed adaptive MCBER MUD receiver with ideal knowledge of Channel Status Information (CSI). Afterwards, we shall consider the complete receiver structure, encompassing also the non-ideal GA-assisted channel estimation. Simulation results evidenced that the proposed MCBER receiver always outperforms state-of-the-art receiver schemes based on EGC and MMSE criterion exploiting the same degree of channel knowledge (i.e. ideal or estimated CSI)

    Iterative Joint Channel Estimation and Symbol Detection for Multi-User MIMO OFDM

    No full text
    Multiple-Input-Multiple-Output (MIMO) Orthogonal Frequency Division Multiplexing (OFDM) systems have recently attracted substantial research interest. However, compared to Single-Input-Single-Output (SISO) systems, channel estimation in the MIMO scenario becomes more challenging, owing to the increased number of independent transmitter-receiver links to be estimated. In the context of the Bell LAyered Space-Time architecture (BLAST) or Space Division Multiple Access (SDMA) multi-user MIMO OFDM literature, no channel estimation technique allows the number of users to be higher than the number of receiver antennas, which is often referred to as an “overloaded” scenario. In this contribution we propose a new Genetic Algorithm (GA) assisted iterative joint channel estimation and multiuser detection approach for MIMO SDMA-OFDM systems, which exhibits a robust performance in the above-mentioned overloaded scenario. Furthermore, GA-aided Multi-User Detection (MUD) techniques found in the literature can only provide a hard-decision output, while the proposed GA is capable of providing “soft” outputs, hence achieving an improved performance with the aid of channel decoders. Finally, a range of simulation results are provided to demonstrate the superiority of the proposed scheme

    A genetic algorithm-assisted semi-adaptive MMSE multi-user detection for MC-CDMA mobile communication systems

    Get PDF
    In this work, a novel Minimum-Mean Squared-Error (MMSE) multi-user detector is proposed for MC-CDMA transmission systems working over mobile radio channels characterized by time-varying multipath fading. The proposed MUD algorithm is based on a Genetic Algorithm (GA)-assisted per-carrier MMSE criterion. The GA block works in two successive steps: a training-aided step aimed at computing the optimal receiver weights using a very short training sequence, and a decision-directed step aimed at dynamically updating the weights vector during a channel coherence period. Numerical results evidenced BER performances almost coincident with ones yielded by ideal MMSE-MUD based on the perfect knowledge of channel impulse response. The proposed GA-assisted MMSE-MUD clearly outperforms state-of-the-art adaptive MMSE receivers based on deterministic gradient algorithms, especially for high number of transmitting users

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Multiuser MIMO-OFDM for Next-Generation Wireless Systems

    No full text
    This overview portrays the 40-year evolution of orthogonal frequency division multiplexing (OFDM) research. The amelioration of powerful multicarrier OFDM arrangements with multiple-input multiple-output (MIMO) systems has numerous benefits, which are detailed in this treatise. We continue by highlighting the limitations of conventional detection and channel estimation techniques designed for multiuser MIMO OFDM systems in the so-called rank-deficient scenarios, where the number of users supported or the number of transmit antennas employed exceeds the number of receiver antennas. This is often encountered in practice, unless we limit the number of users granted access in the base station’s or radio port’s coverage area. Following a historical perspective on the associated design problems and their state-of-the-art solutions, the second half of this treatise details a range of classic multiuser detectors (MUDs) designed for MIMO-OFDM systems and characterizes their achievable performance. A further section aims for identifying novel cutting-edge genetic algorithm (GA)-aided detector solutions, which have found numerous applications in wireless communications in recent years. In an effort to stimulate the cross pollination of ideas across the machine learning, optimization, signal processing, and wireless communications research communities, we will review the broadly applicable principles of various GA-assisted optimization techniques, which were recently proposed also for employment inmultiuser MIMO OFDM. In order to stimulate new research, we demonstrate that the family of GA-aided MUDs is capable of achieving a near-optimum performance at the cost of a significantly lower computational complexity than that imposed by their optimum maximum-likelihood (ML) MUD aided counterparts. The paper is concluded by outlining a range of future research options that may find their way into next-generation wireless systems

    Implementable Wireless Access for B3G Networks - III: Complexity Reducing Transceiver Structures

    No full text
    This article presents a comprehensive overview of some of the research conducted within Mobile VCE’s Core Wireless Access Research Programme,1 a key focus of which has naturally been on MIMO transceivers. The series of articles offers a coherent view of how the work was structured and comprises a compilation of material that has been presented in detail elsewhere (see references within the article). In this article MIMO channel measurements, analysis, and modeling, which were presented previously in the first article in this series of four, are utilized to develop compact and distributed antenna arrays. Parallel activities led to research into low-complexity MIMO single-user spacetime coding techniques, as well as SISO and MIMO multi-user CDMA-based transceivers for B3G systems. As well as feeding into the industry’s in-house research program, significant extensions of this work are now in hand, within Mobile VCE’s own core activity, aiming toward securing major improvements in delivery efficiency in future wireless systems through crosslayer operation
    corecore