93 research outputs found

    Numerical Simulation and Design Assessment of Limited Feedback Channel Estimation in Massive MIMO Communication System

    Get PDF
    The Internet of Things (IoT) has attracted a great deal of interest in various fields including governments, business, academia as an evolving technology that aims to make anything connected, communicate, and exchange of data. The massive connectivity, stringent energy restrictions, and ultra-reliable transmission requirements are also defined as the most distinctive features of IoT. This feature is a natural IoT supporting technology, as massive multiple input (MIMO) inputs will result in enormous spectral/energy efficiency gains and boost IoT transmission reliability dramatically through a coherent processing of the large-scale antenna array signals. However, the processing is coherent and relies on accurate estimation of channel state information (CSI) between BS and users. Massive multiple input (MIMO) is a powerous support technology that fulfils the Internet of Things' (IoT) energy/spectral performance and reliability needs. However, the benefit of MIMOs is dependent on the availability of CSIs. This research proposes an adaptive sparse channel calculation with limited feedback to estimate accurate and prompt CSIs for large multi-intimate-output systems based on Duplex Frequency Division (DFD) systems. The minimal retro-feedback scheme must retrofit the burden of the base station antennas in a linear proportion. This work offers a narrow feedback algorithm to elevate the burden by means of a MIMO double-way representation (DD) channel using uniform dictionaries linked to the arrival angle and start angle (AoA) (AoD). Although the number of transmission antennas in the BS is high, the algorithms offer an acceptable channel estimation accuracy using a limited number of feedback bits, making it suitable for 5G massively MIMO. The results of the simulation indicate the output limit can be achieved with the proposed algorithm

    Sparsifying Dictionary Learning for Beamspace Channel Representation and Estimation in Millimeter-Wave Massive MIMO

    Full text link
    Millimeter-wave massive multiple-input-multiple-output (mmWave mMIMO) is reported as a key enabler in the fifth-generation communication and beyond. It is customary to use a lens antenna array to transform a mmWave mMIMO channel into a beamspace where the channel exhibits sparsity. Exploiting this sparsity enables the applicability of hybrid precoding and achieves pilot reduction. This beamspace transformation is equivalent to performing a Fourier transformation of the channel. A motivation for the Fourier character of this transformation is the fact that the steering response vectors in antenna arrays are Fourier basis vectors. Still, a Fourier transformation is not necessarily the optimal one, due to many reasons. Accordingly, this paper proposes using a learned sparsifying dictionary as the transformation operator leading to another beamspace. Since the dictionary is obtained by training over actual channel measurements, this transformation is shown to yield two immediate advantages. First, is enhancing channel sparsity, thereby leading to more efficient pilot reduction. Second, is improving the channel representation quality, and thus reducing the underlying power leakage phenomenon. Consequently, this allows for both improved channel estimation and facilitated beam selection in mmWave mMIMO. This is especially the case when the antenna array is not perfectly uniform. Besides, a learned dictionary is also used as the precoding operator for the same reasons. Extensive simulations under various operating scenarios and environments validate the added benefits of using learned dictionaries in improving the channel estimation quality and the beam selectivity, thereby improving the spectral efficiency.Comment: This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessibl
    • …
    corecore