20 research outputs found

    Design Simulation and Performance Assessment of Improved Channel Estimation for Millimeter Wave Massive MIMO Systems

    Get PDF
    In this paper, we have optimize specificities with the use of massive MIMO in 5 G systems. Massive MIMO uses a large number, low cost and low power antennas at the base stations. These antennas provide benefit such as improved spectrum performance, which allows the base station to serve more users, reduced latency due to reduced fading power consumption and much more. By employing the lens antenna array, beam space MIMO can utilize beam selection to reduce the number of required RF chains in mm Wave massive MIMO systems without obvious performance loss. However, to achieve the capacity-approaching performance, beam selection requires the accurate information of beam space channel of large size, which is challenging, especially when the number of RF chains is limited. To solve this problem, in this paper we propose a reliable support detection (SD)-based channel estimation scheme. In this work we first design an adaptive selecting network for mm-wave massive MIMO systems with lens antenna array, and based on this network, we further formulate the beam space channel estimation problem as a sparse signal recovery problem. Then, by fully utilizing the structural characteristics of the mm-wave beam space channel, we propose a support detection (SD)-based channel estimation scheme with reliable performance and low pilot overhead. Finally, the performance and complexity analyses are provided to prove that the proposed SD-based channel estimation scheme can estimate the support of sparse beam space channel with comparable or higher accuracy than conventional schemes. Simulation results verify that the proposed SD-based channel estimation scheme outperforms conventional schemes and enjoys satisfying accuracy even in the low SNR region as the structural characteristics of beam space channel can be exploited

    Numerical Simulation and Design Assessment of Limited Feedback Channel Estimation in Massive MIMO Communication System

    Get PDF
    The Internet of Things (IoT) has attracted a great deal of interest in various fields including governments, business, academia as an evolving technology that aims to make anything connected, communicate, and exchange of data. The massive connectivity, stringent energy restrictions, and ultra-reliable transmission requirements are also defined as the most distinctive features of IoT. This feature is a natural IoT supporting technology, as massive multiple input (MIMO) inputs will result in enormous spectral/energy efficiency gains and boost IoT transmission reliability dramatically through a coherent processing of the large-scale antenna array signals. However, the processing is coherent and relies on accurate estimation of channel state information (CSI) between BS and users. Massive multiple input (MIMO) is a powerous support technology that fulfils the Internet of Things' (IoT) energy/spectral performance and reliability needs. However, the benefit of MIMOs is dependent on the availability of CSIs. This research proposes an adaptive sparse channel calculation with limited feedback to estimate accurate and prompt CSIs for large multi-intimate-output systems based on Duplex Frequency Division (DFD) systems. The minimal retro-feedback scheme must retrofit the burden of the base station antennas in a linear proportion. This work offers a narrow feedback algorithm to elevate the burden by means of a MIMO double-way representation (DD) channel using uniform dictionaries linked to the arrival angle and start angle (AoA) (AoD). Although the number of transmission antennas in the BS is high, the algorithms offer an acceptable channel estimation accuracy using a limited number of feedback bits, making it suitable for 5G massively MIMO. The results of the simulation indicate the output limit can be achieved with the proposed algorithm

    A Learnable Optimization and Regularization Approach to Massive MIMO CSI Feedback

    Get PDF
    Channel state information (CSI) plays a critical role in achieving the potential benefits of massive multiple input multiple output (MIMO) systems. In frequency division duplex (FDD) massive MIMO systems, the base station (BS) relies on sustained and accurate CSI feedback from the users. However, due to the large number of antennas and users being served in massive MIMO systems, feedback overhead can become a bottleneck. In this paper, we propose a model-driven deep learning method for CSI feedback, called learnable optimization and regularization algorithm (LORA). Instead of using l1-norm as the regularization term, a learnable regularization module is introduced in LORA to automatically adapt to the characteristics of CSI. We unfold the conventional iterative shrinkage-thresholding algorithm (ISTA) to a neural network and learn both the optimization process and regularization term by end-toend training. We show that LORA improves the CSI feedback accuracy and speed. Besides, a novel learnable quantization method and the corresponding training scheme are proposed, and it is shown that LORA can operate successfully at different bit rates, providing flexibility in terms of the CSI feedback overhead. Various realistic scenarios are considered to demonstrate the effectiveness and robustness of LORA through numerical simulations

    Performance Prediction of Underwater Acoustic Communications Based on Channel Impulse Responses

    Get PDF
    Featured Application: Convolutional neural networks are used on the channel impulse response data to predict the performance of underwater acoustic communications. Abstract: Predicting the channel quality for an underwater acoustic communication link is not a straightforward task. Previous approaches have focused on either physical observations of weather or engineered signal features, some of which require substantial processing to obtain. This work applies a convolutional neural network to the channel impulse responses, allowing the network to learn the features that are useful in predicting the channel quality. Results obtained are comparable or better than conventional supervised learning models, depending on the dataset. The universality of the learned features is also demonstrated by strong prediction performance when transferring from a more complex underwater acoustic channel to a simpler one

    Context-Tree-Based Lossy Compression and Its Application to CSI Representation

    Full text link
    We propose novel compression algorithms for time-varying channel state information (CSI) in wireless communications. The proposed scheme combines (lossy) vector quantisation and (lossless) compression. First, the new vector quantisation technique is based on a class of parametrised companders applied on each component of the normalised CSI vector. Our algorithm chooses a suitable compander in an intuitively simple way whenever empirical data are available. Then, the sequences of quantisation indices are compressed using a context-tree-based approach. Essentially, we update the estimate of the conditional distribution of the source at each instant and encode the current symbol with the estimated distribution. The algorithms have low complexity, are linear-time in both the spatial dimension and time duration, and can be implemented in an online fashion. We run simulations to demonstrate the effectiveness of the proposed algorithms in such scenarios.Comment: 12 pages, 9 figures. Accepted for publication in the IEEE Transactions on Communication
    corecore