594 research outputs found

    Modeling and Performance of Uplink Cache-Enabled Massive MIMO Heterogeneous Networks

    Get PDF
    A significant burden on wireless networks is brought by the uploading of user-generated contents to the Internet by means of applications such as social media. To cope with this mobile data tsunami, we develop a novel multiple-input multiple-output (MIMO) network architecture with randomly located base stations (BSs) a large number of antennas employing cache-enabled uplink transmission. In particular, we formulate a scenario, where the users upload their content to their strongest BSs, which are Poisson point process distributed. In addition, the BSs, exploiting the benefits of massive MIMO, upload their contents to the core network by means of a finite-rate backhaul. After proposing the caching policies, where we propose the modified von Mises distribution as the popularity distribution function, we derive the outage probability and the average delivery rate by taking advantage of tools from the deterministic equivalent and stochastic geometry analyses. Numerical results investigate the realistic performance gains of the proposed heterogeneous cache-enabled uplink on the network in terms of cardinal operating parameters. For example, insights regarding the BSs storage size are exposed. Moreover, the impacts of the key parameters such as the file popularity distribution and the target bitrate are investigated. Specifically, the outage probability decreases if the storage size is increased, while the average delivery rate increases. In addition, the concentration parameter, defining the number of files stored at the intermediate nodes (popularity), affects the proposed metrics directly. Furthermore, a higher target rate results in higher outage because fewer users obey this constraint. Also, we demonstrate that a denser network decreases the outage and increases the delivery rate. Hence, the introduction of caching at the uplink of the system design ameliorates the network performance.Peer reviewe

    Nuts and Bolts of a Realistic Stochastic Geometric Analysis of mmWave HetNets: Hardware Impairments and Channel Aging

    Get PDF
    © 2019 IEEE.Motivated by heterogeneous network (HetNet) design in improving coverage and by millimeter-wave (mmWave) transmission offering an abundance of extra spectrum, we present a general analytical framework shedding light on the downlink of realistic mmWave HetNets consisting of K tiers of randomly located base stations. Specifically, we model, by virtue of stochastic geometry tools, the multi-Tier multi-user (MU) multiple-input multiple-output (MIMO) mmWave network degraded by the inevitable residual additive transceiver hardware impairments (RATHIs) and channel aging. Given this setting, we derive the coverage probability and the area spectral efficiency (ASE), and we subsequently evaluate the impact of residual transceiver hardware impairments and channel aging on these metrics. Different path-loss laws for line-of-sight and non-line-of-sight are accounted for the analysis, which are among the distinguishing features of mmWave systems. Among the findings, we show that the RATHIs have a meaningful impact at the high-signal-To-noise-ratio regime, while the transmit additive distortion degrades further than the receive distortion the system performance. Moreover, serving fewer users proves to be preferable, and the more directive the mmWaves are, the higher the ASE becomes.Peer reviewedFinal Accepted Versio

    Design Simulation and Performance Assessment of Improved Channel Estimation for Millimeter Wave Massive MIMO Systems

    Get PDF
    In this paper, we have optimize specificities with the use of massive MIMO in 5 G systems. Massive MIMO uses a large number, low cost and low power antennas at the base stations. These antennas provide benefit such as improved spectrum performance, which allows the base station to serve more users, reduced latency due to reduced fading power consumption and much more. By employing the lens antenna array, beam space MIMO can utilize beam selection to reduce the number of required RF chains in mm Wave massive MIMO systems without obvious performance loss. However, to achieve the capacity-approaching performance, beam selection requires the accurate information of beam space channel of large size, which is challenging, especially when the number of RF chains is limited. To solve this problem, in this paper we propose a reliable support detection (SD)-based channel estimation scheme. In this work we first design an adaptive selecting network for mm-wave massive MIMO systems with lens antenna array, and based on this network, we further formulate the beam space channel estimation problem as a sparse signal recovery problem. Then, by fully utilizing the structural characteristics of the mm-wave beam space channel, we propose a support detection (SD)-based channel estimation scheme with reliable performance and low pilot overhead. Finally, the performance and complexity analyses are provided to prove that the proposed SD-based channel estimation scheme can estimate the support of sparse beam space channel with comparable or higher accuracy than conventional schemes. Simulation results verify that the proposed SD-based channel estimation scheme outperforms conventional schemes and enjoys satisfying accuracy even in the low SNR region as the structural characteristics of beam space channel can be exploited

    Numerical Simulation and Design Assessment of Limited Feedback Channel Estimation in Massive MIMO Communication System

    Get PDF
    The Internet of Things (IoT) has attracted a great deal of interest in various fields including governments, business, academia as an evolving technology that aims to make anything connected, communicate, and exchange of data. The massive connectivity, stringent energy restrictions, and ultra-reliable transmission requirements are also defined as the most distinctive features of IoT. This feature is a natural IoT supporting technology, as massive multiple input (MIMO) inputs will result in enormous spectral/energy efficiency gains and boost IoT transmission reliability dramatically through a coherent processing of the large-scale antenna array signals. However, the processing is coherent and relies on accurate estimation of channel state information (CSI) between BS and users. Massive multiple input (MIMO) is a powerous support technology that fulfils the Internet of Things' (IoT) energy/spectral performance and reliability needs. However, the benefit of MIMOs is dependent on the availability of CSIs. This research proposes an adaptive sparse channel calculation with limited feedback to estimate accurate and prompt CSIs for large multi-intimate-output systems based on Duplex Frequency Division (DFD) systems. The minimal retro-feedback scheme must retrofit the burden of the base station antennas in a linear proportion. This work offers a narrow feedback algorithm to elevate the burden by means of a MIMO double-way representation (DD) channel using uniform dictionaries linked to the arrival angle and start angle (AoA) (AoD). Although the number of transmission antennas in the BS is high, the algorithms offer an acceptable channel estimation accuracy using a limited number of feedback bits, making it suitable for 5G massively MIMO. The results of the simulation indicate the output limit can be achieved with the proposed algorithm

    DeepTx: Deep Learning Beamforming with Channel Prediction

    Full text link
    Machine learning algorithms have recently been considered for many tasks in the field of wireless communications. Previously, we have proposed the use of a deep fully convolutional neural network (CNN) for receiver processing and shown it to provide considerable performance gains. In this study, we focus on machine learning algorithms for the transmitter. In particular, we consider beamforming and propose a CNN which, for a given uplink channel estimate as input, outputs downlink channel information to be used for beamforming. The CNN is trained in a supervised manner considering both uplink and downlink transmissions with a loss function that is based on UE receiver performance. The main task of the neural network is to predict the channel evolution between uplink and downlink slots, but it can also learn to handle inefficiencies and errors in the whole chain, including the actual beamforming phase. The provided numerical experiments demonstrate the improved beamforming performance.Comment: 27 pages, this work has been submitted to the IEEE for possible publication; v2: Fixed typo in author name, v3: a revisio
    • …
    corecore