3,495 research outputs found

    Fronthaul data compression for Uplink CoMP in cloud radio access network (C-RAN)

    Get PDF
    The design of efficient wireless fronthaul connections for future heterogeneous networks incorporating emerging paradigms such as cloud radio access network has become a challenging task that requires the most effective utilisation of fronthaul network resources. In this paper, we propose to use distributed compression to reduce the fronthaul traffic in uplink Coordinated Multi-Point for cloud radio access network. Unlike the conventional approach where each coordinating point quantises and forwards its own observation to the processing centre, these observations are compressed before forwarding. At the processing centre, the decompression of the observations and the decoding of the user message are conducted in a successive manner. The essence of this approach is the optimisation of the distributed compression using an iterative algorithm to achieve maximal user rate with a given fronthaul rate. In other words, for a target user rate the generated fronthaul traffic is minimised. Moreover, joint decompression and decoding is studied and an iterative optimisation algorithm is devised accordingly. Finally, the analysis is extended to multi-user case and our results reveal that, in both dense and ultra-dense urban deployment scenarios, the usage of distributed compression can efficiently reduce the required fronthaul rate and a further reduction is obtained with joint operation

    How to Solve the Fronthaul Traffic Congestion Problem in H-CRAN?

    Get PDF
    The design of efficient wireless fronthaul connections for future heterogeneous networks incorporating emerging paradigms such as heterogeneous cloud radio access network (H-CRAN) has become a challenging task that requires the most effective utilization of fronthaul network resources. In this paper, we propose and analyze possible solutions to facilitate the fronthaul traffic congestion in the scenario of Coordinated Multi-Point (CoMP) for 5G cellular traffic which is expected to reach ZetaByte by 2017. In particular, we propose to use distributed compression to reduce the fronthaul traffic for H-CRAN. Unlike the conventional approach where each coordinating point quantizes and forwards its own observation to the processing centre, these observations are compressed before forwarding. At the processing centre, the decompression of the observations and the decoding of the user messages are conducted in a joint manner. Our results reveal that, in both dense and ultra-dense urban small cell deployment scenarios, the usage of distributed compression can efficiently reduce the required fronthaul rate by more than 50% via joint operation

    Fundamental Limits of Cloud and Cache-Aided Interference Management with Multi-Antenna Edge Nodes

    Get PDF
    In fog-aided cellular systems, content delivery latency can be minimized by jointly optimizing edge caching and transmission strategies. In order to account for the cache capacity limitations at the Edge Nodes (ENs), transmission generally involves both fronthaul transfer from a cloud processor with access to the content library to the ENs, as well as wireless delivery from the ENs to the users. In this paper, the resulting problem is studied from an information-theoretic viewpoint by making the following practically relevant assumptions: 1) the ENs have multiple antennas; 2) only uncoded fractional caching is allowed; 3) the fronthaul links are used to send fractions of contents; and 4) the ENs are constrained to use one-shot linear precoding on the wireless channel. Assuming offline proactive caching and focusing on a high signal-to-noise ratio (SNR) latency metric, the optimal information-theoretic performance is investigated under both serial and pipelined fronthaul-edge transmission modes. The analysis characterizes the minimum high-SNR latency in terms of Normalized Delivery Time (NDT) for worst-case users' demands. The characterization is exact for a subset of system parameters, and is generally optimal within a multiplicative factor of 3/2 for the serial case and of 2 for the pipelined case. The results bring insights into the optimal interplay between edge and cloud processing in fog-aided wireless networks as a function of system resources, including the number of antennas at the ENs, the ENs' cache capacity and the fronthaul capacity.Comment: 34 pages, 15 figures, submitte
    • …
    corecore