2,217 research outputs found

    Echo State Networks for Proactive Caching in Cloud-Based Radio Access Networks with Mobile Users

    Full text link
    In this paper, the problem of proactive caching is studied for cloud radio access networks (CRANs). In the studied model, the baseband units (BBUs) can predict the content request distribution and mobility pattern of each user, determine which content to cache at remote radio heads and BBUs. This problem is formulated as an optimization problem which jointly incorporates backhaul and fronthaul loads and content caching. To solve this problem, an algorithm that combines the machine learning framework of echo state networks with sublinear algorithms is proposed. Using echo state networks (ESNs), the BBUs can predict each user's content request distribution and mobility pattern while having only limited information on the network's and user's state. In order to predict each user's periodic mobility pattern with minimal complexity, the memory capacity of the corresponding ESN is derived for a periodic input. This memory capacity is shown to be able to record the maximum amount of user information for the proposed ESN model. Then, a sublinear algorithm is proposed to determine which content to cache while using limited content request distribution samples. Simulation results using real data from Youku and the Beijing University of Posts and Telecommunications show that the proposed approach yields significant gains, in terms of sum effective capacity, that reach up to 27.8% and 30.7%, respectively, compared to random caching with clustering and random caching without clustering algorithm.Comment: Accepted in the IEEE Transactions on Wireless Communication

    The edge cloud: A holistic view of communication, computation and caching

    Get PDF
    The evolution of communication networks shows a clear shift of focus from just improving the communications aspects to enabling new important services, from Industry 4.0 to automated driving, virtual/augmented reality, Internet of Things (IoT), and so on. This trend is evident in the roadmap planned for the deployment of the fifth generation (5G) communication networks. This ambitious goal requires a paradigm shift towards a vision that looks at communication, computation and caching (3C) resources as three components of a single holistic system. The further step is to bring these 3C resources closer to the mobile user, at the edge of the network, to enable very low latency and high reliability services. The scope of this chapter is to show that signal processing techniques can play a key role in this new vision. In particular, we motivate the joint optimization of 3C resources. Then we show how graph-based representations can play a key role in building effective learning methods and devising innovative resource allocation techniques.Comment: to appear in the book "Cooperative and Graph Signal Pocessing: Principles and Applications", P. Djuric and C. Richard Eds., Academic Press, Elsevier, 201

    Fundamental Limits of Cloud and Cache-Aided Interference Management with Multi-Antenna Edge Nodes

    Get PDF
    In fog-aided cellular systems, content delivery latency can be minimized by jointly optimizing edge caching and transmission strategies. In order to account for the cache capacity limitations at the Edge Nodes (ENs), transmission generally involves both fronthaul transfer from a cloud processor with access to the content library to the ENs, as well as wireless delivery from the ENs to the users. In this paper, the resulting problem is studied from an information-theoretic viewpoint by making the following practically relevant assumptions: 1) the ENs have multiple antennas; 2) only uncoded fractional caching is allowed; 3) the fronthaul links are used to send fractions of contents; and 4) the ENs are constrained to use one-shot linear precoding on the wireless channel. Assuming offline proactive caching and focusing on a high signal-to-noise ratio (SNR) latency metric, the optimal information-theoretic performance is investigated under both serial and pipelined fronthaul-edge transmission modes. The analysis characterizes the minimum high-SNR latency in terms of Normalized Delivery Time (NDT) for worst-case users' demands. The characterization is exact for a subset of system parameters, and is generally optimal within a multiplicative factor of 3/2 for the serial case and of 2 for the pipelined case. The results bring insights into the optimal interplay between edge and cloud processing in fog-aided wireless networks as a function of system resources, including the number of antennas at the ENs, the ENs' cache capacity and the fronthaul capacity.Comment: 34 pages, 15 figures, submitte
    • …
    corecore