14 research outputs found
Online Reinforcement Learning of X-Haul Content Delivery Mode in Fog Radio Access Networks
We consider a Fog Radio Access Network (F-RAN) with a Base Band Unit (BBU) in
the cloud and multiple cache-enabled enhanced Remote Radio Heads (eRRHs). The
system aims at delivering contents on demand with minimal average latency from
a time-varying library of popular contents. Information about uncached
requested files can be transferred from the cloud to the eRRHs by following
either backhaul or fronthaul modes. The backhaul mode transfers fractions of
the requested files, while the fronthaul mode transmits quantized baseband
samples as in Cloud-RAN (C-RAN). The backhaul mode allows the caches of the
eRRHs to be updated, which may lower future delivery latencies. In contrast,
the fronthaul mode enables cooperative C-RAN transmissions that may reduce the
current delivery latency. Taking into account the trade-off between current and
future delivery performance, this paper proposes an adaptive selection method
between the two delivery modes to minimize the long-term delivery latency.
Assuming an unknown and time-varying popularity model, the method is based on
model-free Reinforcement Learning (RL). Numerical results confirm the
effectiveness of the proposed RL scheme.Comment: 5 pages, 2 figure
Online Reinforcement Learning of X-Haul Content Delivery Mode in Fog Radio Access Networks
We consider a Fog Radio Access Network (F-RAN) with a Base Band Unit (BBU) in
the cloud and multiple cache-enabled enhanced Remote Radio Heads (eRRHs). The
system aims at delivering contents on demand with minimal average latency from
a time-varying library of popular contents. Information about uncached
requested files can be transferred from the cloud to the eRRHs by following
either backhaul or fronthaul modes. The backhaul mode transfers fractions of
the requested files, while the fronthaul mode transmits quantized baseband
samples as in Cloud-RAN (C-RAN). The backhaul mode allows the caches of the
eRRHs to be updated, which may lower future delivery latencies. In contrast,
the fronthaul mode enables cooperative C-RAN transmissions that may reduce the
current delivery latency. Taking into account the trade-off between current and
future delivery performance, this paper proposes an adaptive selection method
between the two delivery modes to minimize the long-term delivery latency.
Assuming an unknown and time-varying popularity model, the method is based on
model-free Reinforcement Learning (RL). Numerical results confirm the
effectiveness of the proposed RL scheme.Comment: 12 pages, 2 figure
Fundamental Limits of Cloud and Cache-Aided Interference Management with Multi-Antenna Edge Nodes
In fog-aided cellular systems, content delivery latency can be minimized by
jointly optimizing edge caching and transmission strategies. In order to
account for the cache capacity limitations at the Edge Nodes (ENs),
transmission generally involves both fronthaul transfer from a cloud processor
with access to the content library to the ENs, as well as wireless delivery
from the ENs to the users. In this paper, the resulting problem is studied from
an information-theoretic viewpoint by making the following practically relevant
assumptions: 1) the ENs have multiple antennas; 2) only uncoded fractional
caching is allowed; 3) the fronthaul links are used to send fractions of
contents; and 4) the ENs are constrained to use one-shot linear precoding on
the wireless channel. Assuming offline proactive caching and focusing on a high
signal-to-noise ratio (SNR) latency metric, the optimal information-theoretic
performance is investigated under both serial and pipelined fronthaul-edge
transmission modes. The analysis characterizes the minimum high-SNR latency in
terms of Normalized Delivery Time (NDT) for worst-case users' demands. The
characterization is exact for a subset of system parameters, and is generally
optimal within a multiplicative factor of 3/2 for the serial case and of 2 for
the pipelined case. The results bring insights into the optimal interplay
between edge and cloud processing in fog-aided wireless networks as a function
of system resources, including the number of antennas at the ENs, the ENs'
cache capacity and the fronthaul capacity.Comment: 34 pages, 15 figures, submitte
Fundamental limits of memory-latency tradeoff in fog radio access networks under arbitrary demands
We consider a fog radio access network (F-RAN) with multiple transmitters and receivers, where each transmitter is connected to the cloud via a fronthaul link. Each network node has a finite cache, where it fills its cache with portions of the library files in the off-peak hours. In the delivery phase, receivers request each library files according to an arbitrary popularity distribution. The cloud and the transmitters are responsible for satisfying the requests. This paper aims to design content placement and coded delivery schemes for minimizing both the expected normalized delivery time (NDT) and the peak NDT which measures the transmission latency. We propose achievable transmission policies, and derive an information-theoretic bound on the expected NDT under uniform popularity distribution. The analytical results show that the proposed scheme is within a gap of 2.58 from the derived bound for both the expected NDT under uniform popularity distribution and the peak NDT. Next, we investigate the expected NDT under an arbitrary popularity distribution for an F-RAN with transmitter-side caches only. The achievable and information-theoretic bounds on the expected NDT are derived, where we analytically prove that our proposed scheme is optimal within a gap of two independent of the popularity distribution
Online Edge Caching and Wireless Delivery in Fog-Aided Networks with Dynamic Content Popularity
Fog Radio Access Network (F-RAN) architectures can leverage both cloud
processing and edge caching for content delivery to the users. To this end,
F-RAN utilizes caches at the edge nodes (ENs) and fronthaul links connecting a
cloud processor to ENs. Assuming time-invariant content popularity, existing
information-theoretic analyses of content delivery in F-RANs rely on offline
caching with separate content placement and delivery phases. In contrast, this
work focuses on the scenario in which the set of popular content is
time-varying, hence necessitating the online replenishment of the ENs' caches
along with the delivery of the requested files. The analysis is centered on the
characterization of the long-term Normalized Delivery Time (NDT), which
captures the temporal dependence of the coding latencies accrued across
multiple time slots in the high signal-to-noise ratio regime. Online edge
caching and delivery schemes are investigated for both serial and pipelined
transmission modes across fronthaul and edge segments. Analytical results
demonstrate that, in the presence of a time-varying content popularity, the
rate of fronthaul links sets a fundamental limit to the long-term NDT of F- RAN
system. Analytical results are further verified by numerical simulation,
yielding important design insights.Comment: 33 pages, 8 figures, Accepted for publication at IEEE Journal in
Selected Areas in Communications, Special Issue on Caching for Communication
Systems and Networks. arXiv admin note: text overlap with arXiv:1701.0618