316 research outputs found
On the Benefits of Edge Caching for MIMO Interference Alignment
In this contribution, we jointly investigate the benefits of caching and
interference alignment (IA) in multiple-input multiple-output (MIMO)
interference channel under limited backhaul capacity. In particular, total
average transmission rate is derived as a function of various system parameters
such as backhaul link capacity, cache size, number of active
transmitter-receiver pairs as well as the quantization bits for channel state
information (CSI). Given the fact that base stations are equipped both with
caching and IA capabilities and have knowledge of content popularity profile,
we then characterize an operational regime where the caching is beneficial.
Subsequently, we find the optimal number of transmitter-receiver pairs that
maximizes the total average transmission rate. When the popularity profile of
requested contents falls into the operational regime, it turns out that caching
substantially improves the throughput as it mitigates the backhaul usage and
allows IA methods to take benefit of such limited backhaul.Comment: 20 pages, 5 figures. A shorter version is to be presented at 16th
IEEE International Workshop on Signal Processing Advances in Wireless
Communications (SPAWC'2015), Stockholm, Swede
How Much Can D2D Communication Reduce Content Delivery Latency in Fog Networks with Edge Caching?
A Fog-Radio Access Network (F-RAN) is studied in which cache-enabled Edge
Nodes (ENs) with dedicated fronthaul connections to the cloud aim at delivering
contents to mobile users. Using an information-theoretic approach, this work
tackles the problem of quantifying the potential latency reduction that can be
obtained by enabling Device-to-Device (D2D) communication over out-of-band
broadcast links. Following prior work, the Normalized Delivery Time (NDT) --- a
metric that captures the high signal-to-noise ratio worst-case latency --- is
adopted as the performance criterion of interest. Joint edge caching, downlink
transmission, and D2D communication policies based on compress-and-forward are
proposed that are shown to be information-theoretically optimal to within a
constant multiplicative factor of two for all values of the problem parameters,
and to achieve the minimum NDT for a number of special cases. The analysis
provides insights on the role of D2D cooperation in improving the delivery
latency.Comment: Submitted to the IEEE Transactions on Communication
Modeling and Performance of Uplink Cache-Enabled Massive MIMO Heterogeneous Networks
A significant burden on wireless networks is brought by the uploading of user-generated contents to the Internet by means of applications such as social media. To cope with this mobile data tsunami, we develop a novel multiple-input multiple-output (MIMO) network architecture with randomly located base stations (BSs) a large number of antennas employing cache-enabled uplink transmission. In particular, we formulate a scenario, where the users upload their content to their strongest BSs, which are Poisson point process distributed. In addition, the BSs, exploiting the benefits of massive MIMO, upload their contents to the core network by means of a finite-rate backhaul. After proposing the caching policies, where we propose the modified von Mises distribution as the popularity distribution function, we derive the outage probability and the average delivery rate by taking advantage of tools from the deterministic equivalent and stochastic geometry analyses. Numerical results investigate the realistic performance gains of the proposed heterogeneous cache-enabled uplink on the network in terms of cardinal operating parameters. For example, insights regarding the BSs storage size are exposed. Moreover, the impacts of the key parameters such as the file popularity distribution and the target bitrate are investigated. Specifically, the outage probability decreases if the storage size is increased, while the average delivery rate increases. In addition, the concentration parameter, defining the number of files stored at the intermediate nodes (popularity), affects the proposed metrics directly. Furthermore, a higher target rate results in higher outage because fewer users obey this constraint. Also, we demonstrate that a denser network decreases the outage and increases the delivery rate. Hence, the introduction of caching at the uplink of the system design ameliorates the network performance.Peer reviewe
- …