9 research outputs found
Living on the Edge: The Role of Proactive Caching in 5G Wireless Networks
This article explores one of the key enablers of beyond G wireless
networks leveraging small cell network deployments, namely proactive caching.
Endowed with predictive capabilities and harnessing recent developments in
storage, context-awareness and social networks, peak traffic demands can be
substantially reduced by proactively serving predictable user demands, via
caching at base stations and users' devices. In order to show the effectiveness
of proactive caching, we examine two case studies which exploit the spatial and
social structure of the network, where proactive caching plays a crucial role.
Firstly, in order to alleviate backhaul congestion, we propose a mechanism
whereby files are proactively cached during off-peak demands based on file
popularity and correlations among users and files patterns. Secondly,
leveraging social networks and device-to-device (D2D) communications, we
propose a procedure that exploits the social structure of the network by
predicting the set of influential users to (proactively) cache strategic
contents and disseminate them to their social ties via D2D communications.
Exploiting this proactive caching paradigm, numerical results show that
important gains can be obtained for each case study, with backhaul savings and
a higher ratio of satisfied users of up to and , respectively.
Higher gains can be further obtained by increasing the storage capability at
the network edge.Comment: accepted for publication in IEEE Communications Magazin
A Transfer Learning Approach for Cache-Enabled Wireless Networks
Locally caching contents at the network edge constitutes one of the most
disruptive approaches in G wireless networks. Reaping the benefits of edge
caching hinges on solving a myriad of challenges such as how, what and when to
strategically cache contents subject to storage constraints, traffic load,
unknown spatio-temporal traffic demands and data sparsity. Motivated by this,
we propose a novel transfer learning-based caching procedure carried out at
each small cell base station. This is done by exploiting the rich contextual
information (i.e., users' content viewing history, social ties, etc.) extracted
from device-to-device (D2D) interactions, referred to as source domain. This
prior information is incorporated in the so-called target domain where the goal
is to optimally cache strategic contents at the small cells as a function of
storage, estimated content popularity, traffic load and backhaul capacity. It
is shown that the proposed approach overcomes the notorious data sparsity and
cold-start problems, yielding significant gains in terms of users'
quality-of-experience (QoE) and backhaul offloading, with gains reaching up to
in a setting consisting of four small cell base stations.Comment: some small fixes in notatio
On the Delay of Geographical Caching Methods in Two-Tiered Heterogeneous Networks
We consider a hierarchical network that consists of mobile users, a
two-tiered cellular network (namely small cells and macro cells) and central
routers, each of which follows a Poisson point process (PPP). In this scenario,
small cells with limited-capacity backhaul are able to cache content under a
given set of randomized caching policies and storage constraints. Moreover, we
consider three different content popularity models, namely fixed content
popularity, distance-dependent and load-dependent, in order to model the
spatio-temporal behavior of users' content request patterns. We derive
expressions for the average delay of users assuming perfect knowledge of
content popularity distributions and randomized caching policies. Although the
trend of the average delay for all three content popularity models is
essentially identical, our results show that the overall performance of
cached-enabled heterogeneous networks can be substantially improved, especially
under the load-dependent content popularity model.Comment: to be presented at IEEE SPAWC'2016, Edinburgh, U
On the Benefits of Edge Caching for MIMO Interference Alignment
In this contribution, we jointly investigate the benefits of caching and
interference alignment (IA) in multiple-input multiple-output (MIMO)
interference channel under limited backhaul capacity. In particular, total
average transmission rate is derived as a function of various system parameters
such as backhaul link capacity, cache size, number of active
transmitter-receiver pairs as well as the quantization bits for channel state
information (CSI). Given the fact that base stations are equipped both with
caching and IA capabilities and have knowledge of content popularity profile,
we then characterize an operational regime where the caching is beneficial.
Subsequently, we find the optimal number of transmitter-receiver pairs that
maximizes the total average transmission rate. When the popularity profile of
requested contents falls into the operational regime, it turns out that caching
substantially improves the throughput as it mitigates the backhaul usage and
allows IA methods to take benefit of such limited backhaul.Comment: 20 pages, 5 figures. A shorter version is to be presented at 16th
IEEE International Workshop on Signal Processing Advances in Wireless
Communications (SPAWC'2015), Stockholm, Swede
Flexible Cache-Aided Networks with Backhauling
Caching at the edge is a promising technique to cope with the increasing data
demand in wireless networks. This paper analyzes the performance of cellular
networks consisting of a tier macro-cell wireless backhaul nodes overlaid with
a tier of cache-aided small cells. We consider both static and dynamic
association policies for content delivery to the user terminals and analyze
their performance. In particular, we derive closed-form expressions for the
area spectral efficiency and the energy efficiency, which are used to optimize
relevant design parameters such as the density of cache-aided small cells and
the storage size. By means of this approach, we are able to draw useful design
insights for the deployment of highly performing cache-aided tiered networks.Comment: 5 pages, 5 figures, to be presented at 18th IEEE International
Workshop on Signal Processing Advances in Wireless Communications
(SPAWC'2017), Sapporo, Japan, 201
Big Data Caching for Networking: Moving from Cloud to Edge
In order to cope with the relentless data tsunami in wireless networks,
current approaches such as acquiring new spectrum, deploying more base stations
(BSs) and increasing nodes in mobile packet core networks are becoming
ineffective in terms of scalability, cost and flexibility. In this regard,
context-aware G networks with edge/cloud computing and exploitation of
\emph{big data} analytics can yield significant gains to mobile operators. In
this article, proactive content caching in G wireless networks is
investigated in which a big data-enabled architecture is proposed. In this
practical architecture, vast amount of data is harnessed for content popularity
estimation and strategic contents are cached at the BSs to achieve higher
users' satisfaction and backhaul offloading. To validate the proposed solution,
we consider a real-world case study where several hours of mobile data traffic
is collected from a major telecom operator in Turkey and a big data-enabled
analysis is carried out leveraging tools from machine learning. Based on the
available information and storage capacity, numerical studies show that several
gains are achieved both in terms of users' satisfaction and backhaul
offloading. For example, in the case of BSs with of content ratings
and Gbyte of storage size ( of total library size), proactive
caching yields of users' satisfaction and offloads of the
backhaul.Comment: accepted for publication in IEEE Communications Magazine, Special
Issue on Communications, Caching, and Computing for Content-Centric Mobile
Network
Big Data Meets Telcos: A Proactive Caching Perspective
Mobile cellular networks are becoming increasingly complex to manage while
classical deployment/optimization techniques and current solutions (i.e., cell
densification, acquiring more spectrum, etc.) are cost-ineffective and thus
seen as stopgaps. This calls for development of novel approaches that leverage
recent advances in storage/memory, context-awareness, edge/cloud computing, and
falls into framework of big data. However, the big data by itself is yet
another complex phenomena to handle and comes with its notorious 4V: velocity,
voracity, volume and variety. In this work, we address these issues in
optimization of 5G wireless networks via the notion of proactive caching at the
base stations. In particular, we investigate the gains of proactive caching in
terms of backhaul offloadings and request satisfactions, while tackling the
large-amount of available data for content popularity estimation. In order to
estimate the content popularity, we first collect users' mobile traffic data
from a Turkish telecom operator from several base stations in hours of time
interval. Then, an analysis is carried out locally on a big data platform and
the gains of proactive caching at the base stations are investigated via
numerical simulations. It turns out that several gains are possible depending
on the level of available information and storage size. For instance, with 10%
of content ratings and 15.4 Gbyte of storage size (87% of total catalog size),
proactive caching achieves 100% of request satisfaction and offloads 98% of the
backhaul when considering 16 base stations.Comment: 8 pages, 5 figure
Reconfigurable cognitive transceiver for opportunistic networks
International audienceIn this work, we provide the implementation and analysis of a cognitive transceiver for opportunistic networks. We focus on a previously introduced dynamic spectrum access (DSA) - cognitive radio (CR) solution for primary-secondary coexistence in opportunistic orthogonal frequency division multiplexing (OFDM) networks, called cognitive interference alignment (CIA). The implementation is based on software-defined radio (SDR) and uses GNU Radio and the universal software radio peripheral (USRP) as the implementation toolkit. The proposed flexible transceiver architecture allows efficient on-the-fly reconfigurations of the physical layer into OFDM, CIA or a combination of both. Remarkably, its responsiveness is such that the uplink and downlink channel reciprocity from the medium perspective, inherent to time division duplex (TDD) communications, can be effectively verified and exploited. We show that CIA provides approximately 10 dB of interference isolation towards the OFDM receiver with respect to a fully random precoder. This result is obtained under suboptimal conditions, which indicates that further gains are possible with a better optimization of the system. Our findings point towards the usefulness of a practical CIA implementation, as it yields a non-negligible performance for the secondary system, while providing interference shielding to the primary receiver