1,757 research outputs found
A Literature Survey of Cooperative Caching in Content Distribution Networks
Content distribution networks (CDNs) which serve to deliver web objects
(e.g., documents, applications, music and video, etc.) have seen tremendous
growth since its emergence. To minimize the retrieving delay experienced by a
user with a request for a web object, caching strategies are often applied -
contents are replicated at edges of the network which is closer to the user
such that the network distance between the user and the object is reduced. In
this literature survey, evolution of caching is studied. A recent research
paper [15] in the field of large-scale caching for CDN was chosen to be the
anchor paper which serves as a guide to the topic. Research studies after and
relevant to the anchor paper are also analyzed to better evaluate the
statements and results of the anchor paper and more importantly, to obtain an
unbiased view of the large scale collaborate caching systems as a whole.Comment: 5 pages, 5 figure
A Transfer Learning Approach for Cache-Enabled Wireless Networks
Locally caching contents at the network edge constitutes one of the most
disruptive approaches in G wireless networks. Reaping the benefits of edge
caching hinges on solving a myriad of challenges such as how, what and when to
strategically cache contents subject to storage constraints, traffic load,
unknown spatio-temporal traffic demands and data sparsity. Motivated by this,
we propose a novel transfer learning-based caching procedure carried out at
each small cell base station. This is done by exploiting the rich contextual
information (i.e., users' content viewing history, social ties, etc.) extracted
from device-to-device (D2D) interactions, referred to as source domain. This
prior information is incorporated in the so-called target domain where the goal
is to optimally cache strategic contents at the small cells as a function of
storage, estimated content popularity, traffic load and backhaul capacity. It
is shown that the proposed approach overcomes the notorious data sparsity and
cold-start problems, yielding significant gains in terms of users'
quality-of-experience (QoE) and backhaul offloading, with gains reaching up to
in a setting consisting of four small cell base stations.Comment: some small fixes in notatio
Big Data Meets Telcos: A Proactive Caching Perspective
Mobile cellular networks are becoming increasingly complex to manage while
classical deployment/optimization techniques and current solutions (i.e., cell
densification, acquiring more spectrum, etc.) are cost-ineffective and thus
seen as stopgaps. This calls for development of novel approaches that leverage
recent advances in storage/memory, context-awareness, edge/cloud computing, and
falls into framework of big data. However, the big data by itself is yet
another complex phenomena to handle and comes with its notorious 4V: velocity,
voracity, volume and variety. In this work, we address these issues in
optimization of 5G wireless networks via the notion of proactive caching at the
base stations. In particular, we investigate the gains of proactive caching in
terms of backhaul offloadings and request satisfactions, while tackling the
large-amount of available data for content popularity estimation. In order to
estimate the content popularity, we first collect users' mobile traffic data
from a Turkish telecom operator from several base stations in hours of time
interval. Then, an analysis is carried out locally on a big data platform and
the gains of proactive caching at the base stations are investigated via
numerical simulations. It turns out that several gains are possible depending
on the level of available information and storage size. For instance, with 10%
of content ratings and 15.4 Gbyte of storage size (87% of total catalog size),
proactive caching achieves 100% of request satisfaction and offloads 98% of the
backhaul when considering 16 base stations.Comment: 8 pages, 5 figure
Big Data Caching for Networking: Moving from Cloud to Edge
In order to cope with the relentless data tsunami in wireless networks,
current approaches such as acquiring new spectrum, deploying more base stations
(BSs) and increasing nodes in mobile packet core networks are becoming
ineffective in terms of scalability, cost and flexibility. In this regard,
context-aware G networks with edge/cloud computing and exploitation of
\emph{big data} analytics can yield significant gains to mobile operators. In
this article, proactive content caching in G wireless networks is
investigated in which a big data-enabled architecture is proposed. In this
practical architecture, vast amount of data is harnessed for content popularity
estimation and strategic contents are cached at the BSs to achieve higher
users' satisfaction and backhaul offloading. To validate the proposed solution,
we consider a real-world case study where several hours of mobile data traffic
is collected from a major telecom operator in Turkey and a big data-enabled
analysis is carried out leveraging tools from machine learning. Based on the
available information and storage capacity, numerical studies show that several
gains are achieved both in terms of users' satisfaction and backhaul
offloading. For example, in the case of BSs with of content ratings
and Gbyte of storage size ( of total library size), proactive
caching yields of users' satisfaction and offloads of the
backhaul.Comment: accepted for publication in IEEE Communications Magazine, Special
Issue on Communications, Caching, and Computing for Content-Centric Mobile
Network
Learning-Based Optimization of Cache Content in a Small Cell Base Station
Optimal cache content placement in a wireless small cell base station (sBS)
with limited backhaul capacity is studied. The sBS has a large cache memory and
provides content-level selective offloading by delivering high data rate
contents to users in its coverage area. The goal of the sBS content controller
(CC) is to store the most popular contents in the sBS cache memory such that
the maximum amount of data can be fetched directly form the sBS, not relying on
the limited backhaul resources during peak traffic periods. If the popularity
profile is known in advance, the problem reduces to a knapsack problem.
However, it is assumed in this work that, the popularity profile of the files
is not known by the CC, and it can only observe the instantaneous demand for
the cached content. Hence, the cache content placement is optimised based on
the demand history. By refreshing the cache content at regular time intervals,
the CC tries to learn the popularity profile, while exploiting the limited
cache capacity in the best way possible. Three algorithms are studied for this
cache content placement problem, leading to different exploitation-exploration
trade-offs. We provide extensive numerical simulations in order to study the
time-evolution of these algorithms, and the impact of the system parameters,
such as the number of files, the number of users, the cache size, and the
skewness of the popularity profile, on the performance. It is shown that the
proposed algorithms quickly learn the popularity profile for a wide range of
system parameters.Comment: Accepted to IEEE ICC 2014, Sydney, Australia. Minor typos corrected.
Algorithm MCUCB correcte
Content Placement in Cache-Enabled Sub-6 GHz and Millimeter-Wave Multi-antenna Dense Small Cell Networks
This paper studies the performance of cache-enabled dense small cell networks
consisting of multi-antenna sub-6 GHz and millimeter-wave base stations.
Different from the existing works which only consider a single antenna at each
base station, the optimal content placement is unknown when the base stations
have multiple antennas. We first derive the successful content delivery
probability by accounting for the key channel features at sub-6 GHz and mmWave
frequencies. The maximization of the successful content delivery probability is
a challenging problem. To tackle it, we first propose a constrained
cross-entropy algorithm which achieves the near-optimal solution with moderate
complexity. We then develop another simple yet effective heuristic
probabilistic content placement scheme, termed two-stair algorithm, which
strikes a balance between caching the most popular contents and achieving
content diversity. Numerical results demonstrate the superior performance of
the constrained cross-entropy method and that the two-stair algorithm yields
significantly better performance than only caching the most popular contents.
The comparisons between the sub-6 GHz and mmWave systems reveal an interesting
tradeoff between caching capacity and density for the mmWave system to achieve
similar performance as the sub-6 GHz system.Comment: 14 pages; Accepted to appear in IEEE Transactions on Wireless
Communication
- …