471 research outputs found
Many-to-Many Matching Games for Proactive Social-Caching in Wireless Small Cell Networks
In this paper, we address the caching problem in small cell networks from a
game theoretic point of view. In particular, we formulate the caching problem
as a many-to-many matching game between small base stations and service
providers' servers. The servers store a set of videos and aim to cache these
videos at the small base stations in order to reduce the experienced delay by
the end-users. On the other hand, small base stations cache the videos
according to their local popularity, so as to reduce the load on the backhaul
links. We propose a new matching algorithm for the many-to-many problem and
prove that it reaches a pairwise stable outcome. Simulation results show that
the number of satisfied requests by the small base stations in the proposed
caching algorithm can reach up to three times the satisfaction of a random
caching policy. Moreover, the expected download time of all the videos can be
reduced significantly
Matching Theory for Future Wireless Networks: Fundamentals and Applications
The emergence of novel wireless networking paradigms such as small cell and
cognitive radio networks has forever transformed the way in which wireless
systems are operated. In particular, the need for self-organizing solutions to
manage the scarce spectral resources has become a prevalent theme in many
emerging wireless systems. In this paper, the first comprehensive tutorial on
the use of matching theory, a Nobelprize winning framework, for resource
management in wireless networks is developed. To cater for the unique features
of emerging wireless networks, a novel, wireless-oriented classification of
matching theory is proposed. Then, the key solution concepts and algorithmic
implementations of this framework are exposed. Then, the developed concepts are
applied in three important wireless networking areas in order to demonstrate
the usefulness of this analytical tool. Results show how matching theory can
effectively improve the performance of resource allocation in all three
applications discussed
A Transfer Learning Approach for Cache-Enabled Wireless Networks
Locally caching contents at the network edge constitutes one of the most
disruptive approaches in G wireless networks. Reaping the benefits of edge
caching hinges on solving a myriad of challenges such as how, what and when to
strategically cache contents subject to storage constraints, traffic load,
unknown spatio-temporal traffic demands and data sparsity. Motivated by this,
we propose a novel transfer learning-based caching procedure carried out at
each small cell base station. This is done by exploiting the rich contextual
information (i.e., users' content viewing history, social ties, etc.) extracted
from device-to-device (D2D) interactions, referred to as source domain. This
prior information is incorporated in the so-called target domain where the goal
is to optimally cache strategic contents at the small cells as a function of
storage, estimated content popularity, traffic load and backhaul capacity. It
is shown that the proposed approach overcomes the notorious data sparsity and
cold-start problems, yielding significant gains in terms of users'
quality-of-experience (QoE) and backhaul offloading, with gains reaching up to
in a setting consisting of four small cell base stations.Comment: some small fixes in notatio
Big Data Meets Telcos: A Proactive Caching Perspective
Mobile cellular networks are becoming increasingly complex to manage while
classical deployment/optimization techniques and current solutions (i.e., cell
densification, acquiring more spectrum, etc.) are cost-ineffective and thus
seen as stopgaps. This calls for development of novel approaches that leverage
recent advances in storage/memory, context-awareness, edge/cloud computing, and
falls into framework of big data. However, the big data by itself is yet
another complex phenomena to handle and comes with its notorious 4V: velocity,
voracity, volume and variety. In this work, we address these issues in
optimization of 5G wireless networks via the notion of proactive caching at the
base stations. In particular, we investigate the gains of proactive caching in
terms of backhaul offloadings and request satisfactions, while tackling the
large-amount of available data for content popularity estimation. In order to
estimate the content popularity, we first collect users' mobile traffic data
from a Turkish telecom operator from several base stations in hours of time
interval. Then, an analysis is carried out locally on a big data platform and
the gains of proactive caching at the base stations are investigated via
numerical simulations. It turns out that several gains are possible depending
on the level of available information and storage size. For instance, with 10%
of content ratings and 15.4 Gbyte of storage size (87% of total catalog size),
proactive caching achieves 100% of request satisfaction and offloads 98% of the
backhaul when considering 16 base stations.Comment: 8 pages, 5 figure
Distributed Caching in Small Cell Networks
The dense deployment of small cells in indoor and outdoor areas contributes mainly in increasing the capacity of cellular networks. On the other hand, the high number of deployed base stations coupled with the increasing growth of data traffic have prompted the apparition of base stations fi tted with storage capacity to avoid network saturation. The storage devices are used as caching units to overcome the limited backhaul capacity in small cells networks (SCNs). Extending the concept of storage to SCNs, gives rise to many new challenges related to the specific characteristics of these networks such as the heterogeneity of the base stations. Formulating the caching problem while taking into account all these specific characteristics with the aim to satisfy the users expectations result in combinatorial optimization problems. However, classical optimization tools do not ensure the optimality of the provided solutions or often the proposed algorithms have an exponential complexity. While most of the existing works are based on the classical optimization tools, in this thesis, we explore another approach to provide a practical solution for the caching problem. In particular, we focus on matching theory which is a game theoretic approach that provides mathematical tools to formulate, analyze and understand scenarios between sets of players. We model the caching problem as a one-to-one matching game between a set of files and a set of base stations and then, we propose an iterative extension of the deferred acceptance algorithm that needs a stable and optimal matching between the two sets. The experimental results show that the proposed algorithm reduces the backhaul load by 10-15 % compared to a random caching algorithm
Echo State Networks for Proactive Caching in Cloud-Based Radio Access Networks with Mobile Users
In this paper, the problem of proactive caching is studied for cloud radio
access networks (CRANs). In the studied model, the baseband units (BBUs) can
predict the content request distribution and mobility pattern of each user,
determine which content to cache at remote radio heads and BBUs. This problem
is formulated as an optimization problem which jointly incorporates backhaul
and fronthaul loads and content caching. To solve this problem, an algorithm
that combines the machine learning framework of echo state networks with
sublinear algorithms is proposed. Using echo state networks (ESNs), the BBUs
can predict each user's content request distribution and mobility pattern while
having only limited information on the network's and user's state. In order to
predict each user's periodic mobility pattern with minimal complexity, the
memory capacity of the corresponding ESN is derived for a periodic input. This
memory capacity is shown to be able to record the maximum amount of user
information for the proposed ESN model. Then, a sublinear algorithm is proposed
to determine which content to cache while using limited content request
distribution samples. Simulation results using real data from Youku and the
Beijing University of Posts and Telecommunications show that the proposed
approach yields significant gains, in terms of sum effective capacity, that
reach up to 27.8% and 30.7%, respectively, compared to random caching with
clustering and random caching without clustering algorithm.Comment: Accepted in the IEEE Transactions on Wireless Communication
On the Benefits of Edge Caching for MIMO Interference Alignment
In this contribution, we jointly investigate the benefits of caching and
interference alignment (IA) in multiple-input multiple-output (MIMO)
interference channel under limited backhaul capacity. In particular, total
average transmission rate is derived as a function of various system parameters
such as backhaul link capacity, cache size, number of active
transmitter-receiver pairs as well as the quantization bits for channel state
information (CSI). Given the fact that base stations are equipped both with
caching and IA capabilities and have knowledge of content popularity profile,
we then characterize an operational regime where the caching is beneficial.
Subsequently, we find the optimal number of transmitter-receiver pairs that
maximizes the total average transmission rate. When the popularity profile of
requested contents falls into the operational regime, it turns out that caching
substantially improves the throughput as it mitigates the backhaul usage and
allows IA methods to take benefit of such limited backhaul.Comment: 20 pages, 5 figures. A shorter version is to be presented at 16th
IEEE International Workshop on Signal Processing Advances in Wireless
Communications (SPAWC'2015), Stockholm, Swede
A Content-based Centrality Metric for Collaborative Caching in Information-Centric Fogs
Information-Centric Fog Computing enables a multitude of nodes near the
end-users to provide storage, communication, and computing, rather than in the
cloud. In a fog network, nodes connect with each other directly to get content
locally whenever possible. As the topology of the network directly influences
the nodes' connectivity, there has been some work to compute the graph
centrality of each node within that network topology. The centrality is then
used to distinguish nodes in the fog network, or to prioritize some nodes over
others to participate in the caching fog. We argue that, for an
Information-Centric Fog Computing approach, graph centrality is not an
appropriate metric. Indeed, a node with low connectivity that caches a lot of
content may provide a very valuable role in the network.
To capture this, we introduce acontent-based centrality (CBC) metric which
takes into account how well a node is connected to the content the network is
delivering, rather than to the other nodes in the network. To illustrate the
validity of considering content-based centrality, we use this new metric for a
collaborative caching algorithm. We compare the performance of the proposed
collaborative caching with typical centrality based, non-centrality based, and
non-collaborative caching mechanisms. Our simulation implements CBC on three
instances of large scale realistic network topology comprising 2,896 nodes with
three content replication levels. Results shows that CBC outperforms benchmark
caching schemes and yields a roughly 3x improvement for the average cache hit
rate
A Survey of Anticipatory Mobile Networking: Context-Based Classification, Prediction Methodologies, and Optimization Techniques
A growing trend for information technology is to not just react to changes, but anticipate them as much as possible. This paradigm made modern solutions, such as recommendation systems, a ubiquitous presence in today's digital transactions. Anticipatory networking extends the idea to communication technologies by studying patterns and periodicity in human behavior and network dynamics to optimize network performance. This survey collects and analyzes recent papers leveraging context information to forecast the evolution of network conditions and, in turn, to improve network performance. In particular, we identify the main prediction and optimization tools adopted in this body of work and link them with objectives and constraints of the typical applications and scenarios. Finally, we consider open challenges and research directions to make anticipatory networking part of next generation networks
- …