83,216 research outputs found
A Secure Approach for Caching Contents in Wireless Ad Hoc Networks
Caching aims to store data locally in some nodes within the network to be
able to retrieve the contents in shorter time periods. However, caching in the
network did not always consider secure storage (due to the compromise between
time performance and security). In this paper, a novel decentralized secure
coded caching approach is proposed. In this solution, nodes only transmit coded
files to avoid eavesdropper wiretapping and protect the user contents. In this
technique random vectors are used to combine the contents using XOR operation.
We modeled the proposed coded caching scheme by a Shannon cipher system to show
that coded caching achieves asymptotic perfect secrecy. The proposed coded
caching scheme significantly simplifies the routing protocol in cached networks
while it reduces over-caching and achieves a higher throughput capacity
compared to uncoded caching in reactive routing. It is shown that with the
proposed coded caching scheme any content can be retrieved by selecting a
random path while achieving asymptotic optimum solution. We have also studied
the cache hit probability and shown that the coded cache hit probability is
significantly higher than uncoded caching. A secure caching update algorithm is
also presented.Comment: To appear in IEEE Transactions on Vehicular Technolog
Energy Efficiency of Downlink Networks with Caching at Base Stations
Caching popular contents at base stations (BSs) can reduce the backhaul cost
and improve the network throughput. Yet whether locally caching at the BSs can
improve the energy efficiency (EE), a major goal for 5th generation cellular
networks, remains unclear. Due to the entangled impact of various factors on EE
such as interference level, backhaul capacity, BS density, power consumption
parameters, BS sleeping, content popularity and cache capacity, another
important question is what are the key factors that contribute more to the EE
gain from caching. In this paper, we attempt to explore the potential of EE of
the cache-enabled wireless access networks and identify the key factors. By
deriving closed-form expression of the approximated EE, we provide the
condition when the EE can benefit from caching, find the optimal cache capacity
that maximizes the network EE, and analyze the maximal EE gain brought by
caching. We show that caching at the BSs can improve the network EE when power
efficient cache hardware is used. When local caching has EE gain over not
caching, caching more contents at the BSs may not provide higher EE. Numerical
and simulation results show that the caching EE gain is large when the backhaul
capacity is stringent, interference level is low, content popularity is skewed,
and when caching at pico BSs instead of macro BSs.Comment: Accepted by Journal on Selected Areas in Communications (JSAC),
Special Issue on Energy-Efficient Techniques for 5G Wireless Communication
System
Coded Caching based on Combinatorial Designs
We consider the standard broadcast setup with a single server broadcasting
information to a number of clients, each of which contains local storage
(called \textit{cache}) of some size, which can store some parts of the
available files at the server. The centralized coded caching framework,
consists of a caching phase and a delivery phase, both of which are carefully
designed in order to use the cache and the channel together optimally. In prior
literature, various combinatorial structures have been used to construct coded
caching schemes. In this work, we propose a binary matrix model to construct
the coded caching scheme. The ones in such a \textit{caching matrix} indicate
uncached subfiles at the users. Identity submatrices of the caching matrix
represent transmissions in the delivery phase. Using this model, we then
propose several novel constructions for coded caching based on the various
types of combinatorial designs. While most of the schemes constructed in this
work (based on existing designs) have a high cache requirement (uncached
fraction being or , being
the number of users), they provide a rate that is either constant or decreasing
() with increasing , and moreover require competitively
small levels of subpacketization (being ), which is an
extremely important parameter in practical applications of coded caching. We
mark this work as another attempt to exploit the well-developed theory of
combinatorial designs for the problem of constructing caching schemes,
utilizing the binary caching model we develop.Comment: 10 pages, Appeared in Proceedings of IEEE ISIT 201
General Caching Is Hard: Even with Small Pages
Caching (also known as paging) is a classical problem concerning page
replacement policies in two-level memory systems. General caching is the
variant with pages of different sizes and fault costs. We give the first
NP-hardness result for general caching with small pages: General caching is
(strongly) NP-hard even when page sizes are limited to {1, 2, 3}. It holds
already in the fault model (each page has unit fault cost) as well as in the
bit model (each page has the same fault cost as size). We also give a very
short proof of the strong NP-hardness of general caching with page sizes
restricted to {1, 2, 3} and arbitrary costs.Comment: 19 pages, 8 figures, an extended abstract appeared in the proceedings
of MAPSP 2015 (www.mapsp2015.com), a conference version has been submitte
Content Delivery Latency of Caching Strategies for Information-Centric IoT
In-network caching is a central aspect of Information-Centric Networking
(ICN). It enables the rapid distribution of content across the network,
alleviating strain on content producers and reducing content delivery
latencies. ICN has emerged as a promising candidate for use in the Internet of
Things (IoT). However, IoT devices operate under severe constraints, most
notably limited memory. This means that nodes cannot indiscriminately cache all
content; instead, there is a need for a caching strategy that decides what
content to cache. Furthermore, many applications in the IoT space are
timesensitive; therefore, finding a caching strategy that minimises the latency
between content request and delivery is desirable. In this paper, we evaluate a
number of ICN caching strategies in regards to latency and hop count reduction
using IoT devices in a physical testbed. We find that the topology of the
network, and thus the routing algorithm used to generate forwarding
information, has a significant impact on the performance of a given caching
strategy. To the best of our knowledge, this is the first study that focuses on
latency effects in ICN-IoT caching while using real IoT hardware, and the first
to explicitly discuss the link between routing algorithm, network topology, and
caching effects.Comment: 10 pages, 9 figures, journal pape
Caching at the Edge with LT codes
We study the performance of caching schemes based on LT under peeling
(iterative) decoding algorithm. We assume that users ask for downloading
content to multiple cache-aided transmitters. Transmitters are connected
through a backhaul link to a master node while no direct link exists between
users and the master node. Each content is fragmented and coded with LT code.
Cache placement at each transmitter is optimized such that transmissions over
the backhaul link is minimized. We derive a closed form expression for the
calculation of the backhaul transmission rate. We compare the performance of a
caching scheme based on LT with respect to a caching scheme based on maximum
distance separable codes. Finally, we show that caching with \acl{LT} codes
behave as good as caching with maximum distance separable codes
- …
