16 research outputs found
Capacity of Cellular Networks with Femtocache
The capacity of next generation of cellular networks using femtocaches is
studied when multihop communications and decentralized cache placement are
considered. We show that the storage capability of future network User
Terminals (UT) can be effectively used to increase the capacity in random
decentralized uncoded caching. We further propose a random decentralized coded
caching scheme which achieves higher capacity results than the random
decentralized uncoded caching. The result shows that coded caching which is
suitable for systems with limited storage capabilities can improve the capacity
of cellular networks by a factor of log(n) where n is the number of nodes
served by the femtocache.Comment: 6 pages, 2 figures, presented at Infocom Workshops on 5G and beyond,
San Francisco, CA, April 201
Optimizing Pilot Overhead for Ultra-Reliable Short-Packet Transmission
In this paper we optimize the pilot overhead for ultra-reliable short-packet
transmission and investigate the dependence of this overhead on packet size and
error probability. In particular, we consider a point-to-point communication in
which one sensor sends messages to a central node, or base-station, over AWGN
with Rayleigh fading channel. We formalize the optimization in terms of
approximate achievable rates at a given block length, pilot length, and error
probability. This leads to more accurate pilot overhead optimization.
Simulation results show that it is important to take into account the packet
size and the error probability when optimizing the pilot overhead.Comment: To be published on IEEE ICC 2017 Communication Theory Symposiu
QoE-driven Cache Placement for Adaptive Video Streaming: Minding the Viewport
International audienceTo handle the increasing demand for video streaming, ISP's and service providers use edge servers to cache video content to reduce the rush on their servers, balance the load between them and over the network, and smooth out the traffic variability. The dynamic adaptive streaming over HTTP protocol (DASH) makes videos available in multiple representations, and end-users can switch video resolution as a function of their network conditions and terminal display capacity (e.g., bandwidth, screen resolution). In this context, we study a viewportaware caching optimization problem for dynamic adaptive video streaming that appropriately considers the client viewport size and access speed, the join time, and the characteristics of videos. We formulate and study the proposed optimization problem as an Integer Linear Program (ILP) that balances minimal join time and maximal visual experience, subject to the cache storage capacity. Our framework sheds light on optimal caching performance. Our proposed heuristic provides guidelines on the videos, and the representations of each video, to cache based on the video popularity, its encoding information, and the distribution of end-user display capacity and access speed in a way to maximize the overall end-user QoE
Power Allocation and Cooperative Diversity in Two-Way Non-Regenerative Cognitive Radio Networks
In this paper, we investigate the performance of a dual-hop block fading
cognitive radio network with underlay spectrum sharing over independent but not
necessarily identically distributed (i.n.i.d.) Nakagami- fading channels.
The primary network consists of a source and a destination. Depending on
whether the secondary network which consists of two source nodes have a single
relay for cooperation or multiple relays thereby employs opportunistic relay
selection for cooperation and whether the two source nodes suffer from the
primary users' (PU) interference, two cases are considered in this paper, which
are referred to as Scenario (a) and Scenario (b), respectively. For the
considered underlay spectrum sharing, the transmit power constraint of the
proposed system is adjusted by interference limit on the primary network and
the interference imposed by primary user (PU). The developed new analysis
obtains new analytical results for the outage capacity (OC) and average symbol
error probability (ASEP). In particular, for Scenario (a), tight lower bounds
on the OC and ASEP of the secondary network are derived in closed-form. In
addition, a closed from expression for the end-to-end OC of Scenario (a) is
achieved. With regards to Scenario (b), a tight lower bound on the OC of the
secondary network is derived in closed-form. All analytical results are
corroborated using Monte Carlo simulation method
Spatial and Social Paradigms for Interference and Coverage Analysis in Underlay D2D Network
The homogeneous Poisson point process (PPP) is widely used to model spatial
distribution of base stations and mobile terminals. The same process can be
used to model underlay device-to-device (D2D) network, however, neglecting
homophilic relation for D2D pairing presents underestimated system insights. In
this paper, we model both spatial and social distributions of interfering D2D
nodes as proximity based independently marked homogeneous Poisson point
process. The proximity considers physical distance between D2D nodes whereas
social relationship is modeled as Zipf based marks. We apply these two
paradigms to analyze the effect of interference on coverage probability of
distance-proportional power-controlled cellular user. Effectively, we apply two
type of functional mappings (physical distance, social marks) to Laplace
functional of PPP. The resulting coverage probability has no closed-form
expression, however for a subset of social marks, the mark summation converges
to digamma and polygamma functions. This subset constitutes the upper and lower
bounds on coverage probability. We present numerical evaluation of these bounds
on coverage probability by varying number of different parameters. The results
show that by imparting simple power control on cellular user, ultra-dense
underlay D2D network can be realized without compromising the coverage
probability of cellular user.Comment: 10 pages, 10 figure
Secure and Private Cloud Storage Systems with Random Linear Fountain Codes
An information theoretic approach to security and privacy called Secure And
Private Information Retrieval (SAPIR) is introduced. SAPIR is applied to
distributed data storage systems. In this approach, random combinations of all
contents are stored across the network. Our coding approach is based on Random
Linear Fountain (RLF) codes. To retrieve a content, a group of servers
collaborate with each other to form a Reconstruction Group (RG). SAPIR achieves
asymptotic perfect secrecy if at least one of the servers within an RG is not
compromised. Further, a Private Information Retrieval (PIR) scheme based on
random queries is proposed. The PIR approach ensures the users privately
download their desired contents without the servers knowing about the requested
contents indices. The proposed scheme is adaptive and can provide privacy
against a significant number of colluding servers.Comment: 8 pages, 2 figure