4,294 research outputs found
Big Data Caching for Networking: Moving from Cloud to Edge
In order to cope with the relentless data tsunami in wireless networks,
current approaches such as acquiring new spectrum, deploying more base stations
(BSs) and increasing nodes in mobile packet core networks are becoming
ineffective in terms of scalability, cost and flexibility. In this regard,
context-aware G networks with edge/cloud computing and exploitation of
\emph{big data} analytics can yield significant gains to mobile operators. In
this article, proactive content caching in G wireless networks is
investigated in which a big data-enabled architecture is proposed. In this
practical architecture, vast amount of data is harnessed for content popularity
estimation and strategic contents are cached at the BSs to achieve higher
users' satisfaction and backhaul offloading. To validate the proposed solution,
we consider a real-world case study where several hours of mobile data traffic
is collected from a major telecom operator in Turkey and a big data-enabled
analysis is carried out leveraging tools from machine learning. Based on the
available information and storage capacity, numerical studies show that several
gains are achieved both in terms of users' satisfaction and backhaul
offloading. For example, in the case of BSs with of content ratings
and Gbyte of storage size ( of total library size), proactive
caching yields of users' satisfaction and offloads of the
backhaul.Comment: accepted for publication in IEEE Communications Magazine, Special
Issue on Communications, Caching, and Computing for Content-Centric Mobile
Network
Big Data Meets Telcos: A Proactive Caching Perspective
Mobile cellular networks are becoming increasingly complex to manage while
classical deployment/optimization techniques and current solutions (i.e., cell
densification, acquiring more spectrum, etc.) are cost-ineffective and thus
seen as stopgaps. This calls for development of novel approaches that leverage
recent advances in storage/memory, context-awareness, edge/cloud computing, and
falls into framework of big data. However, the big data by itself is yet
another complex phenomena to handle and comes with its notorious 4V: velocity,
voracity, volume and variety. In this work, we address these issues in
optimization of 5G wireless networks via the notion of proactive caching at the
base stations. In particular, we investigate the gains of proactive caching in
terms of backhaul offloadings and request satisfactions, while tackling the
large-amount of available data for content popularity estimation. In order to
estimate the content popularity, we first collect users' mobile traffic data
from a Turkish telecom operator from several base stations in hours of time
interval. Then, an analysis is carried out locally on a big data platform and
the gains of proactive caching at the base stations are investigated via
numerical simulations. It turns out that several gains are possible depending
on the level of available information and storage size. For instance, with 10%
of content ratings and 15.4 Gbyte of storage size (87% of total catalog size),
proactive caching achieves 100% of request satisfaction and offloads 98% of the
backhaul when considering 16 base stations.Comment: 8 pages, 5 figure
Storage Solutions for Big Data Systems: A Qualitative Study and Comparison
Big data systems development is full of challenges in view of the variety of
application areas and domains that this technology promises to serve.
Typically, fundamental design decisions involved in big data systems design
include choosing appropriate storage and computing infrastructures. In this age
of heterogeneous systems that integrate different technologies for optimized
solution to a specific real world problem, big data system are not an exception
to any such rule. As far as the storage aspect of any big data system is
concerned, the primary facet in this regard is a storage infrastructure and
NoSQL seems to be the right technology that fulfills its requirements. However,
every big data application has variable data characteristics and thus, the
corresponding data fits into a different data model. This paper presents
feature and use case analysis and comparison of the four main data models
namely document oriented, key value, graph and wide column. Moreover, a feature
analysis of 80 NoSQL solutions has been provided, elaborating on the criteria
and points that a developer must consider while making a possible choice.
Typically, big data storage needs to communicate with the execution engine and
other processing and visualization technologies to create a comprehensive
solution. This brings forth second facet of big data storage, big data file
formats, into picture. The second half of the research paper compares the
advantages, shortcomings and possible use cases of available big data file
formats for Hadoop, which is the foundation for most big data computing
technologies. Decentralized storage and blockchain are seen as the next
generation of big data storage and its challenges and future prospects have
also been discussed
Mitigating Interference in Content Delivery Networks by Spatial Signal Alignment: The Approach of Shot-Noise Ratio
Multimedia content especially videos is expected to dominate data traffic in
next-generation mobile networks. Caching popular content at the network edge
has emerged to be a solution for low-latency content delivery. Compared with
the traditional wireless communication, content delivery has a key
characteristic that many signals coexisting in the air carry identical popular
content. They, however, can interfere with each other at a receiver if their
modulation-and-coding (MAC) schemes are adapted to individual channels
following the classic approach. To address this issue, we present a novel idea
of content adaptive MAC (CAMAC) where adapting MAC schemes to content ensures
that all signals carry identical content are encoded using an identical MAC
scheme, achieving spatial MAC alignment. Consequently, interference can be
harnessed as signals, to improve the reliability of wireless delivery. In the
remaining part of the paper, we focus on quantifying the gain CAMAC can bring
to a content-delivery network using a stochastic-geometry model. Specifically,
content helpers are distributed as a Poisson point process, each of which
transmits a file from a content database based on a given popularity
distribution. It is discovered that the successful content-delivery probability
is closely related to the distribution of the ratio of two independent shot
noise processes, named a shot-noise ratio. The distribution itself is an open
mathematical problem that we tackle in this work. Using stable-distribution
theory and tools from stochastic geometry, the distribution function is derived
in closed form. Extending the result in the context of content-delivery
networks with CAMAC yields the content-delivery probability in different closed
forms. In addition, the gain in the probability due to CAMAC is shown to grow
with the level of skewness in the content popularity distribution.Comment: 32 pages, to appear in IEEE Trans. on Wireless Communicatio
NSSDC Conference on Mass Storage Systems and Technologies for Space and Earth Science Applications, volume 1
Papers and viewgraphs from the conference are presented. This conference served as a broad forum for the discussion of a number of important issues in the field of mass storage systems. Topics include magnetic disk and tape technologies, optical disks and tape, software storage and file management systems, and experiences with the use of a large, distributed storage system. The technical presentations describe, among other things, integrated mass storage systems that are expected to be available commercially. Also included is a series of presentations from Federal Government organizations and research institutions covering their mass storage requirements for the 1990's
Knowledge-infused and Consistent Complex Event Processing over Real-time and Persistent Streams
Emerging applications in Internet of Things (IoT) and Cyber-Physical Systems
(CPS) present novel challenges to Big Data platforms for performing online
analytics. Ubiquitous sensors from IoT deployments are able to generate data
streams at high velocity, that include information from a variety of domains,
and accumulate to large volumes on disk. Complex Event Processing (CEP) is
recognized as an important real-time computing paradigm for analyzing
continuous data streams. However, existing work on CEP is largely limited to
relational query processing, exposing two distinctive gaps for query
specification and execution: (1) infusing the relational query model with
higher level knowledge semantics, and (2) seamless query evaluation across
temporal spaces that span past, present and future events. These allow
accessible analytics over data streams having properties from different
disciplines, and help span the velocity (real-time) and volume (persistent)
dimensions. In this article, we introduce a Knowledge-infused CEP (X-CEP)
framework that provides domain-aware knowledge query constructs along with
temporal operators that allow end-to-end queries to span across real-time and
persistent streams. We translate this query model to efficient query execution
over online and offline data streams, proposing several optimizations to
mitigate the overheads introduced by evaluating semantic predicates and in
accessing high-volume historic data streams. The proposed X-CEP query model and
execution approaches are implemented in our prototype semantic CEP engine,
SCEPter. We validate our query model using domain-aware CEP queries from a
real-world Smart Power Grid application, and experimentally analyze the
benefits of our optimizations for executing these queries, using event streams
from a campus-microgrid IoT deployment.Comment: 34 pages, 16 figures, accepted in Future Generation Computer Systems,
October 27, 201
- …