872 research outputs found
Performance Prediction of Cloud-Based Big Data Applications
Big data analytics have become widespread as a means to extract knowledge from large datasets. Yet, the heterogeneity and irregular- ity usually associated with big data applications often overwhelm the existing software and hardware infrastructures. In such con- text, the exibility and elasticity provided by the cloud computing paradigm o er a natural approach to cost-e ectively adapting the allocated resources to the application’s current needs. However, these same characteristics impose extra challenges to predicting the performance of cloud-based big data applications, a key step to proper management and planning. This paper explores three modeling approaches for performance prediction of cloud-based big data applications. We evaluate two queuing-based analytical models and a novel fast ad hoc simulator in various scenarios based on di erent applications and infrastructure setups. The three ap- proaches are compared in terms of prediction accuracy, nding that our best approaches can predict average application execution times with 26% relative error in the very worst case and about 7% on average
Recommended from our members
Breaking Computational Barriers to Perform Time Series Pattern Mining at Scale and at the Edge
Uncovering repeated behavior in time series is an important problem in many domains such as medicine, geophysics, meteorology, and many more. With the continuing surge of smart/embedded devices generating time series data, there is an ever growing need to perform analysis on datasets of increasing size. Additionally, there is an increasing need for analysis at low power edge devices due to latency problems inherent to the speed of light and the sheer amount of data being recorded. The matrix profile has proven to be a tool highly suitable for pattern mining in time series; however, a naive approach to computing the matrix profile makes it impossible to use effectively in both the cloud and at the edge. This dissertation shows how, through the use of GPUs and machine learning, the matrix profile is computed more feasibly, both at cloud-scale and at sensor-scale. In addition, it illustrates why both of these types of computation are important and what new insights they can provide to practitioners working with time series data
Scalable system for smart urban transport management
Efficient management of smart transport systems requires the integration of various sensing technologies, as well as fast processing of a high volume of heterogeneous data, in order to perform smart analytics of urban networks in real time. However, dynamic response that relies on intelligent demand-side transport management is particularly challenging due to the increasing flow of transmitted sensor data. In this work, a novel smart service-driven, adaptable middleware architecture is proposed to acquire, store, manipulate, and integrate information from heterogeneous data sources in order to deliver smart analytics aimed at supporting strategic decision-making. The architecture offers adaptive and scalable data integration services for acquiring and processing dynamic data, delivering fast response time, and offering data mining and machine learning models for real-time prediction, combined with advanced visualisation techniques. The proposed solution has been implemented and validated, demonstrating its ability to provide real-time performance on the existing, operational, and large-scale bus network of a European capital city
Street Smart in 5G : Vehicular Applications, Communication, and Computing
Recent advances in information technology have revolutionized the automotive industry, paving the way for next-generation smart vehicular mobility. Specifically, vehicles, roadside units, and other road users can collaborate to deliver novel services and applications that leverage, for example, big vehicular data and machine learning. Relatedly, fifth-generation cellular networks (5G) are being developed and deployed for low-latency, high-reliability, and high bandwidth communications. While 5G adjacent technologies such as edge computing allow for data offloading and computation at the edge of the network thus ensuring even lower latency and context-awareness. Overall, these developments provide a rich ecosystem for the evolution of vehicular applications, communications, and computing. Therefore in this work, we aim at providing a comprehensive overview of the state of research on vehicular computing in the emerging age of 5G and big data. In particular, this paper highlights several vehicular applications, investigates their requirements, details the enabling communication technologies and computing paradigms, and studies data analytics pipelines and the integration of these enabling technologies in response to application requirements.Peer reviewe
Data-Driven Approach based on Deep Learning and Probabilistic Models for PHY-Layer Security in AI-enabled Cognitive Radio IoT.
PhD Theses.Cognitive Radio Internet of Things (CR-IoT) has revolutionized almost every eld of life
and reshaped the technological world. Several tiny devices are seamlessly connected in
a CR-IoT network to perform various tasks in many applications. Nevertheless, CR-IoT
su ers from malicious attacks that pulverize communication and perturb network performance.
Therefore, recently it is envisaged to introduce higher-level Arti cial Intelligence
(AI) by incorporating Self-Awareness (SA) capabilities into CR-IoT objects to facilitate
CR-IoT networks to establish secure transmission against vicious attacks autonomously.
In this context, sub-band information from the Orthogonal Frequency Division Multiplexing
(OFDM) modulated transmission in the spectrum has been extracted from the
radio device receiver terminal, and a generalized state vector (GS) is formed containing
low dimension in-phase and quadrature components. Accordingly, a probabilistic method
based on learning a switching Dynamic Bayesian Network (DBN) from OFDM transmission
with no abnormalities has been proposed to statistically model signal behaviors
inside the CR-IoT spectrum. A Bayesian lter, Markov Jump Particle Filter (MJPF),
is implemented to perform state estimation and capture malicious attacks.
Subsequently, GS containing a higher number of subcarriers has been investigated. In
this connection, Variational autoencoders (VAE) is used as a deep learning technique
to extract features from high dimension radio signals into low dimension latent space
z, and DBN is learned based on GS containing latent space data. Afterward, to perform
state estimation and capture abnormalities in a spectrum, Adapted-Markov Jump
Particle Filter (A-MJPF) is deployed. The proposed method can capture anomaly that
appears due to either jammer attacks in transmission or cognitive devices in a network
experiencing di erent transmission sources that have not been observed previously. The
performance is assessed using the receiver
- …