42 research outputs found
High-Performance Modelling and Simulation for Big Data Applications
This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications
Entrega de conteúdos multimédia em over-the-top: caso de estudo das gravações automáticas
Doutoramento em Engenharia EletrotécnicaOver-The-Top (OTT) multimedia delivery is a very appealing approach for providing
ubiquitous,
exible, and globally accessible services capable of low-cost
and unrestrained device targeting. In spite of its appeal, the underlying delivery
architecture must be carefully planned and optimized to maintain a high Qualityof-
Experience (QoE) and rational resource usage, especially when migrating from
services running on managed networks with established quality guarantees. To address
the lack of holistic research works on OTT multimedia delivery systems, this
Thesis focuses on an end-to-end optimization challenge, considering a migration
use-case of a popular Catch-up TV service from managed IP Television (IPTV)
networks to OTT. A global study is conducted on the importance of Catch-up
TV and its impact in today's society, demonstrating the growing popularity of
this time-shift service, its relevance in the multimedia landscape, and tness as
an OTT migration use-case. Catch-up TV consumption logs are obtained from
a Pay-TV operator's live production IPTV service containing over 1 million subscribers
to characterize demand and extract insights from service utilization at a
scale and scope not yet addressed in the literature. This characterization is used
to build demand forecasting models relying on machine learning techniques to enable
static and dynamic optimization of OTT multimedia delivery solutions, which
are able to produce accurate bandwidth and storage requirements' forecasts, and
may be used to achieve considerable power and cost savings whilst maintaining a
high QoE. A novel caching algorithm, Most Popularly Used (MPU), is proposed,
implemented, and shown to outperform established caching algorithms in both
simulation and experimental scenarios. The need for accurate QoE measurements
in OTT scenarios supporting HTTP Adaptive Streaming (HAS) motivates the creation
of a new QoE model capable of taking into account the impact of key HAS
aspects. By addressing the complete content delivery pipeline in the envisioned
content-aware OTT Content Delivery Network (CDN), this Thesis demonstrates
that signi cant improvements are possible in next-generation multimedia delivery
solutions.A entrega de conteúdos multimédia em Over-The-Top (OTT) e uma proposta
atractiva para fornecer um serviço flexível e globalmente acessível, capaz de alcançar qualquer dispositivo, com uma promessa de baixos custos. Apesar das suas vantagens, e necessario um planeamento arquitectural detalhado e optimizado para manter níveis elevados de Qualidade de Experiência (QoE), em particular aquando da migração dos serviços suportados em redes geridas com garantias de qualidade pré-estabelecidas. Para colmatar a falta de trabalhos de investigação na área de sistemas de entrega de conteúdos multimédia em OTT, esta Tese foca-se na optimização destas soluções como um todo, partindo do caso de uso de migração de um serviço popular de Gravações Automáticas suportado em redes de Televisão sobre IP (IPTV) geridas, para um cenário de entrega em OTT. Um estudo global para aferir a importância das Gravações Automáticas revela a sua relevância no panorama de serviços multimédia e a sua adequação enquanto caso de uso de
migração para cenários OTT. São obtidos registos de consumos de um serviço
de produção de Gravações Automáticas, representando mais de 1 milhão de assinantes,
para caracterizar e extrair informação de consumos numa escala e âmbito
não contemplados ate a data na literatura. Esta caracterização e utilizada para
construir modelos de previsão de carga, tirando partido de sistemas de machine
learning, que permitem optimizações estáticas e dinâmicas dos sistemas de entrega
de conteúdos em OTT através de previsões das necessidades de largura de banda e
armazenamento, potenciando ganhos significativos em consumo energético e custos.
Um novo mecanismo de caching, Most Popularly Used (MPU), demonstra um
desempenho superior as soluções de referencia, quer em cenários de simulação quer
experimentais. A necessidade de medição exacta da QoE em streaming adaptativo
HTTP motiva a criaçao de um modelo capaz de endereçar aspectos específicos
destas tecnologias adaptativas. Ao endereçar a cadeia completa de entrega através
de uma arquitectura consciente dos seus conteúdos, esta Tese demonstra que são
possíveis melhorias de desempenho muito significativas nas redes de entregas de
conteúdos em OTT de próxima geração
Technologies and Applications for Big Data Value
This open access book explores cutting-edge solutions and best practices for big data and data-driven AI applications for the data-driven economy. It provides the reader with a basis for understanding how technical issues can be overcome to offer real-world solutions to major industrial areas. The book starts with an introductory chapter that provides an overview of the book by positioning the following chapters in terms of their contributions to technology frameworks which are key elements of the Big Data Value Public-Private Partnership and the upcoming Partnership on AI, Data and Robotics. The remainder of the book is then arranged in two parts. The first part “Technologies and Methods” contains horizontal contributions of technologies and methods that enable data value chains to be applied in any sector. The second part “Processes and Applications” details experience reports and lessons from using big data and data-driven approaches in processes and applications. Its chapters are co-authored with industry experts and cover domains including health, law, finance, retail, manufacturing, mobility, and smart cities. Contributions emanate from the Big Data Value Public-Private Partnership and the Big Data Value Association, which have acted as the European data community's nucleus to bring together businesses with leading researchers to harness the value of data to benefit society, business, science, and industry. The book is of interest to two primary audiences, first, undergraduate and postgraduate students and researchers in various fields, including big data, data science, data engineering, and machine learning and AI. Second, practitioners and industry experts engaged in data-driven systems, software design and deployment projects who are interested in employing these advanced methods to address real-world problems