10,807 research outputs found
Recommended from our members
Multimedia delivery in the future internet
The term “Networked Media” implies that all kinds of media including text, image, 3D graphics, audio
and video are produced, distributed, shared, managed and consumed on-line through various networks,
like the Internet, Fiber, WiFi, WiMAX, GPRS, 3G and so on, in a convergent manner [1]. This white
paper is the contribution of the Media Delivery Platform (MDP) cluster and aims to cover the Networked
challenges of the Networked Media in the transition to the Future of the Internet.
Internet has evolved and changed the way we work and live. End users of the Internet have been confronted
with a bewildering range of media, services and applications and of technological innovations concerning
media formats, wireless networks, terminal types and capabilities. And there is little evidence that the pace
of this innovation is slowing. Today, over one billion of users access the Internet on regular basis, more
than 100 million users have downloaded at least one (multi)media file and over 47 millions of them do so
regularly, searching in more than 160 Exabytes1 of content. In the near future these numbers are expected
to exponentially rise. It is expected that the Internet content will be increased by at least a factor of 6, rising
to more than 990 Exabytes before 2012, fuelled mainly by the users themselves. Moreover, it is envisaged
that in a near- to mid-term future, the Internet will provide the means to share and distribute (new)
multimedia content and services with superior quality and striking flexibility, in a trusted and personalized
way, improving citizens’ quality of life, working conditions, edutainment and safety.
In this evolving environment, new transport protocols, new multimedia encoding schemes, cross-layer inthe
network adaptation, machine-to-machine communication (including RFIDs), rich 3D content as well as
community networks and the use of peer-to-peer (P2P) overlays are expected to generate new models of
interaction and cooperation, and be able to support enhanced perceived quality-of-experience (PQoE) and
innovative applications “on the move”, like virtual collaboration environments, personalised services/
media, virtual sport groups, on-line gaming, edutainment. In this context, the interaction with content
combined with interactive/multimedia search capabilities across distributed repositories, opportunistic P2P
networks and the dynamic adaptation to the characteristics of diverse mobile terminals are expected to
contribute towards such a vision.
Based on work that has taken place in a number of EC co-funded projects, in Framework Program 6 (FP6)
and Framework Program 7 (FP7), a group of experts and technology visionaries have voluntarily
contributed in this white paper aiming to describe the status, the state-of-the art, the challenges and the way
ahead in the area of Content Aware media delivery platforms
Chemical Treatment Methods Pilot (CTMP) System for Treatment of Urban Runoff – Phase I. Feasibility and Design
(pdf contains 418 pages
Integrating perceptual, device and location characteristics for wireless multimedia transmission
In this paper,we describe an investigation exploring user experiences of accessing streamed multimedia content, when that content is tailored according to perceptual, device and location characteristics. To this end, we have created pre-defined transmission profiles and stream perceptually tailored multimedia content to three different locations, each characterised by different infotainment requirements. In the light of our results, we propose that multimedia transmission to mobile and wireless devices should be made based on pre-defined profiles, which contains a combination of static (perceptual, device type, CPU speed, and display specifications) and dynamic information (streamed content type location of the device/user, context of the device/user). The evaluation of such a system showed that the users and service providers can gain from such an approach considerably, as user perceptions of quality were not detrimentally affected by QoS degradations. Consequently, service providers can utilise this information to effectively manage local network traffic and bandwidth
Social Networks through the Prism of Cognition
Human relations are driven by social events - people interact, exchange
information, share knowledge and emotions, or gather news from mass media.
These events leave traces in human memory. The initial strength of a trace
depends on cognitive factors such as emotions or attention span. Each trace
continuously weakens over time unless another related event activity
strengthens it. Here, we introduce a novel Cognition-driven Social Network
(CogSNet) model that accounts for cognitive aspects of social perception and
explicitly represents human memory dynamics. For validation, we apply our model
to NetSense data on social interactions among university students. The results
show that CogSNet significantly improves quality of modeling of human
interactions in social networks
Stream Learning in Energy IoT Systems: A Case Study in Combined Cycle Power Plants
The prediction of electrical power produced in combined cycle power plants is a key challenge in the electrical power and energy systems field. This power production can vary depending on environmental variables, such as temperature, pressure, and humidity. Thus, the business problem is how to predict the power production as a function of these environmental conditions, in order to maximize the profit. The research community has solved this problem by applying Machine Learning techniques, and has managed to reduce the computational and time costs in comparison with the traditional thermodynamical analysis. Until now, this challenge has been tackled from a batch learning perspective, in which data is assumed to be at rest, and where models do not continuously integrate new information into already constructed models. We present an approach closer to the Big Data and Internet of Things paradigms, in which data are continuously arriving and where models learn incrementally, achieving significant enhancements in terms of data processing (time, memory and computational costs), and obtaining competitive performances. This work compares and examines the hourly electrical power prediction of several streaming regressors, and discusses about the best technique in terms of time processing and predictive performance to be applied on this streaming scenario.This work has been partially supported by the EU project iDev40. This project has received funding
from the ECSEL Joint Undertaking (JU) under grant agreement No 783163. The JU receives support from the
European Union’s Horizon 2020 research and innovation programme and Austria, Germany, Belgium, Italy,
Spain, Romania. It has also been supported by the Basque Government (Spain) through the project VIRTUAL
(KK-2018/00096), and by Ministerio de EconomĂa y Competitividad of Spain (Grant Ref. TIN2017-85887-C2-2-P)
Streaming visualisation of quantitative mass spectrometry data based on a novel raw signal decomposition method
As data rates rise, there is a danger that informatics for high-throughput LC-MS becomes more opaque and inaccessible to practitioners. It is therefore critical that efficient visualisation tools are available to facilitate quality control, verification, validation, interpretation, and sharing of raw MS data and the results of MS analyses. Currently, MS data is stored as contiguous spectra. Recall of individual spectra is quick but panoramas, zooming and panning across whole datasets necessitates processing/memory overheads impractical for interactive use. Moreover, visualisation is challenging if significant quantification data is missing due to data-dependent acquisition of MS/MS spectra. In order to tackle these issues, we leverage our seaMass technique for novel signal decomposition. LC-MS data is modelled as a 2D surface through selection of a sparse set of weighted B-spline basis functions from an over-complete dictionary. By ordering and spatially partitioning the weights with an R-tree data model, efficient streaming visualisations are achieved. In this paper, we describe the core MS1 visualisation engine and overlay of MS/MS annotations. This enables the mass spectrometrist to quickly inspect whole runs for ionisation/chromatographic issues, MS/MS precursors for coverage problems, or putative biomarkers for interferences, for example. The open-source software is available from http://seamass.net/viz/
Energy-Efficient Transmission Scheduling with Strict Underflow Constraints
We consider a single source transmitting data to one or more receivers/users
over a shared wireless channel. Due to random fading, the wireless channel
conditions vary with time and from user to user. Each user has a buffer to
store received packets before they are drained. At each time step, the source
determines how much power to use for transmission to each user. The source's
objective is to allocate power in a manner that minimizes an expected cost
measure, while satisfying strict buffer underflow constraints and a total power
constraint in each slot. The expected cost measure is composed of costs
associated with power consumption from transmission and packet holding costs.
The primary application motivating this problem is wireless media streaming.
For this application, the buffer underflow constraints prevent the user buffers
from emptying, so as to maintain playout quality. In the case of a single user
with linear power-rate curves, we show that a modified base-stock policy is
optimal under the finite horizon, infinite horizon discounted, and infinite
horizon average expected cost criteria. For a single user with piecewise-linear
convex power-rate curves, we show that a finite generalized base-stock policy
is optimal under all three expected cost criteria. We also present the
sequences of critical numbers that complete the characterization of the optimal
control laws in each of these cases when some additional technical conditions
are satisfied. We then analyze the structure of the optimal policy for the case
of two users. We conclude with a discussion of methods to identify
implementable near-optimal policies for the most general case of M users.Comment: 109 pages, 11 pdf figures, template.tex is main file. We have
significantly revised the paper from version 1. Additions include the case of
a single receiver with piecewise-linear convex power-rate curves, the case of
two receivers, and the infinite horizon average expected cost proble
- …