309 research outputs found

    Beyond multimedia adaptation: Quality of experience-aware multi-sensorial media delivery

    Get PDF
    Multiple sensorial media (mulsemedia) combines multiple media elements which engage three or more of human senses, and as most other media content, requires support for delivery over the existing networks. This paper proposes an adaptive mulsemedia framework (ADAMS) for delivering scalable video and sensorial data to users. Unlike existing two-dimensional joint source-channel adaptation solutions for video streaming, the ADAMS framework includes three joint adaptation dimensions: video source, sensorial source, and network optimization. Using an MPEG-7 description scheme, ADAMS recommends the integration of multiple sensorial effects (i.e., haptic, olfaction, air motion, etc.) as metadata into multimedia streams. ADAMS design includes both coarse- and fine-grained adaptation modules on the server side: mulsemedia flow adaptation and packet priority scheduling. Feedback from subjective quality evaluation and network conditions is used to develop the two modules. Subjective evaluation investigated users' enjoyment levels when exposed to mulsemedia and multimedia sequences, respectively and to study users' preference levels of some sensorial effects in the context of mulsemedia sequences with video components at different quality levels. Results of the subjective study inform guidelines for an adaptive strategy that selects the optimal combination for video segments and sensorial data for a given bandwidth constraint and user requirement. User perceptual tests show how ADAMS outperforms existing multimedia delivery solutions in terms of both user perceived quality and user enjoyment during adaptive streaming of various mulsemedia content. In doing so, it highlights the case for tailored, adaptive mulsemedia delivery over traditional multimedia adaptive transport mechanisms

    Mulsemedia: State of the art, perspectives, and challenges

    Get PDF
    Mulsemedia-multiple sensorial media-captures a wide variety of research efforts and applications. This article presents a historic perspective on mulsemedia work and reviews current developments in the area. These take place across the traditional multimedia spectrum-from virtual reality applications to computer games-as well as efforts in the arts, gastronomy, and therapy, to mention a few. We also describe standardization efforts, via the MPEG-V standard, and identify future developments and exciting challenges the community needs to overcome

    Quality of experience study for multiple sensorial media delivery

    Get PDF
    Traditional video sequences make use of both visual images and audio tracks which are perceived by human eyes and ears, respectively. In order to present better ultra-reality virtual experience, the comprehensive human sensations (e.g. olfaction, haptic, gustatory, etc) needed to be exploited. In this paper, a multiple sensorial media (mulsemedia) delivery system is introduced to deliver multimedia sequences integrated with multiple media components which engage three or more of human senses such as sight, hearing, olfaction, haptic, gustatory, etc. Three sensorial effects (i.e. haptic, olfaction, and air-flowing) are selected for the purpose of demonstration. Subjective test is conducted to analyze the user perceived quality of experience of the mulsemedia service. It is concluded that the mulsemedia sequences can partly mask the decreased movie quality. Additionally the most preferable sensorial effect is haptic, followed by air-flowing and olfaction.This work was supported in part by Enterprise Ireland Innovation Partnership programme

    The sweet smell of success: Enhancing multimedia applications with olfaction

    Get PDF
    This is the Post-Print version of the Article. The official published version can be accessed from the link below - Copyright @ 2012 ACMOlfaction, or smell, is one of the last challenges which multimedia applications have to conquer. As far as computerized smell is concerned, there are several difficulties to overcome, particularly those associated with the ambient nature of smell. In this article, we present results from an empirical study exploring users' perception of olfaction-enhanced multimedia displays. Findings show that olfaction significantly adds to the user multimedia experience. Moreover, use of olfaction leads to an increased sense of reality and relevance. Our results also show that users are tolerant of the interference and distortion effects caused by olfactory effect in multimedia

    MediaSync: Handbook on Multimedia Synchronization

    Get PDF
    This book provides an approachable overview of the most recent advances in the fascinating field of media synchronization (mediasync), gathering contributions from the most representative and influential experts. Understanding the challenges of this field in the current multi-sensory, multi-device, and multi-protocol world is not an easy task. The book revisits the foundations of mediasync, including theoretical frameworks and models, highlights ongoing research efforts, like hybrid broadband broadcast (HBB) delivery and users' perception modeling (i.e., Quality of Experience or QoE), and paves the way for the future (e.g., towards the deployment of multi-sensory and ultra-realistic experiences). Although many advances around mediasync have been devised and deployed, this area of research is getting renewed attention to overcome remaining challenges in the next-generation (heterogeneous and ubiquitous) media ecosystem. Given the significant advances in this research area, its current relevance and the multiple disciplines it involves, the availability of a reference book on mediasync becomes necessary. This book fills the gap in this context. In particular, it addresses key aspects and reviews the most relevant contributions within the mediasync research space, from different perspectives. Mediasync: Handbook on Multimedia Synchronization is the perfect companion for scholars and practitioners that want to acquire strong knowledge about this research area, and also approach the challenges behind ensuring the best mediated experiences, by providing the adequate synchronization between the media elements that constitute these experiences

    360° mulsemedia experience over next generation wireless networks - a reinforcement learning approach

    Get PDF
    The next generation of wireless networks targets aspiring key performance indicators, like very low latency, higher data rates and more capacity, paving the way for new generations of video streaming technologies, such as 360° or omnidirectional videos. One possible application that could revolutionize the streaming technology is the 360° MULtiple SEnsorial MEDIA (MULSEMEDIA) which enriches the 360° video content with other media objects like olfactory, haptic or even thermoceptic ones. However, the adoption of the 360° Mulsemedia applications might be hindered by the strict Quality of Service (QoS) requirements, like very large bandwidth and low latency for fast responsiveness to the users, inputs that could impact their Quality of Experience (QoE). To this extent, this paper introduces the new concept of 360° Mulsemedia as well as it proposes the use of Reinforcement Learning to enable QoS provisioning over the next generation wireless networks that influences the QoE of the end-users
    corecore