717 research outputs found

    Perceived synchronization of mulsemedia services

    Get PDF
    Multimedia synchronization involves a temporal relationship between audio and visual media components. The presentation of "in-sync" data streams is essential to achieve a natural impression, as "out-of-sync" effects are often associated with user quality of experience (QoE) decrease. Recently, multi-sensory media (mulsemedia) has been demonstrated to provide a highly immersive experience for its users. Unlike traditional multimedia, mulsemedia consists of other media types (i.e., haptic, olfaction, taste, etc.) in addition to audio and visual content. Therefore, the goal of achieving high quality mulsemedia transmission is to present no or little synchronization errors between the multiple media components. In order to achieve this ideal synchronization, there is a need for comprehensive knowledge of the synchronization requirements at the user interface. This paper presents the results of a subjective study carried out to explore the temporal boundaries within which haptic and air-flow media objects can be successfully synchronized with video media. Results show that skews between sensorial media and multimedia might still give the effect that the mulsemedia sequence is "in-sync" and provide certain constraints under which synchronization errors might be tolerated. The outcomes of the paper are used to provide recommendations for mulsemedia service providers in order for their services to be associated with acceptable user experience levels, e.g. haptic media could be presented with a delay of up to 1 s behind video content, while air-flow media could be released either 5 s ahead of or 3 s behind video content

    Beyond multimedia adaptation: Quality of experience-aware multi-sensorial media delivery

    Get PDF
    Multiple sensorial media (mulsemedia) combines multiple media elements which engage three or more of human senses, and as most other media content, requires support for delivery over the existing networks. This paper proposes an adaptive mulsemedia framework (ADAMS) for delivering scalable video and sensorial data to users. Unlike existing two-dimensional joint source-channel adaptation solutions for video streaming, the ADAMS framework includes three joint adaptation dimensions: video source, sensorial source, and network optimization. Using an MPEG-7 description scheme, ADAMS recommends the integration of multiple sensorial effects (i.e., haptic, olfaction, air motion, etc.) as metadata into multimedia streams. ADAMS design includes both coarse- and fine-grained adaptation modules on the server side: mulsemedia flow adaptation and packet priority scheduling. Feedback from subjective quality evaluation and network conditions is used to develop the two modules. Subjective evaluation investigated users' enjoyment levels when exposed to mulsemedia and multimedia sequences, respectively and to study users' preference levels of some sensorial effects in the context of mulsemedia sequences with video components at different quality levels. Results of the subjective study inform guidelines for an adaptive strategy that selects the optimal combination for video segments and sensorial data for a given bandwidth constraint and user requirement. User perceptual tests show how ADAMS outperforms existing multimedia delivery solutions in terms of both user perceived quality and user enjoyment during adaptive streaming of various mulsemedia content. In doing so, it highlights the case for tailored, adaptive mulsemedia delivery over traditional multimedia adaptive transport mechanisms

    User quality of experience of mulsemedia applications

    Get PDF
    User Quality of Experience (QoE) is of fundamental importance in multimedia applications and has been extensively studied for decades. However, user QoE in the context of the emerging multiple-sensorial media (mulsemedia) services, which involve different media components than the traditional multimedia applications, have not been comprehensively studied. This article presents the results of subjective tests which have investigated user perception of mulsemedia content. In particular, the impact of intensity of certain mulsemedia components including haptic and airflow on user-perceived experience are studied. Results demonstrate that by making use of mulsemedia the overall user enjoyment levels increased by up to 77%

    User perception of media content association in olfaction-enhanced multimedia

    Get PDF
    Olfaction is an exciting challenge facing multimedia applications. In this article we have investigated user perception of the association between olfactory media content and video media content in olfactory-enhanced multimedia. Results show that the association between scent and content has a significant impact on the user-perceived experience of olfactory-enhanced multimedia

    MediaSync: Handbook on Multimedia Synchronization

    Get PDF
    This book provides an approachable overview of the most recent advances in the fascinating field of media synchronization (mediasync), gathering contributions from the most representative and influential experts. Understanding the challenges of this field in the current multi-sensory, multi-device, and multi-protocol world is not an easy task. The book revisits the foundations of mediasync, including theoretical frameworks and models, highlights ongoing research efforts, like hybrid broadband broadcast (HBB) delivery and users' perception modeling (i.e., Quality of Experience or QoE), and paves the way for the future (e.g., towards the deployment of multi-sensory and ultra-realistic experiences). Although many advances around mediasync have been devised and deployed, this area of research is getting renewed attention to overcome remaining challenges in the next-generation (heterogeneous and ubiquitous) media ecosystem. Given the significant advances in this research area, its current relevance and the multiple disciplines it involves, the availability of a reference book on mediasync becomes necessary. This book fills the gap in this context. In particular, it addresses key aspects and reviews the most relevant contributions within the mediasync research space, from different perspectives. Mediasync: Handbook on Multimedia Synchronization is the perfect companion for scholars and practitioners that want to acquire strong knowledge about this research area, and also approach the challenges behind ensuring the best mediated experiences, by providing the adequate synchronization between the media elements that constitute these experiences

    The art of digital scent - people, space and time

    Get PDF
    The sense of smell is closely related with people across time and space. The aesthetic, affective and evocative aspects of smell are widely portrayed in art practices. Olfactory art has its unique expression that other modalities hardly have. Yet this aesthetic medium seems to be underestimated when it comes to the digital age. Current digital olfaction researches mainly focus on meeting tasks and solving problems. The aesthetic experience and the meaning behind are seldom discussed. This paper proposes a potential area where the people from digital art and digital olfaction can contribute together. This paper firstly gives an overview about the affective and evocative impacts of scent on people across time and space, reviewing how scent is treated as the aesthetic medium in the art form, then examining current usages of olfactory display, and lastly discussing the opportunities lying ahead. It provides the way to appreciate the world aesthetically through this affective and evocative medium. &nbsp

    The influence of human factors on 360∘ mulsemedia QoE

    Get PDF
    Quality of Experience (QoE) is indelibly linked to the human side of the multimedia experience. Surprisingly, however, there is a paucity of research which explores the impact that human factors has in determining QoE. Whilst this is true of multimedia, it is even more starkly so as far as mulsemedia - applications that involve media engaging three or more of human senses - is concerned. Hence, in the study reported in this paper, we focus on an exciting subset of mulsemedia applications - 360∘ mulsemedia - particularly important given that the upcoming 5G technology is foreseen to be a key enabler for the proliferation of immersive Virtual Reality (VR) applications. Accordingly, we study the impact that human factors such as gender, age, prior computing experience, and smell sensitivity have on 360∘ mulsemedia QoE. Results showed insight into the potential of 360∘ mulsemedia to inspire and to enrich experiences for Generation Z - a generation empowered by rapidly advancing technology. Patterns of prior media usage and smell sensitivity play also an important role in influencing the QoE evaluation - users who have a preference for dynamic videos enjoy and find realistic the 360∘ mulsemedia experiences

    Integration of multi-sensorial effects in synchronised immersive hybrid TV scenarios

    Full text link
    [EN] Traditionally, TV media content has exclusively involved 2D or 3D audiovisual streams consumed by using a simple TV device. However, in order to generate more immersive media consumption experiences, other new types of content (e.g., omnidirectional video), consumption devices (e.g., Head Mounted Displays or HMD) and solutions to stimulate other senses beyond the traditional ones of sight and hearing, can be used. Multi-sensorial media content (a.k.a. mulsemedia) facilitates additional sensory effects that stimulate other senses during the media consumption, with the aim of providing the consumers with a more immersive and realistic experience. They provide the users with a greater degree of realism and immersion, but can also provide greater social integration (e.g., people with AV deficiencies or attention span problems) and even contribute to creating better educational programs (e.g., for learning through the senses in educational content or scientific divulgation). Examples of sensory effects that can be used are olfactory effects (scents), tactile effects (e.g., vibration, wind or pressure effects), and ambient effects (e.g., temperature or lighting). In this paper, a solution for providing multi-sensorial and immersive hybrid (broadcast/broadband) TV content consumption experiences, including omnidirectional video and sensory effects, is presented. It has been designed, implemented, and subjectively evaluated (by 32 participants) in an end-to-end platform for hybrid content generation, delivery and synchronised consumption. The satisfactory results which were obtained regarding the perception of fine synchronisation between sensory effects and multimedia content, and regarding the users' perceived QoE, are summarised and discussed.This work was supported in part by the "Vicerrectorado de Investigacion de la Universitat Politecnica de Valencia'' under Project PAID-11-21 and Project PAID-12-21.Marfil, D.; Boronat, F.; González-Salinas, J.; Sapena Piera, A. (2022). Integration of multi-sensorial effects in synchronised immersive hybrid TV scenarios. IEEE Access. 10:79071-79089. https://doi.org/10.1109/ACCESS.2022.319417079071790891
    corecore