994 research outputs found

    Mulsemedia: State of the art, perspectives, and challenges

    Get PDF
    Mulsemedia-multiple sensorial media-captures a wide variety of research efforts and applications. This article presents a historic perspective on mulsemedia work and reviews current developments in the area. These take place across the traditional multimedia spectrum-from virtual reality applications to computer games-as well as efforts in the arts, gastronomy, and therapy, to mention a few. We also describe standardization efforts, via the MPEG-V standard, and identify future developments and exciting challenges the community needs to overcome

    Integration of multi-sensorial effects in synchronised immersive hybrid TV scenarios

    Full text link
    [EN] Traditionally, TV media content has exclusively involved 2D or 3D audiovisual streams consumed by using a simple TV device. However, in order to generate more immersive media consumption experiences, other new types of content (e.g., omnidirectional video), consumption devices (e.g., Head Mounted Displays or HMD) and solutions to stimulate other senses beyond the traditional ones of sight and hearing, can be used. Multi-sensorial media content (a.k.a. mulsemedia) facilitates additional sensory effects that stimulate other senses during the media consumption, with the aim of providing the consumers with a more immersive and realistic experience. They provide the users with a greater degree of realism and immersion, but can also provide greater social integration (e.g., people with AV deficiencies or attention span problems) and even contribute to creating better educational programs (e.g., for learning through the senses in educational content or scientific divulgation). Examples of sensory effects that can be used are olfactory effects (scents), tactile effects (e.g., vibration, wind or pressure effects), and ambient effects (e.g., temperature or lighting). In this paper, a solution for providing multi-sensorial and immersive hybrid (broadcast/broadband) TV content consumption experiences, including omnidirectional video and sensory effects, is presented. It has been designed, implemented, and subjectively evaluated (by 32 participants) in an end-to-end platform for hybrid content generation, delivery and synchronised consumption. The satisfactory results which were obtained regarding the perception of fine synchronisation between sensory effects and multimedia content, and regarding the users' perceived QoE, are summarised and discussed.This work was supported in part by the "Vicerrectorado de Investigacion de la Universitat Politecnica de Valencia'' under Project PAID-11-21 and Project PAID-12-21.Marfil, D.; Boronat, F.; González-Salinas, J.; Sapena Piera, A. (2022). Integration of multi-sensorial effects in synchronised immersive hybrid TV scenarios. IEEE Access. 10:79071-79089. https://doi.org/10.1109/ACCESS.2022.319417079071790891

    Integration of Multisensorial Stimuli and Multimodal Interaction in a Hybrid 3DTV System

    Get PDF
    This article proposes the integration of multisensorial stimuli and multimodal interaction components into a sports multimedia asset under two dimensions: immersion and interaction. The first dimension comprises a binaural audio system and a set of sensory effects synchronized with the audiovisual content, whereas the second explores interaction through the insertion of interactive 3D objects into the main screen and on-demand presentation of additional information in a second touchscreen. We present an end-to-end solution integrating these components into a hybrid (internet-broadcast) television system using current 3DTV standards. Results from an experimental study analyzing the perceived quality of these stimuli and their influence on the Quality of Experience are presented

    Mulsemedia Communication Research Challenges for Metaverse in 6G Wireless Systems

    Full text link
    Although humans have five basic senses, sight, hearing, touch, smell, and taste, most multimedia systems in current systems only capture two of them, namely, sight and hearing. With the development of the metaverse and related technologies, there is a growing need for a more immersive media format that leverages all human senses. Multisensory media(Mulsemedia) that can stimulate multiple senses will play a critical role in the near future. This paper provides an overview of the history, background, use cases, existing research, devices, and standards of mulsemedia. Emerging mulsemedia technologies such as Extended Reality (XR) and Holographic-Type Communication (HTC) are introduced. Additionally, the challenges in mulsemedia research from the perspective of wireless communication and networking are discussed. The potential of 6G wireless systems to address these challenges is highlighted, and several research directions that can advance mulsemedia communications are identified

    MediaSync: Handbook on Multimedia Synchronization

    Get PDF
    This book provides an approachable overview of the most recent advances in the fascinating field of media synchronization (mediasync), gathering contributions from the most representative and influential experts. Understanding the challenges of this field in the current multi-sensory, multi-device, and multi-protocol world is not an easy task. The book revisits the foundations of mediasync, including theoretical frameworks and models, highlights ongoing research efforts, like hybrid broadband broadcast (HBB) delivery and users' perception modeling (i.e., Quality of Experience or QoE), and paves the way for the future (e.g., towards the deployment of multi-sensory and ultra-realistic experiences). Although many advances around mediasync have been devised and deployed, this area of research is getting renewed attention to overcome remaining challenges in the next-generation (heterogeneous and ubiquitous) media ecosystem. Given the significant advances in this research area, its current relevance and the multiple disciplines it involves, the availability of a reference book on mediasync becomes necessary. This book fills the gap in this context. In particular, it addresses key aspects and reviews the most relevant contributions within the mediasync research space, from different perspectives. Mediasync: Handbook on Multimedia Synchronization is the perfect companion for scholars and practitioners that want to acquire strong knowledge about this research area, and also approach the challenges behind ensuring the best mediated experiences, by providing the adequate synchronization between the media elements that constitute these experiences

    A QoE Model for Mulsemedia TV in a Smart Home Environment

    Get PDF
    The provision to the users of realistic media contents is one of the main goals of future media services. The sense of reality perceived by the user can be enhanced by adding various sensorial effects to the conventional audio-visual content, through the stimulation of the five senses stimulation (sight, hearing, touch, smell and taste), the so-called multi-sensorial media (mulsemedia). To deliver the additional effects within a smart home (SH) environment, custom devices (e.g., air conditioning, lights) providing opportune smart features, are preferred to ad-hoc devices, often deployed in a specific context such as for example in gaming consoles. In the present study, a prototype for a mulsemedia TV application, implemented in a real smart home scenario, allowed the authors to assess the user's Quality of Experience (QoE) through test measurement campaign. The impact of specific sensory effects (i.e., light, airflow, vibration) on the user experience regarding the enhancement of sense of reality, annoyance, and intensity of the effects was investigated through subjective assessment. The need for multi sensorial QoE models is an important challenge for future research in this field, considering the time and cost of subjective quality assessments. Therefore, based on the subjective assessment results, this paper instantiates and validates a parametric QoE model for multi-sensorial TV in a SH scenario which indicates the relationship between the quality of audiovisual contents and user-perceived QoE for sensory effects applications

    Personalization in object-based audio for accessibility : a review of advancements for hearing impaired listeners

    Get PDF
    Hearing loss is widespread and significantly impacts an individual’s ability to engage with broadcast media. Access can be improved through new object-based audio personalization methods. Utilizing the literature on hearing loss and intelligibility this paper develops three dimensions which are evidenced to improve intelligibility: spatial separation, speech to noise ratio and redundancy. These can be personalized, individually or concurrently, using object based audio. A systematic review of all work in object-based audio personalization is then undertaken. These dimensions are utilized to evaluate each project’s approach to personalisation, identifying successful approaches, commercial challenges and the next steps required to ensure continuing improvements to broadcast audio for hard of hearing individuals
    • …
    corecore