23 research outputs found

    Beyond multimedia adaptation: Quality of experience-aware multi-sensorial media delivery

    Get PDF
    Multiple sensorial media (mulsemedia) combines multiple media elements which engage three or more of human senses, and as most other media content, requires support for delivery over the existing networks. This paper proposes an adaptive mulsemedia framework (ADAMS) for delivering scalable video and sensorial data to users. Unlike existing two-dimensional joint source-channel adaptation solutions for video streaming, the ADAMS framework includes three joint adaptation dimensions: video source, sensorial source, and network optimization. Using an MPEG-7 description scheme, ADAMS recommends the integration of multiple sensorial effects (i.e., haptic, olfaction, air motion, etc.) as metadata into multimedia streams. ADAMS design includes both coarse- and fine-grained adaptation modules on the server side: mulsemedia flow adaptation and packet priority scheduling. Feedback from subjective quality evaluation and network conditions is used to develop the two modules. Subjective evaluation investigated users' enjoyment levels when exposed to mulsemedia and multimedia sequences, respectively and to study users' preference levels of some sensorial effects in the context of mulsemedia sequences with video components at different quality levels. Results of the subjective study inform guidelines for an adaptive strategy that selects the optimal combination for video segments and sensorial data for a given bandwidth constraint and user requirement. User perceptual tests show how ADAMS outperforms existing multimedia delivery solutions in terms of both user perceived quality and user enjoyment during adaptive streaming of various mulsemedia content. In doing so, it highlights the case for tailored, adaptive mulsemedia delivery over traditional multimedia adaptive transport mechanisms

    Do I smell coffee? The tale of a 360Âș Mulsemedia experience

    Get PDF
    One of the main challenges in current multimedia networking environments is to find solutions to help accommodate the next generation of mobile application classes with stringent Quality of Service (QoS) requirements whilst enabling Quality of Experience (QoE) provisioning for users. One such application class, featured in this paper, is 360Âș mulsemedia—multiple sensorial media—which enriches 360Âș video by adding sensory effects that stimulate human senses beyond those of sight and hearing, such as the tactile and olfactory ones. In this paper, we present a conceptual framework for 360Âș mulsemedia delivery and a 360Âș mulsemedia-based prototype that enables users to experience 360Âș mulsemedia content. User evaluations revealed that higher video resolutions do not necessarily lead to the highest QoE levels in our experimental setup. Therefore, bandwidth savings can be leveraged with no detrimental impact on QoE

    Integration of multi-sensorial effects in synchronised immersive hybrid TV scenarios

    Full text link
    [EN] Traditionally, TV media content has exclusively involved 2D or 3D audiovisual streams consumed by using a simple TV device. However, in order to generate more immersive media consumption experiences, other new types of content (e.g., omnidirectional video), consumption devices (e.g., Head Mounted Displays or HMD) and solutions to stimulate other senses beyond the traditional ones of sight and hearing, can be used. Multi-sensorial media content (a.k.a. mulsemedia) facilitates additional sensory effects that stimulate other senses during the media consumption, with the aim of providing the consumers with a more immersive and realistic experience. They provide the users with a greater degree of realism and immersion, but can also provide greater social integration (e.g., people with AV deficiencies or attention span problems) and even contribute to creating better educational programs (e.g., for learning through the senses in educational content or scientific divulgation). Examples of sensory effects that can be used are olfactory effects (scents), tactile effects (e.g., vibration, wind or pressure effects), and ambient effects (e.g., temperature or lighting). In this paper, a solution for providing multi-sensorial and immersive hybrid (broadcast/broadband) TV content consumption experiences, including omnidirectional video and sensory effects, is presented. It has been designed, implemented, and subjectively evaluated (by 32 participants) in an end-to-end platform for hybrid content generation, delivery and synchronised consumption. The satisfactory results which were obtained regarding the perception of fine synchronisation between sensory effects and multimedia content, and regarding the users' perceived QoE, are summarised and discussed.This work was supported in part by the "Vicerrectorado de Investigacion de la Universitat Politecnica de Valencia'' under Project PAID-11-21 and Project PAID-12-21.Marfil, D.; Boronat, F.; GonzĂĄlez-Salinas, J.; Sapena Piera, A. (2022). Integration of multi-sensorial effects in synchronised immersive hybrid TV scenarios. IEEE Access. 10:79071-79089. https://doi.org/10.1109/ACCESS.2022.319417079071790891

    Mulsemedia in Telecommunication and Networking Education: A Novel Teaching Approach that Improves the Learning Process

    Get PDF
    The advent and increased use of new technologies, such as innovative mulsemedia and multi-modal content distribution mechanisms, have brought new challenges and diverse opportunities for technology enhanced learning (TEL). NEWTON is a Horizon 2020 European project that revolutionizes the educational process through innovative TEL methodologies and tools, integrated in a pan-European STEM-related learning network platform. This article focuses on one of these novel TEL methodologies (i.e., mulsemedia) and presents how NEWTON enables mulsemedia- enhanced teaching and learning of STEM subjects, with a particular focus on telecommunication and networking related modules. The article also discusses the very promising results of NEWTON case studies carried out with engineering students across two different universities in Spain and Ireland, respectively. The case studies focused on analyzing the impact on the learning process of the mulsemedia-enhanced teaching in the context of telecommunication and networking modules. The main conclusion of the article is that mulsemedia-enhanced education significantly increases students' learning experience and improves their knowledge gain

    MulseOnto: a Reference Ontology to Support the Design of Mulsemedia Systems

    Get PDF
    Designing a mulsemedia|multiple sensorial media|system entails first and foremost comprehending what it is beyond the ordinary understanding that it engages users in digital multisensory experiences that stimulate other senses in addition to sight and hearing, such as smell, touch, and taste. A myriad of programs that comprise a software system, several output devices to deliver sensory effects, computer media, among others, dwell deep in the realm of mulsemedia systems, making it a complex task for newcomers to get acquainted with their concepts and terms. Although there have been many technological advances in this field, especially for multisensory devices, there is a shortage of work that tries to establish common ground in terms of formal and explicit representation of what mulsemedia systems encompass. This might be useful to avoid the design of feeble mulsemedia systems that can be barely reused owing to misconception. In this paper, we extend our previous work by proposing to establish a common conceptualization about mulsemedia systems through a domain reference ontology named MulseOnto to aid the design of them. We applied ontology verification and validation techniques to evaluate it, including assessment by humans and a data-driven approach whereby the outcome is three successful instantiations of MulseOnto for distinct cases, making evident its ability to accommodate heterogeneous mulsemedia scenarios

    360° Mulsemedia: A Way to Improve Subjective QoE in 360° Videos

    Get PDF
    Previous research has shown that adding multisensory media-mulsemedia-to traditional audiovisual content has a positive effect on user Quality of Experience (QoE). However, the QoE impact of employing mulsemedia in 360° videos has remained unexplored. Accordingly, in this paper, a QoE study for watching a 360° video-with and without multisensory effects-in a full free-viewpoint VR setting is presented. The parametric space we considered to influence the QoE consists of the encoding quality and the motion level of the transmitted media. To achieve our research aim, we propose a wearable VR system that provides multisensory enhancement of 360° videos. Then, we utilise its capabilities to systematically evaluate the effects of multisensory stimulation on perceived quality degradation for videos with different motion levels and encoding qualities. Our results make a strong case for the inclusion of multisensory effects in 360° videos, as they reveal that both user-perceived quality, as well as enjoyment, are significantly higher when mulsemedia (as opposed to traditional multimedia) is employed in this context. Moreover, these observations hold true independent of the underlying 360° video encoding quality-thus QoE can be significantly enhanced with a minimal impact on networking resources
    corecore