520 research outputs found
Recommended from our members
Thermal and wind devices for multisensory human-computer interaction: an overview
In order to create immersive experiences in virtual worlds, we need to explore different human senses (sight, hearing, smell, taste, and touch). Many different devices have been developed by both industry and academia towards this aim. In this paper, we focus our attention on the researched area of thermal and wind devices to deliver the sensations of heat and cold against people’s skin and their application to human-computer interaction (HCI). First, we present a review of devices and their features that were identified as relevant. Then, we highlight the users’ experience with thermal and wind devices, highlighting limitations either found or inferred by the authors and studies selected for this survey. Accordingly, from the current literature, we can infer that, in wind and temperature-based haptic systems (i) users experience wind effects produced by fans that move air molecules at room temperature, and (ii) there is no integration of thermal components to devices intended for the production of both cold or hot airflows. Subsequently, an analysis of why thermal wind devices have not been devised yet is undertaken, highlighting the challenges of creating such devices.Espírito Santo Research and Innovation Foundation (FAPES, Brazil) - Finance Code 2021-GL60J), the Coordination for the Improvement of Higher Education Personnel (CAPES, Brazil) - Finance Code 88881.187844/2018-01 and 88887.570688/2020-00 and by the National Council for Scientific and Technological (CNPq, Brazil) - Finance Code 307718/2020-4. The work was also funded by the European Union’s Horizon 2020 Research and Innovation programme under Grant Agreement no. 688503. E. B. Saleme additionally acknowledges aid from the Federal Institute of Espírito Santo
Multisensory 360 videos under varying resolution levels enhance presence
Omnidirectional videos have become a leading multimedia format for Virtual Reality applications. While live 360◦ videos offer a unique immersive experience, streaming of omnidirectional content at high resolutions is not always feasible in bandwidth-limited networks. While in the case of flat videos, scaling to lower resolutions works well, 360◦ video quality is seriously degraded because of the viewing distances involved in head-mounted displays. Hence, in this paper, we investigate first how quality degradation impacts the sense of presence in immersive Virtual Reality applications. Then, we are pushing the boundaries of 360◦ technology through the enhancement with multisensory stimuli. 48 participants experimented both 360◦ scenarios (with and without multisensory content), while they were divided randomly between four conditions characterised by different encoding qualities (HD, FullHD, 2.5K, 4K). The results showed that presence is not mediated by streaming at a higher bitrate. The trend we identified revealed however that presence is positively and significantly impacted by the enhancement with multisensory content. This shows that multisensory technology is crucial in creating more immersive experiences
360° Mulsemedia: A Way to Improve Subjective QoE in 360° Videos
Previous research has shown that adding multisensory media-mulsemedia-to traditional audiovisual content has a positive effect on user Quality of Experience (QoE). However, the QoE impact of employing mulsemedia in 360° videos has remained unexplored. Accordingly, in this paper, a QoE study for watching a 360° video-with and without multisensory effects-in a full free-viewpoint VR setting is presented. The parametric space we considered to influence the QoE consists of the encoding quality and the motion level of the transmitted media. To achieve our research aim, we propose a wearable VR system that provides multisensory enhancement of 360° videos. Then, we utilise its capabilities to systematically evaluate the effects of multisensory stimulation on perceived quality degradation for videos with different motion levels and encoding qualities. Our results make a strong case for the inclusion of multisensory effects in 360° videos, as they reveal that both user-perceived quality, as well as enjoyment, are significantly higher when mulsemedia (as opposed to traditional multimedia) is employed in this context. Moreover, these observations hold true independent of the underlying 360° video encoding quality-thus QoE can be significantly enhanced with a minimal impact on networking resources
Multisensory 360 videos under varying resolution levels enhance presence
Omnidirectional videos have become a leading multimedia format for Virtual Reality applications. While live 360◦ videos offer a unique immersive experience, streaming of omnidirectional content at high resolutions is not always feasible in bandwidth-limited networks. While in the case of flat videos, scaling to lower resolutions works well, 360◦ video quality is seriously degraded because of the viewing distances involved in head-mounted displays. Hence, in this paper, we investigate first how quality degradation impacts the sense of presence in immersive Virtual Reality applications. Then, we are pushing the boundaries of 360◦ technology through the enhancement with multisensory stimuli. 48 participants experimented both 360◦ scenarios (with and without multisensory content), while they were divided randomly between four conditions characterised by different encoding qualities (HD, FullHD, 2.5K, 4K). The results showed that presence is not mediated by streaming at a higher bitrate. The trend we identified revealed however that presence is positively and significantly impacted by the enhancement with multisensory content. This shows that multisensory technology is crucial in creating more immersive experiences
Multimodality in {VR}: {A} Survey
Virtual reality has the potential to change the way we create and consume content in our everyday life. Entertainment, training, design and manufacturing, communication, or advertising are all applications that already benefit from this new medium reaching consumer level. VR is inherently different from traditional media: it offers a more immersive experience, and has the ability to elicit a sense of presence through the place and plausibility illusions. It also gives the user unprecedented capabilities to explore their environment, in contrast with traditional media. In VR, like in the real world, users integrate the multimodal sensory information they receive to create a unified perception of the virtual world. Therefore, the sensory cues that are available in a virtual environment can be leveraged to enhance the final experience. This may include increasing realism, or the sense of presence; predicting or guiding the attention of the user through the experience; or increasing their performance if the experience involves the completion of certain tasks. In this state-of-the-art report, we survey the body of work addressing multimodality in virtual reality, its role and benefits in the final user experience. The works here reviewed thus encompass several fields of research, including computer graphics, human computer interaction, or psychology and perception. Additionally, we give an overview of different applications that leverage multimodal input in areas such as medicine, training and education, or entertainment; we include works in which the integration of multiple sensory information yields significant improvements, demonstrating how multimodality can play a fundamental role in the way VR systems are designed, and VR experiences created and consumed
The influence of scent on virtual reality experiences: The role of aroma-content congruence
We live in a multisensory world. Our experiences are constructed by the stimulation of all our senses. Nevertheless, digital interactions are mainly based on audiovisual elements, while other sensory stimuli have been less explored. Virtual reality (VR) is a sensory-enabling technology that facilitates the integration of sensory inputs to enhance multisensory digital experiences. This study analyzes how the addition of ambient scent to a VR experience affects digital pre-experiences in a service context (tourism). Results from a laboratory experiment confirmed that embodied VR devices, together with pleasant and congruent ambient scents, enhance sensory stimulation, which directly (and indirectly through ease of imagination) influence affective and behavioral reactions. These enriched multisensory experiences strengthen the link between the affective and conative images of destinations. We make recommendations for researchers and service providers with ambitions to deliver ambient scents, especially those congruent with displayed content, to enhance the sensorialization of digital VR experiences
Multimodality in VR: A survey
Virtual reality (VR) is rapidly growing, with the potential to change the way we create and consume content. In VR, users integrate multimodal sensory information they receive, to create a unified perception of the virtual world. In this survey, we review the body of work addressing multimodality in VR, and its role and benefits in user experience, together with different applications that leverage multimodality in many disciplines. These works thus encompass several fields of research, and demonstrate that multimodality plays a fundamental role in VR; enhancing the experience, improving overall performance, and yielding unprecedented abilities in skill and knowledge transfer
Recommended from our members
Distributed multimedia quality: The user perspective
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Distributed multimedia supports a symbiotic infotainment duality, i.e. the ability to transfer information to the user, yet also provide the user with a level of satisfaction. As multimedia is ultimately produced for the education and / or enjoyment of viewers, the user’s-perspective concerning the presentation quality is surely of equal importance as objective Quality of Service (QoS) technical parameters, to defining distributed multimedia quality. In order to extensively measure the user-perspective of multimedia video quality, we introduce an extended model of distributed multimedia quality that segregates quality into three discrete levels: the network-level, the media-level and content-level, using two distinct quality perspectives: the user-perspective and the technical-perspective.
Since experimental questionnaires do not provide continuous monitoring of user attention, eye tracking was used in our study in order to provide a better understanding of the role that the human element plays in the reception, analysis and synthesis of multimedia data. Results showed that video content adaptation, results in disparity in user video eye-paths when: i) no single / obvious point of focus exists; or ii) when the point of attention changes dramatically.
Accordingly, appropriate technical- and user-perspective parameter adaptation is implemented, for all quality abstractions of our model, i.e. network-level (via simulated delay and jitter), media-level (via a technical- and user-perspective manipulated region-of-interest attentive display) and content-level (via display-type and video clip-type). Our work has shown that user perception of distributed multimedia quality cannot be achieved by means of purely technical-perspective QoS parameter adaptation
Integration of multi-sensorial effects in synchronised immersive hybrid TV scenarios
[EN] Traditionally, TV media content has exclusively involved 2D or 3D audiovisual streams consumed by using a simple TV device. However, in order to generate more immersive media consumption experiences, other new types of content (e.g., omnidirectional video), consumption devices (e.g., Head Mounted Displays or HMD) and solutions to stimulate other senses beyond the traditional ones of sight and hearing, can be used. Multi-sensorial media content (a.k.a. mulsemedia) facilitates additional sensory effects that stimulate other senses during the media consumption, with the aim of providing the consumers with a more immersive and realistic experience. They provide the users with a greater degree of realism and immersion, but can also provide greater social integration (e.g., people with AV deficiencies or attention span problems) and even contribute to creating better educational programs (e.g., for learning through the senses in educational content or scientific divulgation). Examples of sensory effects that can be used are olfactory effects (scents), tactile effects (e.g., vibration, wind or pressure effects), and ambient effects (e.g., temperature or lighting). In this paper, a solution for providing multi-sensorial and immersive hybrid (broadcast/broadband) TV content consumption experiences, including omnidirectional video and sensory effects, is presented. It has been designed, implemented, and subjectively evaluated (by 32 participants) in an end-to-end platform for hybrid content generation, delivery and synchronised consumption. The satisfactory results which were obtained regarding the perception of fine synchronisation between sensory effects and multimedia content, and regarding the users' perceived QoE, are summarised and discussed.This work was supported in part by the "Vicerrectorado de Investigacion de la Universitat Politecnica de Valencia'' under Project PAID-11-21 and Project PAID-12-21.Marfil, D.; Boronat, F.; González-Salinas, J.; Sapena Piera, A. (2022). Integration of multi-sensorial effects in synchronised immersive hybrid TV scenarios. IEEE Access. 10:79071-79089. https://doi.org/10.1109/ACCESS.2022.319417079071790891
- …