4 research outputs found
Using eye tracking and heart-rate activity to examine crossmodal correspondences QoE in Mulsemedia
Different senses provide us with information of various levels of precision and enable us to construct a more precise representation of the world. Rich multisensory simulations are thus beneficial for comprehension, memory reinforcement, or retention of information. Crossmodal mappings refer to the systematic associations often made between different sensory modalities (e.g., high pitch is matched with angular shapes) and govern multisensory processing. A great deal of research effort has been put into exploring cross-modal correspondences in the field of cognitive science. However, the possibilities they open in the digital world have been relatively unexplored. Multiple sensorial media (mulsemedia) provides a highly immersive experience to the users and enhances their Quality of Experience (QoE) in the digital world. Thus, we consider that studying the plasticity and the effects of cross-modal correspondences in a mulsemedia setup can bring interesting insights about improving the human computer dialogue and experience. In our experiments, we exposed users to videos with certain visual dimensions (brightness, color, and shape), and we investigated whether the pairing with a cross-modal matching sound (high and low pitch) and the corresponding auto-generated vibrotactile effects (produced by a haptic vest) lead to an enhanced QoE. For this, we captured the eye gaze and the heart rate of users while experiencing mulsemedia, and we asked them to fill in a set of questions targeting their enjoyment and perception at the end of the experiment. Results showed differences in eye-gaze patterns and heart rate between the experimental and the control group, indicating changes in participants’ engagement when videos were accompanied by matching cross-modal sounds (this effect was the strongest for the video displaying angular shapes and high-pitch audio) and transitively generated cross-modal vibrotactile effects.<?vsp -1pt?
MulseOnto: a Reference Ontology to Support the Design of Mulsemedia Systems
Designing a mulsemedia|multiple sensorial media|system entails first and foremost comprehending what it is beyond the ordinary understanding that it engages users in digital multisensory experiences that stimulate other senses in addition to sight and hearing, such as smell, touch, and taste. A myriad of programs that comprise a software system, several output devices to deliver sensory effects, computer media, among others, dwell deep in the realm of mulsemedia systems, making it a complex task for newcomers to get acquainted with their concepts and terms. Although there have been many technological advances in this field, especially for multisensory devices, there is a shortage of work that tries to establish common ground in terms of formal and explicit representation of what mulsemedia systems encompass. This might be useful to avoid the design of feeble mulsemedia systems that can be barely reused owing to misconception. In this paper, we extend our previous work by proposing to establish a common conceptualization about mulsemedia systems through a domain reference ontology named MulseOnto to aid the design of them. We applied ontology verification and validation techniques to evaluate it, including assessment by humans and a data-driven approach whereby the outcome is three successful instantiations of MulseOnto for distinct cases, making evident its ability to accommodate heterogeneous mulsemedia scenarios
Recommended from our members
Enhancing User Experience with Olfaction in Virtual Reality
Human experiences in the physical world are inherently multi-modal, in that we rely on all our senses to perceive our environment, yet experiences within virtual reality (VR) are mainly restricted to our primary senses of vision and audition. The sense of smell (olfaction) has been shown to strongly affect human emotions, memories, and behaviour, but there have only been few attempts to integrate olfactory stimuli into virtual environments. This thesis investigates the addition of olfaction as a modality for VR to enhance user experiences through odour emitting virtual objects and olfactory notifications. As part of this research, I introduce a systematic methodology for odour selection, and develop an off-the-shelf, affordable device for odour display (olfactory display) for VR head-mounted displays. My research begins with a preliminary study examining the effect of olfactory stimuli on participants’ emotional perception of digital images, which was used as a test-bed for gaining insights into the use of olfactory displays and olfaction in a HCI setting. I then report on three empirical studies that examine how olfactory cues can enhance user experience in VR in terms of three key metrics: the quality of experience, task performance, and the sense of presence, which is the feeling of ‘being there’ in the virtual environment. The results from these three studies indicated that congruent, pleasant odours could significantly enhance quality of experience, improve task performance, and to varying degrees increase the sense of presence in VR. Incongruent, pleasant odours however often caused confusion among participants and appeared not to have a significant effect on the sense of presence but were able to improve task performance. The third of these studies also examined the use of odour notifications to enhance user experiences in VR. Participants were able to perceive and understand the olfaction-based notifications, which produced an increase in the sense of presence, quality of experience, as well as task performance. Overall, this thesis’ findings support the notion that olfaction can enhance user experience in VR and it also draws attention to the importance of a systematic odour selection methodology
MediaSync: Handbook on Multimedia Synchronization
This book provides an approachable overview of the most recent advances in the fascinating field of media synchronization (mediasync), gathering contributions from the most representative and influential experts. Understanding the challenges of this field in the current multi-sensory, multi-device, and multi-protocol world is not an easy task. The book revisits the foundations of mediasync, including theoretical frameworks and models, highlights ongoing research efforts, like hybrid broadband broadcast (HBB) delivery and users' perception modeling (i.e., Quality of Experience or QoE), and paves the way for the future (e.g., towards the deployment of multi-sensory and ultra-realistic experiences). Although many advances around mediasync have been devised and deployed, this area of research is getting renewed attention to overcome remaining challenges in the next-generation (heterogeneous and ubiquitous) media ecosystem. Given the significant advances in this research area, its current relevance and the multiple disciplines it involves, the availability of a reference book on mediasync becomes necessary. This book fills the gap in this context. In particular, it addresses key aspects and reviews the most relevant contributions within the mediasync research space, from different perspectives. Mediasync: Handbook on Multimedia Synchronization is the perfect companion for scholars and practitioners that want to acquire strong knowledge about this research area, and also approach the challenges behind ensuring the best mediated experiences, by providing the adequate synchronization between the media elements that constitute these experiences