4 research outputs found

    Corrigendum: May I Smell Your Attention: Exploration of Smell and Sound for Visuospatial Attention in Virtual Reality

    Get PDF
    In the published article, there was an error regarding the affiliation(s) for Dario Pittera. Instead of having affiliation(s) ∗∗2,3,4∗∗, they should only have ∗∗2,4∗∗. The authors apologize for this error and state that this does not change the scientific conclusions of the article in any way. The original article has been updated

    Analysis of Energy Consumption in an Electric Vehicle through Virtual Reality Set-Up

    No full text
    The aim of this work is to present an algorithm used to perform an energetic analysis for an electric vehicle in a Virtual Reality (VR) scenario. This was useful to recognize some patterns of driving behavior considering users and their psychological aspects. The primary aspect enlightened is the description of the experimental setup used to perform tests with 26 users. Through Unity and Matlab software, it was possible to exploit a VR scenario aimed at recreating in the same route both urban and highway paths in condition of real traffic, performed with a Battery Electric Vehicle (BEV). The acquisition part is illustrated in its methodology in two different cases covering disturbance and non-disturbance scenario. Moreover, the population was divided considering gender to establish a characterization which linked energy consumption and associated analysis to psychological traits of the driver

    CalD3r and MenD3s: Spontaneous 3D Facial Expression Databases

    No full text
    In the last couple of decades, the research on 3D facial expression recognition has been fostered by the creation of tailored databases containing prototypical expressions of different individuals and by the advances in cost effective acquisition technologies. Though, most of the currently available databases consist of exaggerated facial expressions, due to the imitation principle which they rely on. This makes these databases only partially employable for real world applications such as human-computer interaction for smart products and environments, health, and industry 4.0, as algorithms learn on these ‘inflated’ data which do not respond to ecological validity requirements. In this work, we present two novel 2D+3D spontaneous facial expression databases of young adults with different geographical origin, in which emotions have been evoked thanks to affective images of the acknowledged IAPS and GAPED databases, and verified with participants’ self-reports. To the best of our knowledge, these are the first three-dimensional facial databases with emotions elicited by validated affective stimuli

    Affective Virtual Reality: How to Design Artificial Experiences Impacting Human Emotions

    No full text
    Computer graphics is—in many cases—about visualizing what you cannot see. However, virtual reality (VR), from its beginnings, aimed at stimulating all human senses: not just the visual channel. Moreover, this set of multisensory stimuli allows users to feel present and able to interact with the virtual environment. In this way, VR aims to deliver experiences that are comparable to real-life ones in their level of detail and stimulation, intensity, and impact. Hence, VR is not only a means to see, but also to feel differently. With the spreading of VR technologies, there is a growing interest in using VR to evoke emotions, including positive and negative ones. This article discusses the current possibilities and the authors’ experience collected in the field in trying to elicit emotions through VR. It explores how different design aspects and features can be used, describing their contributions and benefits in the development of affective VR experiences. This work aims at raising awareness of the necessity to consider and explore the full design space that VR technology provides in comparison to traditional media. Additionally, it provides possible tracks of VR affective applications, illustrating how they could impact our emotions and improve our life, and providing guidelines for their development
    corecore