7 research outputs found

    Bimodal perception of audio-visual material properties for virtual environments

    Get PDF
    International audienceHigh-quality rendering of both audio and visual material properties is very important in interac- tive virtual environments, since convicingly rendered materials increase realism and the sense of immersion. We studied how the level of detail of auditory and visual stimuli interact in the per- ception of audio-visual material rendering quality. Our study is based on perception of material discrimination, when varying the levels of detail of modal synthesis for sound, and Bidirectional Re ectance Distribution Functions for graphics. We performed an experiment for two di erent models (a Dragon and a Bunny model) and two material types (Plastic and Gold). The results show a signi cant interaction between auditory and visual level of detail in the perception of material similarity, when comparing approximate levels of detail to a high-quality audio-visual reference rendering. We show how this result can contribute to signi cant savings in computation time in an interactive audio-visual rendering system. To our knowledge this is the rst study which shows interaction of audio and graphics representation in a material perception task

    Bimodal perception of audio-visual material properties for virtual environments

    Get PDF
    INRIA Research Report 6687High-quality rendering of both audio and visual material properties is very important in interactive virtual environments, since convincingly rendered materials increase realism and the sense of immersion. We studied how the level of detail of auditory and visual stimuli interact in the perception of audio-visual material rendering quality. Our study is based on perception of material discrimination, when varying the levels of detail of modal synthesis for sound, and Bidirectional Reflectance Distribution Functions for graphics. We performed an experiment for two different models (a Dragon and a Bunny model) and two material types (Plastic and Gold). The results show a significant interaction between auditory and visual level of detail in the perception of material similarity, when comparing approximate levels of detail to a high-quality audio-visual reference rendering. We show how this result can contribute to significant savings in computation time in an interactive audio-visual rendering system. To our knowledge this is the first study which shows interaction of audio and graphics representation in a material perception task

    Real-Time Physically Based Sound Synthesis and Application in Multimodal Interaction

    Get PDF
    An immersive experience in virtual environments requires realistic auditory feedback that is closely coupled with other modalities, such as vision and touch. This is particularly challenging for real-time applications due to its stringent computational requirement. In this dissertation, I present and evaluate effective real-time physically based sound synthesis models that integrate visual and touch data and apply them to create richly varying multimodal interaction. I first propose an efficient contact sound synthesis technique that accounts for texture information used for visual rendering and greatly reinforces cross-modal perception. Secondly, I present both empirical and psychoacoustic approaches that formally study the geometry-invariant property of the commonly used material model in real-time sound synthesis. Based on this property, I design a novel example-based material parameter estimation framework that automatically creates synthetic sound effects naturally controlled by complex geometry and dynamics in visual simulation. Lastly, I translate user touch input captured on commodity multi-touch devices to physical performance models that drive both visual and auditory rendering. This novel multimodal interaction is demonstrated in a virtual musical instrument application on both a large-size tabletop and mobile tablet devices, and evaluated through pilot studies. Such an application offers capabilities for intuitive and expressive music playing, rapid prototyping of virtual instruments, and active exploration of sound effects determined by various physical parameters.Doctor of Philosoph

    Developing knowledge for real world problem scenarios : using 3D gaming technology within a problem-based learning framework

    Get PDF
    Problem-based learning is an instructional strategy that emphasises active and experiential learning through problem-solving activity. Using gaming technologies to embed this approach in a three-dimensional (3D) simulation environment provides users with a dynamic, responsive, visually engaging, and cost effective learning experience. Representing real world problems in 3D simulation environments develops knowledge and skills that are applicable to their resolution. The Simulation, User, and Problem-based Learning (SUPL) Design Framework was developed to inform the design of learning environments which develop problem-solving knowledge for real world application. This framework identifies design factors relative to the user, the problem-solving task, and the 3D simulation environment which facilitate the transfer, development, and application of problem-solving knowledge. To assess the validity of the SUPL Design Framework, the Fires in Underground Mines Evacuation Simulator (FUMES) was developed to train mining personnel in emergency evacuation procedures at the Challenger gold mine in South Australia. Two groups of participants representing experienced and novice personnel were utilised to ascertain the effectiveness of FUMES as a training platform in this regard. Findings demonstrated that FUMES accurately represented emergency evacuation scenarios in the Challenger mine. Participants were able to utilise existing real world knowledge in FUMES to resolve emergency evacuation problem-solving tasks and develop new knowledge. The effectiveness of the SUPL Design Framework was also demonstrated, as was the need to design learning environments to meet the learning needs of users rather than merely as static simulations of real world problems. A series of generalisable design guidelines were also established from these findings which could be applied to design problem-based learning simulations in other training contexts

    Material Visualisation for Virtual Reality: The Perceptual Investigations

    Get PDF
    Material representation plays a significant role in design visualisation and evaluation. On one hand, the simulated material properties determine the appearance of product prototypes in digitally rendered scenes. On the other hand, those properties are perceived by the viewers in order to make important design decisions. As an approach to simulate a more realistic environment, Virtual Reality (VR) provides users a vivid impression of depth and embodies them into an immersive environment. However, the scientific understanding of material perception and its applications in VR is still fairly limited. This leads to this thesis’s research question on whether the material perception in VR is different from that in traditional 2D displays, as well as the potential of using VR as a design tool to facilitate material evaluation.       This thesis is initiated from studying the perceptual difference of rendered materials between VR and traditional 2D viewing modes. Firstly, through a pilot study, it is confirmed that users have different perceptual experiences of the same material in the two viewing modes. Following that initial finding, the research investigates in more details the perceptual difference with psychophysics methods, which help in quantifying the users’ perceptual responses. Using the perceptual scale as a measuring means, the research analyses the users’ judgment and recognition of the material properties under VR and traditional 2D display environments. In addition, the research also elicits the perceptual evaluation criteria to analyse the emotional aspects of materials. The six perceptual criteria are in semantic forms, including rigidity, formality, fineness, softness, modernity, and irregularity.       The results showed that VR could support users in making a more refined judgment of material properties. That is to say, the users perceive better the minute changes of material properties under immersive viewing conditions. In terms of emotional aspects, VR is advantageous in signifying the effects induced by visual textures, while the 2D viewing mode is more effective for expressing the characteristics of plain surfaces. This thesis has contributed to the deeper understanding of users’ perception of material appearances in Virtual Reality, which is critical in achieving an effective design visualisation using such a display medium

    Putting It Into Words: The Impact of Visual Impairment on Perception, Experience and Presence

    Get PDF
    The experience of being “present” in a mediated environment, such that it appears real, is known to be affected by deficits in perception yet little research has been devoted to disabled audiences. People with a visual impairment access audiovisual media by means of Audio Description, which gives visual information in verbal form. The AD user plays an active role, engaging their own perceptual processing systems and bringing real-world experiences to the mediated environment. In exploring visual impairment and presence, this thesis concerns a question fundamental to psychology, whether propositional and experiential knowledge equate. It casts doubt on current models of sensory compensation in the blind and puts forward an alternative hypothesis of linguistic compensation. Qualitative evidence from Study 1 suggests that, in the absence of bimodal (audio-visual) cues, words can compensate for missing visual information. The role of vision in multisensory integration is explored experimentally in Studies 2 and 3. Crossmodal associations arising both from direct perception and imagery are shown to be altered by visual experience. Study 4 tests presence in an auditory environment. Non-verbal sound is shown to enhance presence in the sighted but not the blind. Both Studies 3 and 4 support neuroimaging evidence that words are processed differently in the absence of sight. Study 5, comparing mental spatial models, suggests this is explained by explicit verbal encoding by people with a visual impairment. Study 6 tests the effect of words on presence and emotion elicitation in an audiovisual environment. In the absence of coherent information from the dialogue, additional verbal information significantly improves understanding. Moreover, in certain circumstances, Audio Description significantly enhances presence and successfully elicits a target emotion. A model of Audio Description is presented. Implications are discussed for theoretical models of perceptual processing and presence in those with and without sight
    corecore