388 research outputs found

    Benchmarking of mobile phone cameras

    Get PDF
    fi=vertaisarvioitu|en=peerReviewed

    The Role of Prosodic Stress and Speech Perturbation on the Temporal Synchronization of Speech and Deictic Gestures

    Get PDF
    Gestures and speech converge during spoken language production. Although the temporal relationship of gestures and speech is thought to depend upon factors such as prosodic stress and word onset, the effects of controlled alterations in the speech signal upon the degree of synchrony between manual gestures and speech is uncertain. Thus, the precise nature of the interactive mechanism of speech-gesture production, or lack thereof, is not agreed upon or even frequently postulated. In Experiment 1, syllable position and contrastive stress were manipulated during sentence production to investigate the synchronization of speech and pointing gestures. An additional aim of Experiment 2 was to investigate the temporal relationship of speech and pointing gestures when speech is perturbed with delayed auditory feedback (DAF). Comparisons between the time of gesture apex and vowel midpoint (GA-VM) for each of the conditions were made for both Experiment 1 and Experiment 2. Additional comparisons of the interval between gesture launch midpoint to vowel midpoint (GLM-VM), total gesture time, gesture launch time, and gesture return time were made for Experiment 2. The results for the first experiment indicated that gestures were more synchronized with first position syllables and neutral syllables as measured GA-VM intervals. The first position syllable effect was also found in the second experiment. However, the results from Experiment 2 supported an effect of contrastive pitch effect. GLM-VM was shorter for first position targets and accented syllables. In addition, gesture launch times and total gesture times were longer for contrastive pitch accented syllables, especially when in the second position of words. Contrary to the predictions, significantly longer GA-VM and GLM-VM intervals were observed when individuals responded under provided delayed auditory feedback (DAF). Vowel and sentence durations increased both with (DAF) and when a contrastive accented syllable was produced. Vowels were longest for accented, second position syllables. These findings provide evidence that the timing of gesture is adjusted based upon manipulations of the speech stream. A potential mechanism of entrainment of the speech and gesture system is offered as an explanation for the observed effects

    Animated storytelling in 360 degrees

    Get PDF
    Mastergradsoppgave i digital kommunikasjon og kultur, Høgskolen i Innlandet, 2019.This Master thesis is examining how film form and film style is utilized to tell stories within VR movies; animated shorts aimed at screening through head mounted displays (HMD). Mainly, these shorts have a linear narrative, but simultaneously leave some decisions to the spectator, e.g. where to turn their head at any moment, consequently leaving the framing of shots to the audience. Since they all have a certain degree of interactivity, these productions have inherited features from both movies and games. Thus, the theoretical basis will include works from both academic fields. The objective is to study what parts of film form and film style are still applicable within a ubiquitous 360-degrees view. I will also study how the potential sense of immersion and spatial presence achieved by the 360-degrees view influences the narration. The thesis is conducted as a qualitative analysis of the following study objects: Fan, M. (Producer) & Darnell, E. (Director). (2016). Invasion! [Animated short for 360 degrees screening]. USA: Baobab Studios. Eisenmann, D. (Producer) & Osborne, P. (Director). (2016). Pearl. [Animated short for 360 degrees screening]. USA: Evil Eye Pictures. Cellucci, C. (Producer), Pinkava, J. (Director) & Oftedal, M. (Director). (2018). Piggy. [Animated short for 360 degrees screening]. USA: Google Spotlight Stories. The first selection, Invasion! (Fan & Darnell, 2016), is one of the earliest attempts of transferring animated movies into HMDs, and in this one the spectator additionally is granted with an avatar. The second selection, Pearl (Eisenmann & Osborne, 2016) is considered a milestone within the medium as the first 360 animated short (and the only one till this date) to receive a nomination for an Academy Award. The final selection, Piggy (Cellucci, Pinkava & Oftedal, 2018) includes narrative elements proceeding in real time, and the pacing and order of events can thus partly be influenced by where the spectator is turning their gaze. Under the headlines Film Form, Film Style and Immersion/Presence, I will examine how a ubiquitous visual display affects the way a narrative pattern (film form) is constructed, and how the technical and aesthetic devices (film style) are utilized to convey this pattern. Finally, I will look at whether these movies are able to give a sense of immersion/presence and how this might influence the narrative

    Sonic Interactions in Virtual Environments

    Get PDF

    Sonic interactions in virtual environments

    Get PDF
    This book tackles the design of 3D spatial interactions in an audio-centered and audio-first perspective, providing the fundamental notions related to the creation and evaluation of immersive sonic experiences. The key elements that enhance the sensation of place in a virtual environment (VE) are: Immersive audio: the computational aspects of the acoustical-space properties of Virutal Reality (VR) technologies Sonic interaction: the human-computer interplay through auditory feedback in VE VR systems: naturally support multimodal integration, impacting different application domains Sonic Interactions in Virtual Environments will feature state-of-the-art research on real-time auralization, sonic interaction design in VR, quality of the experience in multimodal scenarios, and applications. Contributors and editors include interdisciplinary experts from the fields of computer science, engineering, acoustics, psychology, design, humanities, and beyond. Their mission is to shape an emerging new field of study at the intersection of sonic interaction design and immersive media, embracing an archipelago of existing research spread in different audio communities and to increase among the VR communities, researchers, and practitioners, the awareness of the importance of sonic elements when designing immersive environments

    Sonic Interactions in Virtual Environments

    Get PDF
    This open access book tackles the design of 3D spatial interactions in an audio-centered and audio-first perspective, providing the fundamental notions related to the creation and evaluation of immersive sonic experiences. The key elements that enhance the sensation of place in a virtual environment (VE) are: Immersive audio: the computational aspects of the acoustical-space properties of Virutal Reality (VR) technologies Sonic interaction: the human-computer interplay through auditory feedback in VE VR systems: naturally support multimodal integration, impacting different application domains Sonic Interactions in Virtual Environments will feature state-of-the-art research on real-time auralization, sonic interaction design in VR, quality of the experience in multimodal scenarios, and applications. Contributors and editors include interdisciplinary experts from the fields of computer science, engineering, acoustics, psychology, design, humanities, and beyond. Their mission is to shape an emerging new field of study at the intersection of sonic interaction design and immersive media, embracing an archipelago of existing research spread in different audio communities and to increase among the VR communities, researchers, and practitioners, the awareness of the importance of sonic elements when designing immersive environments

    Sonic Interactions in Virtual Environments

    Get PDF
    This open access book tackles the design of 3D spatial interactions in an audio-centered and audio-first perspective, providing the fundamental notions related to the creation and evaluation of immersive sonic experiences. The key elements that enhance the sensation of place in a virtual environment (VE) are: Immersive audio: the computational aspects of the acoustical-space properties of Virutal Reality (VR) technologies Sonic interaction: the human-computer interplay through auditory feedback in VE VR systems: naturally support multimodal integration, impacting different application domains Sonic Interactions in Virtual Environments will feature state-of-the-art research on real-time auralization, sonic interaction design in VR, quality of the experience in multimodal scenarios, and applications. Contributors and editors include interdisciplinary experts from the fields of computer science, engineering, acoustics, psychology, design, humanities, and beyond. Their mission is to shape an emerging new field of study at the intersection of sonic interaction design and immersive media, embracing an archipelago of existing research spread in different audio communities and to increase among the VR communities, researchers, and practitioners, the awareness of the importance of sonic elements when designing immersive environments
    corecore