13 research outputs found

    Building Cognition-Aware Systems: A Mobile Toolkit for Extracting Time-of-Day Fluctuations of Cognitive Performance

    Get PDF
    People’s alertness fluctuates across the day: at some times we are highly focused while at others we feel unable to concentrate. So far, extracting fluctuation patterns has been time and cost-intensive. Using an in-the-wild approach with 12 participants, we evaluated three cognitive tasks regarding their adequacy as a mobile and economical assessment tool of diurnal changes in mental performance. Participants completed the five-minute test battery on their smartphones multiple times a day for a period of 1-2 weeks. Our results show that people’s circadian rhythm can be obtained under unregulated non-laboratory conditions. Along with this validation study, we release our test battery as an open source library for future work towards cognition-aware systems as well as a tool for psychological and medical research. We discuss ways of integrating the toolkit and possibilities for implicitly measuring performance variations in common applications. The ability to detect systematic patterns in alertness levels will allow cognition-aware systems to provide in-situ assistance in accordance with users’ current cognitive capabilities and limitations

    The Consistency of Crossmodal Synchrony Perception across the Visual, Auditory, and Tactile Senses

    Get PDF
    Machulla T-K, Di Luca M, Ernst MO. The Consistency of Crossmodal Synchrony Perception across the Visual, Auditory, and Tactile Senses. Journal of Experimental Psychology: Human Perception and Performance. 2016;42(7):1026-1038

    Recalibration of multisensory simultaneity: Cross-modal transfer coincides with a change in perceptual latency

    No full text
    Di Luca M, Machulla T-K, Ernst MO. Recalibration of multisensory simultaneity: Cross-modal transfer coincides with a change in perceptual latency. Journal of Vision. 2009;9(12):7.After exposure to asynchronous sound and light stimuli, perceived audio-visual synchrony changes to compensate for the asynchrony. Here we investigate to what extent this audio-visual recalibration effect transfers to visual-tactile and audio-tactile simultaneity perception in order to infer the mechanisms responsible for temporal recalibration. Results indicate that audio-visual recalibration of simultaneity can transfer to audio-tactile and visual-tactile stimuli depending on the way in which the multisensory stimuli are presented. With presentation of co-located multisensory stimuli, we found a change in the perceptual latency of the visual stimuli. Presenting auditory stimuli through headphones, on the other hand, induced a change in the perceptual latency of the auditory stimuli. We argue that the difference in transfer depends on the relative trust in the auditory and visual estimates. Interestingly, these findings were confirmed by showing that audio-visual recalibration influences simple reaction time to visual and auditory stimuli. Presenting co-located stimuli during asynchronous exposure induced a change in reaction time to visual stimuli, while with headphones the change in reaction time occurred for the auditory stimuli. These results indicate that the perceptual latency is altered with repeated exposure to asynchronous audio-visual stimuli in order to compensate (at least in part) for the presented asynchrony

    Multisensory simultaneity recalibration: storage of the aftereffect in the absence of counterevidence

    No full text
    Machulla T-K, Di Luca M, Fröhlich E, Ernst MO. Multisensory simultaneity recalibration: Storage of the aftereffect in the absence of counterevidence. Experimental Brain Research. 2012;217(1):89-97

    VRsneaky: Stepping into an audible virtual world with gait-aware auditory feedback

    No full text
    New VR experiences allow users to walk extensively in the virtual space. Bigger tracking spaces, treadmills and redirected walking solutions are now available. Yet, certain connections to the user's movement are still not made. Here, we specifically see a shortcoming of representations of locomotion feedback in state-of-the-art VR setups. As shown in our paper, providing synchronized step sounds is important for involving the user further into the experience and virtual world, but is often neglected. VRsneaky detects the user's gait and plays synchronized gait-aware step sounds accordingly by attaching force sensing resistors (FSR) and accelerometers to the user's shoe. In an exciting bank robbery the user will try to rob the bank behind a guards back. The tension will increase as the user has to be aware of each step in this atmospheric experience. Each step will remind the user to pay attention to every movement, as each step will be represented using adaptive step sounds resulting in different noise levels

    Towards a haptic raxonomy of emotions:Exploring vibrotactile stimulation in the dorsal region

    No full text
    The implicit communication of emotional states between persons is a key use case for novel assistive and augmentation technologies. It can serve to expand individuals' perceptual capabilities and assist neurodivergent individuals. Notably, vibrotactile rendering is a promising method for delivering emotional information with minimal interference with visual or auditory perception. To date, the subjective individual association between vibrotactile properties and emotional states remains unclear. Previous approaches relied on analogies or arbitrary variations, limiting generalization. To address this, we conducted a study with 40 participants, analyzing associations between attributes of self-generated vibrotactile patterns (amplitude, frequency, spatial location of stimulation) and four emotional states (Anger, Happiness, Neutral, Sadness). We fin a preference for symmetrically arranged patterns, as well as distinct amplitude and frequency profiles for different emotions.</p

    Impact of reviewing lifelogging photos on recalling episodic memories

    No full text
    Photos are a rich and popular form for preserving memories. Thus, they are widely used as cues to augment human memory. Near-continuous capture and sharing of photos have generated a need to summarize and review relevant photos to revive important events. However, there is limited work on exploring how regular reviewing of selected photos influence overall recall of past events. In this paper, we present an experiment to investigate the effect of regular reviewing of egocentric lifelogging photos on the formation and retrieval of autobiographic memories. Our approach protects the privacy of the participants and provides improved validation for their memory performance compared to existing approaches. The results of our experiment are a step towards developing memory shaping algorithms that accentuate or attenuate memories on demand

    There Is No First- or Third-Person View in Virtual Reality:Understanding the Perspective Continuum

    No full text
    Modern games make creative use of First- and Third-person perspectives (FPP and TPP) to allow the player to explore virtual worlds. Traditionally, FPP and TPP perspectives are seen as distinct concepts. Yet, Virtual Reality (VR) allows for flexibility in choosing perspectives. We introduce the notion of a perspective continuum in VR, which is technically related to the camera position and conceptually to how users perceive their environment in VR. A perspective continuum enables adapting and manipulating the sense of agency and involvement in the virtual world. This flexibility of perspectives broadens the design space of VR experiences through deliberately manipulating perception. In a study, we explore users’ attitudes, experiences and perceptions while controlling a virtual character from the two known perspectives. Statistical analysis of the empirical results shows the existence of a perspective continuum in VR. Our findings can be used to design experiences based on shifts of perception
    corecore