1,371 research outputs found

    Effect before cause: supramodal recalibration of sensorimotor timing.

    Get PDF
    YesBackground: Our motor actions normally generate sensory events, but how do we know which events were self generated and which have external causes? Here we use temporal adaptation to investigate the processing stage and generality of our sensorimotor timing estimates. Methodology/Principal Findings: Adaptation to artificially-induced delays between action and event can produce a startling percept¿upon removal of the delay it feels as if the sensory event precedes its causative action. This temporal recalibration of action and event occurs in a quantitatively similar manner across the sensory modalities. Critically, it is robust to the replacement of one sense during the adaptation phase with another sense during the test judgment. Conclusions/Significance: Our findings suggest a high-level, supramodal recalibration mechanism. The effects are well described by a simple model which attempts to preserve the expected synchrony between action and event, but only when causality indicates it is reasonable to do so. We further demonstrate that this model successfully characterises related adaptation data from outside the sensorimotor domain

    Obstacle detection display for visually impaired:Coding of direction, distance, and height on a vibrotactile waist band

    Get PDF
    Electronic travel aids (ETAs) can potentially increase the safety and comfort of blind users by detecting and displaying obstacles outside the range of the white cane. In a series of experiments, we aim to balance the amount of information displayed and the comprehensibility of the information taking into account the risk of information overload. In Experiment 1, we investigate perception of compound signals displayed on a tactile vest while walking. The results confirm that the threat of information overload is clear and present. Tactile coding parameters that are sufficiently discriminable in isolation may not be so in compound signals and while walking and using the white cane. Horizontal tactor location is a strong coding parameter, and temporal pattern is the preferred secondary coding parameter. Vertical location is also possible as coding parameter but it requires additional tactors and makes the display hardware more complex and expensive and less user friendly. In Experiment 2, we investigate how we can off-load the tactile modality by mitigating part of the information to an auditory display. Off-loading the tactile modality through auditory presentation is possible, but this off-loading is limited and may result in a new threat of auditory overload. In addition, taxing the auditory channel may in turn interfere with other auditory cues from the environment. In Experiment 3, we off-load the tactile sense by reducing the amount of displayed information using several filter rules. The resulting design was evaluated in Experiment 4 with visually impaired users. Although they acknowledge the potential of the display, the added of the ETA as a whole also depends on its sensor and object recognition capabilities. We recommend to use not more than two coding parameters in a tactile compound message and apply filter rules to reduce the amount of obstacles to be displayed in an obstacle avoidance ETA.</p

    7th Tübingen Perception Conference: TWK 2004

    No full text

    Investigating the effect of sensory concurrency on learning haptic spatiotemporal signals

    Get PDF
    A new generation of multimodal interfaces and interactions is emerging. Drawing on the principles of Sensory Substitution and Augmentation Devices (SSADs), these new interfaces offer the potential for rich, immersive human-computer interactions, but are difficult to design well, and take time to master, creating significant barriers towards wider adoption. Following a review of the literature surrounding existing SSADs, their metrics for success and their growing influence on interface design in Human Computer Interaction, we present a medium term (4-day) study comparing the effectiveness of various combinations of visual and haptic feedback (sensory concurrencies) in preparing users to perform a virtual maze navigation task using haptic feedback alone. Participants navigated 12 mazes in each of 3 separate sessions under a specific combination of visual and haptic feedback, before performing the same task using the haptic feedback alone. Visual sensory deprivation was shown to be inferior to visual & haptic concurrency in enabling haptic signal comprehension, while a new hybridized condition combining reduced visual feedback with the haptic signal was shown to be superior. Potential explanations for the effectiveness of the hybrid mechanism are explored, and the scope and implications of its generalization to new sensory interfaces is presented.PostprintPeer reviewe

    Your place or mine:Shared sensory experiences elicit a remapping of peripersonal space

    Get PDF
    Our perceptual systems integrate multisensory information about objects that are close to our bodies, which allow us to respond quickly and appropriately to potential threats, as well as act upon and manipulate useful tools. Intriguingly, the representation of this area close to our body, known as the multisensory 'peripersonal space' (PPS), can expand or contract during social interactions. However, it is not yet known how different social interactions can alter the representation of PPS. In particular, shared sensory experiences, such as those elicited by bodily illusions such as the enfacement illusion, can induce feelings of ownership over the other's body which has also been shown to increase the remapping of the other's sensory experiences onto our own bodies. The current study investigated whether such shared sensory experiences between two people induced by the enfacement illusion could alter the way PPS was represented, and whether this alteration could be best described as an expansion of one's own PPS towards the other or a remapping of the other's PPS onto one's own. An audio-tactile integration task allowed us to measure the extent of the PPS before and after a shared sensory experience with a confederate. Our results showed a clear increase in audio-tactile integration in the space close to the confederate's body after the shared experience. Importantly, this increase did not extend across the space between participant and confederate, as would be expected if the participant's PPS had expanded. Thus, the pattern of results is more consistent with a partial remapping of the confederate's PPS onto the participant's own PPS. These results have important consequences for our understanding of interpersonal space during different kinds of social interactions. (C) 2014 Elsevier Ltd. All rights reserved
    • …
    corecore