29 research outputs found

    Visual detail about the body modulates tactile localisation biases

    Get PDF
    The localisation of tactile stimuli requires the integration of visual and somatosensory inputs within an internal representation of the body surface, and is prone to consistent bias. Joints may play a role in segmenting such internal body representations, and may therefore influence tactile localisation biases, although the nature of this influence remains unclear. Here, we investigate the relationship between conceptual knowledge of joint locations and tactile localisation biases on the hand. In one task, participants localised tactile stimuli applied to the dorsum of their hand. A distal localisation bias was observed in all participants, consistent with previous results. We also manipulated the availability of visual information during this task, to determine whether the absence of this information could account for the distal bias observed here and by Mancini and colleagues (2011). The observed distal bias increased in magnitude when visual information was restricted, without a corresponding decrease in precision. In a separate task, the same participants indicated, from memory, knuckle locations on a silhouette image of their hand. Analogous distal biases were also seen in the knuckle localisation task. The accuracy of conceptual joint knowledge was not correlated with tactile localisation bias magnitude, although a similarity in observed bias direction suggests that both tasks may rely on a common, higher-order body representation. These results also suggest that distortions of conceptual body representation may be more common in healthy individuals than previously thought

    A brain-computer interface with vibrotactile biofeedback for haptic information

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>It has been suggested that Brain-Computer Interfaces (BCI) may one day be suitable for controlling a neuroprosthesis. For closed-loop operation of BCI, a tactile feedback channel that is compatible with neuroprosthetic applications is desired. Operation of an EEG-based BCI using only <it>vibrotactile feedback</it>, a commonly used method to convey haptic senses of contact and pressure, is demonstrated with a high level of accuracy.</p> <p>Methods</p> <p>A Mu-rhythm based BCI using a motor imagery paradigm was used to control the position of a virtual cursor. The cursor position was shown visually as well as transmitted haptically by modulating the intensity of a vibrotactile stimulus to the upper limb. A total of six subjects operated the BCI in a two-stage targeting task, receiving only vibrotactile biofeedback of performance. The location of the vibration was also systematically varied between the left and right arms to investigate location-dependent effects on performance.</p> <p>Results and Conclusion</p> <p>Subjects are able to control the BCI using only vibrotactile feedback with an average accuracy of 56% and as high as 72%. These accuracies are significantly higher than the 15% predicted by random chance if the subject had no voluntary control of their Mu-rhythm. The results of this study demonstrate that vibrotactile feedback is an effective biofeedback modality to operate a BCI using motor imagery. In addition, the study shows that placement of the vibrotactile stimulation on the biceps ipsilateral or contralateral to the motor imagery introduces a significant bias in the BCI accuracy. This bias is consistent with a drop in performance generated by stimulation of the contralateral limb. Users demonstrated the capability to overcome this bias with training.</p

    Tactile localization biases are modulated by gaze direction

    Get PDF
    Identifying the spatial location of touch on the skin surface is a fundamental function of our somatosensory system. Despite the fact that stimulation of even single mechanoreceptive afferent fibres is sufficient to produce clearly localised percepts, tactile localisation can be modulated also by higher-level processes such as body posture. This suggests that tactile events are coded using multiple representations using different coordinate systems. Recent reports provide evidence for systematic biases on tactile localisation task, which are thought to result from a supramodal representation of the skin surface. While the influence of non-informative vision of the body and gaze direction on tactile discrimination tasks has been extensively studied, their effects on tactile localisation tasks remain largely unexplored. To address this question, participants performed a tactile localization task on their left hand under different visual conditions by means of a mirror box; in the mirror condition a single stimulus was delivered on participants’ hand while the reflexion of the right hand was seen through the mirror; in the object condition participants looked at a box through the mirror, and in the right hand condition participants looked directly at their right hand. Participants reported the location of the tactile stimuli using a silhouette of a hand. Results showed a shift in the localization of the touches towards the tip of the fingers (distal bias) and the thumb (radial biases) across conditions. Critically, distal biases were reduced when participants looked towards the mirror compared to when they looked at their right hand suggesting that gaze direction reduces the typical proximo-distal biases in tactile localization. Moreover, vision of the hand modulates the internal configuration of points’ locations, by elongating it, in the radio-ulnar axis

    Learning new sensorimotor contingencies:Effects of long-term use of sensory augmentation on the brain and conscious perception

    Get PDF
    Theories of embodied cognition propose that perception is shaped by sensory stimuli and by the actions of the organism. Following sensorimotor contingency theory, the mastery of lawful relations between own behavior and resulting changes in sensory signals, called sensorimotor contingencies, is constitutive of conscious perception. Sensorimotor contingency theory predicts that, after training, knowledge relating to new sensorimotor contingencies develops, leading to changes in the activation of sensorimotor systems, and concomitant changes in perception. In the present study, we spell out this hypothesis in detail and investigate whether it is possible to learn new sensorimotor contingencies by sensory augmentation. Specifically, we designed an fMRI compatible sensory augmentation device, the feelSpace belt, which gives orientation information about the direction of magnetic north via vibrotactile stimulation on the waist of participants. In a longitudinal study, participants trained with this belt for seven weeks in natural environment. Our EEG results indicate that training with the belt leads to changes in sleep architecture early in the training phase, compatible with the consolidation of procedural learning as well as increased sensorimotor processing and motor programming. The fMRI results suggest that training entails activity in sensory as well as higher motor centers and brain areas known to be involved in navigation. These neural changes are accompanied with changes in how space and the belt signal are perceived, as well as with increased trust in navigational ability. Thus, our data on physiological processes and subjective experiences are compatible with the hypothesis that new sensorimotor contingencies can be acquired using sensory augmentation

    Centralizing Bias and the Vibrotactile Funneling Illusion on the Forehead

    Get PDF
    This paper provides a novel psychophysical investigation of headmounted vibrotactile interfaces for sensory augmentation. A 1-by-7 headband vibrotactile display was used to provide stimuli on each participant’s forehead. Experiment I investigated the ability to identify the location of a vibrotactile stimulus presented to a single tactor in the display; results indicated that localization error is uniform but biased towards the forehead midline. In Experiment II, two tactors were activated simultaneously, and participants were asked to indicate whether they experienced one or two stimulus locations. Participants reported the funneling illusion—experiencing one stimulus when two tactors were activated—mainly for the shortest inter-tactor difference. We discuss the significance of these results for the design of head-mounted vibrotactile displays and in relation to research on localization and funneling on different body surface

    The Vibrotactile Experience of the HOME Button on Smartphones

    No full text
    The vibration of the virtual HOME button is very important for smartphone users. To understand the user experience of different vibration modes of the HOME button, we designed 2 experiments to study this issue. Study 1 compared 4 different HOME buttons that were experienced either in or out of visual sight. The results showed that the perceived intensity was the key factor related to the tactile experience of the HOME button regardless of the particular vibration mode. Study 2 explored the influence of vibration intensity on users&rsquo; tactile experiences. The results showed that the frequency and amplitude of the vibration had a significant positive relationship with the overall evaluation of the tactile experience. More importantly, this effect was mediated by the perceived intensity. These results have implications for designing vibration modes that satisfy the needs of smartphone users. &copy; 2019, Springer Nature Switzerland AG.</p
    corecore