114 research outputs found

    Categorical perception of tactile distance

    Get PDF
    The tactile surface forms a continuous sheet covering the body. And yet, the perceived distance between two touches varies across stimulation sites. Perceived tactile distance is larger when stimuli cross over the wrist, compared to when both fall on either the hand or the forearm. This effect could reflect a categorical distortion of tactile space across body-part boundaries (in which stimuli crossing the wrist boundary are perceptually elongated) or may simply reflect a localised increased in acuity surrounding anatomical landmarks (in which stimuli near the wrist are perceptually elongated). We tested these two interpretations, by comparing a well-documented bias to perceive mediolateral tactile distances across the forearm/hand as larger than proximodistal ones along the forearm/hand at three different sites (hand, wrist, and forearm). According to the ‘categorical’ interpretation, tactile distances should be elongated selectively in the proximodistal axis thus reducing the anisotropy. According to the ‘localised acuity’ interpretation, distances will be perceptually elongated in the vicinity of the wrist regardless of orientation, leading to increased overall size without affecting anisotropy. Consistent with the categorical account, we found a reduction in the magnitude of anisotropy at the wrist, with no evidence of a corresponding specialized increase in precision. These findings demonstrate that we reference touch to a representation of the body that is categorically segmented into discrete parts, which consequently influences the perception of tactile distance

    Songbird dynamics under the sea : acoustic interactions between humpback whales suggest song mediates male interactions

    Get PDF
    © The Author(s), 2018. This article is distributed under the terms of the Creative Commons Attribution License. The definitive version was published in Royal Society Open Science 5 (2018): 171298, doi:10.1098/rsos.171298.The function of song has been well studied in numerous taxa and plays a role in mediating both intersexual and intrasexual interactions. Humpback whales are among few mammals who sing, but the role of sexual selection on song in this species is poorly understood. While one predominant hypothesis is that song mediates male–male interactions, the mechanism by which this may occur has never been explored. We applied metrics typically used to assess songbird interactions to examine song sequences and movement patterns of humpback whale singers. We found that males altered their song presentation in the presence of other singers; focal males increased the rate at which they switched between phrase types (p = 0.005), and tended to increase the overall evenness of their song presentation (p = 0.06) after a second male began singing. Two-singer dyads overlapped their song sequences significantly more than expected by chance. Spatial analyses revealed that change in distance between singers was related to whether both males kept singing (p = 0.012), with close approaches leading to song cessation. Overall, acoustic interactions resemble known mechanisms of mediating intrasexual interactions in songbirds. Future work should focus on more precisely resolving how changes in song presentation may be used in competition between singing males.D.M.C. was supported by an EPA Science to Achieve Results (STAR) Fellowship for PhD research

    A supramodal representation of the body surface

    Get PDF
    The ability to accurately localize both tactile and painful sensations on the body is one of the most important functions of the somatosensory system. Most accounts of localization refer to the systematic spatial relation between skin receptors and cortical neurons. The topographic organization of somatosensory neurons in the brain provides a map of the sensory surface. However, systematic distortions in perceptual localization tasks suggest that localizing a somatosensory stimulus involves more than simply identifying specific active neural populations within a somatotopic map. Thus, perceptual localization may depend on both afferent inputs and other unknown factors. In four experiments, we investigated whether localization biases vary according to the specific skin regions and subset of afferent fibers stimulated. We represented localization errors as a ‘perceptual map’ of skin locations. We compared the perceptual maps of stimuli that activate Aβ (innocuous touch), Aδ (pinprick pain), and C fibers (non-painful heat) on both the hairy and glabrous skin of the left hand. Perceptual maps exhibited systematic distortions that strongly depended on the skin region stimulated. We found systematic distal and radial (i.e., towards the thumb) biases in localization of touch, pain, and heat on the hand dorsum. A less consistent proximal bias was found on the palm. These distortions were independent of the population of afferent fibers stimulated, and also independent of the response modality used to report localization. We argue that these biases are likely to have a central origin, and result from a supramodal representation of the body surface

    Visual detail about the body modulates tactile localisation biases

    Get PDF
    The localisation of tactile stimuli requires the integration of visual and somatosensory inputs within an internal representation of the body surface, and is prone to consistent bias. Joints may play a role in segmenting such internal body representations, and may therefore influence tactile localisation biases, although the nature of this influence remains unclear. Here, we investigate the relationship between conceptual knowledge of joint locations and tactile localisation biases on the hand. In one task, participants localised tactile stimuli applied to the dorsum of their hand. A distal localisation bias was observed in all participants, consistent with previous results. We also manipulated the availability of visual information during this task, to determine whether the absence of this information could account for the distal bias observed here and by Mancini and colleagues (2011). The observed distal bias increased in magnitude when visual information was restricted, without a corresponding decrease in precision. In a separate task, the same participants indicated, from memory, knuckle locations on a silhouette image of their hand. Analogous distal biases were also seen in the knuckle localisation task. The accuracy of conceptual joint knowledge was not correlated with tactile localisation bias magnitude, although a similarity in observed bias direction suggests that both tasks may rely on a common, higher-order body representation. These results also suggest that distortions of conceptual body representation may be more common in healthy individuals than previously thought

    No correlation between distorted body representations underlying tactile distance perception and position sense

    Get PDF
    Both tactile distance perception and position sense are believed to require that immediate afferent signals be referenced to a stored representation of body size and shape (the body model). For both of these abilities, recent studies have reported that the stored body representations involved are highly distorted, at least in the case of the hand, with the hand dorsum represented as wider and squatter than it actually is. Here, we investigated whether individual differences in the magnitude of these distortions are shared between tactile distance perception and position sense, as would be predicted by the hypothesis that a single distorted body model underlies both tasks. We used established tasks to measure distortions of the represented shape of the hand dorsum. Consistent with previous results, in both cases there were clear biases to overestimate distances oriented along the medio-lateral axis of the hand compared to the proximo- distal axis. Moreover, within each task there were clear split-half correlations, demonstrating that both tasks show consistent individual differences. Critically, however, there was no correlation between the magnitudes of distortion in the two tasks. This casts doubt on the proposal that a common body model underlies both tactile distance perception and position sense

    Mind the gap: the effects of temporal and spatial separation in localization of dual touches on the hand

    Get PDF
    In this study, we aimed to relate the findings from two predominantly separate streams of literature, one reporting on the localisation of single touches on the skin, and the other on the distance perception of dual touches. Participants were touched with two points, delivered either simultaneously or separated by a short delay to various locations on their left hand dorsum. They then indicated on a size-matched hand silhouette the perceived locations of tactile stimuli. We quantified the deviations between the actual stimulus grid and the corresponding perceptual map which was constructed from the perceived tactile locations, and we calculated the precision of tactile localisation (i.e. the variability across localisation attempts). The evidence showed that the dual touches, akin to single touch stimulations, were mislocalised distally and that their variable localisation error was reduced near joints, particularly near knuckles. However, contrary to single-touch localisation literature, we observed for the dual touches to be mislocalised towards the ulnar side of the hand, particularly when they were presented sequentially. Further, the touches presented in a sequential order were slightly ‘repelled’ from each other and their perceived distance increased, while the simultaneous tactile pairs were localised closer to each other and their distance was compressed. Whereas the sequential touches may have been localised with reference to the body, the compression of tactile perceptual space for simultaneous touches was related in the previous literature to signal summation and inhibition and the low-level factors, including the innervation density and properties of receptive fields of somatosensory neurons

    A critical experimental study of the classical tactile threshold theory

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The tactile sense is being used in a variety of applications involving tactile human-machine interfaces. In a significant number of publications the classical threshold concept plays a central role in modelling and explaining psychophysical experimental results such as in stochastic resonance (SR) phenomena. In SR, noise enhances detection of sub-threshold stimuli and the phenomenon is explained stating that the required amplitude to exceed the sensory threshold barrier can be reached by adding noise to a sub-threshold stimulus. We designed an experiment to test the validity of the classical vibrotactile threshold. Using a second choice experiment, we show that individuals can order sensorial events below the level known as the classical threshold. If the observer's sensorial system is not activated by stimuli below the threshold, then a second choice could not be above the chance level. Nevertheless, our experimental results are above that chance level contradicting the definition of the classical tactile threshold.</p> <p>Results</p> <p>We performed a three alternative forced choice detection experiment on 6 subjects asking them first and second choices. In each trial, only one of the intervals contained a stimulus and the others contained only noise. According to the classical threshold assumptions, a correct second choice response corresponds to a guess attempt with a statistical frequency of 50%. Results show an average of 67.35% (STD = 1.41%) for the second choice response that is not explained by the classical threshold definition. Additionally, for low stimulus amplitudes, second choice correct detection is above chance level for any detectability level.</p> <p>Conclusions</p> <p>Using a second choice experiment, we show that individuals can order sensorial events below the level known as a classical threshold. If the observer's sensorial system is not activated by stimuli below the threshold, then a second choice could not be above the chance level. Nevertheless, our experimental results are above that chance level. Therefore, if detection exists below the classical threshold level, then the model to explain the SR phenomenon or any other tactile perception phenomena based on the psychophysical classical threshold is not valid. We conclude that a more suitable model of the tactile sensory system is needed.</p

    A brain-computer interface with vibrotactile biofeedback for haptic information

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>It has been suggested that Brain-Computer Interfaces (BCI) may one day be suitable for controlling a neuroprosthesis. For closed-loop operation of BCI, a tactile feedback channel that is compatible with neuroprosthetic applications is desired. Operation of an EEG-based BCI using only <it>vibrotactile feedback</it>, a commonly used method to convey haptic senses of contact and pressure, is demonstrated with a high level of accuracy.</p> <p>Methods</p> <p>A Mu-rhythm based BCI using a motor imagery paradigm was used to control the position of a virtual cursor. The cursor position was shown visually as well as transmitted haptically by modulating the intensity of a vibrotactile stimulus to the upper limb. A total of six subjects operated the BCI in a two-stage targeting task, receiving only vibrotactile biofeedback of performance. The location of the vibration was also systematically varied between the left and right arms to investigate location-dependent effects on performance.</p> <p>Results and Conclusion</p> <p>Subjects are able to control the BCI using only vibrotactile feedback with an average accuracy of 56% and as high as 72%. These accuracies are significantly higher than the 15% predicted by random chance if the subject had no voluntary control of their Mu-rhythm. The results of this study demonstrate that vibrotactile feedback is an effective biofeedback modality to operate a BCI using motor imagery. In addition, the study shows that placement of the vibrotactile stimulation on the biceps ipsilateral or contralateral to the motor imagery introduces a significant bias in the BCI accuracy. This bias is consistent with a drop in performance generated by stimulation of the contralateral limb. Users demonstrated the capability to overcome this bias with training.</p

    Tactile localization biases are modulated by gaze direction

    Get PDF
    Identifying the spatial location of touch on the skin surface is a fundamental function of our somatosensory system. Despite the fact that stimulation of even single mechanoreceptive afferent fibres is sufficient to produce clearly localised percepts, tactile localisation can be modulated also by higher-level processes such as body posture. This suggests that tactile events are coded using multiple representations using different coordinate systems. Recent reports provide evidence for systematic biases on tactile localisation task, which are thought to result from a supramodal representation of the skin surface. While the influence of non-informative vision of the body and gaze direction on tactile discrimination tasks has been extensively studied, their effects on tactile localisation tasks remain largely unexplored. To address this question, participants performed a tactile localization task on their left hand under different visual conditions by means of a mirror box; in the mirror condition a single stimulus was delivered on participants’ hand while the reflexion of the right hand was seen through the mirror; in the object condition participants looked at a box through the mirror, and in the right hand condition participants looked directly at their right hand. Participants reported the location of the tactile stimuli using a silhouette of a hand. Results showed a shift in the localization of the touches towards the tip of the fingers (distal bias) and the thumb (radial biases) across conditions. Critically, distal biases were reduced when participants looked towards the mirror compared to when they looked at their right hand suggesting that gaze direction reduces the typical proximo-distal biases in tactile localization. Moreover, vision of the hand modulates the internal configuration of points’ locations, by elongating it, in the radio-ulnar axis
    corecore