281,768 research outputs found

    EyePACT: eye-based parallax correction on touch-enabled interactive displays

    Get PDF
    The parallax effect describes the displacement between the perceived and detected touch locations on a touch-enabled surface. Parallax is a key usability challenge for interactive displays, particularly for those that require thick layers of glass between the screen and the touch surface to protect them from vandalism. To address this challenge, we present EyePACT, a method that compensates for input error caused by parallax on public displays. Our method uses a display-mounted depth camera to detect the user's 3D eye position in front of the display and the detected touch location to predict the perceived touch location on the surface. We evaluate our method in two user studies in terms of parallax correction performance as well as multi-user support. Our evaluations demonstrate that EyePACT (1) significantly improves accuracy even with varying gap distances between the touch surface and the display, (2) adapts to different levels of parallax by resulting in significantly larger corrections with larger gap distances, and (3) maintains a significantly large distance between two users' fingers when interacting with the same object. These findings are promising for the development of future parallax-free interactive displays

    Chronic Use of a Sensitized Bionic Hand Does Not Remap the Sense of Touch

    Get PDF
    Electrical stimulation of tactile nerve fibers that innervated an amputated hand results in vivid sensations experienced at a specific location on the phantom hand, a phenomenon that can be leveraged to convey tactile feedback through bionic hands. Ideally, electrically evoked sensations would be experienced on the appropriate part of the hand: touch with the bionic index fingertip, for example, would elicit a sensation experienced on the index fingertip. However, the perceived locations of sensations are determined by the idiosyncratic position of the stimulating electrode in the nerve and thus are difficult to predict or control. This problem could be circumvented if perceived sensations shifted over time to become consistent with the position of the sensor that triggers them. We show that, after long-term use of a neuromusculoskeletal prosthesis that featured a mismatch between the sensor location and the resulting tactile experience, the perceived location of the touch did not change

    Seeing the body distorts tactile size perception

    Get PDF
    Vision of the body modulates somatosensation, even when entirely non-informative about stimulation. For example, seeing the body increases tactile spatial acuity, but reduces acute pain. While previous results demonstrate that vision of the body modulates somatosensory sensitivity, it is unknown whether vision also affects metric properties of touch, and if so how. This study investigated how non-informative vision of the body modulates tactile size perception. We used the mirror box illusion to induce the illusion that participants were directly seeing their stimulated left hand, though they actually saw their reflected right hand. We manipulated whether participants: (a) had the illusion of directly seeing their stimulated left hand, (b) had the illusion of seeing a non-body object at the same location, or (c) looked directly at their non-stimulated right-hand. Participants made verbal estimates of the perceived distance between two tactile stimuli presented simultaneously to the dorsum of the left hand, either 20, 30, or 40 mm apart. Vision of the body significantly reduced the perceived size of touch, compared to vision of the object or of the contralateral hand. In contrast, no apparent changes of perceived hand size were found. These results show that seeing the body distorts tactile size perception

    Effects of Gaze Position on Touch Localization

    Get PDF
    Previous research has shown that the direction of gaze relative to the body affects the perceived location of touch, and has argued that these effects indicate that a gaze-centered reference frame is used for touch localization. In this dissertation I examine a discrepancy in the existing literature: why do different studies report opposite directions of effects when eye and head positions are manipulated separately? I resolve this discrepancy by showing that it is not due to whether eye or head position is manipulated (chapter 2) but is in fact due to the nature of the task (chapter 3). I also find that the effect occurs on the back of the body (chapter 4), a body part that is not normally in view and thus would be less likely to use gaze as a reference point. I test theories for why these effects occur (chapter 5), and find that results are compatible with the perceived location of a touch being attracted towards the location of gaze, at least for perceptual measures. When location was reported by pointing, an action-based measure, I find no effect of gaze direction on touch localization, suggesting that a gaze-independent reference frame is used for action. These behavioral results are complementary to recent neurophysiological and neuroimaging findings indicating that spatial locations are coded in a range of different reference frames, and indicate that gaze-related reference frames are behaviorally relevant in tactile localization

    Vicarious ratings of social touch reflect the anatomical distribution & velocity tuning of C-tactile afferents: A Hedonic Homunculus?

    Get PDF
    A subclass of C-fibres, C-tactile afferents (CTs), have been discovered which respond preferentially to low force/velocity stroking touch, that is typically perceived as pleasant. Molecular genetic visualization of these low-threshold mechanosensitive C-fibres (CLTMs) in mice revealed a denser distribution in dorsal than ventral thoracic sites, scattered distal limb innervation and a complete absence from glabrous paw skin (Liu et al., 2007). Here we used third-party ratings to examine whether affective responses to social touch reflect the anatomical distribution and velocity tuning of CTs. Participants viewed and rated a sequence of video clips depicting one individual being touched by another at different skin sites and at 3 different velocities (static, 3 cm/s, 30 cm/s). Immediately after viewing each clip participants were asked to rate how pleasant they perceived the touch to be. Vicarious preferences matched the previously reported anatomical innervation density of rodent CLTMs, with touch on the back being rated significantly more pleasant than any other location. Furthermore, in contrast to all other skin sites, CT optimal (3 cm/s) touch on the palm of the hand was not preferred to static touch, consistent with the anatomical absence of CTs in glabrous skin. Our findings demonstrate that humans recognise the specific rewarding value of CT optimal caressing touch and their preferences reflect the hypothesised anatomical distribution of CTs

    More than skin-deep: integration of skin-based and musculo-skeletal reference frames in localisation of touch

    Get PDF
    The skin of the forearm is, in one sense, a flat 2D sheet, but in another sense approximately cylindrical, mirroring the 3D volumetric shape of the arm. The role of frames of reference based on the skin as a 2D sheet versus based on the musculo-skeletal structure of the arm remains unclear. When we rotate the forearm from a pronated to a supinated posture, the skin on its surface is displaced. Thus, a marked location will slide with the skin across the underlying flesh, and the touch perceived at this location should follow this displacement if it is localised within a skin-based reference frame. We investigated, however, if the perceived tactile locations were also affected by the rearrangement in underlying musculo-skeletal structure, i.e. displaced medially and laterally on a pronated and supinated forearm, respectively. Participants pointed to perceived touches (Experiment 1), or marked them on a three-dimensional size-matched forearm on a computer screen (Experiment 2). The perceived locations were indeed displaced medially after forearm pronation in both response modalities. This misperception was reduced (Experiment 1), or absent altogether (Experiment 2) in the supinated posture when the actual stimulus grid moved laterally with the displaced skin. The grid was perceptually stretched at medial-lateral axis, and it was displaced distally, which suggest the influence of skin-based factors. Our study extends the tactile localisation literature focused on the skin-based reference frame and on the effects of spatial positions of body parts by implicating the musculo-skeletal factors in localisation of touch on the body

    Perception of 3-D Location Based on Vision, Touch, and Extended Touch

    Get PDF
    Perception of the near environment gives rise to spatial images in working memory that continue to represent the spatial layout even after cessation of sensory input. As the observer moves, these spatial images are continuously updated. This research is concerned with (1) whether spatial images of targets are formed when they are sensed using extended touch (i.e., using a probe to extend the reach of the arm) and (2) the accuracy with which such targets are perceived. In Experiment 1, participants perceived the 3-D locations of individual targets from a fixed origin and were then tested with an updating task involving blindfolded walking followed by placement of the hand at the remembered target location. Twenty-four target locations, representing all combinations of two distances, two heights, and six azimuths, were perceived by vision or by blindfolded exploration with the bare hand, a 1-m probe, or a 2-m probe. Systematic errors in azimuth were observed for all targets, reflecting errors in representing the target locations and updating. Overall, updating after visual perception was best, but the quantitative differences between conditions were small. Experiment 2 demonstrated that auditory information signifying contact with the target was not a factor. Overall, the results indicate that 3-D spatial images can be formed of targets sensed by extended touch and that perception by extended touch, even out to 1.75 m, is surprisingly accurate

    The Neural Basis of Somatosensory Remapping Develops in Human Infancy

    Get PDF
    When we sense a touch, our brains take account of our current limb position to determine the location of that touch in external space [1, 2]. Here we show that changes in the way the brain processes somatosensory information in the first year of life underlie the origins of this ability [3]. In three experiments we recorded somatosensory evoked potentials (SEPs) from 6.5-, 8-, and 10-month-old infants while presenting vibrotactile stimuli to their hands across uncrossed- and crossed-hands postures. At all ages we observed SEPs over central regions contralateral to the stimulated hand. Somatosensory processing was influenced by arm posture from 8 months onward. At 8 months, posture influenced mid-latency SEP components, but by 10 months effects were observed at early components associated with feed-forward stages of somatosensory processing. Furthermore, sight of the hands was a necessary pre-requisite for somatosensory remapping at 10 months. Thus, the cortical networks [4] underlying the ability to dynamically update the location of a perceived touch across limb movements become functional during the first year of life. Up until at least 6.5 months of age, it seems that human infants’ perceptions of tactile stimuli in the external environment are heavily dependent upon limb position

    Object-Guided Spatial Selection in Touch Without Concurrent Changes in the Perceived Location of the Hands

    Get PDF
    In an endogenous cueing paradigm with central visual cues, observers made speeded responses to tactile targets at the hands, which were either close together or far apart, and holding either two separate objects or one common object between them. When the hands were far apart, the response time costs associated with attending to the wrong hand were reduced when attention had to be shifted along one object jointly held by both hands compared to when it was shifted over the same distance but across separate objects. Similar reductions in attentional costs were observed when the hands were placed closer together, suggesting that processing at one hand is less prioritized over that at another when the hands can be “grouped” by virtue of arising from the same spatial location or from the same object. Probes of perceived hand locations throughout the task showed that holding a common object decreased attentional separability without decreasing the perceived separation between the hands. Our findings suggest that tactile events at the hands may be represented in a spatial framework that flexibly adapts to (object-guided) attentional demands, while their relative coordinates are simultaneously preserved. </jats:p

    “Pulling Telescoped Phantoms Out of the Stump”: Manipulating the Perceived Position of Phantom Limbs Using a Full-Body Illusion

    Get PDF
    Most amputees experience phantom limbs, or the sensation that their amputated limb is still attached to the body. Phantom limbs can be perceived in the location previously occupied by the intact limb, or they can gradually retract inside the stump, a phenomenon referred to as “telescoping”.  Telescoping is relevant from a clinical point of view, as it tends to be related to increased levels of phantom pain. In the current study we demonstrate how a full-body illusion can be used to temporarily revoke telescoping sensations in upper limb amputees. During this illusion participants view the body of a mannequin from a first person perspective while being subjected to synchronized visuo-tactile stimulation through stroking, which makes them experience the mannequin’s body as their own. In Experiment 1 we used an intact mannequin, and showed that amputees can experience ownership of an intact body as well as referral of touch from both hands of the mannequin. In Experiment 2 and 3 we used an amputated mannequin, and demonstrated that depending on the spatial location of the strokes applied to the mannequin, participants experienced their phantom hand to either remain telescoped, or to actually be located below the stump. The effects were supported by subjective data from questionnaires, as well as verbal reports of the perceived location of the phantom hand in a visual judgment task. These findings are of particular interest, as they show that the temporary revoking of telescoping sensations does not necessarily have to involve the visualization of an intact hand or illusory movement of the phantom (as in the rubber hand illusion or mirror visual feedback therapy), but that it can also be obtained through mere referral of touch from the stump to the spatial location corresponding to that previously occupied by the intact hand. Moreover, our study also provides preliminary evidence for the fact that these manipulations can have an effect on phantom pain sensations
    • …
    corecore