14 research outputs found

    Where is my hand?: Proprioceptive position sense and its applications in haptics

    Get PDF
    Smeets, J.B.J. [Promotor]Brenner, E.M. [Copromotor

    Proprioception is Robust under External Forces

    Get PDF
    Information from cutaneous, muscle and joint receptors is combined with efferent information to create a reliable percept of the configuration of our body (proprioception). We exposed the hand to several horizontal force fields to examine whether external forces influence this percept. In an end-point task subjects reached visually presented positions with their unseen hand. In a vector reproduction task, subjects had to judge a distance and direction visually and reproduce the corresponding vector by moving the unseen hand. We found systematic individual errors in the reproduction of the end-points and vectors, but these errors did not vary systematically with the force fields. This suggests that human proprioception accounts for external forces applied to the hand when sensing the position of the hand in the horizontal plane

    Errors in visuo-haptic and haptic-haptic location matching are stable over long periods of time

    Get PDF
    \u3cp\u3ePeople make systematic errors when they move their unseen dominant hand to a visual target (visuo-haptic matching) or to their other unseen hand (haptic-haptic matching). Why they make such errors is still unknown. A key question in determining the reason is to what extent individual participants' errors are stable over time. To examine this, we developed a method to quantify the consistency. With this method, we studied the stability of systematic matching errors across time intervals of at least a month. Within this time period, individual subjects' matches were as consistent as one could expect on the basis of the variability in the individual participants' performance within each session. Thus individual participants make quite different systematic errors, but in similar circumstances they make the same errors across long periods of time.\u3c/p\u3

    Haptic guidance needs to be intuitive not just informative to improve human motor accuracy.

    Get PDF
    Humans make both random and systematic errors when reproducing learned movements. Intuitive haptic guidance that assists one to make the movements reduces such errors. Our study examined whether any additional haptic information about the location of the target reduces errors in a position reproduction task, or whether the haptic guidance needs to be assistive to do so. Holding a haptic device, subjects made reaches to visible targets without time constraints. They did so in a no-guidance condition, and in guidance conditions in which the direction of the force with respect to the target differed, but the force scaled with the distance to the target in the same way. We examined whether guidance forces directed towards the target would reduce subjects' errors in reproducing a prior position to the same extent as do forces rotated by 90 degrees or 180 degrees, as it might because the forces provide the same information in all three cases. Without vision of the arm, both the accuracy and precision were significantly better with guidance directed towards the target than in all other conditions. The errors with rotated guidance did not differ from those without guidance. Not surprisingly, the movements tended to be faster when guidance forces directed the reaches to the target. This study shows that haptic guidance significantly improved motor performance when using it was intuitive, while non-intuitively presented information did not lead to any improvements and seemed to be ignored even in our simple paradigm with static targets and no time constraints

    Evaluation of Haptic and Visual Cues for Repulsive or Attractive Guidance in Nonholonomic Steering Tasks.

    Get PDF
    Remote control of vehicles is a difficult task for operators. Support systems that present additional task information may assist operators, but their usefulness is expected to depend on several factors such as 1) the nature of conveyed information, 2) what modality it is conveyed through, and 3) the task difficulty. In an exploratory experiment, these three factors were manipulated to quantify their effects on operator behavior. Subjects ( n=15n = {{15}}) used a haptic manipulator to steer a virtual nonholonomic vehicle through abstract environments, in which obstacles needed to be avoided. Both a simple support conveying near-future predictions of the trajectory of the vehicle and a more elaborate support that continuously suggests the path to be taken were designed (factor 1). These types of information were offered either with visual or haptic cues (factor 2). These four support systems were tested in four different abstracted environments with decreasing amount of allowed variability in realized trajectories (factor 3). The results show improvements for the simple support only when this information was presented visually, but not when offered haptically. For the elaborate support, equally large improvements for both modalities were found. This suggests that the elaborate support is better: additional information is key in improving performance in nonholonomic steering tasks

    Matching locations is not just matching sensory representations

    Get PDF
    People make systematic errors when matching locations of an unseen index finger with the index finger of the other hand, or with a visual target. In this study, we present two experiments that test the consistency of such matching errors across different combinations of matching methods. In the first experiment, subjects had to move their unseen index fingers to visually presented targets. We examined the consistency between matching errors for the two hands and for different postures (hand above a board or below it). We found very little consistency: The matching error depends on the posture and differs between the hands. In the second experiment, we designed sets of tasks that involved the same matching configurations. For example, we compared matching errors when moving with the unseen index finger to a visual target, with errors when moving a visual target to the unseen index finger. We found that matching errors are not invertible. Furthermore, moving both index fingers to the same visual target results in a different mismatch between the hands than directly matching the two index fingers. We conclude that the errors that we make when matching locations cannot only arise from systematic mismatches between sensory representations of the positions of the fingers and of visually perceived space. We discuss how these results can be interpreted in terms of sensory transformations that depend on the movement that needs to be made

    Quantifying temporal ventriloquism in audio-visual rhythm perception

    No full text
    We used a rhythm perception paradigm to quantify the effects of small temporal discrepancies between audio-visual stimulus pairs. In this paradigm, observers had to align the onset of a target stimulus (position 3) within a rhythmic sequence of four markers (positions 1, 2, 4, and 5). In Exp 1, the modalities of the markers and targets were crossed in a 2X2 design. In unimodal conditions, the target was placed accurately for both audio (click) and visual (flash) conditions, but in bimodal conditions, there was a consistent 25-30 ms bias in target placement. In Exp 2, the markers were bimodal with various SOAs between the audio and visual components, and the targets were visual flashes. The results demonstrated temporal ventriloquism in which adjustment of the visual target was affected by the timing of the audio components of the bimodal markers, even when observers were told to use the visual components only

    The intimacy of heartbeat communication

    No full text
    Heart beat communication is hypothesized to be an intimate cue. Moreover, as with other nonverbal cues, we expect that hearing someone’s heart beat triggers unconscious nonverbal compensation strategies like increasing interpersonal distance. In line with this, we found that hearing someone’s heart beat increases the kept interpersonal distance. We conclude that heart beat communication increases the feeling of intimacy and can therefore be employed in connectedness devices

    Quantifying temporal ventriloquism in audio-visual rhythm perception

    No full text
    We used a rhythm perception paradigm to quantify the effects of small temporal discrepancies between audio-visual stimulus pairs. In this paradigm, observers had to align the onset of a target stimulus (position 3) within a rhythmic sequence of four markers (positions 1, 2, 4, and 5). In Exp 1, the modalities of the markers and targets were crossed in a 2X2 design. In unimodal conditions, the target was placed accurately for both audio (click) and visual (flash) conditions, but in bimodal conditions, there was a consistent 25-30 ms bias in target placement. In Exp 2, the markers were bimodal with various SOAs between the audio and visual components, and the targets were visual flashes. The results demonstrated temporal ventriloquism in which adjustment of the visual target was affected by the timing of the audio components of the bimodal markers, even when observers were told to use the visual components only
    corecore