242 research outputs found

    Serial search for fingers of the same hand but not for fingers of different hands

    Get PDF
    In most haptic search tasks, tactile stimuli are presented to the fingers of both hands. In such tasks, the search pattern for some object features, such as the shape of raised line symbols, has been found to be serial. The question is whether this search is serial over all fingers irrespective of the hand, or whether it is serial over the fingers of each hand and parallel over the two hands. To investigate this issue, we determined the speed of static haptic search when two items are presented to two fingers of the same hand and when two items are presented to two fingers of different hands. We compared the results with predictions for parallel and serial search based on the results of a previous study using the same items and a similar task. The results indicate that two fingers of the same hand process information in a serial manner, while two fingers of two different hands process information in parallel. Thus, considering the individual fingers as independent units in haptic search may not be justified, because the hand that they belong to matters. © 2009 Springer-Verlag

    Combining eye and hand in search is suboptimal

    Get PDF
    When performing everyday tasks, we often move our eyes and hand together: we look where we are reaching in order to better guide the hand. This coordinated pattern with the eye leading the hand is presumably optimal behaviour. But eyes and hands can move to different locations if they are involved in different tasks. To find out whether this leads to optimal performance, we studied the combination of visual and haptic search. We asked ten participants to perform a combined visual and haptic search for a target that was present in both modalities and compared their search times to those on visual only and haptic only search tasks. Without distractors, search times were faster for visual search than for haptic search. With many visual distractors, search times were longer for visual than for haptic search. For the combined search, performance was poorer than the optimal strategy whereby each modality searched a different part of the display. The results are consistent with several alternative accounts, for instance with vision and touch searching independently at the same time

    True as Touch? : An Investigation in Tactile Information Processing

    Get PDF
    Smeets, J.B.J. [Promotor]Brenner, E. [Copromotor

    Rewarding imperfect performance reduces adaptive changes

    Get PDF
    Could a pat on the back affect motor adaptation? Recent studies indeed suggest that rewards can boost motor adaptation. However, the rewards used were typically reward gradients that carried quite detailed information about performance. We investigated whether simple binary rewards affected how participants learned to correct for a visual rotation of performance feedback in a 3D pointing task. To do so, we asked participants to align their unseen hand with virtual target cubes in alternating blocks with and without spatial performance feedback. Forty participants were assigned to one of two groups: a ‘spatial only’ group, in which the feedback consisted of showing the (perturbed) endpoint of the hand, or to a ‘spatial & reward’ group, in which a reward could be received in addition to the spatial feedback. In addition, six participants were tested in a ‘reward only’ group. Binary reward was given when the participants’ hand landed in a virtual ‘hit area’ that was adapted to individual performance to reward about half the trials. The results show a typical pattern of adaptation in both the ‘spatial only’ and the ‘spatial & reward’ groups, whereas the ‘reward only’ group was unable to adapt. The rewards did not affect the overall pattern of adaptation in the ‘spatial & reward’ group. However, on a trial-by-trial basis, the rewards reduced adaptive changes to spatial errors. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s00221-015-4540-1) contains supplementary material, which is available to authorized users

    Haptic search with finger movements: using more fingers does not necessarily reduce search times

    Get PDF
    Two haptic serial search tasks were used to investigate how the separations between items, and the number of fingers used to scan them, influence the search time and search strategy. In both tasks participants had to search for a target (cross) between a fixed number of non-targets (circles). The items were placed in a straight line. The target’s position was varied within blocks, and inter-item separation was varied between blocks. In the first experiment participants used their index finger to scan the display. As expected, search time depended on target position as well as on item separation. For larger separations participants’ movements were jerky, resembling ‘saccades’ and ‘fixations’, while for the shortest separation the movements were smooth. When only considering time in contact with an item, search times were the same for all separation conditions. Furthermore, participants never continued their movement after they encountered the target. These results suggest that participants did not use the time during which they were moving between the items to process information about the items. The search times were a little shorter than those in a static search experiment (Overvliet et al. in Percept Psychophys, 2007a), where multiple items were presented to the fingertips simultaneously. To investigate whether this is because the finger was moving or because only one finger was stimulated, we conducted a second experiment in which we asked participants to put three fingers in line and use them together to scan the items. Doing so increased the time in contact with the items for all separations, so search times were presumably longer in the static search experiment because multiple fingers were involved. This may be caused by the time that it takes to switch from one finger to the other

    Child development and the role of visual experience in the use of spatial and non-spatial features in haptic object perception

    Get PDF
    Previous work has suggested a different developmental timeline and role of visual experience for the use of spatial and non-spatial features in haptic object recognition. To investigate this conjecture, we used a haptic ambiguous odd-one-out task in which one object needed to be selected as being different from two other objects. The odd-one-out could be selected based on four characteristics: size, shape (spatial), texture, and weight (non-spatial). We tested sighted children from 4 to 12 years of age; congenitally blind, late blind, and adult participants with low vision; and normally sighted adults. Given the protracted developmental time course for spatial perception, we expected a shift from a preference for non-spatial features toward spatial features during typical development. Due to the dominant influence of vision for spatial perception, we expected congenitally blind adults to show a similar preference for non-spatial features as the youngest children. The results confirmed our first hypothesis; the 4-year-olds demonstrated a lower dominance for spatial features for object classification compared with older children and sighted adults. In contrast, our second hypothesis was not confirmed; congenitally blind adults’ preferred categorization criteria were indistinguishable from those of sighted controls. These findings suggest an early development, but late maturation, of spatial processing in haptic object recognition independent of visual experience
    corecore