13 research outputs found
Keeping Safe : Intra-individual Consistency in Obstacle Avoidance Behaviour Across Grasping and Locomotion Tasks
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Jutta Billino was supported by grants from the German Research Foundation, Collaborative Research Centre SFB/TRR 135: Cardinal Mechanisms of Perception.Peer reviewedPublisher PD
Decision making in slow and rapid reaching : Sacrificing success to minimize effort
Acknowledgement This work was supported by the James S. McDonnell Foundation (Scholar Award to ARH). Supplementary Material Data available at: https://zenodo.org/record/3604284Peer reviewedPostprin
Decision making in slow and rapid reaching: Sacrificing success to minimize effort
Current studies on visuomotor decision making come to inconsistent conclusions regarding the optimality with which these decisions are made. When executing rapid reaching movements under uncertainty, humans seem to automatically select optimal movement paths that take into account the position of all potential targets (spatial averaging). In contrast, humans rarely employ optimal strategies when making decisions on whether to pursue two action goals simultaneously or prioritise one goal over another. Here, we manipulated whether spatial averaging or pre-selection of a single target would provide the optimal strategy by varying the spatial separation between two potential movement targets as well as the time available for movement execution. Experiment 1 provided a generous amount of time to reach the final target. Participants tended to pre-select the target that was easiest to reach and correct their movement path in-flight if required. In Experiment 2, a strict time limit was set, such that the optimal strategy depended on the separation between the potential targets: for small separations, there was sufficient time to employ averaging strategies, but higher success for larger separations required pre-selecting the final target instead. None of our participants adjusted their movement strategies with spatial separation, however. In Experiment 3, we confirm the bias towards targets that are easiest to reach and show that this comes at the expense of overall task success. The results suggest a strong tendency to minimize immediate movement effort, and a failure to flexibly adjust movement strategies to maximize the probability of success
The effect of visuohaptic discrepancy on perceived surface roughness: Partial replication
This work was supported by a doctoral scholarship from the Biotechnology and Biological Sciences Research Council (BBSRC) awarded to Karina Kangur [grant number RT10106-10]
Crossmodal texture perception is illumination-dependent
Data files for
Kangur, K., Giesel, M., Harris, J. M., & Hesse, C. (2022). Crossmodal Texture Perception Is Illumination-Dependent, Multisensory Research, 36(1), 75-91. doi: https://doi.org/10.1163/22134808-bja1008
Crossmodal texture perception is illumination-dependent
Visually perceived roughness of 3D textures varies with illumination direction. Surfaces appear rougher when the illumination angle is lowered resulting in a lack of roughness constancy. Here we aimed to investigate whether the visual system also relies on illumination-dependent features when judging roughness in a crossmodal matching task or whether it can access illumination-invariant surface features that can also be evaluated by the tactile system. Participants ( N = 32) explored an abrasive paper of medium physical roughness either tactually, or visually under two different illumination conditions (top vs oblique angle). Subsequently, they had to judge if a comparison stimulus (varying in physical roughness) matched the previously explored standard. Matching was either performed using the same modality as during exploration (intramodal) or using a different modality (crossmodal). In the intramodal conditions, participants performed equally well independent of the modality or illumination employed. In the crossmodal conditions, participants selected rougher tactile matches after exploring the standard visually under oblique illumination than under top illumination. Conversely, after tactile exploration, they selected smoother visual matches under oblique than under top illumination. These findings confirm that visual roughness perception depends on illumination direction and show, for the first time, that this failure of roughness constancy also transfers to judgements made crossmodally