108 research outputs found

    Catching a Ball at the Right Time and Place: Individual Factors Matter

    Get PDF
    Intercepting a moving object requires accurate spatio-temporal control. Several studies have investigated how the CNS copes with such a challenging task, focusing on the nature of the information used to extract target motion parameters and on the identification of general control strategies. In the present study we provide evidence that the right time and place of the collision is not univocally specified by the CNS for a given target motion; instead, different but equally successful solutions can be adopted by different subjects when task constraints are loose. We characterized arm kinematics of fourteen subjects and performed a detailed analysis on a subset of six subjects who showed comparable success rates when asked to catch a flying ball in three dimensional space. Balls were projected by an actuated launching apparatus in order to obtain different arrival flight time and height conditions. Inter-individual variability was observed in several kinematic parameters, such as wrist trajectory, wrist velocity profile, timing and spatial distribution of the impact point, upper limb posture, trunk motion, and submovement decomposition. Individual idiosyncratic behaviors were consistent across different ball flight time conditions and across two experimental sessions carried out at one year distance. These results highlight the importance of a systematic characterization of individual factors in the study of interceptive tasks

    Avoiding moving obstacles

    Get PDF
    To successfully move our hand to a target, we must consider how to get there without hitting surrounding objects. In a dynamic environment this involves being able to respond quickly when our relationship with surrounding objects changes. People adjust their hand movements with a latency of about 120 ms when the visually perceived position of their hand or of the target suddenly changes. It is not known whether people can react as quickly when the position of an obstacle changes. Here we show that quick responses of the hand to changes in obstacle position are possible, but that these responses are direct reactions to the motion in the surrounding. True adjustments to the changed position of the obstacle appeared at much longer latencies (about 200 ms). This is even so when the possible change is predictable. Apparently, our brain uses certain information exceptionally quickly for guiding our movements, at the expense of not always responding adequately. For reaching a target that changes position, one must at some time move in the same direction as the target did. For avoiding obstacles that change position, moving in the same direction as the obstacle is not always an adequate response, not only because it may be easier to avoid the obstacle by moving the other way, but also because one wants to hit the target after passing the obstacle. Perhaps subjects nevertheless quickly respond in the direction of motion because this helps avoid collisions when pressed for time. © 2008 Springer-Verlag

    Similarities between digits’ movements in grasping, touching and pushing

    Get PDF
    In order to find out whether the movements of single digits are controlled in a special way when grasping, we compared the movements of the digits when grasping an object with their movements in comparable single-digit tasks: pushing or lightly tapping the same object at the same place. The movements of the digits in grasping were very similar to the movements in the single-digit tasks. To determine to what extent the hand transport and grip formation in grasping emerges from a synchronised motion of individual digits, we combined movements of finger and thumb in the single-digit tasks to obtain hypothetical transport and grip components. We found a larger peak grip aperture earlier in the movement for the single-digit tasks. The timing of peak grip aperture depended in the same way on its size for all tasks. Furthermore, the deviations from a straight line of the transport component differed considerably between subjects, but were remarkably similar across tasks. These results support the idea that grasping should be regarded as consisting of moving the digits, rather than transporting the hand and shaping the grip

    Music Attenuates Excessive Visual Guidance of Skilled Reaching in Advanced but Not Mild Parkinson's Disease

    Get PDF
    Parkinson's disease (PD) results in movement and sensory impairments that can be reduced by familiar music. At present, it is unclear whether the beneficial effects of music are limited to lessening the bradykinesia of whole body movement or whether beneficial effects also extend to skilled movements of PD subjects. This question was addressed in the present study in which control and PD subjects were given a skilled reaching task that was performed with and without accompanying preferred musical pieces. Eye movements and limb use were monitored with biomechanical measures and limb movements were additionally assessed using a previously described movement element scoring system. Preferred musical pieces did not lessen limb and hand movement impairments as assessed with either the biomechanical measures or movement element scoring. Nevertheless, the PD patients with more severe motor symptoms as assessed by Hoehn and Yahr (HY) scores displayed enhanced visual engagement of the target and this impairment was reduced during trials performed in association with accompanying preferred musical pieces. The results are discussed in relation to the idea that preferred musical pieces, although not generally beneficial in lessening skilled reaching impairments, may normalize the balance between visual and proprioceptive guidance of skilled reaching

    Grasping Kinematics from the Perspective of the Individual Digits: A Modelling Study

    Get PDF
    Grasping is a prototype of human motor coordination. Nevertheless, it is not known what determines the typical movement patterns of grasping. One way to approach this issue is by building models. We developed a model based on the movements of the individual digits. In our model the following objectives were taken into account for each digit: move smoothly to the preselected goal position on the object without hitting other surfaces, arrive at about the same time as the other digit and never move too far from the other digit. These objectives were implemented by regarding the tips of the digits as point masses with a spring between them, each attracted to its goal position and repelled from objects' surfaces. Their movements were damped. Using a single set of parameters, our model can reproduce a wider variety of experimental findings than any previous model of grasping. Apart from reproducing known effects (even the angles under which digits approach trapezoidal objects' surfaces, which no other model can explain), our model predicted that the increase in maximum grip aperture with object size should be greater for blocks than for cylinders. A survey of the literature shows that this is indeed how humans behave. The model can also adequately predict how single digit pointing movements are made. This supports the idea that grasping kinematics follow from the movements of the individual digits

    Measurement of the Tau Polarisation Asymmetry from event acolinearity at OPAL

    No full text

    The role of visual and nonvisual feedback in a vehicle steering task

    No full text
    This article investigates vehicle steering control, focusing on the task of lane changing and the role of different sources of sensory feedback. Participants carried out 2 experiments in a fully instrumented, motion-based simulator. Despite the high level of realism afforded by the simulator, participants were unable to complete a lane change in the absence of visual feedback. When asked to produce the steering movements required to change lanes and turn a corner, participants produced remarkably similar behavior in each case, revealing a misconception of how a lane-change maneuver is normally executed. Finally, participants were asked to change lanes in a fixed-based simulator, in the presence of intermittent visual information. Normal steering behavior could be restored using brief but suitably timed exposure to visual information. The data suggest that vehicle steering control can be characterized as a series of unidirectional, open-loop steering movements, each punctuated by a brief visual update

    Analysis of recent empirical challenges to an account of time-to-collision perception

    No full text
    How do we perceive how long it will be before we reach a certain place when running, driving, or skiing? How do we perceive how long it will be before a moving object reaches us or will arrive at a place where it can be hit or caught? These are questions of how we temporally coordinate our actions with a dynamic environment so as to control collision events. Much of the theoretical work on the control of these interceptive actions has been united in supposing that (1) timing is functionally separable from positioning and the two are controlled using different types of information; (2) timing is controlled using special-purpose time-to-arrival information; (3) the time-to-arrival information used for the timing of fast interceptive actions is a first-order approximation to the actual time-to-arrival, which does not take accelerations into account. Challenges to each of these suppositions have recently emerged, suggesting that a complete rethinking of how interceptions are controlled may be necessary. These challenges are analyzed in detail and it is shown that they are readily accommodated by a recent theory of interceptive timing based on the points just noted
    corecore