3 research outputs found

    Gaze direction influences grasping actions towards unseen, haptically explored, objects

    Get PDF
    Haptic exploration produces mental object representations that can be memorized for subsequent object-directed behaviour. Storage of haptically-acquired object images (HOIs), engages, besides canonical somatosensory areas, the early visual cortex (EVC). Clear evidence for a causal contribution of EVC to HOI representation is still lacking. The use of visual information by the grasping system undergoes necessarily a frame of reference shift by integrating eye-position. We hypothesize that if the motor system uses HOIs stored in a retinotopic coding in the visual cortex, then its use is likely to depend at least in part on eye position. We measured the kinematics of 4 fingers in the right hand of 15 healthy participants during the task of grasping different unseen objects behind an opaque panel, that had been previously explored haptically. The participants never saw the object and operated exclusively based on haptic information. The position of the object was fixed, in front of the participant, but the subject's gaze varied from trial to trial between 3 possible positions, towards the unseen object or away from it, on either side. Results showed that the middle and little fingers' kinematics during reaching for the unseen object changed significantly according to gaze position. In a control experiment we showed that intransitive hand movements were not modulated by gaze direction. Manipulating eye-position produces small but significant configuration errors, (behavioural errors due to shifts in frame of reference) possibly related to an eye-centered frame of reference, despite the absence of visual information, indicating sharing of resources between the haptic and the visual/oculomotor system to delayed haptic grasping

    Neuropsychological and behavioral studies on object grasping in humans with and without vision

    Get PDF
    Sensorimotor transformations are used to translate sensory information on intrinsic properties of objects (i.e., size, shape, orientation) onto motor commands for appropriate hand-object interaction. Hence, the direct result of sensorimotor transformation for reach-to-grasp action is hand kinematics (hand shaping) fitting with the object size. We assembled and evaluated a sensor-based glove to measure finger flexion during reaching of differently sized cylinders. Once ensured of the good functioning of the tool, we adopt the glove in two studies dealing with grasping with and without vision. The first study aimed to causally draw a functional map of PMC for visually-based grasping. Specifically, online TMS was applied over a grid covering the whole precentral gyrus while subjects grasped three differently sized cylinders. Output from our sensor glove was analyzed with a hypothesis-independent approach using classification algorithms. Results from classifiers convincingly suggested a multifocal representation of visually-based grasping in human PMC involving the ventral PMC and, for the first time in human, the supplementary motor area. The second study aimed to establish whether the gaze direction modulated hand shaping during haptically-based reaching as it does during visually-based reaching. Participants haptically explored and then grasped an object of three possible sizes aligned with body midline while looking in the direction of the object or laterally to it. Results showed that gaze direction asymmetrically affected finger flexion during haptically-based reaching. Despite this asymmetrical effect, the investigation provided evidence for retinotopic coding of haptically-explored objects

    Data-driven Mechanical Design and Control Method of Dexterous Upper-Limb Prosthesis

    Get PDF
    With an increasing number of people, 320,000 per year, suffering from impaired upper limb function due to various medical conditions like stroke and blunt trauma, the demand for highly functional upper limb prostheses is increasing; however, the rates of rejection of prostheses are high due to factors such as lack of functionality, high cost, weight, and lack of sensory feedback. Modern robotics has led to the development of more affordable and dexterous upper limb prostheses with mostly anthropomorphic designs. However, due to the highly sophisticated ergonomics of anthropomorphic hands, most are economically prohibitive and suffer from control complexity due to increased cognitive load on the user. Thus, this thesis work aims to design a prosthesis that relies on the emulation of the kinematics and contact forces involved in grasping tasks with healthy human hands rather than on biomimicry for reduction of mechanical complexity and utilization of technologically advanced engineering components. This is accomplished by 1) experimentally characterizing human grasp kinematics and kinetics as a basis for data-driven prosthesis design. Using the grasp data, steps are taken to 2) develop a data-driven design and control method of an upper limb prosthesis that shares the kinematics and kinetics required for healthy human grasps without taking the anthropomorphic design. This thesis demonstrates an approach to decrease the gap between the functionality of the human hand and robotic upper limb prostheses by introducing a method to optimize the design and control method of an upper limb prosthesis. This is accomplished by first, collecting grasp data from human subjects with a motion and force capture glove. The collected data are used to minimize control complexity by reducing the dimensionality of the device while fulfilling the kinematic and kinetic requirements of daily grasping tasks. Using these techniques, a task-oriented upper limb prosthesis is prototyped and tested in simulation and physical environment.Ph.D
    corecore