96 research outputs found

    Closed-loop Control using Electrotactile Feedback Encoded in Frequency and Pulse Width

    Get PDF

    Noninvasive Neuroprosthetic Control of Grasping by Amputees

    Get PDF
    Smooth coordination and fine temporal control of muscles by the brain allows us to effortlessly pre-shape our hand to grasp different objects. Correlates of motor control for grasping have been found across wide-spread cortical areas, with diverse signal features. These signals have been harnessed by implanting intracortical electrodes and used to control the motion of robotic hands by tetraplegics, using algorithms called brain-machine interfaces (BMIs). Signatures of motor control signal encoding mechanisms of the brain in macro-scale signals such as electroencephalography (EEG) are unknown, and could potentially be used to develop noninvasive brain-machine interfaces. Here we show that a) low frequency (0.1 – 1 Hz) time domain EEG contains information about grasp pre-shaping in able-bodies individuals, and b) This information can be used to control pre-shaping motion of a robotic hand by amputees. In the first study, we recorded simultaneous EEG and hand kinematics as 5 able-bodies individuals grasped various objects. Linear decoders using low delta band EEG amplitudes accurately predicted hand pre-shaping kinematics during grasping. Correlation coefficient between predicted and actual kinematics was r = 0.59 ± 0.04, 0.47 ± 0.06 and 0.32 ± 0.05 for the first 3 synergies. In the second study, two transradial amputees (A1 and A2) controlled a prosthetic hand to grasp two objects using a closed-loop BMI with low delta band EEG. This study was conducted longitudinally in 12 sessions spread over 38 days. A1 achieved a 63% success rate, with 11 sessions significantly above chance. A2 achieved a 32% success rate, with 2 sessions significantly above chance. Previous methods of EEG-based BMIs used frequency domain features, and were thought to have a low signal-to-noise ratio making them unsuitable for control of dexterous tasks like grasping. Our results demonstrate that time-domain EEG contains information about grasp pre-shaping, which can be harnessed for neuroprosthetic control.Electrical and Computer Engineering, Department o

    Neuromorphic Computing Systems for Tactile Sensing Perception

    Get PDF
    Touch sensing plays an important role in humans daily life. Tasks like exploring, grasping and manipulating objects deeply rely on it. As such, Robots and hand prosthesis endowed with the sense of touch can better and more easily manipulate objects, and physically collaborate with other agents. Towards this goal, information about touched objects and surfaces has to be inferred from raw data coming from the sensors. The orientation of edges, which is employed as a pre-processing stage in both artificial vision and touch, is a key indication for object discrimination. Inspired on the encoding of edges in human first-order tactile afferents, we developed a biologically inspired, spiking models architecture that mimics human tactile perception with computational primitives that are implementable on low-power subthreshold neuromorphic hardware. The network architecture uses three layers of Leaky Integrate and Fire neurons to distinguish different edge orientations of a bar pressed on the artificial skin of the iCub robot. We demonstrated that the network architecture can learn the appropriate connectivity through unsupervised spike-based learning, and that the number and spatial distribution of sensitive areas within receptive fields are important in edge orientation discrimination. The unconstrained and random structure of the connectivity among layers can produce unbalanced activity in the output neurons, which are driven by a variable amount of synaptic inputs. We explored two different mechanisms of synaptic normalization (weights normalization and homeostasis), defining how this can be useful during the learning phase and inference phase. The network is successfully able to discriminate between 35 orientations of 36 (0 degree to 180 degree with 5 degree step increments) with homeostasis and weights normalization mechanism. Besides edge orientation discrimination, we modified the network architecture to be able to classify six different touch modalities (e.g. poke, press, grab, squeeze, push, and rolling a wheel). We demonstrated the ability of the network to learn appropriate connectivity patterns for the classification, achieving a total accuracy of 88.3 %. Furthermore, another application scenario on the tactile object shapes recognition has been considered because of its importance in robotic manipulation. We illustrated that the network architecture with 2 layers of spiking neurons was able to discriminate the tactile object shapes with accuracy 100 %, after integrating to it an array of 160 piezoresistive tactile sensors where the object shapes are applied

    Doctor of Philosophy

    Get PDF
    dissertationHands are so central to the human experience, yet we often take for granted the capacity to maneuver objects, to form a gesture, or to caress a loved-one’s hand. The effects of hand amputation can be severe, including functional disabilities, chronic phantom pain, and a profound sense of loss which can lead to depression and anxiety. In previous studies, peripheral-nerve interfaces, such as the Utah Slanted Electrode Array (USEA), have shown potential for restoring a sense of touch and prosthesis movement control. This dissertation represents a substantial step forward in the use of the USEAs for clinical careâ€"ultimately providing human amputees with widespread hand sensation that is functionally useful and psychologically meaningful. In completion of this ultimate objective, we report on three major advances. First, we performed the first dual-USEA implantations in human amputees; placing one USEA in the residual median nerve and another USEA in the residual ulnar nerve. Chapter 2 of this dissertation shows that USEAs provided full-hand sensory coverage, and that movement of the implant site to the upper arm in the second subject, proximal to nerve branch-points to extrinsic hand muscles, enabled activation of both proprioceptive sensory percepts and cutaneous percepts. Second, in Chapter 3, we report on successful use of USEA-evoked sensory percepts for functional discrimination tasks. We provide a comprehensive report of functional discrimination among USEA-evoked sensory percepts from three human subjects, including discrimination among multiple proprioceptive or cutaneous sensory percepts with different hand locations, sensory qualities, and/or intensities. Finally, in Chapter 4, we report on the psychological value of multiple degree of freedom prosthesis control, multisensor prosthesis sensation, and closed-loop control. This chapter represents the first report of prosthesis embodiment during closed-loop and open-loop prosthesis control by an amputee, as well as the most sophisticated closed-loop prosthesis control reported in literature to-date, including 5-degree-of-freedom motor control and sensory feedback from 4 hand locations. Ultimately, we expect that USEA-evoked hand sensations may be used as part of a take-home prosthesis system which will provide users with both advanced functional capabilities and a meaningful sense of embodiment and limb restoration

    Sensory mechanisms involved in obtaining frictional information for perception and grip force adjustment during object manipulation

    Full text link
    Sensory signals informing about frictional properties of a surface are used both for perception to experience material properties and for motor control to be able to handle objects using adequate manipulative forces. There are fundamental differences between these two purposes and scenarios, how sensory information typically is obtained. This thesis aims to explore the mechanisms involved in the perception of frictional properties of the touched surfaces under conditions relevant for object manipulation. Firstly, I show that in the passive touch condition, when the surface is brought in contact with immobilised finger, humans are unable to use existing friction-related mechanical cues and perceptually associate them with frictional properties. However, a submillimeter range lateral movement significantly improved the subject's ability to evaluate the frictional properties of two otherwise identical surfaces. It is demonstrated that partial slips within the contact area and fingertip tissue deformation create very potent sensory stimuli, enabling tactile afferents to signal friction-dependent mechanical effects translating into slipperiness (friction) perception. Further, I demonstrate that natural movement kinematics facilitate the development of such small skin displacements within the contact area and may play a central role in enabling the perception of surface slipperiness and adjusting grip force to friction when manipulating objects. This demonstrates intimate interdependence between the motor and sensory systems. This work significantly extends our understanding of fundamental tactile sensory processes involved in friction signaling in the context of motor control and dexterous object manipulation tasks. This knowledge and discovered friction sensing principles may assist in designing haptic rendering devices and artificial tactile sensors as well as associated control algorithms to be used in robotic grippers and hand prostheses

    Tactile mapping of harsh, constrained environments, with an application to oil wells

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. [110]-114).This work develops a practical approach to explore rough environments when time is critical. The harsh environmental conditions prevent the use of range, force/torque or tactile sensors. A representative case is the mapping of oil wells. In these conditions, tactile exploration is appealing. In this work, the environment is mapped tactilely, by a manipulator whose only sensors are joint encoders. The robot autonomously explores the environment collecting few, sparse tactile data and monitoring its free movements. These data are used to create a model of the surface in real time and to choose the robot's movements to reduce the mapping time. First, the approach is described and its feasibility demonstrated. Real-time impedance control allows a robust robot movement and the detection of the surface using a manipulator mounting only position sensors. A representation based on geometric primitives describes the surface using the few, sparse data available. The robustness of the method is tested against surface roughness and different surrounding fluids. Joint backlash strongly affect the robot's precision, and it is inevitable because of the thermal expansion in the joints. Here, a new strategy is developed to compensate for backlash positioning errors, by simultaneously identifying the surface and the backlash values. Second, an exploration strategy to map a constraining environment with a manipulator is developed. To maximize the use of the acquired data, this work proposes a hybrid approach involving both workspace and configuration space. The amount of knowledge of the environment is evaluated with an approach based on information theory, and the robot's movements are chosen to maximize the expected increase of such knowledge. Since the robot only possesses position sensors, the location along the robot where contact with the surface occurs cannot be determined with certainty. Thus a new approach is developed, that evaluates the probability of contact with specific parts of the robot and classifies and uses the data according to the different types of contact. This work is validated with simulations and experiments with a prototype manipulator specifically designed for this application.by Francesco Mazzini.Ph.D
    • …
    corecore