346 research outputs found

    Modeling motor control in continuous-time Active Inference: a survey

    Full text link
    The way the brain selects and controls actions is still widely debated. Mainstream approaches based on Optimal Control focus on stimulus-response mappings that optimize cost functions. Ideomotor theory and cybernetics propose a different perspective: they suggest that actions are selected and controlled by activating action effects and by continuously matching internal predictions with sensations. Active Inference offers a modern formulation of these ideas, in terms of inferential mechanisms and prediction-error-based control, which can be linked to neural mechanisms of living organisms. This article provides a technical illustration of Active Inference models in continuous time and a brief survey of Active Inference models that solve four kinds of control problems; namely, the control of goal-directed reaching movements, active sensing, the resolution of multisensory conflict during movement and the integration of decision-making and motor control. Crucially, in Active Inference, all these different facets of motor control emerge from the same optimization process - namely, the minimization of Free Energy - and do not require designing separate cost functions. Therefore, Active Inference provides a unitary perspective on various aspects of motor control that can inform both the study of biological control mechanisms and the design of artificial and robotic systems

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice

    Motor learning and sensory plasticity in healthy adults and Parkinson's disease

    Get PDF
    We use multiple sources of sensory information to guide goal-directed movements, such as reaching. When information from multiple modalities (i.e. vision, proprioception) is incongruent, one learns to adapt his or her movements and recalibrate one sense to more closely match the other; simply put, one begins to perceive his/her hand where one sees it. This thesis attempts to better characterize this sensory recalibration (termed 'proprioceptive recalibration') following adaptation to a visuomotor distortion under a variety of contexts, and contributes to the existing literature that describes sensory plasticity associated with motor learning. Specifically, chapter two describes the effect of initial exposure to a visuomotor distortion and the dominance of the hand trained on proprioceptive recalibration. In. this study, participants used their dominant right or non-dominant left hand to reach to targets with visual feedback of hand position that was abruptly rotated clockwise relative to their unseen hand. Proprioceptive recalibration was then assessed and found to be comparable in the two hands and consistent with previous studies employing a gradual perturbation; these findings suggest that neither the initial error signal nor dominance of the hand trained influence recalibration. Chapter three describes how the magnitude of the visuomotor distortion affects the magnitude of recalibration, and how this is related to changes in reach aftereffects. Changes in reach aftereffects and proprioception were measured following adaptation to increasingly misaligned visual hand feedback; these changes were found to increase systematically as a function of the distortion magnitude. However, while these changes were directly correlated with the distortion magnitude, they were not correlated with each other, which suggests that these two processes may be mediated by simultaneous yet separate underlying mechanisms. Chapter four similarly describes how the magnitude of a cross-sensory error signal (generated in the absence of a visuomotor signal derived from goal-directed movement) affects the magnitude of recalibration, and how this is related to changes in reach aftereffects. Participants moved their unseen hand along a grooved path while viewing a cursor that moved towards a target; the position of the path was gradually rotated counter-clockwise with respect to the cursor. Following this cross-sensory adaptation, changes in reach aftereffects and proprioception were both found to saturate at a small distortion as no further changes were observed with training with increasing misalignment. Furthermore, these changes were not correlated with the magnitude of the misalignment. However, in contrast to the findings in chapter three, these changes were correlated with each other, suggesting that the cross-sensory discrepancy drives changes in both reach aftereffects (partially) and proprioception. This study helps to characterize the contribution of different error signals to changes in motor and sensory systems. Lastly, chapter five describes how damage to central nervous system structures integral to sensorimotor integration (i.e. the basal ganglia) affects proprioceptive recalibration. Patients with Parkinson's disease were able to learn to reach to targets with gradually rotated and translated visual feedback of hand positions comparably to healthy older adults. Patients also recalibrated proprioception comparably to healthy older adults, although the trend for greater recalibration in patients suggests that they may depend more on salient visual information of hand position than proprioceptive feedback to guide movement

    On the development of a cybernetic prosthetic hand

    Get PDF
    The human hand is the end organ of the upper limb, which in humans serves the important function of prehension, as well as being an important organ for sensation and communication. It is a marvellous example of how a complex mechanism can be implemented, capable of realizing very complex and useful tasks using a very effective combination of mechanisms, sensing, actuation and control functions. In this thesis, the road towards the realization of a cybernetic hand has been presented. After a detailed analysis of the model, the human hand, a deep review of the state of the art of artificial hands has been carried out. In particular, the performance of prosthetic hands used in clinical practice has been compared with the research prototypes, both for prosthetic and for robotic applications. By following a biomechatronic approach, i.e. by comparing the characteristics of these hands with the natural model, the human hand, the limitations of current artificial devices will be put in evidence, thus outlining the design goals for a new cybernetic device. Three hand prototypes with a high number of degrees of freedom have been realized and tested: the first one uses microactuators embedded inside the structure of the fingers, and the second and third prototypes exploit the concept of microactuation in order to increase the dexterity of the hand while maintaining the simplicity for the control. In particular, a framework for the definition and realization of the closed-loop electromyographic control of these devices has been presented and implemented. The results were quite promising, putting in evidence that, in the future, there could be two different approaches for the realization of artificial devices. On one side there could be the EMG-controlled hands, with compliant fingers but only one active degree of freedom. On the other side, more performing artificial hands could be directly interfaced with the peripheral nervous system, thus establishing a bi-directional communication with the human brain

    Upper Body Coordination and Movement Control in Unconstrained Visually Guided Three-Dimensional Reach Movements.

    Full text link
    Reaching is a basic component of human movements requiring the coordination of the eyes and multiple body segments including the hand, forearm, arm and torso. Although this movement has been studied extensively, the theory bridging the explicit reaching behavior (coordinated movement of body segments) and the implicit reaching strategy (control mechanisms) is limited. Hence, modeling unconstrained reach movements as a result of coordination remains a difficult task. The aims of the present study were to investigate the relationships defining the coordination pattern, control mode composition and movement phase transition in order to develop a model of coordinated reach movements. This work focuses more particularly on the characterization of body segment kinematics in movement phases and control mode transition in relation to visual information. A novel approach to determine control mode transition is proposed by using changes in curvature of the elbow swivel angle (ESA) combined with the content of visual information. The results show that this approach seems to be a good indicator of control mode transition in reach movements. The relative durations of movement control modes were therefore determined and modeled as a function of reaching requirements. In addition, the use of the swivel angle enables the reduction of the degrees of freedom and contributes to a simplification of arm movement models. Two strategies of movement execution were observed as a function of the availability of the visual information. In absence of vision, the movement variability was significantly reduced in order to constrain the system degrees of freedom. Furthermore, the orientation of the movement errors strongly support that in the present context, movements are planned in a local coordinate system and the head is the origin of that frame of reference. A coordination model was developed to describe the timing and kinematics of three-dimensional reach movements. This model also includes the relationship between the eyes and body segment movements. With a generalized hand trajectory, the proposed model generates the sequence of movement phases and drives a multi-linkage system as a function of target locations.Ph.D.Biomedical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/75946/1/syyu_1.pd

    Haptics Rendering and Applications

    Get PDF
    There has been significant progress in haptic technologies but the incorporation of haptics into virtual environments is still in its infancy. A wide range of the new society's human activities including communication, education, art, entertainment, commerce and science would forever change if we learned how to capture, manipulate and reproduce haptic sensory stimuli that are nearly indistinguishable from reality. For the field to move forward, many commercial and technological barriers need to be overcome. By rendering how objects feel through haptic technology, we communicate information that might reflect a desire to speak a physically- based language that has never been explored before. Due to constant improvement in haptics technology and increasing levels of research into and development of haptics-related algorithms, protocols and devices, there is a belief that haptics technology has a promising future

    Advancing the Underactuated Grasping Capabilities of Single Actuator Prosthetic Hands

    Get PDF
    The last decade has seen significant advancements in upper limb prosthetics, specifically in the myoelectric control and powered prosthetic hand fields, leading to more active and social lifestyles for the upper limb amputee community. Notwithstanding the improvements in complexity and control of myoelectric prosthetic hands, grasping still remains one of the greatest challenges in robotics. Upper-limb amputees continue to prefer more antiquated body-powered or powered hook terminal devices that are favored for their control simplicity, lightweight and low cost; however, these devices are nominally unsightly and lack in grasp variety. The varying drawbacks of both complex myoelectric and simple body-powered devices have led to low adoption rates for all upper limb prostheses by amputees, which includes 35% pediatric and 23% adult rejection for complex devices and 45% pediatric and 26% adult rejection for body-powered devices [1]. My research focuses on progressing the grasping capabilities of prosthetic hands driven by simple control and a single motor, to combine the dexterous functionality of the more complex hands with the intuitive control of the more simplistic body-powered devices with the goal of helping upper limb amputees return to more active and social lifestyles. Optimization of a prosthetic hand driven by a single actuator requires the optimization of many facets of the hand. This includes optimization of the finger kinematics, underactuated mechanisms, geometry, materials and performance when completing activities of daily living. In my dissertation, I will present chapters dedicated to improving these subsystems of single actuator prosthetic hands to better replicate human hand function from simple control. First, I will present a framework created to optimize precision grasping – which is nominally unstable in underactuated configurations – from a single actuator. I will then present several novel mechanisms that allow a single actuator to map to higher degree of freedom motion and multiple commonly used grasp types. I will then discuss how fingerpad geometry and materials can better grasp acquisition and frictional properties within the hand while also providing a method of fabricating lightweight custom prostheses. Last, I will analyze the results of several human subject testing studies to evaluate the optimized hands performance on activities of daily living and compared to other commercially available prosthesis
    • …
    corecore