276 research outputs found

    Biomimetic Manipulator Control Design for Bimanual Tasks in the Natural Environment

    Get PDF
    As robots become more prolific in the human environment, it is important that safe operational procedures are introduced at the same time; typical robot control methods are often very stiff to maintain good positional tracking, but this makes contact (purposeful or accidental) with the robot dangerous. In addition, if robots are to work cooperatively with humans, natural interaction between agents will make tasks easier to perform with less effort and learning time. Stability of the robot is particularly important in this situation, especially as outside forces are likely to affect the manipulator when in a close working environment; for example, a user leaning on the arm, or task-related disturbance at the end-effector. Recent research has discovered the mechanisms of how humans adapt the applied force and impedance during tasks. Studies have been performed to apply this adaptation to robots, with promising results showing an improvement in tracking and effort reduction over other adaptive methods. The basic algorithm is straightforward to implement, and allows the robot to be compliant most of the time and only stiff when required by the task. This allows the robot to work in an environment close to humans, but also suggests that it could create a natural work interaction with a human. In addition, no force sensor is needed, which means the algorithm can be implemented on almost any robot. This work develops a stable control method for bimanual robot tasks, which could also be applied to robot-human interactive tasks. A dynamic model of the Baxter robot is created and verified, which is then used for controller simulations. The biomimetic control algorithm forms the basis of the controller, which is developed into a hybrid control system to improve both task-space and joint-space control when the manipulator is disturbed in the natural environment. Fuzzy systems are implemented to remove the need for repetitive and time consuming parameter tuning, and also allows the controller to actively improve performance during the task. Experimental simulations are performed, and demonstrate how the hybrid task/joint-space controller performs better than either of the component parts under the same conditions. The fuzzy tuning method is then applied to the hybrid controller, which is shown to slightly improve performance as well as automating the gain tuning process. In summary, a novel biomimetic hybrid controller is presented, with a fuzzy mechanism to avoid the gain tuning process, finalised with a demonstration of task-suitability in a bimanual-type situation.EPSR

    Using the Microsoft Kinect to assess human bimanual coordination

    Get PDF
    Optical marker-based systems are the gold-standard for capturing three-dimensional (3D) human kinematics. However, these systems have various drawbacks including time consuming marker placement, soft tissue movement artifact, and are prohibitively expensive and non-portable. The Microsoft Kinect is an inexpensive, portable, depth camera that can be used to capture 3D human movement kinematics. Numerous investigations have assessed the Kinect\u27s ability to capture postural control and gait, but to date, no study has evaluated it\u27s capabilities for measuring spatiotemporal coordination. In order to investigate human coordination and coordination stability with the Kinect, a well-studied bimanual coordination paradigm (Kelso, 1984, Kelso; Scholz, & Schöner, 1986) was adapted. ^ Nineteen participants performed ten trials of coordinated hand movements in either in-phase or anti-phase patterns of coordination to the beat of a metronome which was incrementally sped up and slowed down. Continuous relative phase (CRP) and the standard deviation of CRP were used to assess coordination and coordination stability, respectively.^ Data from the Kinect were compared to a Vicon motion capture system using a mixed-model, repeated measures analysis of variance and intraclass correlation coefficients (2,1) (ICC(2,1)).^ Kinect significantly underestimated CRP for the the anti-phase coordination pattern (p \u3c.0001) and overestimated the in-phase pattern (p\u3c.0001). However, a high ICC value (r=.097) was found between the systems. For the standard deviation of CRP, the Kinect exhibited significantly higher variability than the Vicon (p \u3c .0001) but was able to distinguish significant differences between patterns of coordination with anti-phase variability being higher than in-phase (p \u3c .0001). Additionally, the Kinect was unable to accurately capture the structure of coordination stability for the anti-phase pattern. Finally, agreement was found between systems using the ICC (r=.37).^ In conclusion, the Kinect was unable to accurately capture mean CRP. However, the high ICC between the two systems is promising and the Kinect was able to distinguish between the coordination stability of in-phase and anti-phase coordination. However, the structure of variability as movement speed increased was dissimilar to the Vicon, particularly for the anti-phase pattern. Some aspects of coordination are nicely captured by the Kinect while others are not. Detecting differences between bimanual coordination patterns and the stability of those patterns can be achieved using the Kinect. However, researchers interested in the structure of coordination stability should exercise caution since poor agreement was found between systems

    The role of interhemispheric cortico-cortical connections in bimanual coordination in the rat

    Get PDF
    Bimanual coordination – in which both hands work together to achieve a goal – is crucial for basic needs of life, such as gathering and feeding. The mammalian body has a left and right side which is often symmetrically shaped, but raises the question of how does the brain organize two sides of our body in a coordinated manner. The overall aim of this thesis is to better-understand neural mechanism of bimanual coordination. Bimanual coordination is highly developed in primates, where it has been most extensively studied. Rodents also exhibit remarkable dexterity and coordination of forelimbs during food handling and consumption. However, rodents have been less commonly used in the study of bimanual coordination because of limited quantitative measuring techniques. To study the neural mechanism of bimanual coordination using rodents, therefore, first requires a method to measure and classify bimanual movements. In this thesis, I propose a high-resolution tracking system that enables kinematic analysis of rat forelimb movements. The system quantifies forelimb movements bilaterally in head-fixed rats during food handling and consumption. Forelimb movements occurring naturally during feeding were encoded as continuous 3-D trajectories. The trajectories were then automatically segmented and analyzed, using a novel algorithm, according to the laterality of movement speed or the asymmetry of movement direction across the forelimbs. Bilateral forelimb movements were frequently observed during spontaneous food handling. Both symmetry and asymmetry in movement direction were observed, with symmetric bilateral movements quantitatively more common. Using the proposed method, I further investigated a key hypothesis that the corpus callosum, the thickest commissure connecting two cerebral cortices, mediates bimanual movements. I performed pharmacological blockade of the anterior corpus callosum (aCC) in which commissures from cortical forelimb motor areas are reciprocally connected. The kinematic analysis of bimanual coordination during food handling revealed that the frequency of occurrence of symmetric bimanual movements was reduced by aCC inhibition. In counterpart, asymmetric bimanual movements were increased. Other global scales of motor skills, such as mean food drop rate, and mean consumption time remained unchanged. Bilateral multiunit recordings from corresponding cortical areas showed positively correlated activity patterns in the large majority of interacting pairs. The present study also found that the putative excitatory neurons were also positively correlated with putative inhibitory neurons in the opposite hemisphere, suggesting interhemispheric inhibition via inhibitory neurons. Collectively, these results suggest that the symmetric bimanual movements in rodents are modulated by the anterior corpus callosum via both excitatory and inhibitory connections of two motor cortices.Okinawa Institute of Science and Technology Graduate Universit

    Barehand Mode Switching in Touch and Mid-Air Interfaces

    Get PDF
    Raskin defines a mode as a distinct setting within an interface where the same user input will produce results different to those it would produce in other settings. Most interfaces have multiple modes in which input is mapped to different actions, and, mode-switching is simply the transition from one mode to another. In touch interfaces, the current mode can change how a single touch is interpreted: for example, it could draw a line, pan the canvas, select a shape, or enter a command. In Virtual Reality (VR), a hand gesture-based 3D modelling application may have different modes for object creation, selection, and transformation. Depending on the mode, the movement of the hand is interpreted differently. However, one of the crucial factors determining the effectiveness of an interface is user productivity. Mode-switching time of different input techniques, either in a touch interface or in a mid-air interface, affects user productivity. Moreover, when touch and mid-air interfaces like VR are combined, making informed decisions pertaining to the mode assignment gets even more complicated. This thesis provides an empirical investigation to characterize the mode switching phenomenon in barehand touch-based and mid-air interfaces. It explores the potential of using these input spaces together for a productivity application in VR. And, it concludes with a step towards defining and evaluating the multi-faceted mode concept, its characteristics and its utility, when designing user interfaces more generally

    Recent Advancements in Augmented Reality for Robotic Applications: A Survey

    Get PDF
    Robots are expanding from industrial applications to daily life, in areas such as medical robotics, rehabilitative robotics, social robotics, and mobile/aerial robotics systems. In recent years, augmented reality (AR) has been integrated into many robotic applications, including medical, industrial, human–robot interactions, and collaboration scenarios. In this work, AR for both medical and industrial robot applications is reviewed and summarized. For medical robot applications, we investigated the integration of AR in (1) preoperative and surgical task planning; (2) image-guided robotic surgery; (3) surgical training and simulation; and (4) telesurgery. AR for industrial scenarios is reviewed in (1) human–robot interactions and collaborations; (2) path planning and task allocation; (3) training and simulation; and (4) teleoperation control/assistance. In addition, the limitations and challenges are discussed. Overall, this article serves as a valuable resource for working in the field of AR and robotic research, offering insights into the recent state of the art and prospects for improvement

    In Vivo Neuromechanics: Decoding Causal Motor Neuron Behavior with Resulting Musculoskeletal Function.

    Get PDF
    Human motor function emerges from the interaction between the neuromuscular and the musculoskeletal systems. Despite the knowledge of the mechanisms underlying neural and mechanical functions, there is no relevant understanding of the neuro-mechanical interplay in the neuro-musculo-skeletal system. This currently represents the major challenge to the understanding of human movement. We address this challenge by proposing a paradigm for investigating spinal motor neuron contribution to skeletal joint mechanical function in the intact human in vivo. We employ multi-muscle spatial sampling and deconvolution of high-density fiber electrical activity to decode accurate α-motor neuron discharges across five lumbosacral segments in the human spinal cord. We use complete α-motor neuron discharge series to drive forward subject-specific models of the musculoskeletal system in open-loop with no corrective feedback. We perform validation tests where mechanical moments are estimated with no knowledge of reference data over unseen conditions. This enables accurate blinded estimation of ankle function purely from motor neuron information. Remarkably, this enables observing causal associations between spinal motor neuron activity and joint moment control. We provide a new class of neural data-driven musculoskeletal modeling formulations for bridging between movement neural and mechanical levels in vivo with implications for understanding motor physiology, pathology, and recovery
    • 

    corecore