12 research outputs found

    Review of Anthropomorphic Head Stabilisation and Verticality Estimation in Robots

    Get PDF
    International audienceIn many walking, running, flying, and swimming animals, including mammals, reptiles, and birds, the vestibular system plays a central role for verticality estimation and is often associated with a head sta-bilisation (in rotation) behaviour. Head stabilisation, in turn, subserves gaze stabilisation, postural control, visual-vestibular information fusion and spatial awareness via the active establishment of a quasi-inertial frame of reference. Head stabilisation helps animals to cope with the computational consequences of angular movements that complicate the reliable estimation of the vertical direction. We suggest that this strategy could also benefit free-moving robotic systems, such as locomoting humanoid robots, which are typically equipped with inertial measurements units. Free-moving robotic systems could gain the full benefits of inertial measurements if the measurement units are placed on independently orientable platforms, such as a human-like heads. We illustrate these benefits by analysing recent humanoid robots design and control approaches

    A comprehensive gaze stabilization controller based on cerebellar internal models

    Get PDF
    Gaze stabilization is essential for clear vision; it is the combined effect of two reflexes relying on vestibular inputs: the vestibulocollic reflex (VCR), which stabilizes the head in space and the vestibulo-ocular reflex (VOR), which stabilizes the visual axis to minimize retinal image motion. The VOR works in conjunction with the opto-kinetic reflex (OKR), which is a visual feedback mechanism that allows the eye to move at the same speed as the observed scene. Together they keep the image stationary on the retina. In this work, we implement on a humanoid robot a model of gaze stabilization based on the coordination of VCR, VOR and OKR. The model, inspired by neuroscientific cerebellar theories, is provided with learning and adaptation capabilities based on internal models. We present the results for the gaze stabilization model on three sets of experiments conducted on the SABIAN robot and on the iCub simulator, validating the robustness of the proposed control method. The first set of experiments focused on the controller response to a set of disturbance frequencies along the vertical plane. The second shows the performances of the system under three-dimensional disturbances. The last set of experiments was carried out to test the capability of the proposed model to stabilize the gaze in locomotion tasks. The results confirm that the proposed model is beneficial in all cases reducing the retinal slip (velocity of the image on the retina) and keeping the orientation of the head stable

    Combining central pattern generators with the electromagnetism-like algorithm for head motion stabilization during quadruped robot locomotion

    Get PDF
    Visually-guided locomotion is important for autonomous robotics. However, there are several difficulties, for instance, the head shaking that results from the robot locomotion itself that constraints stable image acquisition and the possibility to rely on that information to act accordingly. In this article, we propose a controller architecture that is able to generate locomotion for a quadruped robot and to generate head motion able to minimize the head motion induced by locomotion itself. The movement controllers are biologically inspired in the concept of Central Pattern Generators (CPGs). CPGs are modelled based on nonlinear dynamical systems, coupled Hopf oscillators. This approach allows to explicitly specify parameters such as amplitude, offset and frequency of movement and to smoothly modulate the generated oscillations according to changes in these parameters. We take advantage of this particularity and propose a combined approach to generate head movement stabilization on a quadruped robot, using CPGs and a global optimization algorithm. The best set of parameters that generates the head movement are computed by the electromagnetism-like algorithm in order to reduce the head shaking caused by locomotion. Experimental results on a simulated AIBO robot demonstrate that the proposed approach generates head movement that does not eliminate but reduces the one induced by locomotion

    Evolution strategies combined with central pattern generators for head motion minimization during quadruped robot locomotion

    Get PDF
    In autonomous robotics, the head shaking induced by locomotion is a relevant and still not solved problem. This problem constraints stable image acquisition and the possibility to rely on that information to act accordingly. In this article, we propose a movement controller to generate locomotion and head movement. Our aim is to generate the head movement required to minimize the head motion induced by locomotion itself. The movement controllers are biologically inspired in the concept of Central Pattern Generators (CPGs). CPGs are modelled based on nonlinear dynamical systems, coupled Hopf oscillators. This approach allows to explicitly specify parameters such as amplitude, offset and frequency of movement and to smoothly modulate the generated oscillations according to changes in these parameters. Based on these ideas, we propose a combined approach to generate head movement stabilization on a quadruped robot, using CPGs and an evolution strategy. The best set of parameters that generates the head movement are computed by an evolution strategy. Experiments were performed on a simulated AIBO robot. The obtained results demonstrate the feasibility of the approach, by reducing the overall head movement

    Enhancing the Sense of Attention from an Assistance Mobile Robot by Improving Eye-Gaze Contact from Its Iconic Face Displayed on a Flat Screen

    Get PDF
    One direct way to express the sense of attention in a human interaction is through the gaze. This paper presents the enhancement of the sense of attention from the face of a human-sized mobile robot during an interaction. This mobile robot was designed as an assistance mobile robot and uses a flat screen at the top of the robot to display an iconic (simplified) face with big round eyes and a single line as a mouth. The implementation of eye-gaze contact from this iconic face is a problem because of the difficulty of simulating real 3D spherical eyes in a 2D image considering the perspective of the person interacting with the mobile robot. The perception of eye-gaze contact has been improved by manually calibrating the gaze of the robot relative to the location of the face of the person interacting with the robot. The sense of attention has been further enhanced by implementing cyclic face explorations with saccades in the gaze and by performing blinking and small movements of the mouth

    Pattern Generation for Rough Terrain Locomotion with Quadrupedal Robots:Morphed Oscillators & Sensory Feedback

    Get PDF
    Animals are able to locomote on rough terrain without any apparent difficulty, but this does not mean that the locomotor system is simple. The locomotor system is actually a complex multi-input multi-output closed-loop control system. This thesis is dedicated to the design of controllers for rough terrain locomotion, for animal-like quadrupedal robots. We choose the problem of blind rough terrain locomotion as the target of experiments. Blind rough terrain locomotion requires continuous and momentary corrections of leg movements and body posture, and provides a proper testbed to observe the interaction of different mod- ules involved in locomotion control. As for the specific case of this thesis, we have to design rough terrain locomotion controllers that do not depend on the torque-control capability, have limited sensing, and have to be computationally light, all due to the properties of the robotics platform that we use. We propose that a robust locomotion controller, taking into account the aforementioned constraints, is constructed from at least three modules: 1) pattern generators providing the nominal patterns of locomotion; 2) A posture controller continuously adjusting the attitude of the body and keeping the robot upright; and 3) quick reflexes to react to unwanted momentary events like stumbling or an external force impulse. We introduce the framework of morphed oscillators to systematize the design of pattern gen- erators realized as coupled nonlinear oscillators. Morphed oscillators are nonlinear oscillators that can encode arbitrary limit cycle shapes and simultaneously have infinitely large basins of attraction. More importantly, they provide dynamical systems that can assume the role of feedforward locomotion controllers known as Central Pattern Generators (CPGs), and accept discontinuous sensory feedback without the risk of producing discontinuous output. On top of the CPG module, we add a kinematic model-based posture controller inspired by virtual model control (VMC), to control the body attitude. Virtual model control produces forces, and through the application of the Jacobian transpose method, generates torques which are added to the CPG torques. However, because our robots do not have a torque- control capability, we adapt the posture controller by producing task-space velocities instead of forces, thus generating joint-space velocity feedback signals. Since the CPG model used for locomotion generates joint velocities and accepts feedback without the fear of instability or discontinuity, the posture control feedback is easily integrated into the CPG dynamics. More- over, we introduce feedback signals for adjusting the posture by shifting the trunk positions, which directly update the limit cycle shape of the morphed oscillator nodes of the CPG. Reflexes are added, with minimal complexity, to react to momentary events. We implement simple impulse-based feedback mechanisms inspired by animals and successful rough terrain robots to 1) flex the leg if the robot is stumbling (stumbling correction reflex); 2) extend the leg if an expected contact is missing (leg extension reflex); or 3) initiate a lateral stepping sequence in response to a lateral external perturbation. CPG, posture controller, and reflexes are put together in a modular control architecture alongside additional modules that estimate inclination, control speed and direction, maintain timing of feedback signals, etc. [...

    Humanoid Robots

    Get PDF
    For many years, the human being has been trying, in all ways, to recreate the complex mechanisms that form the human body. Such task is extremely complicated and the results are not totally satisfactory. However, with increasing technological advances based on theoretical and experimental researches, man gets, in a way, to copy or to imitate some systems of the human body. These researches not only intended to create humanoid robots, great part of them constituting autonomous systems, but also, in some way, to offer a higher knowledge of the systems that form the human body, objectifying possible applications in the technology of rehabilitation of human beings, gathering in a whole studies related not only to Robotics, but also to Biomechanics, Biomimmetics, Cybernetics, among other areas. This book presents a series of researches inspired by this ideal, carried through by various researchers worldwide, looking for to analyze and to discuss diverse subjects related to humanoid robots. The presented contributions explore aspects about robotic hands, learning, language, vision and locomotion

    Aerospace Medicine and Biology: A continuing bibliography with indexes, supplement 267, January 1985

    Get PDF
    This publication is a cumulative index to the abstracts contained in the Supplements 255 through 266 of Aerospace Medicine and Biology: A Continuing Bibliography. It includes seven indexes--subject, personal author, corporate source, foreign technology, contract number, report number, and accession number

    Life Sciences Program Tasks and Bibliography for FY 1997

    Get PDF
    This document includes information on all peer reviewed projects funded by the Office of Life and Microgravity Sciences and Applications, Life Sciences Division during fiscal year 1997. This document will be published annually and made available to scientists in the space life sciences field both as a hard copy and as an interactive internet web page
    corecore