7,659 research outputs found

    Predicting the internal model of a robotic system from its morphology

    Get PDF
    The estimation of the internal model of a robotic system results from the interaction of its morphology, sensors and actuators, with a particular environment. Model learning techniques, based on supervised machine learning, are widespread for determining the internal model. An important limitation of such approaches is that once a model has been learnt, it does not behave properly when the robot morphology is changed. From this it follows that there must exist a relationship between them. We propose a model for this correlation between the morphology and the internal model parameters, so that a new internal model can be predicted when the morphological parameters are modified. Di erent neural network architectures are proposed to address this high dimensional regression problem. A case study is analyzed in detail to illustrate and evaluate the performance of the approach, namely, a pan-tilt robot head executing saccadic movements. The best results are obtained for an architecture with parallel neural networks due to the independence of its outputs. Theses results can have a great significance since the predicted parameters can dramatically speed up the adaptation process following a change in morpholog

    Learning at the Ends: From Hand to Tool Affordances in Humanoid Robots

    Full text link
    One of the open challenges in designing robots that operate successfully in the unpredictable human environment is how to make them able to predict what actions they can perform on objects, and what their effects will be, i.e., the ability to perceive object affordances. Since modeling all the possible world interactions is unfeasible, learning from experience is required, posing the challenge of collecting a large amount of experiences (i.e., training data). Typically, a manipulative robot operates on external objects by using its own hands (or similar end-effectors), but in some cases the use of tools may be desirable, nevertheless, it is reasonable to assume that while a robot can collect many sensorimotor experiences using its own hands, this cannot happen for all possible human-made tools. Therefore, in this paper we investigate the developmental transition from hand to tool affordances: what sensorimotor skills that a robot has acquired with its bare hands can be employed for tool use? By employing a visual and motor imagination mechanism to represent different hand postures compactly, we propose a probabilistic model to learn hand affordances, and we show how this model can generalize to estimate the affordances of previously unseen tools, ultimately supporting planning, decision-making and tool selection tasks in humanoid robots. We present experimental results with the iCub humanoid robot, and we publicly release the collected sensorimotor data in the form of a hand posture affordances dataset.Comment: dataset available at htts://vislab.isr.tecnico.ulisboa.pt/, IEEE International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EpiRob 2017

    Flora robotica -- An Architectural System Combining Living Natural Plants and Distributed Robots

    Full text link
    Key to our project flora robotica is the idea of creating a bio-hybrid system of tightly coupled natural plants and distributed robots to grow architectural artifacts and spaces. Our motivation with this ground research project is to lay a principled foundation towards the design and implementation of living architectural systems that provide functionalities beyond those of orthodox building practice, such as self-repair, material accumulation and self-organization. Plants and robots work together to create a living organism that is inhabited by human beings. User-defined design objectives help to steer the directional growth of the plants, but also the system's interactions with its inhabitants determine locations where growth is prohibited or desired (e.g., partitions, windows, occupiable space). We report our plant species selection process and aspects of living architecture. A leitmotif of our project is the rich concept of braiding: braids are produced by robots from continuous material and serve as both scaffolds and initial architectural artifacts before plants take over and grow the desired architecture. We use light and hormones as attraction stimuli and far-red light as repelling stimulus to influence the plants. Applied sensors range from simple proximity sensing to detect the presence of plants to sophisticated sensing technology, such as electrophysiology and measurements of sap flow. We conclude by discussing our anticipated final demonstrator that integrates key features of flora robotica, such as the continuous growth process of architectural artifacts and self-repair of living architecture.Comment: 16 pages, 12 figure

    Choosing new ways to chew

    Get PDF
    No abstract availabl

    Robots as Powerful Allies for the Study of Embodied Cognition from the Bottom Up

    Get PDF
    A large body of compelling evidence has been accumulated demonstrating that embodiment – the agent’s physical setup, including its shape, materials, sensors and actuators – is constitutive for any form of cognition and as a consequence, models of cognition need to be embodied. In contrast to methods from empirical sciences to study cognition, robots can be freely manipulated and virtually all key variables of their embodiment and control programs can be systematically varied. As such, they provide an extremely powerful tool of investigation. We present a robotic bottom-up or developmental approach, focusing on three stages: (a) low-level behaviors like walking and reflexes, (b) learning regularities in sensorimotor spaces, and (c) human-like cognition. We also show that robotic based research is not only a productive path to deepening our understanding of cognition, but that robots can strongly benefit from human-like cognition in order to become more autonomous, robust, resilient, and safe

    Bending angle prediction and control of soft pneumatic actuators with embedded flex sensors - a data-driven approach

    Get PDF
    In this paper, a purely data-driven modelling approach is presented for predicting and controlling the free bending angle response of a typical soft pneumatic actuator (SPA), embedded with a resistive flex sensor. An experimental setup was constructed to test the SPA at different input pressure values and orientations, while recording the resulting feedback from the embedded flex sensor and on-board pressure sensor. A calibrated high speed camera captures image frames during the actuation, which are then analysed using an image processing program to calculate the actual bending angle and synchronise it with the recorded sensory feedback. Empirical models were derived based on the generated experimental data using two common data-driven modelling techniques; regression analysis and artificial neural networks. Both techniques were validated using a new dataset at untrained operating conditions to evaluate their prediction accuracy. Furthermore, the derived empirical model was used as part of a closed-loop PID controller to estimate and control the bending angle of the tested SPA based on the real-time sensory feedback generated. The tuned PID controller allowed the bending SPA to accurately follow stepped and sinusoidal reference signals, even in the presence of pressure leaks in the pneumatic supply. This work demonstrates how purely data-driven models can be effectively used in controlling the bending of SPAs under different operating conditions, avoiding the need for complex analytical modelling and material characterisation. Ultimately, the aim is to create more controllable soft grippers based on such SPAs with embedded sensing capabilities, to be used in applications requiring both a ‘soft touch’ as well as a more controllable object manipulation

    Fast Damage Recovery in Robotics with the T-Resilience Algorithm

    Full text link
    Damage recovery is critical for autonomous robots that need to operate for a long time without assistance. Most current methods are complex and costly because they require anticipating each potential damage in order to have a contingency plan ready. As an alternative, we introduce the T-resilience algorithm, a new algorithm that allows robots to quickly and autonomously discover compensatory behaviors in unanticipated situations. This algorithm equips the robot with a self-model and discovers new behaviors by learning to avoid those that perform differently in the self-model and in reality. Our algorithm thus does not identify the damaged parts but it implicitly searches for efficient behaviors that do not use them. We evaluate the T-Resilience algorithm on a hexapod robot that needs to adapt to leg removal, broken legs and motor failures; we compare it to stochastic local search, policy gradient and the self-modeling algorithm proposed by Bongard et al. The behavior of the robot is assessed on-board thanks to a RGB-D sensor and a SLAM algorithm. Using only 25 tests on the robot and an overall running time of 20 minutes, T-Resilience consistently leads to substantially better results than the other approaches

    Data-driven bending angle prediction of soft pneumatic actuators with embedded flex sensors

    Get PDF
    In this paper, resistive flex sensors have been embedded at the strain limiting layer of soft pneumatic actuators, in order to provide sensory feedback that can be utilised in predicting their bending angle during actuation. An experimental setup was prepared to test the soft actuators under controllable operating conditions, record the resulting sensory feedback, and synchronise this with the actual bending angles measured using a developed image processing program. Regression analysis and neural networks are two data-driven modelling techniques that were implemented and compared in this study, to evaluate their ability in predicting the bending angle response of the tested soft actuators at different input pressures and testing orientations. This serves as a step towards controlling this class of soft bending actuators, using data-driven empirical models that lifts the need for complex analytical modelling and material characterisation. The aim is to ultimately create a more controllable version of this class of soft pneumatic actuators with embedded sensing capabilities, to act as compliant soft gripper fingers that can be used in applications requiring both a ‘soft touch’ as well as more controllable object manipulation
    • …
    corecore