10 research outputs found

    Insect-inspired visual navigation on-board an autonomous robot: real-world routes encoded in a single layer network

    Get PDF
    Insect-Inspired models of visual navigation, that operate by scanning for familiar views of the world, have been shown to be capable of robust route navigation in simulation. These familiarity-based navigation algorithms operate by training an artificial neural network (ANN) with views from a training route, so that it can then output a familiarity score for any new view. In this paper we show that such an algorithm – with all computation performed on a small low-power robot – is capable of delivering reliable direction information along real-world outdoor routes, even when scenes contain few local landmarks and have high-levels of noise (from variable lighting and terrain). Indeed, routes can be precisely recapitulated and we show that the required computation and storage does not increase with the number of training views. Thus the ANN provides a compact representation of the knowledge needed to traverse a route. In fact, rather than losing information, there are instances where the use of an ANN ameliorates the problems of sub optimal paths caused by tortuous training routes. Our results suggest the feasibility of familiarity-based navigation for long-range autonomous visual homing

    Recent advances in evolutionary and bio-inspired adaptive robotics: Exploiting embodied dynamics

    No full text
    AbstractThis paper explores current developments in evolutionary and bio-inspired approaches to autonomous robotics, concentrating on research from our group at the University of Sussex. These developments are discussed in the context of advances in the wider fields of adaptive and evolutionary approaches to AI and robotics, focusing on the exploitation of embodied dynamics to create behaviour. Four case studies highlight various aspects of such exploitation. The first exploits the dynamical properties of a physical electronic substrate, demonstrating for the first time how component-level analog electronic circuits can be evolved directly in hardware to act as robot controllers. The second develops novel, effective and highly parsimonious navigation methods inspired by the way insects exploit the embodied dynamics of innate behaviours. Combining biological experiments with robotic modeling, it is shown how rapid route learning can be achieved with the aid of navigation-specific visual information that is provided and exploited by the innate behaviours. The third study focuses on the exploitation of neuromechanical chaos in the generation of robust motor behaviours. It is demonstrated how chaotic dynamics can be exploited to power a goal-driven search for desired motor behaviours in embodied systems using a particular control architecture based around neural oscillators. The dynamics are shown to be chaotic at all levels in the system, from the neural to the embodied mechanical. The final study explores the exploitation of the dynamics of brain-body-environment interactions for efficient, agile flapping winged flight. It is shown how a multi-objective evolutionary algorithm can be used to evolved dynamical neural controllers for a simulated flapping wing robot with feathered wings. Results demonstrate robust, stable, agile flight is achieved in the face of random wind gusts by exploiting complex asymmetric dynamics partly enabled by continually changing wing and tail morphologies.</jats:p

    Robust view based navigation through view classification

    No full text
    Current implementations of view-based navigation on robots have shown success, but are limited to routes of <10m [1] [2]. This is in part because current strategies do not take into account whether a view has been correctly recognised, moving in the most familiar direction given by the rotational familiarity function (RFF) regardless of prediction confidence. We demonstrate that it is possible to use the shape of the RFF to classify if the current view is from a known position, and thus likely to provide valid navigational information, or from a position which is unknown, aliased or occluded and therefore likely to result in erroneous movement. Our model could classify these four view types with accuracies of 1.00, 0.91, 0.97 and 0.87 respectively. We hope to use these results to extend online view-based navigation and prevent robot loss in complex environments

    Insect-inspired visual navigation on-board an autonomous robot: real-world routes encoded in a single layer network

    No full text
    Insect-Inspired models of visual navigation, that operate by scanning for familiar views of the world, have been shown to be capable of robust route navigation in simulation. These familiarity-based navigation algorithms operate by training an artificial neural network (ANN) with views from a training route, so that it can then output a familiarity score for any new view. In this paper we show that such an algorithm – with all computation performed on a small low-power robot – is capable of delivering reliable direction information along real-world outdoor routes, even when scenes contain few local landmarks and have high-levels of noise (from variable lighting and terrain). Indeed, routes can be precisely recapitulated and we show that the required computation and storage does not increase with the number of training views. Thus the ANN provides a compact representation of the knowledge needed to traverse a route. In fact, rather than losing information, there are instances where the use of an ANN ameliorates the problems of sub optimal paths caused by tortuous training routes. Our results suggest the feasibility of familiarity-based navigation for long-range autonomous visual homing

    Recent advances in evolutionary and bio-inspired adaptive robotics: exploiting embodied dynamics

    No full text
    This paper explores current developments in evolutionary and bio-inspired approaches to autonomous robotics, concentrating on research from our group at the University of Sussex. These developments are discussed in the context of advances in the wider fields of adaptive and evolutionary approaches to AI and robotics, focusing on the exploitation of embodied dynamics to create behaviour. Four case studies highlight various aspects of such exploitation. The first exploits the dynamical properties of a physical electronic substrate, demonstrating for the first time how component-level analog electronic circuits can be evolved directly in hardware to act as robot controllers. The second develops novel, effective and highly parsimonious navigation methods inspired by the way insects exploit the embodied dynamics of innate behaviours. Combining biological experiments with robotic modeling, it is shown how rapid route learning can be achieved with the aid of navigation-specific visual information that is provided and exploited by the innate behaviours. The third study focuses on the exploitation of neuromechanical chaos in the generation of robust motor behaviours. It is demonstrated how chaotic dynamics can be exploited to power a goal-driven search for desired motor behaviours in embodied systems using a particular control architecture based around neural oscillators. The dynamics are shown to be chaotic at all levels in the system, from the neural to the embodied mechanical. The final study explores the exploitation of the dynamics of brain-body-environment interactions for efficient, agile flapping winged flight. It is shown how a multi-objective evolutionary algorithm can be used to evolved dynamical neural controllers for a simulated flapping wing robot with feathered wings. Results demonstrate robust, stable, agile flight is achieved in the face of random wind gusts by exploiting complex asymmetric dynamics partly enabled by continually changing wing and tail morphologies
    corecore