115 research outputs found

    Near range path navigation using LGMD visual neural networks

    Get PDF
    In this paper, we proposed a method for near range path navigation for a mobile robot by using a pair of biologically inspired visual neural network – lobula giant movement detector (LGMD). In the proposed binocular style visual system, each LGMD processes images covering a part of the wide field of view and extracts relevant visual cues as its output. The outputs from the two LGMDs are compared and translated into executable motor commands to control the wheels of the robot in real time. Stronger signal from the LGMD in one side pushes the robot away from this side step by step; therefore, the robot can navigate in a visual environment naturally with the proposed vision system. Our experiments showed that this bio-inspired system worked well in different scenarios

    Biologically inspired vision systems in robotics

    Get PDF
    During the last years, the International Journal of Advanced Robotic Systems, under the Topic of Vision Systems, especially welcomes papers that cover any aspect of biologically inspired vision in robots. As Guest Editors of the Special Issue on “Biologically Inspired Vision Systems in Robotics,” we feel that living beings have still much to tell us about the design and development of robotics

    Taking Inspiration from Flying Insects to Navigate inside Buildings

    Get PDF
    These days, flying insects are seen as genuinely agile micro air vehicles fitted with smart sensors and also parsimonious in their use of brain resources. They are able to visually navigate in unpredictable and GPS-denied environments. Understanding how such tiny animals work would help engineers to figure out different issues relating to drone miniaturization and navigation inside buildings. To turn a drone of ~1 kg into a robot, miniaturized conventional avionics can be employed; however, this results in a loss of their flight autonomy. On the other hand, to turn a drone of a mass between ~1 g (or less) and ~500 g into a robot requires an innovative approach taking inspiration from flying insects both with regard to their flapping wing propulsion system and their sensory system based mainly on motion vision in order to avoid obstacles in three dimensions or to navigate on the basis of visual cues. This chapter will provide a snapshot of the current state of the art in the field of bioinspired optic flow sensors and optic flow-based direct feedback loops applied to micro air vehicles flying inside buildings

    Towards Computational Models and Applications of Insect Visual Systems for Motion Perception: A Review

    Get PDF
    Motion perception is a critical capability determining a variety of aspects of insects' life, including avoiding predators, foraging and so forth. A good number of motion detectors have been identified in the insects' visual pathways. Computational modelling of these motion detectors has not only been providing effective solutions to artificial intelligence, but also benefiting the understanding of complicated biological visual systems. These biological mechanisms through millions of years of evolutionary development will have formed solid modules for constructing dynamic vision systems for future intelligent machines. This article reviews the computational motion perception models originating from biological research of insects' visual systems in the literature. These motion perception models or neural networks comprise the looming sensitive neuronal models of lobula giant movement detectors (LGMDs) in locusts, the translation sensitive neural systems of direction selective neurons (DSNs) in fruit flies, bees and locusts, as well as the small target motion detectors (STMDs) in dragonflies and hover flies. We also review the applications of these models to robots and vehicles. Through these modelling studies, we summarise the methodologies that generate different direction and size selectivity in motion perception. At last, we discuss about multiple systems integration and hardware realisation of these bio-inspired motion perception models

    Reactive direction control for a mobile robot: A locust-like control of escape direction emerges when a bilateral pair of model locust visual neurons are integrated

    Get PDF
    Locusts possess a bilateral pair of uniquely identifiable visual neurons that respond vigorously to the image of an approaching object. These neurons are called the lobula giant movement detectors (LGMDs). The locust LGMDs have been extensively studied and this has lead to the development of an LGMD model for use as an artificial collision detector in robotic applications. To date, robots have been equipped with only a single, central artificial LGMD sensor, and this triggers a non-directional stop or rotation when a potentially colliding object is detected. Clearly, for a robot to behave autonomously, it must react differently to stimuli approaching from different directions. In this study, we implement a bilateral pair of LGMD models in Khepera robots equipped with normal and panoramic cameras. We integrate the responses of these LGMD models using methodologies inspired by research on escape direction control in cockroaches. Using ‘randomised winner-take-all’ or ‘steering wheel’ algorithms for LGMD model integration, the khepera robots could escape an approaching threat in real time and with a similar distribution of escape directions as real locusts. We also found that by optimising these algorithms, we could use them to integrate the left and right DCMD responses of real jumping locusts offline and reproduce the actual escape directions that the locusts took in a particular trial. Our results significantly advance the development of an artificial collision detection and evasion system based on the locust LGMD by allowing it reactive control over robot behaviour. The success of this approach may also indicate some important areas to be pursued in future biological research

    Adaptation of sensor morphology: an integrative view of perception from biologically inspired robotics perspective

    Get PDF
    Sensor morphology, the morphology of a sensing mechanism which plays a role of shaping the desired response from physical stimuli from surroundings to generate signals usable as sensory information, is one of the key common aspects of sensing processes. This paper presents a structured review of researches on bioinspired sensor morphology implemented in robotic systems, and discusses the fundamental design principles. Based on literature review, we propose two key arguments: first, owing to its synthetic nature, biologically inspired robotics approach is a unique and powerful methodology to understand the role of sensor morphology and how it can evolve and adapt to its task and environment. Second, a consideration of an integrative view of perception by looking into multidisciplinary and overarching mechanisms of sensor morphology adaptation across biology and engineering enables us to extract relevant design principles that are important to extend our understanding of the unfinished concepts in sensing and perceptionThis study was supported by the European Commission with the RoboSoft CA (A Coordination Action for Soft Robotics, contract #619319). SGN was supported by School of Engineering seed funding (2016), Malaysia Campus, Monash University

    Go with the flow : visually mediated flight control in bumblebees

    Get PDF
    Despite their small brains and tiny eyes, flying insects are capable of detecting and avoiding collisions with moving obstacles, and with remarkable precision they navigate through environments of different complexity. For this thesis, I have investigated how bumblebees use the pattern of apparent image motion that is generated in their eyes as they move through the world (known as optic flow), in order to control flight. I analysed the speed and position of bumblebee (Bombus terrestris) flight trajectories as they negotiated arenas of different dimensions and visual complexity. I also investigated the impact of optic flow on bumblebee learning flights, a special kind of flight designed to memorise the location of the nest or a newly discovered food source. The general aim of my research has been to understand how flying insects use vision to actively control their flight. The viewing angle at which optic flow is measured has important consequences for flight in densely cluttered environments, where timely control of position and speed are necessary for effective collision avoidance. I therefore investigated when, and how, bumblebees respond to sudden changes in the magnitude of optic flow. My results reveal that the visual region over which bumblebees measure optic flow is determined by the location in the frontal visual field where they experience the maximum magnitude of translational optic flow. This strategy ensures that bumblebees regulate their position and speed according to the nearest obstacles, allowing them to maximise flight efficiency and to minimise the risk of collision. My results further demonstrate that, when flying in narrow spaces, bumblebees use optic flow information from nearby surfaces in the lateral visual field to control flight, while in more open spaces they rely primarily on optic flow cues from the ventral field of view. This result strengthens the finding that bumblebees measure optic flow for flight control flexibly in their visual field, depending on where the maximum magnitude of translational optic flow occurs. It also adds another dimension to it by suggesting that bumblebees respond to optic flow cues in the ventral visual field if the magnitude is higher there than in the lateral visual field. Thus, the ability to flexibly use the surrounding optic flow field is of great importance when it comes to the control of cruising flight. For this thesis I also investigated the impact of ventral and panoramic optic flow on the control of learning flights in bumblebees. The results show that the presence of ventral optic flow is important for enabling bumblebees to perform well-controlled learning flights. Whether panoramic optic flow cues are present or not does not strongly affect the overall structure of the learning flight, although these cues might still be involved in fine-scale flight control. Finally, I found that, when the availability of ventral optic flow is limited to certain heights, bumblebees appear to adjust their flight parameters to maintain the perception of ventral optic flow cues. In summary, the results compiled in this thesis contribute to a better understanding of how insects use visual information to control their flight. Among other findings, my results emphasize the importance of a being able to flexibly measure optic flow in different parts of the visual field, something that enhances bees’ ability to avoid collisions
    corecore