5,512 research outputs found

    Sensor Fusion Based Model for Collision Free Mobile Robot Navigation

    Get PDF
    Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot’s wheels, and 24 fuzzy rules for the robot’s movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes.https://doi.org/10.3390/s1601002

    Design and analysis of Intelligent Navigational controller for Mobile Robot

    Get PDF
    Since last several years requirement graph for autonomous mobile robots according to its virtual application has always been an upward one. Smother and faster mobile robots navigation with multiple function are the necessity of the day. This research is based on navigation system as well as kinematics model analysis for autonomous mobile robot in known environments. To execute and attain introductory robotic behaviour inside environments(e.g. obstacle avoidance, wall or edge following and target seeking) robot uses method of perception, sensor integration and fusion. With the help of these sensors robot creates its collision free path and analyse an environmental map time to time. Mobile robot navigation in an unfamiliar environment can be successfully studied here using online sensor fusion and integration. Various AI algorithm are used to describe overall procedure of mobilerobot navigation and its path planning problem. To design suitable controller that create collision free path are achieved by the combined study of kinematics analysis of motion as well as an artificial intelligent technique. In fuzzy logic approach, a set of linguistic fuzzy rules are generated for navigation of mobile robot. An expert controller has been developed for the navigation in various condition of environment using these fuzzy rules. Further, type-2 fuzzy is employed to simplify and clarify the developed control algorithm more accurately due to fuzzy logic limitations. In addition, recurrent neural network (RNN) methodology has been analysed for robot navigation. Which helps the model at the time of learning stage. The robustness of controller has been checked on Webots simulation platform. Simulation results and performance of the controller using Webots platform show that, the mobile robot is capable for avoiding obstacles and reaching the termination point in efficient manner

    Neural Sensor Fusion for Spatial Visualization on a Mobile Robot

    Full text link
    An ARTMAP neural network is used to integrate visual information and ultrasonic sensory information on a B 14 mobile robot. Training samples for the neural network are acquired without human intervention. Sensory snapshots are retrospectively associated with the distance to the wall, provided by on~ board odomctry as the robot travels in a straight line. The goal is to produce a more accurate measure of distance than is provided by the raw sensors. The neural network effectively combines sensory sources both within and between modalities. The improved distance percept is used to produce occupancy grid visualizations of the robot's environment. The maps produced point to specific problems of raw sensory information processing and demonstrate the benefits of using a neural network system for sensor fusion.Office of Naval Research and Naval Research Laboratory (00014-96-1-0772, 00014-95-1-0409, 00014-95-0657

    Deep Network Uncertainty Maps for Indoor Navigation

    Full text link
    Most mobile robots for indoor use rely on 2D laser scanners for localization, mapping and navigation. These sensors, however, cannot detect transparent surfaces or measure the full occupancy of complex objects such as tables. Deep Neural Networks have recently been proposed to overcome this limitation by learning to estimate object occupancy. These estimates are nevertheless subject to uncertainty, making the evaluation of their confidence an important issue for these measures to be useful for autonomous navigation and mapping. In this work we approach the problem from two sides. First we discuss uncertainty estimation in deep models, proposing a solution based on a fully convolutional neural network. The proposed architecture is not restricted by the assumption that the uncertainty follows a Gaussian model, as in the case of many popular solutions for deep model uncertainty estimation, such as Monte-Carlo Dropout. We present results showing that uncertainty over obstacle distances is actually better modeled with a Laplace distribution. Then, we propose a novel approach to build maps based on Deep Neural Network uncertainty models. In particular, we present an algorithm to build a map that includes information over obstacle distance estimates while taking into account the level of uncertainty in each estimate. We show how the constructed map can be used to increase global navigation safety by planning trajectories which avoid areas of high uncertainty, enabling higher autonomy for mobile robots in indoor settings.Comment: Accepted for publication in "2019 IEEE-RAS International Conference on Humanoid Robots (Humanoids)

    Obstacle Avoidance and Proscriptive Bayesian Programming

    Get PDF
    Unexpected events and not modeled properties of the robot environment are some of the challenges presented by situated robotics research field. Collision avoidance is a basic security requirement and this paper proposes a probabilistic approach called Bayesian Programming, which aims to deal with the uncertainty, imprecision and incompleteness of the information handled to solve the obstacle avoidance problem. Some examples illustrate the process of embodying the programmer preliminary knowledge into a Bayesian program and experimental results of these examples implementation in an electrical vehicle are described and commented. A video illustration of the developed experiments can be found at http://www.inrialpes.fr/sharp/pub/laplac
    corecore