211,179 research outputs found

    A layered fuzzy logic controller for nonholonomic car-like robot

    Get PDF
    A system for real time navigation of a nonholonomic car-like robot in a dynamic environment consists of two layers is described: a Sugeno-type fuzzy motion planner; and a modified proportional navigation based fuzzy controller. The system philosophy is inspired by human routing when moving between obstacles based on visual information including right and left views to identify the next step to the goal. A Sugeno-type fuzzy motion planner of four inputs one output is introduced to give a clear direction to the robot controller. The second stage is a modified proportional navigation based fuzzy controller based on the proportional navigation guidance law and able to optimize the robot's behavior in real time, i.e. to avoid stationary and moving obstacles in its local environment obeying kinematics constraints. The system has an intelligent combination of two behaviors to cope with obstacle avoidance as well as approaching a target using a proportional navigation path. The system was simulated and tested on different environments with various obstacle distributions. The simulation reveals that the system gives good results for various simple environments

    Learning Pose Estimation for UAV Autonomous Navigation and Landing Using Visual-Inertial Sensor Data

    Get PDF
    In this work, we propose a robust network-in-the-loop control system for autonomous navigation and landing of an Unmanned-Aerial-Vehicle (UAV). To estimate the UAV’s absolute pose, we develop a deep neural network (DNN) architecture for visual-inertial odometry, which provides a robust alternative to traditional methods. We first evaluate the accuracy of the estimation by comparing the prediction of our model to traditional visual-inertial approaches on the publicly available EuRoC MAV dataset. The results indicate a clear improvement in the accuracy of the pose estimation up to 25% over the baseline. Finally, we integrate the data-driven estimator in the closed-loop flight control system of Airsim, a simulator available as a plugin for Unreal Engine, and we provide simulation results for autonomous navigation and landing

    Looking Good, Feeling Good – Tac Map: a navigation system for the blind

    Get PDF
    This paper describes the research and development of a navigation system for the blind that provides a tactile and visual language that can be understood by both sighted and blind users. It describes key work and issues in the development of graphical symbols and in particular the pioneering work of Neurath‟s ISOTYPES, as well as more specific communication systems for blind people. The paper focuses on the development of "TacMap‟, a navigation system for the blind. User engagement has been fundamental in the research and the paper discusses the methodology, the research findings and product‟s potential future opportunities and impact

    Near range path navigation using LGMD visual neural networks

    Get PDF
    In this paper, we proposed a method for near range path navigation for a mobile robot by using a pair of biologically inspired visual neural network – lobula giant movement detector (LGMD). In the proposed binocular style visual system, each LGMD processes images covering a part of the wide field of view and extracts relevant visual cues as its output. The outputs from the two LGMDs are compared and translated into executable motor commands to control the wheels of the robot in real time. Stronger signal from the LGMD in one side pushes the robot away from this side step by step; therefore, the robot can navigate in a visual environment naturally with the proposed vision system. Our experiments showed that this bio-inspired system worked well in different scenarios

    Simulation Platform for Vision Aided Inertial Navigation

    Get PDF
    Integrated Inertial Navigation System (INS) and Global Positioning System (GPS) solutions have become the backbone of many modern positioning and navigation systems. The Achilles\u27 heel of such systems are their susceptibility to GPS outages. Hence, there has been sustained interest in alternate navigation techniques to augment a GPS/INS system. With the advancement in camera technologies, visual odometry is a suitable technique. As the cost and effort required to conduct physical trials on a visual odometry system is extensive, this research seeks to provide a simulation platform that is capable of simulating different grades of GPS/INS systems under various realistic visual odometry scenarios. The simulation platform also allows standardized data to be tested across different navigation filters. The utility of this simulation platform is demonstrated by a trade study on factors affecting the performance of a GPS/INS system augmented with visual odometry

    A 64mW DNN-based Visual Navigation Engine for Autonomous Nano-Drones

    Full text link
    Fully-autonomous miniaturized robots (e.g., drones), with artificial intelligence (AI) based visual navigation capabilities are extremely challenging drivers of Internet-of-Things edge intelligence capabilities. Visual navigation based on AI approaches, such as deep neural networks (DNNs) are becoming pervasive for standard-size drones, but are considered out of reach for nanodrones with size of a few cm2{}^\mathrm{2}. In this work, we present the first (to the best of our knowledge) demonstration of a navigation engine for autonomous nano-drones capable of closed-loop end-to-end DNN-based visual navigation. To achieve this goal we developed a complete methodology for parallel execution of complex DNNs directly on-bard of resource-constrained milliwatt-scale nodes. Our system is based on GAP8, a novel parallel ultra-low-power computing platform, and a 27 g commercial, open-source CrazyFlie 2.0 nano-quadrotor. As part of our general methodology we discuss the software mapping techniques that enable the state-of-the-art deep convolutional neural network presented in [1] to be fully executed on-board within a strict 6 fps real-time constraint with no compromise in terms of flight results, while all processing is done with only 64 mW on average. Our navigation engine is flexible and can be used to span a wide performance range: at its peak performance corner it achieves 18 fps while still consuming on average just 3.5% of the power envelope of the deployed nano-aircraft.Comment: 15 pages, 13 figures, 5 tables, 2 listings, accepted for publication in the IEEE Internet of Things Journal (IEEE IOTJ
    • …
    corecore