1,753 research outputs found

    Neural Sensor Fusion for Spatial Visualization on a Mobile Robot

    Full text link
    An ARTMAP neural network is used to integrate visual information and ultrasonic sensory information on a B 14 mobile robot. Training samples for the neural network are acquired without human intervention. Sensory snapshots are retrospectively associated with the distance to the wall, provided by on~ board odomctry as the robot travels in a straight line. The goal is to produce a more accurate measure of distance than is provided by the raw sensors. The neural network effectively combines sensory sources both within and between modalities. The improved distance percept is used to produce occupancy grid visualizations of the robot's environment. The maps produced point to specific problems of raw sensory information processing and demonstrate the benefits of using a neural network system for sensor fusion.Office of Naval Research and Naval Research Laboratory (00014-96-1-0772, 00014-95-1-0409, 00014-95-0657

    Stabilized RPA Flight in Building Proximity Operations

    Get PDF
    The thesis seeks a solution to the requirement for a highly reliable and capable Unmanned Air Vehicle (UAV) to support a wide array of missions and applications that require close proximity flight to structures. The scope of the project includes the drafting of a concept of operations (CONOPs) describing how the mission requirements might be met using the sensor, operators, and air vehicle described here in. The demonstration of the wall-following section of that CONOPs is performed by cart testing a custom algorithm and evaluating its ability to react to its environment. Finally, a flight test was performed to characterize the capabilities of an RTK-GPS system to stably hold a UAV in a single position, and minimize vehicle yaw, as a potential means of minimizing environmental sensing requirements in GPS permissive environments. The results for RTK-GPS were, position hold standard of deviation 8.0 x 10.1cm at a 5m flight altitude, and 17cm x 12.7cm at 8m flight altitude. Yaw variation results were a standard of deviation of 1.7° at 5m and 3.7° at 8m. The LIDAR wall-following tests proved the feasibility of using a decision tree style coding approach to proximity flight near a structure, but still has some changes that should be considered before being used operationally

    Acoustic Echo Estimation using the model-based approach with Application to Spatial Map Construction in Robotics

    Get PDF

    Real-time Simultaneous Localization And Mapping Of Mobile Robots

    Get PDF
    Tez (Yüksek Lisans) -- İstanbul Teknik Üniversitesi, Fen Bilimleri Enstitüsü, 2008Thesis (M.Sc.) -- İstanbul Technical University, Institute of Science and Technology, 2008Bu çalışmanın amacı çeşitli algılayıcılara sahip mobil robot ile kapalı, bilinmeyen ortamların haritasını çıkarmak ve aynı zamanda robotun kendini bulunduğu ortam içinde konumlandırmasıdır. Yapılan çalışmada robotun çevre ile olan etkileşimi kızılötesi ve ultrasonik algılayıcılar ile sağlanmaktadır. Ultrasonik algılayıcılar ucuz ve başarılı bir algılayıcı tipi olmasının yanında, yapısından kaynaklanan problemlerden dolayı çalışması zor olan algılayıcı tiplerinden biridir. Yapılan çalışmalar sırasında bu problemlerin en az seviyeye indirilmesi sağlanmıştır. Kızılötesi algılayıcılar ise yakın mesafeden yaptıkları doğru ölçümlerden dolayı çarpışma önleyici güvenlik sistemi amaçlı kullanılmıştır. Ortam haritasının çıkarılmasında ultrasonik mesafe ölçerler ve dijital pusula kullanılmıştır. Bununla birlikte robotun konumunun takip edilebilmesi için robotun üzerinde enkoderli motorlar kullanılmıştır. Robotun konumlandırılması ve harita çıkarma doğruluğu büyük ölçüde tasarımda kullanılan algılayıcı ve eyleyicilere bağlıdır. Algılayıcı ve eyleyicilerin seçiminde boyutları, doğrulukları ve mikroişlemci ile olan arayüzleri dikkate alınmıştır. Algılayıcılar tarafından ölçülen veriler mikroişlemci tarafından alınıp işlenmekte ve daha karmaşık hesaplama, bilgi depolama, konumlandırma, harita çıkarma işlemi yapacak olan bilgisayara kablosuz RF iletişimi ile aktarılmaktadır.The aim of this study is localization and mapping of the unknown indoor environments using mobile robot that have various sensors. The mobile robot provides interaction with the surroundings by using infrared and ultrasonic sensors. The ultrasonic sensors are cheap and successful but also they have some problem arise from the structure of them. These problems are reduced to the lower level during the study. Infrared sensors perform accurate measurements from the closer range therefore they are used for collision avoidance security purposes. Environment mapping is generated by using ultrasonic range finders and digital compass. In addition to this, to observe the localization of the robot, motors with encoders are used. Localization of the robot and accuracy of mapping are mostly related to used sensors and actuators of the robot design. The selection of the sensors and the actuators are considered according to their sizes, accuracies, interfaces to the microprocessor. Data measured by the sensors that is received and processed at the microprocessor. Then, data processed by the microprocessor is sent to the remote computer via RF communication for the complicated computation, data storage, localization and generating map.Yüksek LisansM.Sc

    Mobility increases localizability: A survey on wireless indoor localization using inertial sensors

    Get PDF

    Sensor-based Collision Avoidance System for the Walking Machine ALDURO

    Get PDF
    This work presents a sensor system develop for the robot ALDURO (Antropomorphically Legged and Wheeled Duisburg Robot), in order to allow it to detect and avoid obstacles when moving in unstructured terrains. The robot is a large-scale hydraulically driven 4-legged walking-machine, developed at the Duisburg-Essen University, with 16 degrees of freedom at each leg and will be steered by an operator sitting in a cab on the robot body. The Cartesian operator instructions are processed by a control computer, which converts them into appropriate autonomous leg movements, what makes necessary that the robot automatically recognizes the obstacles (rock, trunks, holes, etc.) on its way, locates and avoids them. A system based on ultra-sound sensors was developed to carry this task on, but there are intrinsic problems with such sensors, concerning to their poor angular precision. To overcome that, a fuzzy model of the used ultra-sound sensor, based on the characteristics of the real one, was developed to include the uncertainties about the measures. A posterior fuzzy inference builds from the measured data a map of the robot’s surroundings, to be used as input to the navigation system. This whole sensor system was implemented at a test stand, where a real size leg of the robot is fully functional. The sensors are assembled in an I2C net, which uses a micro-controller as interface to the main controller (a personal computer). That enables to relieve the main controller of some data processing, which is carried by the microcontroller on. The sensor system was tested together with the fuzzy data inference, and different arrangements to the sensors and settings of the inference system were tried, in order to achieve a satisfactory result

    Heterogeneous Teams of Modular Robots for Mapping and Exploration

    Get PDF
    The definitive article is published in Autonomous Robots. It is available at http://www.springerlink.com (DOI: DOI: 10.1023/A:1008933826411). © Springer-VerlagIn this article, we present the design of a team of heterogeneous, centimeter-scale robots that collaborate to map and explore unknown environments. The robots, called Millibots, are configured from modular components that include sonar and IR sensors, camera, communication, computation, and mobility modules. Robots with different configurations use their special capabilities collaboratively to accomplish a given task. For mapping and exploration with multiple robots, it is critical to know the relative positions of each robot with respect to the others. We have developed a novel localization system that uses sonar-based distance measurements to determine the positions of all the robots in the group. With their positions known, we use an occupancy grid Bayesian mapping algorithm to combine the sensor data from multiple robots with different sensing modalities. Finally, we present the results of several mapping experiments conducted by a user-guided team of five robots operating in a room containing multiple obstacles
    corecore