109 research outputs found

    Body swarm interface (BOSI) : controlling robotic swarms using human bio-signals

    Get PDF
    Traditionally robots are controlled using devices like joysticks, keyboards, mice and other similar human computer interface (HCI) devices. Although this approach is effective and practical for some cases, it is restrictive only to healthy individuals without disabilities, and it also requires the user to master the device before its usage. It becomes complicated and non-intuitive when multiple robots need to be controlled simultaneously with these traditional devices, as in the case of Human Swarm Interfaces (HSI). This work presents a novel concept of using human bio-signals to control swarms of robots. With this concept there are two major advantages: Firstly, it gives amputees and people with certain disabilities the ability to control robotic swarms, which has previously not been possible. Secondly, it also gives the user a more intuitive interface to control swarms of robots by using gestures, thoughts, and eye movement. We measure different bio-signals from the human body including Electroencephalography (EEG), Electromyography (EMG), Electrooculography (EOG), using off the shelf products. After minimal signal processing, we then decode the intended control action using machine learning techniques like Hidden Markov Models (HMM) and K-Nearest Neighbors (K-NN). We employ formation controllers based on distance and displacement to control the shape and motion of the robotic swarm. Comparison for ground truth for thoughts and gesture classifications are done, and the resulting pipelines are evaluated with both simulations and hardware experiments with swarms of ground robots and aerial vehicles

    A systematic review of perception system and simulators for autonomous vehicles research

    Get PDF
    This paper presents a systematic review of the perception systems and simulators for autonomous vehicles (AV). This work has been divided into three parts. In the first part, perception systems are categorized as environment perception systems and positioning estimation systems. The paper presents the physical fundamentals, principle functioning, and electromagnetic spectrum used to operate the most common sensors used in perception systems (ultrasonic, RADAR, LiDAR, cameras, IMU, GNSS, RTK, etc.). Furthermore, their strengths and weaknesses are shown, and the quantification of their features using spider charts will allow proper selection of different sensors depending on 11 features. In the second part, the main elements to be taken into account in the simulation of a perception system of an AV are presented. For this purpose, the paper describes simulators for model-based development, the main game engines that can be used for simulation, simulators from the robotics field, and lastly simulators used specifically for AV. Finally, the current state of regulations that are being applied in different countries around the world on issues concerning the implementation of autonomous vehicles is presented.This work was partially supported by DGT (ref. SPIP2017-02286) and GenoVision (ref. BFU2017-88300-C2-2-R) Spanish Government projects, and the “Research Programme for Groups of Scientific Excellence in the Region of Murcia" of the Seneca Foundation (Agency for Science and Technology in the Region of Murcia – 19895/GERM/15)

    Modeling and Control for Vision Based Rear Wheel Drive Robot and Solving Indoor SLAM Problem Using LIDAR

    Get PDF
    abstract: To achieve the ambitious long-term goal of a feet of cooperating Flexible Autonomous Machines operating in an uncertain Environment (FAME), this thesis addresses several critical modeling, design, control objectives for rear-wheel drive ground vehicles. Toward this ambitious goal, several critical objectives are addressed. One central objective of the thesis was to show how to build low-cost multi-capability robot platform that can be used for conducting FAME research. A TFC-KIT car chassis was augmented to provide a suite of substantive capabilities. The augmented vehicle (FreeSLAM Robot) costs less than 500butoffersthecapabilityofcommerciallyavailablevehiclescostingover500 but offers the capability of commercially available vehicles costing over 2000. All demonstrations presented involve rear-wheel drive FreeSLAM robot. The following summarizes the key hardware demonstrations presented and analyzed: (1)Cruise (v, ) control along a line, (2) Cruise (v, ) control along a curve, (3) Planar (x, y) Cartesian Stabilization for rear wheel drive vehicle, (4) Finish the track with camera pan tilt structure in minimum time, (5) Finish the track without camera pan tilt structure in minimum time, (6) Vision based tracking performance with different cruise speed vx, (7) Vision based tracking performance with different camera fixed look-ahead distance L, (8) Vision based tracking performance with different delay Td from vision subsystem, (9) Manually remote controlled robot to perform indoor SLAM, (10) Autonomously line guided robot to perform indoor SLAM. For most cases, hardware data is compared with, and corroborated by, model based simulation data. In short, the thesis uses low-cost self-designed rear-wheel drive robot to demonstrate many capabilities that are critical in order to reach the longer-term FAME goal.Dissertation/ThesisDefense PresentationMasters Thesis Electrical Engineering 201

    Information Aided Navigation: A Review

    Full text link
    The performance of inertial navigation systems is largely dependent on the stable flow of external measurements and information to guarantee continuous filter updates and bind the inertial solution drift. Platforms in different operational environments may be prevented at some point from receiving external measurements, thus exposing their navigation solution to drift. Over the years, a wide variety of works have been proposed to overcome this shortcoming, by exploiting knowledge of the system current conditions and turning it into an applicable source of information to update the navigation filter. This paper aims to provide an extensive survey of information aided navigation, broadly classified into direct, indirect, and model aiding. Each approach is described by the notable works that implemented its concept, use cases, relevant state updates, and their corresponding measurement models. By matching the appropriate constraint to a given scenario, one will be able to improve the navigation solution accuracy, compensate for the lost information, and uncover certain internal states, that would otherwise remain unobservable.Comment: 8 figures, 3 table

    Proceedings of the International Micro Air Vehicles Conference and Flight Competition 2017 (IMAV 2017)

    Get PDF
    The IMAV 2017 conference has been held at ISAE-SUPAERO, Toulouse, France from Sept. 18 to Sept. 21, 2017. More than 250 participants coming from 30 different countries worldwide have presented their latest research activities in the field of drones. 38 papers have been presented during the conference including various topics such as Aerodynamics, Aeroacoustics, Propulsion, Autopilots, Sensors, Communication systems, Mission planning techniques, Artificial Intelligence, Human-machine cooperation as applied to drones

    Mixed Reality and Remote Sensing Application of Unmanned Aerial Vehicle in Fire and Smoke Detection

    Get PDF
    This paper proposes the development of a system incorporating inertial measurement unit (IMU), a consumer-grade digital camera and a fire detection algorithm simultaneously with a nano Unmanned Aerial Vehicle (UAV) for inspection purposes. The video streams are collected through the monocular camera and navigation relied on the state-of-the-art indoor/outdoor Simultaneous Localisation and Mapping (SLAM) system. It implements the robotic operating system (ROS) and computer vision algorithm to provide a robust, accurate and unique inter-frame motion estimation. The collected onboard data are communicated to the ground station and used the SLAM system to generate a map of the environment. A robust and efficient re-localization was performed to recover from tracking failure, motion blur, and frame lost in the data received. The fire detection algorithm was deployed based on the colour, movement attributes, temporal variation of fire intensity and its accumulation around a point. The cumulative time derivative matrix was utilized to analyze the frame-by-frame changes and to detect areas with high-frequency luminance flicker (random characteristic). Colour, surface coarseness, boundary roughness, and skewness features were perceived as the quadrotor flew autonomously within the clutter and congested area. Mixed Reality system was adopted to visualize and test the proposed system in a physical environment, and the virtual simulation was conducted through the Unity game engine. The results showed that the UAV could successfully detect fire and flame, autonomously fly towards and hover around it, communicate with the ground station and simultaneously generate a map of the environment. There was a slight error between the real and virtual UAV calibration due to the ground truth data and the correlation complexity of tracking real and virtual camera coordinate frames

    Unmanned Robotic Systems and Applications

    Get PDF
    This book presents recent studies of unmanned robotic systems and their applications. With its five chapters, the book brings together important contributions from renowned international researchers. Unmanned autonomous robots are ideal candidates for applications such as rescue missions, especially in areas that are difficult to access. Swarm robotics (multiple robots working together) is another exciting application of the unmanned robotics systems, for example, coordinated search by an interconnected group of moving robots for the purpose of finding a source of hazardous emissions. These robots can behave like individuals working in a group without a centralized control

    Low computational SLAM for an autonomous indoor aerial inspection vehicle

    Get PDF
    The past decade has seen an increase in the capability of small scale Unmanned Aerial Vehicle (UAV) systems, made possible through technological advancements in battery, computing and sensor miniaturisation technology. This has opened a new and rapidly growing branch of robotic research and has sparked the imagination of industry leading to new UAV based services, from the inspection of power-lines to remote police surveillance. Miniaturisation of UAVs have also made them small enough to be practically flown indoors. For example, the inspection of elevated areas in hazardous or damaged structures where the use of conventional ground-based robots are unsuitable. Sellafield Ltd, a nuclear reprocessing facility in the U.K. has many buildings that require frequent safety inspections. UAV inspections eliminate the current risk to personnel of radiation exposure and other hazards in tall structures where scaffolding or hoists are required. This project focused on the development of a UAV for the novel application of semi-autonomously navigating and inspecting these structures without the need for personnel to enter the building. Development exposed a significant gap in knowledge concerning indoor localisation, specifically Simultaneous Localisation and Mapping (SLAM) for use on-board UAVs. To lower the on-board processing requirements of SLAM, other UAV research groups have employed techniques such as off-board processing, reduced dimensionality or prior knowledge of the structure, techniques not suitable to this application given the unknown nature of the structures and the risk of radio-shadows. In this thesis a novel localisation algorithm, which enables real-time and threedimensional SLAM running solely on-board a computationally constrained UAV in heavily cluttered and unknown environments is proposed. The algorithm, based on the Iterative Closest Point (ICP) method utilising approximate nearest neighbour searches and point-cloud decimation to reduce the processing requirements has successfully been tested in environments similar to that specified by Sellafield Ltd

    Autonomous flight in unstructured and unknown indoor environments

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2009.This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Cataloged from student-submitted PDF version of thesis.Includes bibliographical references (p. 119-126).This thesis presents the design, implementation, and validation of a system that enables a micro air vehicle to autonomously explore and map unstructured and unknown indoor environments. Such a vehicle would be of considerable use in many real-world applications such as search and rescue, civil engineering inspection, and a host of military tasks where it is dangerous or difficult to send people. While mapping and exploration capabilities are common for ground vehicles today, air vehicles seeking to achieve these capabilities face unique challenges. While there has been recent progress toward sensing, control, and navigation suites for GPS-denied flight, there have been few demonstrations of stable, goal-directed flight in real environments. The main focus of this research is the development of real-time state estimation techniques that allow our quadrotor helicopter to fly autonomously in indoor, GPS-denied environments. Accomplishing this feat required the development of a large integrated system that brought together many components into a cohesive package. As such, the primary contribution is the development of the complete working system. I show experimental results that illustrate the MAV's ability to navigate accurately in unknown environments, and demonstrate that our algorithms enable the MAV to operate autonomously in a variety of indoor environments.by Abraham Galton Bachrach.S.M
    corecore