82 research outputs found

    Terrain Classification from Body-mounted Cameras during Human Locomotion

    Get PDF
    Abstract—This paper presents a novel algorithm for terrain type classification based on monocular video captured from the viewpoint of human locomotion. A texture-based algorithm is developed to classify the path ahead into multiple groups that can be used to support terrain classification. Gait is taken into account in two ways. Firstly, for key frame selection, when regions with homogeneous texture characteristics are updated, the fre-quency variations of the textured surface are analysed and used to adaptively define filter coefficients. Secondly, it is incorporated in the parameter estimation process where probabilities of path consistency are employed to improve terrain-type estimation. When tested with multiple classes that directly affect mobility a hard surface, a soft surface and an unwalkable area- our proposed method outperforms existing methods by up to 16%, and also provides improved robustness. Index Terms—texture, classification, recursive filter, terrain classification I

    Robot Mapping and Navigation in Real-World Environments

    Get PDF
    Robots can perform various tasks, such as mapping hazardous sites, taking part in search-and-rescue scenarios, or delivering goods and people. Robots operating in the real world face many challenges on the way to the completion of their mission. Essential capabilities required for the operation of such robots are mapping, localization and navigation. Solving all of these tasks robustly presents a substantial difficulty as these components are usually interconnected, i.e., a robot that starts without any knowledge about the environment must simultaneously build a map, localize itself in it, analyze the surroundings and plan a path to efficiently explore an unknown environment. In addition to the interconnections between these tasks, they highly depend on the sensors used by the robot and on the type of the environment in which the robot operates. For example, an RGB camera can be used in an outdoor scene for computing visual odometry, or to detect dynamic objects but becomes less useful in an environment that does not have enough light for cameras to operate. The software that controls the behavior of the robot must seamlessly process all the data coming from different sensors. This often leads to systems that are tailored to a particular robot and a particular set of sensors. In this thesis, we challenge this concept by developing and implementing methods for a typical robot navigation pipeline that can work with different types of the sensors seamlessly both, in indoor and outdoor environments. With the emergence of new range-sensing RGBD and LiDAR sensors, there is an opportunity to build a single system that can operate robustly both in indoor and outdoor environments equally well and, thus, extends the application areas of mobile robots. The techniques presented in this thesis aim to be used with both RGBD and LiDAR sensors without adaptations for individual sensor models by using range image representation and aim to provide methods for navigation and scene interpretation in both static and dynamic environments. For a static world, we present a number of approaches that address the core components of a typical robot navigation pipeline. At the core of building a consistent map of the environment using a mobile robot lies point cloud matching. To this end, we present a method for photometric point cloud matching that treats RGBD and LiDAR sensors in a uniform fashion and is able to accurately register point clouds at the frame rate of the sensor. This method serves as a building block for the further mapping pipeline. In addition to the matching algorithm, we present a method for traversability analysis of the currently observed terrain in order to guide an autonomous robot to the safe parts of the surrounding environment. A source of danger when navigating difficult to access sites is the fact that the robot may fail in building a correct map of the environment. This dramatically impacts the ability of an autonomous robot to navigate towards its goal in a robust way, thus, it is important for the robot to be able to detect these situations and to find its way home not relying on any kind of map. To address this challenge, we present a method for analyzing the quality of the map that the robot has built to date, and safely returning the robot to the starting point in case the map is found to be in an inconsistent state. The scenes in dynamic environments are vastly different from the ones experienced in static ones. In a dynamic setting, objects can be moving, thus making static traversability estimates not enough. With the approaches developed in this thesis, we aim at identifying distinct objects and tracking them to aid navigation and scene understanding. We target these challenges by providing a method for clustering a scene taken with a LiDAR scanner and a measure that can be used to determine if two clustered objects are similar that can aid the tracking performance. All methods presented in this thesis are capable of supporting real-time robot operation, rely on RGBD or LiDAR sensors and have been tested on real robots in real-world environments and on real-world datasets. All approaches have been published in peer-reviewed conference papers and journal articles. In addition to that, most of the presented contributions have been released publicly as open source software

    Proceedings of the NASA Conference on Space Telerobotics, volume 1

    Get PDF
    The theme of the Conference was man-machine collaboration in space. Topics addressed include: redundant manipulators; man-machine systems; telerobot architecture; remote sensing and planning; navigation; neural networks; fundamental AI research; and reasoning under uncertainty

    Conference on Intelligent Robotics in Field, Factory, Service, and Space (CIRFFSS 1994), volume 1

    Get PDF
    The AIAA/NASA Conference on Intelligent Robotics in Field, Factory, Service, and Space (CIRFFSS '94) was originally proposed because of the strong belief that America's problems of global economic competitiveness and job creation and preservation can partly be solved by the use of intelligent robotics, which are also required for human space exploration missions. Individual sessions addressed nuclear industry, agile manufacturing, security/building monitoring, on-orbit applications, vision and sensing technologies, situated control and low-level control, robotic systems architecture, environmental restoration and waste management, robotic remanufacturing, and healthcare applications

    Bio-Inspired Robotics

    Get PDF
    Modern robotic technologies have enabled robots to operate in a variety of unstructured and dynamically-changing environments, in addition to traditional structured environments. Robots have, thus, become an important element in our everyday lives. One key approach to develop such intelligent and autonomous robots is to draw inspiration from biological systems. Biological structure, mechanisms, and underlying principles have the potential to provide new ideas to support the improvement of conventional robotic designs and control. Such biological principles usually originate from animal or even plant models, for robots, which can sense, think, walk, swim, crawl, jump or even fly. Thus, it is believed that these bio-inspired methods are becoming increasingly important in the face of complex applications. Bio-inspired robotics is leading to the study of innovative structures and computing with sensory–motor coordination and learning to achieve intelligence, flexibility, stability, and adaptation for emergent robotic applications, such as manipulation, learning, and control. This Special Issue invites original papers of innovative ideas and concepts, new discoveries and improvements, and novel applications and business models relevant to the selected topics of ``Bio-Inspired Robotics''. Bio-Inspired Robotics is a broad topic and an ongoing expanding field. This Special Issue collates 30 papers that address some of the important challenges and opportunities in this broad and expanding field

    Advances in Intelligent Robotics and Collaborative Automation

    Get PDF
    This book provides an overview of a series of advanced research lines in robotics as well as of design and development methodologies for intelligent robots and their intelligent components. It represents a selection of extended versions of the best papers presented at the Seventh IEEE International Workshop on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications IDAACS 2013 that were related to these topics. Its contents integrate state of the art computational intelligence based techniques for automatic robot control to novel distributed sensing and data integration methodologies that can be applied to intelligent robotics and automation systems. The objective of the text was to provide an overview of some of the problems in the field of robotic systems and intelligent automation and the approaches and techniques that relevant research groups within this area are employing to try to solve them.The contributions of the different authors have been grouped into four main sections:• Robots• Control and Intelligence• Sensing• Collaborative automationThe chapters have been structured to provide an easy to follow introduction to the topics that are addressed, including the most relevant references, so that anyone interested in this field can get started in the area

    A Mechanism for Spatial Orientation Based on Sensory Adaptation in Caenorhabditis Elegans

    Get PDF
    During chemotaxis, animals compute spatial information about odor gradients to make navigational choices for finding or avoiding an odor source. The challenge to the neural circuitry is to interpret and respond to odor concentrations that change over time as animals traverse a gradient. In this thesis, I ask how a nervous system regulates spatial navigation by studying the chemotaxis response of Caenorhabditis elegans to diacetyl. A behavioral analysis demonstrated that AWA sensory neurons drive chemotaxis over several orders of magnitude in odor concentration, providing an entry point for dissecting the mechanistic basis of chemotaxis at the level of neural activity. Precise microfluidic stimulation enabled me to dissociate space from time in the olfactory input to characterize how odor sensing relates to behavior. I systematically measured neuronal responses to odor in the diacetyl chemotaxis circuit, aided by a newly developed imaging system with flexible stimulus delivery and elevated throughput. I found reliable sensory responses to the behaviorally relevant range of odor concentrations. I then followed odor-evoked activity to downstream interneurons that integrate sensory input. Adaptation of neuronal responses to odor yielded a highly sensitive response to small increases in odor concentration at the interneuron level, providing a mechanism for efficient gradient sensing during klinokinesis. Adaptation dynamics at the sensory level were stimulus-dependent and cell-autonomously altered in several classes of mutant animals. Behavioral responses to different concentrations of diacetyl resulted from overlapping contributions from multiple sensory neurons. AWA was specifically required for orientation behavior in response to small increases in odor concentration that are encountered in shallow gradients, demonstrating functional specialization amongst sensory neurons for stimulus characteristics. This work sheds light on an algorithm underlying acute behavioral computation and its biological implementation. The experimental results are presented in two parts: Chapter 2 describes the development of a microscope for high-throughput imaging of neuronal activity in Caenorhabditis elegans. I present a characterization of chemosensory responses to odor and its correlation with behavior. This work has been published (Larsch et al., 2013). Chapter 3 describes the functional architecture of the AWA chemosensory circuit and the role of adaptation in maintaining sensitivity over a wide range of stimulus intensities. This work is currently being prepared for publication
    • …
    corecore