2,322 research outputs found

    Design and construction of a configurable full-field range imaging system for mobile robotic applications

    Get PDF
    Mobile robotic devices rely critically on extrospection sensors to determine the range to objects in the robot’s operating environment. This provides the robot with the ability both to navigate safely around obstacles and to map its environment and hence facilitate path planning and navigation. There is a requirement for a full-field range imaging system that can determine the range to any obstacle in a camera lens’ field of view accurately and in real-time. This paper details the development of a portable full-field ranging system whose bench-top version has demonstrated sub-millimetre precision. However, this precision required non-real-time acquisition rates and expensive hardware. By iterative replacement of components, a portable, modular and inexpensive version of this full-field ranger has been constructed, capable of real-time operation with some (user-defined) trade-off with precision

    Technology of swallowable capsule for medical applications

    Get PDF
    Medical technology has undergone major breakthroughs in recent years, especially in the area of the examination tools for diagnostic purposes. This paper reviews the swallowable capsule technology in the examination of the gastrointestinal system for various diseases. The wireless camera pill has created a more advanced method than many traditional examination methods for the diagnosis of gastrointestinal diseases such as gastroscopy by the use of an endoscope. After years of great innovation, commercial swallowable pills have been produced and applied in clinical practice. These smart pills can cover the examination of the gastrointestinal system and not only provide to the physicians a lot more useful data that is not available from the traditional methods, but also eliminates the use of the painful endoscopy procedure. In this paper, the key state-of-the-art technologies in the existing Wireless Capsule Endoscopy (WCE) systems are fully reported and the recent research progresses related to these technologies are reviewed. The paper ends by further discussion on the current technical bottlenecks and future research in this area

    Bioinspired engineering of exploration systems for NASA and DoD

    Get PDF
    A new approach called bioinspired engineering of exploration systems (BEES) and its value for solving pressing NASA and DoD needs are described. Insects (for example honeybees and dragonflies) cope remarkably well with their world, despite possessing a brain containing less than 0.01% as many neurons as the human brain. Although most insects have immobile eyes with fixed focus optics and lack stereo vision, they use a number of ingenious, computationally simple strategies for perceiving their world in three dimensions and navigating successfully within it. We are distilling selected insect-inspired strategies to obtain novel solutions for navigation, hazard avoidance, altitude hold, stable flight, terrain following, and gentle deployment of payload. Such functionality provides potential solutions for future autonomous robotic space and planetary explorers. A BEES approach to developing lightweight low-power autonomous flight systems should be useful for flight control of such biomorphic flyers for both NASA and DoD needs. Recent biological studies of mammalian retinas confirm that representations of multiple features of the visual world are systematically parsed and processed in parallel. Features are mapped to a stack of cellular strata within the retina. Each of these representations can be efficiently modeled in semiconductor cellular nonlinear network (CNN) chips. We describe recent breakthroughs in exploring the feasibility of the unique blending of insect strategies of navigation with mammalian visual search, pattern recognition, and image understanding into hybrid biomorphic flyers for future planetary and terrestrial applications. We describe a few future mission scenarios for Mars exploration, uniquely enabled by these newly developed biomorphic flyers

    Event-based Vision: A Survey

    Get PDF
    Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output a stream of events that encode the time, location and sign of the brightness changes. Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low power consumption, and high pixel bandwidth (on the order of kHz) resulting in reduced motion blur. Hence, event cameras have a large potential for robotics and computer vision in challenging scenarios for traditional cameras, such as low-latency, high speed, and high dynamic range. However, novel methods are required to process the unconventional output of these sensors in order to unlock their potential. This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. We present event cameras from their working principle, the actual sensors that are available and the tasks that they have been used for, from low-level vision (feature detection and tracking, optic flow, etc.) to high-level vision (reconstruction, segmentation, recognition). We also discuss the techniques developed to process events, including learning-based techniques, as well as specialized processors for these novel sensors, such as spiking neural networks. Additionally, we highlight the challenges that remain to be tackled and the opportunities that lie ahead in the search for a more efficient, bio-inspired way for machines to perceive and interact with the world

    Integrated Circuitry to Detect Slippage Inspired by Human Skin and Artificial Retinas

    Get PDF
    This paper presents a bioinspired integrated tactile coprocessor that is able to generate a warning in the case of slippage via the data provided by a tactile sensor. Some implementations use different layers of piezoresistive and piezoelectric materials to build upon the raw sensor and obtain the static (pressure) as well as the dynamic (slippage) information. In this paper, a simple raw sensor is used, and a circuitry is implemented, which is able to extract the dynamic information from a single piezoresistive layer. The circuitry was inspired by structures found in human skin and retina, as they are biological systems made up of a dense network of receptors. It is largely based on an artificial retina , which is able to detect motion by using relatively simple spatial temporal dynamics. The circuitry was adapted to respond in the bandwidth of microvibrations produced by early slippage, resembling human skin. Experimental measurements from a chip implemented in a 0.35-mum four-metal two-poly standard CMOS process are presented to show both the performance of the building blocks included in each processing node and the operation of the whole system as a detector of early slippage.Ministerio de Economía y Competitividad TEC2006-12376-C02-01Gobierno de España TEC2006- 1572

    IMPLEMENTATION OF A LOCALIZATION-ORIENTED HRI FOR WALKING ROBOTS IN THE ROBOCUP ENVIRONMENT

    Get PDF
    This paper presents the design and implementation of a human–robot interface capable of evaluating robot localization performance and maintaining full control of robot behaviors in the RoboCup domain. The system consists of legged robots, behavior modules, an overhead visual tracking system, and a graphic user interface. A human–robot communication framework is designed for executing cooperative and competitive processing tasks between users and robots by using object oriented and modularized software architecture, operability, and functionality. Some experimental results are presented to show the performance of the proposed system based on simulated and real-time information. </jats:p

    Embedded Vision Systems: A Review of the Literature

    Get PDF
    Over the past two decades, the use of low power Field Programmable Gate Arrays (FPGA) for the acceleration of various vision systems mainly on embedded devices have become widespread. The reconfigurable and parallel nature of the FPGA opens up new opportunities to speed-up computationally intensive vision and neural algorithms on embedded and portable devices. This paper presents a comprehensive review of embedded vision algorithms and applications over the past decade. The review will discuss vision based systems and approaches, and how they have been implemented on embedded devices. Topics covered include image acquisition, preprocessing, object detection and tracking, recognition as well as high-level classification. This is followed by an outline of the advantages and disadvantages of the various embedded implementations. Finally, an overview of the challenges in the field and future research trends are presented. This review is expected to serve as a tutorial and reference source for embedded computer vision systems
    • 

    corecore