1,137 research outputs found

    Obstacle Avoidance Based on Stereo Vision Navigation System for Omni-directional Robot

    Get PDF
    This paper addresses the problem of obstacle avoidance in mobile robot navigation systems. The navigation system is considered very important because the robot must be able to be controlled from its initial position to its destination without experiencing a collision. The robot must be able to avoid obstacles and arrive at its destination. Several previous studies have focused more on predetermined stationary obstacles. This has resulted in research results being difficult to apply in real environmental conditions, whereas in real conditions, obstacles can be stationary or moving caused by changes in the walking environment. The objective of this study is to address the robot’s navigation behaviors to avoid obstacles. In dealing with complex problems as previously described, a control system is designed using Neuro-Fuzzy so that the robot can avoid obstacles when the robot moves toward the destination. This paper uses ANFIS for obstacle avoidance control. The learning model used is offline learning. Mapping the input and output data is used in the initial step. Then the data is trained to produce a very small error. To support the movement of the robot so that it is more flexible and smoother in avoiding obstacles and can identify objects in real-time, a three wheels omnidirectional robot is used equipped with a stereo vision sensor. The contribution is to advance state of the art in obstacle avoidance for robot navigation systems by exploiting ANFIS with target-and-obstacles detection based on stereo vision sensors. This study tested the proposed control method by using 15 experiments with different obstacle setup positions. These scenarios were chosen to test the ability to avoid moving obstacles that may come from the front, the right, or the left of the robot. The robot moved to the left or right of the obstacles depending on the given Vy speed. After several tests with different obstacle positions, the robot managed to avoid the obstacle when the obstacle distance ranged from 173 – 150 cm with an average speed of Vy 274 mm/s. In the process of avoiding obstacles, the robot still calculates the direction in which the robot is facing the target until the target angle is 0

    Microrobots for wafer scale microfactory: design fabrication integration and control.

    Get PDF
    Future assembly technologies will involve higher automation levels, in order to satisfy increased micro scale or nano scale precision requirements. Traditionally, assembly using a top-down robotic approach has been well-studied and applied to micro-electronics and MEMS industries, but less so in nanotechnology. With the bloom of nanotechnology ever since the 1990s, newly designed products with new materials, coatings and nanoparticles are gradually entering everyone’s life, while the industry has grown into a billion-dollar volume worldwide. Traditionally, nanotechnology products are assembled using bottom-up methods, such as self-assembly, rather than with top-down robotic assembly. This is due to considerations of volume handling of large quantities of components, and the high cost associated to top-down manipulation with the required precision. However, the bottom-up manufacturing methods have certain limitations, such as components need to have pre-define shapes and surface coatings, and the number of assembly components is limited to very few. For example, in the case of self-assembly of nano-cubes with origami design, post-assembly manipulation of cubes in large quantities and cost-efficiency is still challenging. In this thesis, we envision a new paradigm for nano scale assembly, realized with the help of a wafer-scale microfactory containing large numbers of MEMS microrobots. These robots will work together to enhance the throughput of the factory, while their cost will be reduced when compared to conventional nano positioners. To fulfill the microfactory vision, numerous challenges related to design, power, control and nanoscale task completion by these microrobots must be overcome. In this work, we study three types of microrobots for the microfactory: a world’s first laser-driven micrometer-size locomotor called ChevBot,a stationary millimeter-size robotic arm, called Solid Articulated Four Axes Microrobot (sAFAM), and a light-powered centimeter-size crawler microrobot called SolarPede. The ChevBot can perform autonomous navigation and positioning on a dry surface with the guidance of a laser beam. The sAFAM has been designed to perform nano positioning in four degrees of freedom, and nanoscale tasks such as indentation, and manipulation. And the SolarPede serves as a mobile workspace or transporter in the microfactory environment

    Contemporary Robotics

    Get PDF
    This book book is a collection of 18 chapters written by internationally recognized experts and well-known professionals of the field. Chapters contribute to diverse facets of contemporary robotics and autonomous systems. The volume is organized in four thematic parts according to the main subjects, regarding the recent advances in the contemporary robotics. The first thematic topics of the book are devoted to the theoretical issues. This includes development of algorithms for automatic trajectory generation using redudancy resolution scheme, intelligent algorithms for robotic grasping, modelling approach for reactive mode handling of flexible manufacturing and design of an advanced controller for robot manipulators. The second part of the book deals with different aspects of robot calibration and sensing. This includes a geometric and treshold calibration of a multiple robotic line-vision system, robot-based inline 2D/3D quality monitoring using picture-giving and laser triangulation, and a study on prospective polymer composite materials for flexible tactile sensors. The third part addresses issues of mobile robots and multi-agent systems, including SLAM of mobile robots based on fusion of odometry and visual data, configuration of a localization system by a team of mobile robots, development of generic real-time motion controller for differential mobile robots, control of fuel cells of mobile robots, modelling of omni-directional wheeled-based robots, building of hunter- hybrid tracking environment, as well as design of a cooperative control in distributed population-based multi-agent approach. The fourth part presents recent approaches and results in humanoid and bioinspirative robotics. It deals with design of adaptive control of anthropomorphic biped gait, building of dynamic-based simulation for humanoid robot walking, building controller for perceptual motor control dynamics of humans and biomimetic approach to control mechatronic structure using smart materials

    Vision Sensors and Edge Detection

    Get PDF
    Vision Sensors and Edge Detection book reflects a selection of recent developments within the area of vision sensors and edge detection. There are two sections in this book. The first section presents vision sensors with applications to panoramic vision sensors, wireless vision sensors, and automated vision sensor inspection, and the second one shows image processing techniques, such as, image measurements, image transformations, filtering, and parallel computing

    “Design, Development and Characterization of a Thermal Sensor Brick System for Modular Robotics

    Get PDF
    This thesis presents the work on thermal imaging sensor brick (TISB) system for modular robotics. The research demonstrates the design, development and characterization of the TISB system. The TISB system is based on the design philosophy of sensor bricks for modular robotics. In under vehicle surveillance for threat detection, which is a target application of this work we have demonstrated the advantages of the TISB system over purely vision-based systems. We have highlighted the advantages of the TISB system as an illumination invariant threat detection system for detecting hidden threat objects in the undercarriage of a car. We have compared the TISB system to the vision sensor brick system and the mirror on a stick. We have also illustrated the operational capability of the system on the SafeBot under vehicle robot to acquire and transmit the data wirelessly. The early designs of the TISB system, the evolution of the designs and the uniformity achieved while maintaining the modularity in building the different sensor bricks; the visual, the thermal and the range sensor brick is presented as part of this work. Each of these sensor brick systems designed and implemented at the Imaging Robotics and Intelligent Systems (IRIS) laboratory consist of four major blocks: Sensing and Image Acquisition Block, Pre-Processing and Fusion Block, Communication Block, and Power Block. The Sensing and Image Acquisition Block is to capture images or acquire data. The Pre-Processing and Fusion Block is to work on the acquired images or data. The Communication Block is for transferring data between the sensor brick and the remote host computer. The Power Block is to maintain power supply to the entire brick. The modular sensor bricks are self-sufficient plug and play systems. The SafeBot under vehicle robot designed and implemented at the IRIS laboratory has two tracked platforms one on each side with a payload bay area in the middle. Each of these tracked platforms is a mobility brick based on the same design philosophy as the modular sensor bricks. The robot can carry one brick at a time or even multiple bricks at the same time. The contributions of this thesis are: (1) designing and developing the hardware implementation of the TISB system, (2) designing and developing the software for the TISB system, and (3) characterizing the TISB system, where this characterization of the system is the major contribution of this thesis. The analysis of the thermal sensor brick system provides the user and future designers with sufficient information on parameters to be considered to make the right choice for future modifications, the kind of applications the TISB could handle and the load that the different blocks of the TISB system could manage. Under vehicle surveillance for threat detection, perimeter / area surveillance, scouting, and improvised explosive device (IED) detection using a car-mounted system are some of the applications that have been identified for this system

    Development of new intelligent autonomous robotic assistant for hospitals

    Get PDF
    Continuous technological development in modern societies has increased the quality of life and average life-span of people. This imposes an extra burden on the current healthcare infrastructure, which also creates the opportunity for developing new, autonomous, assistive robots to help alleviate this extra workload. The research question explored the extent to which a prototypical robotic platform can be created and how it may be implemented in a hospital environment with the aim to assist the hospital staff with daily tasks, such as guiding patients and visitors, following patients to ensure safety, and making deliveries to and from rooms and workstations. In terms of major contributions, this thesis outlines five domains of the development of an actual robotic assistant prototype. Firstly, a comprehensive schematic design is presented in which mechanical, electrical, motor control and kinematics solutions have been examined in detail. Next, a new method has been proposed for assessing the intrinsic properties of different flooring-types using machine learning to classify mechanical vibrations. Thirdly, the technical challenge of enabling the robot to simultaneously map and localise itself in a dynamic environment has been addressed, whereby leg detection is introduced to ensure that, whilst mapping, the robot is able to distinguish between people and the background. The fourth contribution is geometric collision prediction into stabilised dynamic navigation methods, thus optimising the navigation ability to update real-time path planning in a dynamic environment. Lastly, the problem of detecting gaze at long distances has been addressed by means of a new eye-tracking hardware solution which combines infra-red eye tracking and depth sensing. The research serves both to provide a template for the development of comprehensive mobile assistive-robot solutions, and to address some of the inherent challenges currently present in introducing autonomous assistive robots in hospital environments.Open Acces

    Sound source localization through shape reconfiguration in a snake robot

    Get PDF
    This paper describes a snake robot system that uses sound source localization. We show in this paper as to how we can localize a sound source in 3D and solve the classic forward backward problem in sound source localization using minimum number of audio sensors by using the multiple degrees of freedom of the snake robot. We describe the hardware and software architecture of the robot and show the results of several sound tracking experiments we did with our snake robot. We also present biologically inspired sound tracking behavior in different postures of a biological snake robot as "Digital Snake Charming"
    • …
    corecore