1,029 research outputs found

    IMPLEMENTATION OF A LOCALIZATION-ORIENTED HRI FOR WALKING ROBOTS IN THE ROBOCUP ENVIRONMENT

    Get PDF
    This paper presents the design and implementation of a human–robot interface capable of evaluating robot localization performance and maintaining full control of robot behaviors in the RoboCup domain. The system consists of legged robots, behavior modules, an overhead visual tracking system, and a graphic user interface. A human–robot communication framework is designed for executing cooperative and competitive processing tasks between users and robots by using object oriented and modularized software architecture, operability, and functionality. Some experimental results are presented to show the performance of the proposed system based on simulated and real-time information. </jats:p

    Deep Visual Foresight for Planning Robot Motion

    Full text link
    A key challenge in scaling up robot learning to many skills and environments is removing the need for human supervision, so that robots can collect their own data and improve their own performance without being limited by the cost of requesting human feedback. Model-based reinforcement learning holds the promise of enabling an agent to learn to predict the effects of its actions, which could provide flexible predictive models for a wide range of tasks and environments, without detailed human supervision. We develop a method for combining deep action-conditioned video prediction models with model-predictive control that uses entirely unlabeled training data. Our approach does not require a calibrated camera, an instrumented training set-up, nor precise sensing and actuation. Our results show that our method enables a real robot to perform nonprehensile manipulation -- pushing objects -- and can handle novel objects not seen during training.Comment: ICRA 2017. Supplementary video: https://sites.google.com/site/robotforesight

    Robust and Efficient Robot Vision Through Sampling

    Get PDF

    Object detection for KRSBI robot soccer using PeleeNet on omnidirectional camera

    Get PDF
    Kontes Robot Sepak Bola Indonesia (KRSBI) is an annual event for contestants to compete their design and robot engineering in the field of robot soccer. Each contestant tries to win the match by scoring a goal toward the opponent's goal. In order to score a goal, the robot needs to find the ball, locate the goal, then kick the ball toward goal. We employed an omnidirectional vision camera as a visual sensor for a robot to perceive the object’s information. We calibrated streaming images from the camera to remove the mirror distortion. Furthermore, we deployed PeleeNet as our deep learning model for object detection. We fine-tuned PeleeNet on our dataset generated from our image collection. Our experiment result showed PeleeNet had the potential for deep learning mobile platform in KRSBI as the object detection architecture. It had a perfect combination of memory efficiency, speed and accuracy

    Desenvolvimento de um sistema de visĂŁo para robĂ´s humanoides

    Get PDF
    Mestrado em Engenharia Electrónica e Telecomunicaçõe

    Embedded distributed vision system for humanoid soccer robot

    Get PDF
    Computer vision is one of the most challenging applications in sensor systems since the signal is complex from spatial and logical point of view. Due to these characteristics vision applications require high computing resources, which makes them especially difficult to use in embedded systems, like mobile robots with reduced amount memory and computing power. In this work a distributed architecture for humanoid visual control is presented using specific nodes for vision processing cooperating with the main CPU to coordinate the movements of the exploring behaviours. This architecture provides additional computing resources in a reduced area, without disturbing tasks related with low level control (mainly kinematics) with the ones involving vision processing algorithms. The information is exchanged allowing to linking control loops between both nodes.This work was supported from the Spanish MICINN project SIDIRELI DPI2008-06737-C02-01/02 and FEDER founds

    Autonomous Configuration of Parameters in Robotic Digital Cameras

    Full text link

    Embedded system for motion control of an omnidirectional mobile robot

    Get PDF
    In this paper, an embedded system for motion control of omnidirectional mobile robots is presented. An omnidirectional mobile robot is a type of holonomic robots. It can move simultaneously and independently in translation and rotation. The RoboCup small-size league, a robotic soccer competition, is chosen as the research platform in this paper. The first part of this research is to design and implement an embedded system that can communicate with a remote server using a wireless link, and execute received commands. Second, a fuzzy-Tuned proportional-integral (PI) path planner and a related low-level controller are proposed to attain optimal input for driving a linear discrete dynamic model of the omnidirectional mobile robot. To fit the planning requirements and avoid slippage, velocity, and acceleration filters are also employed. In particular, low-level optimal controllers, such as a linear quadratic regulator (LQR) for multiple-input-multiple-output acceleration and deceleration of velocity are investigated, where an LQR controller is running on the robot with feedback from motor encoders or sensors. Simultaneously, a fuzzy adaptive PI is used as a high-level controller for position monitoring, where an appropriate vision system is used as a source of position feedback. A key contribution presented in this research is an improvement in the combined fuzzy-PI LQR controller over a traditional PI controller. Moreover, the efficiency of the proposed approach and PI controller are also discussed. Simulation and experimental evaluations are conducted with and without external disturbance. An optimal result to decrease the variances between the target trajectory and the actual output is delivered by the onboard regulator controller in this paper. The modeling and experimental results confirm the claim that utilizing the new approach in trajectory-planning controllers results in more precise motion of four-wheeled omnidirectional mobile robots. 2018 IEEE.Scopu

    Using genetic algorithms for real-time object detection

    Get PDF
    P. 1-12This article presents a new approach to mobile robot vision systems based on genetic algorithms. The major contribution of the proposal is the real-time adaptation of genetic algorithms, which are generally used offline. In order to achieve this goal, the execution time must be as short as possible. The scope of the System is the robotic football competition Robocup whitin the Standard Platform category. The system developed detects and estimates distance and orientation to key elements on a football field, such as the ball and goals. Different experiments have been carried out whitin an official Robocup environmentS

    A reliability-based particle filter for humanoid robot self-localization in Robocup Standard Platform League

    Get PDF
    This paper deals with the problem of humanoid robot localization and proposes a new method for position estimation that has been developed for the RoboCup Standard Platform League environment. Firstly, a complete vision system has been implemented in the Nao robot platform that enables the detection of relevant field markers. The detection of field markers provides some estimation of distances for the current robot position. To reduce errors in these distance measurements, extrinsic and intrinsic camera calibration procedures have been developed and described. To validate the localization algorithm, experiments covering many of the typical situations that arise during RoboCup games have been developed: ranging from degradation in position estimation to total loss of position (due to falls, &lsquo;kidnapped robot&rsquo;, or penalization). The self-localization method developed is based on the classical particle filter algorithm. The main contribution of this work is a new particle selection strategy. Our approach reduces the CPU computing time required for each iteration and so eases the limited resource availability problem that is common in robot platforms such as Nao. The experimental results show the quality of the new algorithm in terms of localization and CPU time consumption.This work has been supported by the Spanish Science and Innovation Ministry (MICINN) under the CICYT project COBAMI: DPI2011-28507-C02-01/02. The responsibility for the content remains with the authors.Munera Sánchez, E.; Muñoz Alcobendas, M.; Blanes Noguera, F.; Benet Gilabert, G.; Simó Ten, JE. (2013). A reliability-based particle filter for humanoid robot self-localization in Robocup Standard Platform League. Sensors. 13(11):14954-14983. https://doi.org/10.3390/s131114954S1495414983131
    • …
    corecore