4,370 research outputs found

    Human Motion Trajectory Prediction: A Survey

    Full text link
    With growing numbers of intelligent autonomous systems in human environments, the ability of such systems to perceive, understand and anticipate human behavior becomes increasingly important. Specifically, predicting future positions of dynamic agents and planning considering such predictions are key tasks for self-driving vehicles, service robots and advanced surveillance systems. This paper provides a survey of human motion trajectory prediction. We review, analyze and structure a large selection of work from different communities and propose a taxonomy that categorizes existing methods based on the motion modeling approach and level of contextual information used. We provide an overview of the existing datasets and performance metrics. We discuss limitations of the state of the art and outline directions for further research.Comment: Submitted to the International Journal of Robotics Research (IJRR), 37 page

    IMPLEMENTATION OF A LOCALIZATION-ORIENTED HRI FOR WALKING ROBOTS IN THE ROBOCUP ENVIRONMENT

    Get PDF
    This paper presents the design and implementation of a human–robot interface capable of evaluating robot localization performance and maintaining full control of robot behaviors in the RoboCup domain. The system consists of legged robots, behavior modules, an overhead visual tracking system, and a graphic user interface. A human–robot communication framework is designed for executing cooperative and competitive processing tasks between users and robots by using object oriented and modularized software architecture, operability, and functionality. Some experimental results are presented to show the performance of the proposed system based on simulated and real-time information. </jats:p

    Special issue on smart interactions in cyber-physical systems: Humans, agents, robots, machines, and sensors

    Get PDF
    In recent years, there has been increasing interaction between humans and non‐human systems as we move further beyond the industrial age, the information age, and as we move into the fourth‐generation society. The ability to distinguish between human and non‐human capabilities has become more difficult to discern. Given this, it is common that cyber‐physical systems (CPSs) are rapidly integrated with human functionality, and humans have become increasingly dependent on CPSs to perform their daily routines.The constant indicators of a future where human and non‐human CPSs relationships consistently interact and where they allow each other to navigate through a set of non‐trivial goals is an interesting and rich area of research, discovery, and practical work area. The evidence of con- vergence has rapidly gained clarity, demonstrating that we can use complex combinations of sensors, artificial intelli- gence, and data to augment human life and knowledge. To expand the knowledge in this area, we should explain how to model, design, validate, implement, and experiment with these complex systems of interaction, communication, and networking, which will be developed and explored in this special issue. This special issue will include ideas of the future that are relevant for understanding, discerning, and developing the relationship between humans and non‐ human CPSs as well as the practical nature of systems that facilitate the integration between humans, agents, robots, machines, and sensors (HARMS).Fil: Kim, Donghan. Kyung Hee University;Fil: Rodriguez, Sebastian Alberto. Universidad TecnolĂłgica Nacional; Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas. Centro CientĂ­fico TecnolĂłgico Conicet - TucumĂĄn; ArgentinaFil: Matson, Eric T.. Purdue University; Estados UnidosFil: Kim, Gerard Jounghyun. Korea University

    Real-time Perceptive Motion Control using Control Barrier Functions with Analytical Smoothing for Six-Wheeled-Telescopic-Legged Robot Tachyon 3

    Full text link
    To achieve safe legged locomotion, it is important to generate motion in real-time considering various constraints in robots and environments. In this study, we propose a lightweight real-time perspective motion control system for the newly developed six-wheeled-telescopic-legged robot, Tachyon 3. In the proposed method, analytically smoothed constraints including Smooth Separating Axis Theorem (Smooth SAT) as a novel higher order differentiable collision detection for 3D shapes is applied to the Control Barrier Function (CBF). The proposed system integrating the CBF achieves online motion generation in a short control cycle of 1 ms that satisfies joint limitations, environmental collision avoidance and safe convex foothold constraints. The efficiency of Smooth SAT is shown from the collision detection time of 1 us or less and the CBF constraint computation time for Tachyon3 of several us. Furthermore, the effectiveness of the proposed system is verified through the stair-climbing motion, integrating online recognition in a simulation and a real machine.Comment: 8 pages, 8 figures, This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessibl
    • 

    corecore