1,157 research outputs found

    A framework for safe human-humanoid coexistence

    Get PDF
    This work is focused on the development of a safety framework for Human-Humanoid coexistence, with emphasis on humanoid locomotion. After a brief introduction to the fundamental concepts of humanoid locomotion, the two most common approaches for gait generation are presented, and are extended with the inclusion of a stability condition to guarantee the boundedness of the generated trajectories. Then the safety framework is presented, with the introduction of different safety behaviors. These behaviors are meant to enhance the overall level of safety during any robot operation. Proactive behaviors will enhance or adapt the current robot operations to reduce the risk of danger, while override behaviors will stop the current robot activity in order to take action against a particularly dangerous situation. A state machine is defined to control the transitions between the behaviors. The behaviors that are strictly related to locomotion are subsequently detailed, and an implementation is proposed and validated. A possible implementation of the remaining behaviors is proposed through the review of related works that can be found in literature

    Collaborative Control: A Robot-Centric Model for Vehicle Teleoperation

    Get PDF

    Socially-Aware Navigation Planner Using Models of Human-Human Interaction

    Get PDF
    A real-time socially-aware navigation planner helps a mobile robot to navigate alongside humans in a socially acceptable manner. This navigation planner is a modification of nav_core package of Robot Operating System (ROS), based upon earlier work and further modified to use only egocentric sensors. The planner can be utilized to provide safe as well as socially appropriate robot navigation. Primitive features including interpersonal distance between the robot and an interaction partner and features of the environment (such as hallways detected in real-time) are used to reason about the current state of an interaction. Gaussian Mixture Models (GMM) are trained over these features from human-human interaction demonstrations of various interaction scenarios. This model is both used to discriminate different human actions related to their navigation behavior and to help in the trajectory selection process to provide a social-appropriateness score for a potential trajectory. This thesis presents a model based framework for navigation planning, a simulation-based evaluation of the model-based navigation behavior

    Sticky Hands

    Get PDF

    A novel framework to improve motion planning of robotic systems through semantic knowledge-based reasoning

    Get PDF
    The need to improve motion planning techniques for manipulator robots, and new effective strategies to manipulate different objects to perform more complex tasks, is crucial for various real-world applications where robots cooperate with humans. This paper proposes a novel framework that aims to improve the motion planning of a robotic agent (a manipulator robot) through semantic knowledge-based reasoning. The Semantic Web Rule Language (SWRL) was used to infer new knowledge based on the known environment and the robotic system. Ontological knowledge, e.g., semantic maps, were generated through a deep neural network, trained to detect and classify objects in the environment where the robotic agent performs. Manipulation constraints were deduced, and the environment corresponding to the agent’s manipulation workspace was created so the planner could interpret it to generate a collision-free path. For reasoning with the ontology, different SPARQL queries were used. The proposed framework was implemented and validated in a real experimental setup, using the planning framework ROSPlan to perform the planning tasks. The proposed framework proved to be a promising strategy to improve motion planning of robotics systems, showing the benefits of artificial intelligence, for knowledge representation and reasoning in robotics.info:eu-repo/semantics/publishedVersio

    Human aware robot navigation

    Get PDF
    Abstract. Human aware robot navigation refers to the navigation of a robot in an environment shared with humans in such a way that the humans should feel comfortable, and natural with the presence of the robot. On top of that, the robot navigation should comply with the social norms of the environment. The robot can interact with humans in the environment, such as avoiding them, approaching them, or following them. In this thesis, we specifically focus on the approach behavior of the robot, keeping the other use cases still in mind. Studying and analyzing how humans move around other humans gives us the idea about the kind of navigation behaviors that we expect the robots to exhibit. Most of the previous research does not focus much on understanding such behavioral aspects while approaching people. On top of that, a straightforward mathematical modeling of complex human behaviors is very difficult. So, in this thesis, we proposed an Inverse Reinforcement Learning (IRL) framework based on Guided Cost Learning (GCL) to learn these behaviors from demonstration. After analyzing the CongreG8 dataset, we found that the incoming human tends to make an O-space (circle) with the rest of the group. Also, the approaching velocity slows down when the approaching human gets closer to the group. We utilized these findings in our framework that can learn the optimal reward and policy from the example demonstrations and imitate similar human motion

    Safe navigation and human-robot interaction in assistant robotic applications

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    The development of a human-robot interface for industrial collaborative system

    Get PDF
    Industrial robots have been identified as one of the most effective solutions for optimising output and quality within many industries. However, there are a number of manufacturing applications involving complex tasks and inconstant components which prohibit the use of fully automated solutions in the foreseeable future. A breakthrough in robotic technologies and changes in safety legislations have supported the creation of robots that coexist and assist humans in industrial applications. It has been broadly recognised that human-robot collaborative systems would be a realistic solution as an advanced production system with wide range of applications and high economic impact. This type of system can utilise the best of both worlds, where the robot can perform simple tasks that require high repeatability while the human performs tasks that require judgement and dexterity of the human hands. Robots in such system will operate as “intelligent assistants”. In a collaborative working environment, robot and human share the same working area, and interact with each other. This level of interface will require effective ways of communication and collaboration to avoid unwanted conflicts. This project aims to create a user interface for industrial collaborative robot system through integration of current robotic technologies. The robotic system is designed for seamless collaboration with a human in close proximity. The system is capable to communicate with the human via the exchange of gestures, as well as visual signal which operators can observe and comprehend at a glance. The main objective of this PhD is to develop a Human-Robot Interface (HRI) for communication with an industrial collaborative robot during collaboration in proximity. The system is developed in conjunction with a small scale collaborative robot system which has been integrated using off-the-shelf components. The system should be capable of receiving input from the human user via an intuitive method as well as indicating its status to the user ii effectively. The HRI will be developed using a combination of hardware integrations and software developments. The software and the control framework were developed in a way that is applicable to other industrial robots in the future. The developed gesture command system is demonstrated on a heavy duty industrial robot
    • …
    corecore