90,147 research outputs found

    Human aware robot navigation

    Get PDF
    Abstract. Human aware robot navigation refers to the navigation of a robot in an environment shared with humans in such a way that the humans should feel comfortable, and natural with the presence of the robot. On top of that, the robot navigation should comply with the social norms of the environment. The robot can interact with humans in the environment, such as avoiding them, approaching them, or following them. In this thesis, we specifically focus on the approach behavior of the robot, keeping the other use cases still in mind. Studying and analyzing how humans move around other humans gives us the idea about the kind of navigation behaviors that we expect the robots to exhibit. Most of the previous research does not focus much on understanding such behavioral aspects while approaching people. On top of that, a straightforward mathematical modeling of complex human behaviors is very difficult. So, in this thesis, we proposed an Inverse Reinforcement Learning (IRL) framework based on Guided Cost Learning (GCL) to learn these behaviors from demonstration. After analyzing the CongreG8 dataset, we found that the incoming human tends to make an O-space (circle) with the rest of the group. Also, the approaching velocity slows down when the approaching human gets closer to the group. We utilized these findings in our framework that can learn the optimal reward and policy from the example demonstrations and imitate similar human motion

    Investigating Navigation Strategies in the Morris Water Maze through Deep Reinforcement Learning

    Full text link
    Navigation is a complex skill with a long history of research in animals and humans. In this work, we simulate the Morris Water Maze in 2D to train deep reinforcement learning agents. We perform automatic classification of navigation strategies, analyze the distribution of strategies used by artificial agents, and compare them with experimental data to show similar learning dynamics as those seen in humans and rodents. We develop environment-specific auxiliary tasks and examine factors affecting their usefulness. We suggest that the most beneficial tasks are potentially more biologically feasible for real agents to use. Lastly, we explore the development of internal representations in the activations of artificial agent neural networks. These representations resemble place cells and head-direction cells found in mouse brains, and their presence has correlation to the navigation strategies that artificial agents employ.Comment: 26 pages, 15 figure

    Socially Compliant Navigation Dataset (SCAND): A Large-Scale Dataset of Demonstrations for Social Navigation

    Full text link
    Social navigation is the capability of an autonomous agent, such as a robot, to navigate in a 'socially compliant' manner in the presence of other intelligent agents such as humans. With the emergence of autonomously navigating mobile robots in human populated environments (e.g., domestic service robots in homes and restaurants and food delivery robots on public sidewalks), incorporating socially compliant navigation behaviors on these robots becomes critical to ensuring safe and comfortable human robot coexistence. To address this challenge, imitation learning is a promising framework, since it is easier for humans to demonstrate the task of social navigation rather than to formulate reward functions that accurately capture the complex multi objective setting of social navigation. The use of imitation learning and inverse reinforcement learning to social navigation for mobile robots, however, is currently hindered by a lack of large scale datasets that capture socially compliant robot navigation demonstrations in the wild. To fill this gap, we introduce Socially CompliAnt Navigation Dataset (SCAND) a large scale, first person view dataset of socially compliant navigation demonstrations. Our dataset contains 8.7 hours, 138 trajectories, 25 miles of socially compliant, human teleoperated driving demonstrations that comprises multi modal data streams including 3D lidar, joystick commands, odometry, visual and inertial information, collected on two morphologically different mobile robots a Boston Dynamics Spot and a Clearpath Jackal by four different human demonstrators in both indoor and outdoor environments. We additionally perform preliminary analysis and validation through real world robot experiments and show that navigation policies learned by imitation learning on SCAND generate socially compliant behavior

    Incorporating perception uncertainty in human-aware navigation:A comparative study

    Get PDF
    In this work, we present a novel approach to human-aware navigation by probabilistically modelling the uncertainty of perception for a social robotic system and investigating its effect on the overall social navigation performance. The model of the social costmap around a person has been extended to consider this new uncertainty factor, which has been widely neglected despite playing an important role in situations with noisy perception. A social path planner based on the fast marching method has been augmented to account for the uncertainty in the positions of people. The effectiveness of the proposed approach has been tested in extensive experiments carried out with real robots and in simulation. Real experiments have been conducted, given noisy perception, in the presence of single/multiple, static/dynamic humans. Results show how this approach has been able to achieve trajectories that are able to keep a more appropriate social distance to the people, compared to those of the basic navigation approach, and the human-aware navigation approach which relies solely on perfect perception, when the complexity of the environment increases. Accounting for uncertainty of perception is shown to result in smoother trajectories with lower jerk that are more natural from the point of view of humans

    Socially aware robot navigation system in human-populated and interactive environments based on an adaptive spatial density function and space affordances

    Get PDF
    Traditionally robots are mostly known by society due to the wide use of manipulators, which are generally placed in controlled environments such as factories. However, with the advances in the area of mobile robotics, they are increasingly inserted into social contexts, i.e., in the presence of people. The adoption of socially acceptable behaviours demands a trade-off between social comfort and other metrics of efficiency. For navigation tasks, for example, humans must be differentiated from other ordinary objects in the scene. In this work, we propose a novel human-aware navigation strategy built upon the use of an adaptive spatial density function that efficiently cluster groups of people according to their spatial arrangement. Space affordances are also used for defining potential activity spaces considering the objects in the scene. The proposed function defines regions where navigation is either discouraged or forbidden. To implement a socially acceptable navigation, the navigation architecture combines a probabilistic roadmap and rapidly-exploring random tree path planners, and an adaptation of the elastic band algorithm. Trials in real and simulated environments carried out demonstrate that the use of the clustering algorithm and social rules in the navigation architecture do not hinder the navigation performance

    Trust-aware Safe Control for Autonomous Navigation: Estimation of System-to-human Trust for Trust-adaptive Control Barrier Functions

    Full text link
    A trust-aware safe control system for autonomous navigation in the presence of humans, specifically pedestrians, is presented. The system combines model predictive control (MPC) with control barrier functions (CBFs) and trust estimation to ensure safe and reliable navigation in complex environments. Pedestrian trust values are computed based on features, extracted from camera sensor images, such as mutual eye contact and smartphone usage. These trust values are integrated into the MPC controller's CBF constraints, allowing the autonomous vehicle to make informed decisions considering pedestrian behavior. Simulations conducted in the CARLA driving simulator demonstrate the feasibility and effectiveness of the proposed system, showcasing more conservative behaviour around inattentive pedestrians and vice versa. The results highlight the practicality of the system in real-world applications, providing a promising approach to enhance the safety and reliability of autonomous navigation systems, especially self-driving vehicles

    Social Navigation in a Cognitive Architecture Using Dynamic Proxemic Zones

    Get PDF
    [EN] Robots have begun to populate the everyday environments of human beings. These social robots must perform their tasks without disturbing the people with whom they share their environment. This paper proposes a navigation algorithm for robots that is acceptable to people. Robots will detect the personal areas of humans, to carry out their tasks, generating navigation routes that have less impact on human activities. The main novelty of this work is that the robot will perceive the moods of people to adjust the size of proxemic areas. This work will contribute to making the presence of robots in human-populated environments more acceptable. As a result, we have integrated this approach into a cognitive architecture designed to perform tasks in human-populated environments. The paper provides quantitative experimental results in two scenarios: controlled, including social navigation metrics in comparison with a traditional navigation method, and non-controlled, in robotic competitions where different studies of social robotics are measured.SIGobierno de España (TIN2016-76515-R grant for the COMBAHO project, supported with Feder funds )Comunidad de Madrid (RoboCity2030-DIH-CM (S2018/NMT-4331)

    EEG Correlates of Spatial Navigation in Patients with Right Hippocampal Lesion: A Mobile Brain/Body Imaging (MoBI) Study

    Get PDF
    Spatial navigation is a fundamental cognitive function that consists of different cognitive processes such as learning and decision making as well as physical locomotion. In the literature, there is a tendency to focus on cognitive elements of human spatial navigation while the presence of the body and embodied agents are neglected. Being that sensory and motor systems are integrated into the brain mechanisms according to embodied cognition theory, integrating physical movement into navigation research is crucial to investigate brain dynamics underlying human spatial navigation. Using Mobile Brain/Body Imaging (MoBI) approach, this study aims to understand electroencephalographic (EEG) activity during spatial navigation in actively moving humans. In the present study, 27 participants (9 patients with right hippocampal lesion and 18 healthy matched controls) performed a spatial navigation task in a human virtual analogue of the Morris Water Maze. Subjects were tested in both desktop and MoBI setups. In both study setups, frontal-midline (FM) theta (4-8 Hz) oscillations were examined with high-density EEG. In MoBI, EEG activity was recorded synchronously to motion capture, and the virtual environment was presented by a head-mounted display. EEG data were analyzed by using the event-related desynchronization/synchronization (ERD/ERS) method. Association between FM theta activity and spatial navigation performance was analyzed. Further, we also tested the effect of the study setup on the participant group. By comparing desktop and MoBI setups, the study aims to reveal how dynamics of the brain with hippocampal lesion change under action during spatial navigation compared to a healthy brain.Spatial navigation is a fundamental cognitive function that consists of different cognitive processes such as learning and decision making as well as physical locomotion. In the literature, there is a tendency to focus on cognitive elements of human spatial navigation while the presence of the body and embodied agents are neglected. Being that sensory and motor systems are integrated into the brain mechanisms according to embodied cognition theory, integrating physical movement into navigation research is crucial to investigate brain dynamics underlying human spatial navigation. Using Mobile Brain/Body Imaging (MoBI) approach, this study aims to understand electroencephalographic (EEG) activity during spatial navigation in actively moving humans. In the present study, 27 participants (9 patients with right hippocampal lesion and 18 healthy matched controls) performed a spatial navigation task in a human virtual analogue of the Morris Water Maze. Subjects were tested in both desktop and MoBI setups. In both study setups, frontal-midline (FM) theta (4-8 Hz) oscillations were examined with high-density EEG. In MoBI, EEG activity was recorded synchronously to motion capture, and the virtual environment was presented by a head-mounted display. EEG data were analyzed by using the event-related desynchronization/synchronization (ERD/ERS) method. Association between FM theta activity and spatial navigation performance was analyzed. Further, we also tested the effect of the study setup on the participant group. By comparing desktop and MoBI setups, the study aims to reveal how dynamics of the brain with hippocampal lesion change under action during spatial navigation compared to a healthy brain
    • …
    corecore