2 research outputs found

    SIGVerse: A cloud-based VR platform for research on social and embodied human-robot interaction

    Full text link
    Common sense and social interaction related to daily-life environments are considerably important for autonomous robots, which support human activities. One of the practical approaches for acquiring such social interaction skills and semantic information as common sense in human activity is the application of recent machine learning techniques. Although recent machine learning techniques have been successful in realizing automatic manipulation and driving tasks, it is difficult to use these techniques in applications that require human-robot interaction experience. Humans have to perform several times over a long term to show embodied and social interaction behaviors to robots or learning systems. To address this problem, we propose a cloud-based immersive virtual reality (VR) platform which enables virtual human-robot interaction to collect the social and embodied knowledge of human activities in a variety of situations. To realize the flexible and reusable system, we develop a real-time bridging mechanism between ROS and Unity, which is one of the standard platforms for developing VR applications. We apply the proposed system to a robot competition field named RoboCup@Home to confirm the feasibility of the system in a realistic human-robot interaction scenario. Through demonstration experiments at the competition, we show the usefulness and potential of the system for the development and evaluation of social intelligence through human-robot interaction. The proposed VR platform enables robot systems to collect social experiences with several users in a short time. The platform also contributes in providing a dataset of social behaviors, which would be a key aspect for intelligent service robots to acquire social interaction skills based on machine learning techniques.Comment: 16 pages. Under review in Frontiers in Robotics and A

    Analysis of the use of behavioral data from virtual reality for calibration of agent-based evacuation models

    Get PDF
    Agent-based evacuation modeling represents an effective tool for making predictions about evacuation aspects of buildings such as evacuation times, congestions, and maximum safe building capacity. Collection of real behavioral data for calibrating agent-based evacuation models is time-consuming, costly, and completely impossible in the case of buildings in the design phase, where predictions about evacuation behavior are especially needed. In recent years evacuation experiments conducted in virtual reality (VR) have been frequently proposed in the literature as an effective tool for collecting data about human behavior. However, empirical studies which would assess validity of VR-based data for such purposes are still rare and considerably lacking in the agent-based evacuation modeling domain. This study explores opportunities that the VR behavioral data may bring for refining outputs of agent evacuation models. To this end, this study employed multiple input settings of agent-based evacuation models (ABEMs), including those based on the data gathered from the VR evacuation experiment that mapped out evacuation behaviors of individuals within the building. Calibration and evaluation of models was based on empirical data gathered from an original evacuation exercise conducted in a real building (N=35) and its virtual twin (N=38). This study found that the resulting predictions of single agent models using data collected in the VR environment after proposed corrections have the potential to better predict real-world evacuation behavior while offering desirable variance in the data outputs necessary for practical applications
    corecore