383 research outputs found

    Integrating Human Inputs with Autonomous Behaviors on an Intelligent Wheelchair Platform

    Get PDF
    Researchers have developed and assessed a computer-controlled wheelchair called the Smart Chair. A shared control framework has different levels of autonomy, allowing the human operator complete control of the chair at each level while ensuring the user\u27s safety. The semiautonomous system incorporates deliberative motion plans or controllers, reactive behaviors, and human user inputs. At every instant in time, control inputs from three sources are integrated continuously to provide a safe trajectory to the destination. Experiments with 50 participants demonstrate quantitatively and qualitatively the benefits of human-robot augmentation in three modes of operation: manual, autonomous, and semiautonomous. This article is part of a special issue on Interacting with Autonomy

    Recent Advancements in Augmented Reality for Robotic Applications: A Survey

    Get PDF
    Robots are expanding from industrial applications to daily life, in areas such as medical robotics, rehabilitative robotics, social robotics, and mobile/aerial robotics systems. In recent years, augmented reality (AR) has been integrated into many robotic applications, including medical, industrial, human–robot interactions, and collaboration scenarios. In this work, AR for both medical and industrial robot applications is reviewed and summarized. For medical robot applications, we investigated the integration of AR in (1) preoperative and surgical task planning; (2) image-guided robotic surgery; (3) surgical training and simulation; and (4) telesurgery. AR for industrial scenarios is reviewed in (1) human–robot interactions and collaborations; (2) path planning and task allocation; (3) training and simulation; and (4) teleoperation control/assistance. In addition, the limitations and challenges are discussed. Overall, this article serves as a valuable resource for working in the field of AR and robotic research, offering insights into the recent state of the art and prospects for improvement

    Human robot interactions in care applications

    Full text link

    Ultrasonic-Based Environmental Perception for Mobile 5G-Oriented XR Applications

    Get PDF
    One of the sectors that is expected to significantly benefit from 5G network deployment is eXtended Reality (XR). Besides the very high bandwidth, reliability, and Quality of Service (QoS) to be delivered to end users, XR also requires accurate environmental perception for safety reasons: this is fundamental when a user, wearing XR equipment, is immersed in a “virtual” world, but moves in a “real” environment. To overcome this limitation (especially when using low-cost XR equipments, such as cardboards worn by the end user), it is possible to exploit the potentialities offered by Internet of Things (IoT) nodes with sensing/actuating capabilities. In this paper, we rely on ultrasonic sensor-based IoT systems to perceive the surrounding environment and to provide “side information” to XR systems, then performing a preliminary experimental characterization campaign with different ultrasonic IoT system configurations worn by the end user. The combination of the information flows associated with XR and IoT components is enabled by 5G technology. An illustrative experimental scenario, relative to a “Tourism 4.0” IoT-aided VR application deployed by Vodafone in Milan, Italy, is presented

    Robot guidance using machine vision techniques in industrial environments: A comparative review

    Get PDF
    In the factory of the future, most of the operations will be done by autonomous robots that need visual feedback to move around the working space avoiding obstacles, to work collaboratively with humans, to identify and locate the working parts, to complete the information provided by other sensors to improve their positioning accuracy, etc. Different vision techniques, such as photogrammetry, stereo vision, structured light, time of flight and laser triangulation, among others, are widely used for inspection and quality control processes in the industry and now for robot guidance. Choosing which type of vision system to use is highly dependent on the parts that need to be located or measured. Thus, in this paper a comparative review of different machine vision techniques for robot guidance is presented. This work analyzes accuracy, range and weight of the sensors, safety, processing time and environmental influences. Researchers and developers can take it as a background information for their future works
    • …
    corecore