2,475 research outputs found

    Using virtual reality and 3D industrial numerical models for immersive interactive checklists

    Get PDF
    At the different stages of the PLM, companies develop numerous checklist-based procedures involving prototype inspection and testing. Besides, techniques from CAD, 3D imaging, animation and virtual reality now form a mature set of tools for industrial applications. The work presented in this article develops a unique framework for immersive checklist-based project reviews that applies to all steps of the PLM. It combines immersive navigation in the checklist, virtual experiments when needed and multimedia update of the checklist. It provides a generic tool, independent of the considered checklist, relies on the integration of various VR tools and concepts, in a modular way, and uses an original gesture recognition. Feasibility experiments are presented, validating the benefits of the approach

    Design of a test environment for planning and interaction with virtual production processes

    Get PDF
    Rising complexity of systems combined with multi-disciplinary development and manufacturing processes necessitates new approaches of early validation of intermediate digital process and system prototypes. To develop and test these approaches, the modular digital cube test center was build. Usage of different Visualization Modules such as Powerwall, CAVE or Head Mounted Display allows immersive interaction with the prototypes. Combined with Haptic Interaction Modules from one axis assembly device to a hexapod simulator up to a full freedom kinematic portal and usage of different simulation modules of vehicle design, multi-kinematic, manufacturing and process-simulation allows early virtual prototypes validation in multiple use cases

    Recent Advancements in Augmented Reality for Robotic Applications: A Survey

    Get PDF
    Robots are expanding from industrial applications to daily life, in areas such as medical robotics, rehabilitative robotics, social robotics, and mobile/aerial robotics systems. In recent years, augmented reality (AR) has been integrated into many robotic applications, including medical, industrial, human–robot interactions, and collaboration scenarios. In this work, AR for both medical and industrial robot applications is reviewed and summarized. For medical robot applications, we investigated the integration of AR in (1) preoperative and surgical task planning; (2) image-guided robotic surgery; (3) surgical training and simulation; and (4) telesurgery. AR for industrial scenarios is reviewed in (1) human–robot interactions and collaborations; (2) path planning and task allocation; (3) training and simulation; and (4) teleoperation control/assistance. In addition, the limitations and challenges are discussed. Overall, this article serves as a valuable resource for working in the field of AR and robotic research, offering insights into the recent state of the art and prospects for improvement

    Virtual reality for assembly methods prototyping: a review

    Get PDF
    Assembly planning and evaluation is an important component of the product design process in which details about how parts of a new product will be put together are formalized. A well designed assembly process should take into account various factors such as optimum assembly time and sequence, tooling and fixture requirements, ergonomics, operator safety, and accessibility, among others. Existing computer-based tools to support virtual assembly either concentrate solely on representation of the geometry of parts and fixtures and evaluation of clearances and tolerances or use simulated human mannequins to approximate human interaction in the assembly process. Virtual reality technology has the potential to support integration of natural human motions into the computer aided assembly planning environment (Ritchie et al. in Proc I MECH E Part B J Eng 213(5):461–474, 1999). This would allow evaluations of an assembler’s ability to manipulate and assemble parts and result in reduced time and cost for product design. This paper provides a review of the research in virtual assembly and categorizes the different approaches. Finally, critical requirements and directions for future research are presented

    Virtual and Mixed Reality in Telerobotics: A Survey

    Get PDF

    An Overview of Self-Adaptive Technologies Within Virtual Reality Training

    Get PDF
    This overview presents the current state-of-the-art of self-adaptive technologies within virtual reality (VR) training. Virtual reality training and assessment is increasingly used for five key areas: medical, industrial & commercial training, serious games, rehabilitation and remote training such as Massive Open Online Courses (MOOCs). Adaptation can be applied to five core technologies of VR including haptic devices, stereo graphics, adaptive content, assessment and autonomous agents. Automation of VR training can contribute to automation of actual procedures including remote and robotic assisted surgery which reduces injury and improves accuracy of the procedure. Automated haptic interaction can enable tele-presence and virtual artefact tactile interaction from either remote or simulated environments. Automation, machine learning and data driven features play an important role in providing trainee-specific individual adaptive training content. Data from trainee assessment can form an input to autonomous systems for customised training and automated difficulty levels to match individual requirements. Self-adaptive technology has been developed previously within individual technologies of VR training. One of the conclusions of this research is that while it does not exist, an enhanced portable framework is needed and it would be beneficial to combine automation of core technologies, producing a reusable automation framework for VR training

    The IFMIF-DONES remote handling control system: Experimental setup for OPC UA integration

    Get PDF
    The devices used to carry out Remote Handling (RH) manipulation tasks in radiation environments address requirements that are significantly different from common robotic and industrial systems due to the lack of repetitive operations and incompletely specified control actions. This imposes the need of control with human-in -the-loop operations. These RH systems are used on facilities such PRIDE, CERN, ESS, ITER or IFMIF-DONES, the reference used for this work. For the RH system is crucial to provide high availability, robustness against radiation, haptic devices for teleoperation and dexterous operation, and smooth coordination and integration with the centralized control room. To achieve this purpose is necessary to find the best approach towards a standard control framework capable of providing a standard set of functionalities, tools, interfaces, communications, and data formats to the different types of mechatronic devices that are usually considered for Remote Handling tasks. This previous phase of homogenization is not considered in most facilities, which leads towards a costly integration process during the commissioning phase of the facility.In this paper, an approach to the IFMIF-DONES RH Control framework with strong standard support based on protocols such as OPC UA has been described and validated through an experimental setup. This test bench includes a set of physical devices (PLC, conveyor belt and computers) and a set of OPC UA compatible software tools, configured and operable from any node of the University of Granada network. This proof-of-concept mockup provides flexibility to modify the dimension and complexity of the setup by using new virtual or physical devices connected to a unique backbone. Besides, it will be used to test different aspects such as control schemes, failure injection, network modeling, predictive maintenance studies, operator training on simulated/ real scenarios, usability or ergonomics of the user interfaces before the deployment. In this contribution, the results are described and illustrated using a conveyor belt set-up, a small but representative reference used to validate the RH control concepts here proposed.European Union via the Euratom Research and Training Programme 101052200 - EUROfusio

    A white paper: NASA virtual environment research, applications, and technology

    Get PDF
    Research support for Virtual Environment technology development has been a part of NASA's human factors research program since 1985. Under the auspices of the Office of Aeronautics and Space Technology (OAST), initial funding was provided to the Aerospace Human Factors Research Division, Ames Research Center, which resulted in the origination of this technology. Since 1985, other Centers have begun using and developing this technology. At each research and space flight center, NASA missions have been major drivers of the technology. This White Paper was the joint effort of all the Centers which have been involved in the development of technology and its applications to their unique missions. Appendix A is the list of those who have worked to prepare the document, directed by Dr. Cynthia H. Null, Ames Research Center, and Dr. James P. Jenkins, NASA Headquarters. This White Paper describes the technology and its applications in NASA Centers (Chapters 1, 2 and 3), the potential roles it can take in NASA (Chapters 4 and 5), and a roadmap of the next 5 years (FY 1994-1998). The audience for this White Paper consists of managers, engineers, scientists and the general public with an interest in Virtual Environment technology. Those who read the paper will determine whether this roadmap, or others, are to be followed
    corecore