2,570 research outputs found

    Using virtual reality and 3D industrial numerical models for immersive interactive checklists

    Get PDF
    At the different stages of the PLM, companies develop numerous checklist-based procedures involving prototype inspection and testing. Besides, techniques from CAD, 3D imaging, animation and virtual reality now form a mature set of tools for industrial applications. The work presented in this article develops a unique framework for immersive checklist-based project reviews that applies to all steps of the PLM. It combines immersive navigation in the checklist, virtual experiments when needed and multimedia update of the checklist. It provides a generic tool, independent of the considered checklist, relies on the integration of various VR tools and concepts, in a modular way, and uses an original gesture recognition. Feasibility experiments are presented, validating the benefits of the approach

    An Augmented Interaction Strategy For Designing Human-Machine Interfaces For Hydraulic Excavators

    Get PDF
    Lack of adequate information feedback and work visibility, and fatigue due to repetition have been identified as the major usability gaps in the human-machine interface (HMI) design of modern hydraulic excavators that subject operators to undue mental and physical workload, resulting in poor performance. To address these gaps, this work proposed an innovative interaction strategy, termed “augmented interaction”, for enhancing the usability of the hydraulic excavator. Augmented interaction involves the embodiment of heads-up display and coordinated control schemes into an efficient, effective and safe HMI. Augmented interaction was demonstrated using a framework consisting of three phases: Design, Implementation/Visualization, and Evaluation (D.IV.E). Guided by this framework, two alternative HMI design concepts (Design A: featuring heads-up display and coordinated control; and Design B: featuring heads-up display and joystick controls) in addition to the existing HMI design (Design C: featuring monitor display and joystick controls) were prototyped. A mixed reality seating buck simulator, named the Hydraulic Excavator Augmented Reality Simulator (H.E.A.R.S), was used to implement the designs and simulate a work environment along with a rock excavation task scenario. A usability evaluation was conducted with twenty participants to characterize the impact of the new HMI types using quantitative (task completion time, TCT; and operating error, OER) and qualitative (subjective workload and user preference) metrics. The results indicated that participants had a shorter TCT with Design A. For OER, there was a lower error probability due to collisions (PER1) with Design A, and lower error probability due to misses (PER2)with Design B. The subjective measures showed a lower overall workload and a high preference for Design B. It was concluded that augmented interaction provides a viable solution for enhancing the usability of the HMI of a hydraulic excavator

    Thought-controlled games with brain-computer interfaces

    Get PDF
    Nowadays, EEG based BCI systems are starting to gain ground in games for health research. With reduced costs and promising an innovative and exciting new interaction paradigm, attracted developers and researchers to use them on video games for serious applications. However, with researchers focusing mostly on the signal processing part, the interaction aspect of the BCIs has been neglected. A gap between classification performance and online control quality for BCI based systems has been created by this research disparity, resulting in suboptimal interactions that lead to user fatigue and loss of motivation over time. Motor-Imagery (MI) based BCIs interaction paradigms can provide an alternative way to overcome motor-related disabilities, and is being deployed in the health environment to promote the functional and structural plasticity of the brain. A BCI system in a neurorehabilitation environment, should not only have a high classification performance, but should also provoke a high level of engagement and sense of control to the user, for it to be advantageous. It should also maximize the level of control on user’s actions, while not requiring them to be subject to long training periods on each specific BCI system. This thesis has two main contributions, the Adaptive Performance Engine, a system we developed that can provide up to 20% improvement to user specific performance, and NeuRow, an immersive Virtual Reality environment for motor neurorehabilitation that consists of a closed neurofeedback interaction loop based on MI and multimodal feedback while using a state-of-the-art Head Mounted Display.Hoje em dia, os sistemas BCI baseados em EEG estão a começar a ganhar terreno em jogos relacionados com a saúde. Com custos reduzidos e prometendo um novo e inovador paradigma de interação, atraiu programadores e investigadores para usá-los em vídeo jogos para aplicações sérias. No entanto, com os investigadores focados principalmente na parte do processamento de sinal, o aspeto de interação dos BCI foi negligenciado. Um fosso entre o desempenho da classificação e a qualidade do controle on-line para sistemas baseados em BCI foi criado por esta disparidade de pesquisa, resultando em interações subótimas que levam à fadiga do usuário e à perda de motivação ao longo do tempo. Os paradigmas de interação BCI baseados em imagética motora (IM) podem fornecer uma maneira alternativa de superar incapacidades motoras, e estão sendo implementados no sector da saúde para promover plasticidade cerebral funcional e estrutural. Um sistema BCI usado num ambiente de neuro-reabilitação, para que seja vantajoso, não só deve ter um alto desempenho de classificação, mas também deve promover um elevado nível de envolvimento e sensação de controlo ao utilizador. Também deve maximizar o nível de controlo nas ações do utilizador, sem exigir que sejam submetidos a longos períodos de treino em cada sistema BCI específico. Esta tese tem duas contribuições principais, o Adaptive Performance Engine, um sistema que desenvolvemos e que pode fornecer até 20% de melhoria para o desempenho específico do usuário, e NeuRow, um ambiente imersivo de Realidade Virtual para neuro-reabilitação motora, que consiste num circuito fechado de interação de neuro-feedback baseado em IM e feedback multimodal e usando um Head Mounted Display de última geração

    Laser Graphics in Augmented Reality Applications for Real- World Robot Deployment

    Get PDF
    Lasers are powerful light source. With their thin shafts of bright light and colours, laser beams can provide a dazzling display matching that of outdoor fireworks. With computer assistance, animated laser graphics can generate eye-catching images against a dark sky. Due to technology constraints, laser images are outlines without any interior fill or detail. On a more functional note, lasers assist in the alignment of components, during installation

    Hands-Free Control Interfaces for an Extra Vehicular Jetpack

    Get PDF
    The National Aeronautics and Space Administration (NASA) strategic vision includes, as part of its long-term goals, the exploration of deep space and Near Earth Asteroids (NEA). To support these endeavors, funds have been invested in research to develop advanced exploration capabilities. To enable the human mobility necessary to effectively explore NEA and deep space, a new extravehicular activity (EVA) Jetpack is under development at the Johnson Space Center. The new design leverages knowledge and experience gained from the current astronaut rescue device, the Simplified Aid for EVA Rescue (SAFER). Whereas the primary goal for a rescue device is to return the crew to a safe haven, in-space exploration and navigation requires an expanded set of capabilities. To accommodate the range of tasks astronauts may be expected to perform while utilizing the Jetpack, it was desired to offer a hands-free method of control. This paper describes the development and innovations involved in creating two hands-free control interfaces and an experimental test platform for a suited astronaut flying the Jetpack during an EVA

    Systematic literature review of hand gestures used in human computer interaction interfaces

    Get PDF
    Gestures, widely accepted as a humans' natural mode of interaction with their surroundings, have been considered for use in human-computer based interfaces since the early 1980s. They have been explored and implemented, with a range of success and maturity levels, in a variety of fields, facilitated by a multitude of technologies. Underpinning gesture theory however focuses on gestures performed simultaneously with speech, and majority of gesture based interfaces are supported by other modes of interaction. This article reports the results of a systematic review undertaken to identify characteristics of touchless/in-air hand gestures used in interaction interfaces. 148 articles were reviewed reporting on gesture-based interaction interfaces, identified through searching engineering and science databases (Engineering Village, Pro Quest, Science Direct, Scopus and Web of Science). The goal of the review was to map the field of gesture-based interfaces, investigate the patterns in gesture use, and identify common combinations of gestures for different combinations of applications and technologies. From the review, the community seems disparate with little evidence of building upon prior work and a fundamental framework of gesture-based interaction is not evident. However, the findings can help inform future developments and provide valuable information about the benefits and drawbacks of different approaches. It was further found that the nature and appropriateness of gestures used was not a primary factor in gesture elicitation when designing gesture based systems, and that ease of technology implementation often took precedence

    Recent Advancements in Augmented Reality for Robotic Applications: A Survey

    Get PDF
    Robots are expanding from industrial applications to daily life, in areas such as medical robotics, rehabilitative robotics, social robotics, and mobile/aerial robotics systems. In recent years, augmented reality (AR) has been integrated into many robotic applications, including medical, industrial, human–robot interactions, and collaboration scenarios. In this work, AR for both medical and industrial robot applications is reviewed and summarized. For medical robot applications, we investigated the integration of AR in (1) preoperative and surgical task planning; (2) image-guided robotic surgery; (3) surgical training and simulation; and (4) telesurgery. AR for industrial scenarios is reviewed in (1) human–robot interactions and collaborations; (2) path planning and task allocation; (3) training and simulation; and (4) teleoperation control/assistance. In addition, the limitations and challenges are discussed. Overall, this article serves as a valuable resource for working in the field of AR and robotic research, offering insights into the recent state of the art and prospects for improvement

    Smart operators: How augmented and virtual technologies are affecting the worker's performance in manufacturing contexts

    Get PDF
    Purpose: The correct interaction between the workforce and augmented, virtual, and mixed reality technologies represents a crucial aspect of the success of the smart factory. This interaction is, indeed, affected by the variability of human behavior and its reliability, which can strongly influence the quality, safety, and productivity standards. For this reason, this paper aims to provide a clear and complete analysis of the impacts of these technologies on the performance of operators. Design/methodology/approach: A Systematic Literature Review (SLR) was conducted to identify peer-reviewed papers that focused on the implementation of augmented and virtual technologies in manufacturing systems and their effects on human performance. Findings: In total, 61 papers were selected and thoroughly analyzed. The findings of this study reveal that Augmented, Virtual and Mixed Reality can be applied for several applications in manufacturing systems with different types of devices, that involve various advantages and disadvantages. The worker’s performance that are influencing by the use of these technologies are above all time to complete a task, error rate and mental and physical workload. Originality/value: Over the years Augmented, Virtual and Mixed Reality technologies in manufacturing systems have been investigated by researchers. Several studies mostly focused on technological issues, have been conducted. The role of the operator, whose tasks may be influenced positively or negatively by the use of new devices, has been hardly ever analyzed and a deep analysis of human performance affected by these technologies is missing. This study represents a preliminary analysis to fill this gap. The results obtained from the SLR allowed us to develop a conceptual framework that investigates the current state-of-the-art knowledge about the topic and highlights gaps in the current researchPeer Reviewe

    Virtual Reality and Its Application in Education

    Get PDF
    Virtual reality is a set of technologies that enables two-way communication, from computer to user and vice versa. In one direction, technologies are used to synthesize visual, auditory, tactile, and sometimes other sensory experiences in order to provide the illusion that practically non-existent things can be seen, heard, touched, or otherwise felt. In the other direction, technologies are used to adequately record human movements, sounds, or other potential input data that computers can process and use. This book contains six chapters that cover topics including definitions and principles of VR, devices, educational design principles for effective use of VR, technology education, and use of VR in technical and natural sciences

    VANET Applications: Hot Use Cases

    Get PDF
    Current challenges of car manufacturers are to make roads safe, to achieve free flowing traffic with few congestions, and to reduce pollution by an effective fuel use. To reach these goals, many improvements are performed in-car, but more and more approaches rely on connected cars with communication capabilities between cars, with an infrastructure, or with IoT devices. Monitoring and coordinating vehicles allow then to compute intelligent ways of transportation. Connected cars have introduced a new way of thinking cars - not only as a mean for a driver to go from A to B, but as smart cars - a user extension like the smartphone today. In this report, we introduce concepts and specific vocabulary in order to classify current innovations or ideas on the emerging topic of smart car. We present a graphical categorization showing this evolution in function of the societal evolution. Different perspectives are adopted: a vehicle-centric view, a vehicle-network view, and a user-centric view; described by simple and complex use-cases and illustrated by a list of emerging and current projects from the academic and industrial worlds. We identified an empty space in innovation between the user and his car: paradoxically even if they are both in interaction, they are separated through different application uses. Future challenge is to interlace social concerns of the user within an intelligent and efficient driving
    • …
    corecore