161 research outputs found

    An Ergonomics Investigation of the Application of Virtual Reality on Training for a Precision Task

    Get PDF
    Virtual reality is rapidly expanding its capabilities and accessibility to consumers. The application of virtual reality in training for precision tasks has been limited to specialized equipment such as a haptic glove or a haptic stylus, but not studied for handheld controllers in consumer-grade systems such as the HTC Vive. A straight-line precision steadiness task was adopted in virtual reality to emulate basic linear movements in industrial operations and disability rehabilitation. This study collected the total time and the error time for the straight-line task in both virtual reality and a physical control experiment for 48 participants. The task was performed at four different gap widths, 4mm, 5mm, 6mm, and 7mm, to see the effects of virtual reality at different levels of precision. Average error ratios were then calculated and analyzed for strong associations to various factors. The results indicated that a combination of Environment x Gap Width factors significantly affected average error ratios, with a p-value of 0.000. This human factors study also collected participants’ ratings of user experience dimensions, such as difficulty, comfort, strain, reliability, and effectiveness, for both physical and virtual environments in a questionnaire. The results indicate that the ratings for difficulty, reliability, and effectiveness were significantly different, with virtual reality rating consistently rating worse than the physical environment. An analysis of questionnaire responses indicates a significant association of overall environment preference (physical or virtual) with performance data, with a p-value of 0.027. In general, virtual reality yielded higher error among participants. As the difficulty of the task increased, the performance in virtual reality degraded significantly. Virtual reality has great potential for a variety of precision applications, but the technology in consumer-grade hardware must improve significantly to enable these applications. Virtual reality is difficult to implement without previous experience or specialized knowledge in programming, which makes the technology currently inaccessible for many people. Future work is needed to investigate a larger variety of precision tasks and movements to expand the body of knowledge of virtual reality applications for training purposes

    Bimanual Motor Strategies and Handedness Role During Human-Exoskeleton Haptic Interaction

    Full text link
    Bimanual object manipulation involves multiple visuo-haptic sensory feedbacks arising from the interaction with the environment that are managed from the central nervous system and consequently translated in motor commands. Kinematic strategies that occur during bimanual coupled tasks are still a scientific debate despite modern advances in haptics and robotics. Current technologies may have the potential to provide realistic scenarios involving the entire upper limb extremities during multi-joint movements but are not yet exploited to their full potential. The present study explores how hands dynamically interact when manipulating a shared object through the use of two impedance-controlled exoskeletons programmed to simulate bimanually coupled manipulation of virtual objects. We enrolled twenty-six participants (2 groups: right-handed and left-handed) who were requested to use both hands to grab simulated objects across the robot workspace and place them in specific locations. The virtual objects were rendered with different dynamic proprieties and textures influencing the manipulation strategies to complete the tasks. Results revealed that the roles of hands are related to the movement direction, the haptic features, and the handedness preference. Outcomes suggested that the haptic feedback affects bimanual strategies depending on the movement direction. However, left-handers show better control of the force applied between the two hands, probably due to environmental pressures for right-handed manipulations

    Innovative Learning Environments in STEM Higher Education

    Get PDF
    As explored in this open access book, higher education in STEM fields is influenced by many factors, including education research, government and school policies, financial considerations, technology limitations, and acceptance of innovations by faculty and students. In 2018, Drs. Ryoo and Winkelmann explored the opportunities, challenges, and future research initiatives of innovative learning environments (ILEs) in higher education STEM disciplines in their pioneering project: eXploring the Future of Innovative Learning Environments (X-FILEs). Workshop participants evaluated four main ILE categories: personalized and adaptive learning, multimodal learning formats, cross/extended reality (XR), and artificial intelligence (AI) and machine learning (ML). This open access book gathers the perspectives expressed during the X-FILEs workshop and its follow-up activities. It is designed to help inform education policy makers, researchers, developers, and practitioners about the adoption and implementation of ILEs in higher education

    Socially Cognizant Robotics for a Technology Enhanced Society

    Full text link
    Emerging applications of robotics, and concerns about their impact, require the research community to put human-centric objectives front-and-center. To meet this challenge, we advocate an interdisciplinary approach, socially cognizant robotics, which synthesizes technical and social science methods. We argue that this approach follows from the need to empower stakeholder participation (from synchronous human feedback to asynchronous societal assessment) in shaping AI-driven robot behavior at all levels, and leads to a range of novel research perspectives and problems both for improving robots' interactions with individuals and impacts on society. Drawing on these arguments, we develop best practices for socially cognizant robot design that balance traditional technology-based metrics (e.g. efficiency, precision and accuracy) with critically important, albeit challenging to measure, human and society-based metrics

    Facilitating Self-monitored Physical Rehabilitation with Virtual Reality and Haptic feedback

    Full text link
    Physical rehabilitation is essential to recovery from joint replacement operations. As a representation, total knee arthroplasty (TKA) requires patients to conduct intensive physical exercises to regain the knee's range of motion and muscle strength. However, current joint replacement physical rehabilitation methods rely highly on therapists for supervision, and existing computer-assisted systems lack consideration for enabling self-monitoring, making at-home physical rehabilitation difficult. In this paper, we investigated design recommendations that would enable self-monitored rehabilitation through clinical observations and focus group interviews with doctors and therapists. With this knowledge, we further explored Virtual Reality(VR)-based visual presentation and supplemental haptic motion guidance features in our implementation VReHab, a self-monitored and multimodal physical rehabilitation system with VR and vibrotactile and pneumatic feedback in a TKA rehabilitation context. We found that the third point of view real-time reconstructed motion on a virtual avatar overlaid with the target pose effectively provides motion awareness and guidance while haptic feedback helps enhance users' motion accuracy and stability. Finally, we implemented \systemname to facilitate self-monitored post-operative exercises and validated its effectiveness through a clinical study with 10 patients

    1st AAU Workshop on Human-Centered Robotics

    Get PDF
    • …
    corecore