1,871 research outputs found

    Exploration of Reaction Pathways and Chemical Transformation Networks

    Full text link
    For the investigation of chemical reaction networks, the identification of all relevant intermediates and elementary reactions is mandatory. Many algorithmic approaches exist that perform explorations efficiently and automatedly. These approaches differ in their application range, the level of completeness of the exploration, as well as the amount of heuristics and human intervention required. Here, we describe and compare the different approaches based on these criteria. Future directions leveraging the strengths of chemical heuristics, human interaction, and physical rigor are discussed.Comment: 48 pages, 4 figure

    Novel Actuation Methods for High Force Haptics

    Get PDF

    Pseudo-haptics survey: Human-computer interaction in extended reality & teleoperation

    Get PDF
    Pseudo-haptic techniques are becoming increasingly popular in human-computer interaction. They replicate haptic sensations by leveraging primarily visual feedback rather than mechanical actuators. These techniques bridge the gap between the real and virtual worlds by exploring the brain’s ability to integrate visual and haptic information. One of the many advantages of pseudo-haptic techniques is that they are cost-effective, portable, and flexible. They eliminate the need for direct attachment of haptic devices to the body, which can be heavy and large and require a lot of power and maintenance. Recent research has focused on applying these techniques to extended reality and mid-air interactions. To better understand the potential of pseudo-haptic techniques, the authors developed a novel taxonomy encompassing tactile feedback, kinesthetic feedback, and combined categories in multimodal approaches, ground not covered by previous surveys. This survey highlights multimodal strategies and potential avenues for future studies, particularly regarding integrating these techniques into extended reality and collaborative virtual environments.info:eu-repo/semantics/publishedVersio

    Direct Manipulation Of Virtual Objects

    Get PDF
    Interacting with a Virtual Environment (VE) generally requires the user to correctly perceive the relative position and orientation of virtual objects. For applications requiring interaction in personal space, the user may also need to accurately judge the position of the virtual object relative to that of a real object, for example, a virtual button and the user\u27s real hand. This is difficult since VEs generally only provide a subset of the cues experienced in the real world. Complicating matters further, VEs presented by currently available visual displays may be inaccurate or distorted due to technological limitations. Fundamental physiological and psychological aspects of vision as they pertain to the task of object manipulation were thoroughly reviewed. Other sensory modalities--proprioception, haptics, and audition--and their cross-interactions with each other and with vision are briefly discussed. Visual display technologies, the primary component of any VE, were canvassed and compared. Current applications and research were gathered and categorized by different VE types and object interaction techniques. While object interaction research abounds in the literature, pockets of research gaps remain. Direct, dexterous, manual interaction with virtual objects in Mixed Reality (MR), where the real, seen hand accurately and effectively interacts with virtual objects, has not yet been fully quantified. An experimental test bed was designed to provide the highest accuracy attainable for salient visual cues in personal space. Optical alignment and user calibration were carefully performed. The test bed accommodated the full continuum of VE types and sensory modalities for comprehensive comparison studies. Experimental designs included two sets, each measuring depth perception and object interaction. The first set addressed the extreme end points of the Reality-Virtuality (R-V) continuum--Immersive Virtual Environment (IVE) and Reality Environment (RE). This validated, linked, and extended several previous research findings, using one common test bed and participant pool. The results provided a proven method and solid reference points for further research. The second set of experiments leveraged the first to explore the full R-V spectrum and included additional, relevant sensory modalities. It consisted of two full-factorial experiments providing for rich data and key insights into the effect of each type of environment and each modality on accuracy and timeliness of virtual object interaction. The empirical results clearly showed that mean depth perception error in personal space was less than four millimeters whether the stimuli presented were real, virtual, or mixed. Likewise, mean error for the simple task of pushing a button was less than four millimeters whether the button was real or virtual. Mean task completion time was less than one second. Key to the high accuracy and quick task performance time observed was the correct presentation of the visual cues, including occlusion, stereoscopy, accommodation, and convergence. With performance results already near optimal level with accurate visual cues presented, adding proprioception, audio, and haptic cues did not significantly improve performance. Recommendations for future research include enhancement of the visual display and further experiments with more complex tasks and additional control variables

    THE EFFECT OF HAPTIC INTERACTION AND LEARNER CONTROL ON STUDENT PERFORMANCE IN AN ONLINE DISTANCE EDUCATION COURSE

    Get PDF
    Today’s learners are taking advantage of a whole new world of multimedia and hypermedia experiences to gain understanding and construct knowledge. While at the same time, teachers and instructional designers are producing these experiences at rapid paces. Many angles of interactivity with digital content continue to be researched, as is the case with this study. The purpose of this study is to determine whether there is a significant difference in the performance of distance education students who exercise learner control interactivity effectively through a traditional input device versus students who exercise learner control interactivity through haptic input methods. This study asks three main questions about the relationship and potential impact touch input had on the interactivity sequence a learner chooses while participating in an online distance education course. Effects were measured by using criterion from logged assessments within one module of a distance education course. This study concludes that learner control sequence choices did have significant effects on learner outcomes. However, input method did not. The sequence that learners chose had positive effects on scores, the number of attempts it took to pass assessments, and the overall range of scores per assessment attempts. Touch input learners performed as well as traditional input learners, and summative first sequence learners outperformed all other learners. These findings support the beliefs that new input methods are not detrimental and that learner-controlled options while participating in digital online courses are valuable for learners, under certain conditions

    3D interaction with scientific data : an experimental and perceptual approach

    Get PDF

    Perceptual Issues Improve Haptic Systems Performance

    Get PDF

    An Ergonomics Investigation of the Application of Virtual Reality on Training for a Precision Task

    Get PDF
    Virtual reality is rapidly expanding its capabilities and accessibility to consumers. The application of virtual reality in training for precision tasks has been limited to specialized equipment such as a haptic glove or a haptic stylus, but not studied for handheld controllers in consumer-grade systems such as the HTC Vive. A straight-line precision steadiness task was adopted in virtual reality to emulate basic linear movements in industrial operations and disability rehabilitation. This study collected the total time and the error time for the straight-line task in both virtual reality and a physical control experiment for 48 participants. The task was performed at four different gap widths, 4mm, 5mm, 6mm, and 7mm, to see the effects of virtual reality at different levels of precision. Average error ratios were then calculated and analyzed for strong associations to various factors. The results indicated that a combination of Environment x Gap Width factors significantly affected average error ratios, with a p-value of 0.000. This human factors study also collected participants’ ratings of user experience dimensions, such as difficulty, comfort, strain, reliability, and effectiveness, for both physical and virtual environments in a questionnaire. The results indicate that the ratings for difficulty, reliability, and effectiveness were significantly different, with virtual reality rating consistently rating worse than the physical environment. An analysis of questionnaire responses indicates a significant association of overall environment preference (physical or virtual) with performance data, with a p-value of 0.027. In general, virtual reality yielded higher error among participants. As the difficulty of the task increased, the performance in virtual reality degraded significantly. Virtual reality has great potential for a variety of precision applications, but the technology in consumer-grade hardware must improve significantly to enable these applications. Virtual reality is difficult to implement without previous experience or specialized knowledge in programming, which makes the technology currently inaccessible for many people. Future work is needed to investigate a larger variety of precision tasks and movements to expand the body of knowledge of virtual reality applications for training purposes
    • …
    corecore