2,908 research outputs found

    Applying forces to elastic network models of large biomolecules using a haptic feedback device

    Get PDF
    Elastic network models of biomolecules have proved to be relatively good at predicting global conformational changes particularly in large systems. Software that facilitates rapid and intuitive exploration of conformational change in elastic network models of large biomolecules in response to externally applied forces would therefore be of considerable use, particularly if the forces mimic those that arise in the interaction with a functional ligand. We have developed software that enables a user to apply forces to individual atoms of an elastic network model of a biomolecule through a haptic feedback device or a mouse. With a haptic feedback device the user feels the response to the applied force whilst seeing the biomolecule deform on the screen. Prior to the interactive session normal mode analysis is performed, or pre-calculated normal mode eigenvalues and eigenvectors are loaded. For large molecules this allows the memory and number of calculations to be reduced by employing the idea of the important subspace, a relatively small space of the first M lowest frequency normal mode eigenvectors within which a large proportion of the total fluctuation occurs. Using this approach it was possible to study GroEL on a standard PC as even though only 2.3% of the total number of eigenvectors could be used, they accounted for 50% of the total fluctuation. User testing has shown that the haptic version allows for much more rapid and intuitive exploration of the molecule than the mouse version

    Enabling audio-haptics

    Get PDF
    This thesis deals with possible solutions to facilitate orientation, navigation and overview of non-visual interfaces and virtual environments with the help of sound in combination with force-feedback haptics. Applications with haptic force-feedback, s

    Interactive exploration of historic information via gesture recognition

    Get PDF
    Developers of interactive exhibits often struggle to �nd appropriate input devices that enable intuitive control, permitting the visitors to engage e�ectively with the content. Recently motion sensing input devices like the Microsoft Kinect or Panasonic D-Imager have become available enabling gesture based control of computer systems. These devices present an attractive input device for exhibits since the user can interact with their hands and they are not required to physically touch any part of the system. In this thesis we investigate techniques to enable the raw data coming from these types of devices to be used to control an interactive exhibit. Object recognition and tracking techniques are used to analyse the user's hand where movement and clicks are processed. To show the e�ectiveness of the techniques the gesture system is used to control an interactive system designed to inform the public about iconic buildings in the centre of Norwich, UK. We evaluate two methods of making selections in the test environment. At the time of experimentation the technologies were relatively new to the image processing environment. As a result of the research presented in this thesis, the techniques and methods used have been detailed and published [3] at the VSMM (Virtual Systems and Multimedia 2012) conference with the intention of further forwarding the area

    Telelocomotion—remotely operated legged robots

    Get PDF
    © 2020 by the authors. Li-censee MDPI, Basel, Switzerland. Teleoperated systems enable human control of robotic proxies and are particularly amenable to inaccessible environments unsuitable for autonomy. Examples include emergency response, underwater manipulation, and robot assisted minimally invasive surgery. However, teleoperation architectures have been predominantly employed in manipulation tasks, and are thus only useful when the robot is within reach of the task. This work introduces the idea of extending teleoperation to enable online human remote control of legged robots, or telelocomotion, to traverse challenging terrain. Traversing unpredictable terrain remains a challenge for autonomous legged locomotion, as demonstrated by robots commonly falling in high-profile robotics contests. Telelocomotion can reduce the risk of mission failure by leveraging the high-level understanding of human operators to command in real-time the gaits of legged robots. In this work, a haptic telelocomotion interface was developed. Two within-user studies validate the proof-of-concept interface: (i) The first compared basic interfaces with the haptic interface for control of a simulated hexapedal robot in various levels of traversal complexity; (ii) the second presents a physical implementation and investigated the efficacy of the proposed haptic virtual fixtures. Results are promising to the use of haptic feedback for telelocomotion for complex traversal tasks

    COMPARATIVE STUDY OF HAPTIC AND VISUAL FEEDBACK FOR KINESTHETIC TRAINING TASKS

    Get PDF
    Haptics is the sense of simulating and applying the sense of human touch. Application of touch sensations is done with haptic interface devices. The past few years has seen the development of several haptic interface devices with a wide variety of technologies used in their design. This thesis introduces haptic technologies and includes a survey of haptic interface devices and technologies. An improvement in simulating and applying touch sensation when using the Quanser Haptic Wand with proSense is suggested in this work using a novel five degree-of-freedom algorithm. This approach uses two additional torques to enhance the three degree-of-freedom of force feedback currently available with these products. Modern surgical trainers for performing laparoscopic surgery are incorporating haptic feedback in addition to visual feedback for training. This work presents a quantitative comparison of haptic versus visual training. One of the key results of the study is that haptic feedback is better than visual feedback for kinesthetic navigation tasks

    Haptic Guidance for Extended Range Telepresence

    Get PDF
    A novel navigation assistance for extended range telepresence is presented. The haptic information from the target environment is augmented with guidance commands to assist the user in reaching desired goals in the arbitrarily large target environment from the spatially restricted user environment. Furthermore, a semi-mobile haptic interface was developed, one whose lightweight design and setup configuration atop the user provide for an absolutely safe operation and high force display quality

    3D interaction with scientific data : an experimental and perceptual approach

    Get PDF

    Designing for the dichotomy of immersion in location based games

    Get PDF
    The interaction design of mixed reality location based games typically focuses upon the digital content of the mobile screen, as this is characteristically the primary navigational tool players use to traverse the game space. This emphasis on the digital over the physical means the opportunity for player immersion in mixed reality games is often limited to the single (digital) dimension. This research seeks to redress this imbalance, which is caused, in part, by the requirement for the player?s attention to be systematically switched between the two worlds, defined in this research as the ?Dichotomy of Immersion?. Using different design strategies we propose minimising the reliance of the player upon the mobile screen by encouraging greater observation of their physical surroundings. Using a ?research through design? approach for the mixed reality game PAC-LAN: Zombie Apocalypse, we illustrate design strategies for increasing immersion in location based games, which we believe will aid designers in enabling players to more readily engage with the physical context of the game and thus facilitate richer game experiences
    corecore