14 research outputs found

    A survey of haptics in serious gaming

    Get PDF
    Serious gaming often requires high level of realism for training and learning purposes. Haptic technology has been proved to be useful in many applications with an additional perception modality complementary to the audio and the vision. It provides novel user experience to enhance the immersion of virtual reality with a physical control-layer. This survey focuses on the haptic technology and its applications in serious gaming. Several categories of related applications are listed and discussed in details, primarily on haptics acts as cognitive aux and main component in serious games design. We categorize haptic devices into tactile, force feedback and hybrid ones to suit different haptic interfaces, followed by description of common haptic gadgets in gaming. Haptic modeling methods, in particular, available SDKs or libraries either for commercial or academic usage, are summarized. We also analyze the existing research difficulties and technology bottleneck with haptics and foresee the future research directions

    Multimodality with Eye tracking and Haptics: A New Horizon for Serious Games?

    Get PDF
    The goal of this review is to illustrate the emerging use of multimodal virtual reality that can benefit learning-based games. The review begins with an introduction to multimodal virtual reality in serious games and we provide a brief discussion of why cognitive processes involved in learning and training are enhanced under immersive virtual environments. We initially outline studies that have used eye tracking and haptic feedback independently in serious games, and then review some innovative applications that have already combined eye tracking and haptic devices in order to provide applicable multimodal frameworks for learning-based games. Finally, some general conclusions are identified and clarified in order to advance current understanding in multimodal serious game production as well as exploring possible areas for new applications

    Combining 3-D geovisualization with force feedback driven user interaction

    Full text link
    We describe a prototype software system for investigating novel human-computer interaction techniques for 3-D geospatial data. This system, M4-Geo (Multi-Modal Mesh Manipulation of Geospatial data), aims to provide a more intuitive interface for directly manipulating 3-D surface data, such as digital terrain models (DTM). The M4-Geo system takes place within a 3-D environment and uses a Phantom haptic force feedback device to enhance 3-D computer graphics with touch-based interactions. The Phantom uses a 3-D force feedback stylus, which acts as a virtual “finger tip ” that allows the user to feel the shape (morphology) of the terrain’s surface in great detail. In addition, it acts as a touch sensitive tool for different GIS tasks, such as digitizing (draping) of lines and polygons directly onto a 3-D surface and directly deforming surfaces (by pushing or pulling the stylus in or out). The user may adjust the properties of the surface deformation (e.g., soft or hard) locally by painting it with a special “material color.” The overlap of visual and force representation of 3-D data aides hand-eye coordination for these tasks and helps the user to perceive the 3-D spatial data in a more holistic, multi-sensory way. The use of such a 3-D force feedback device for direct interaction may thus provide more intuitive and efficient alternatives to the mouse and keyboards driven interactions common today, in particular in areas related to digital landscape design, surface hydrology and geotechnical engineering

    Expressive cutting, deforming, and painting of three-dimensional digital shapes through asymmetric bimanual haptic manipulation

    Get PDF
    Practitioners of the geosciences, design, and engineering disciplines communicate complex ideas about shape by manipulating three-dimensional digital objects to match their conceptual model. However, the two-dimensional control interfaces, common in software applications, create a disconnect to three-dimensional manipulations. This research examines cutting, deforming, and painting manipulations for expressive three-dimensional interaction. It presents a cutting algorithm specialized for planning cuts on a triangle mesh, the extension of a deformation algorithm for inhomogeneous meshes, and the definition of inhomogeneous meshes by painting into a deformation property map. This thesis explores two-handed interactions with haptic force-feedback where each hand can fulfill an asymmetric bimanual role. These digital shape manipulations demonstrate a step toward the creation of expressive three-dimensional interactions

    Large Deformation Object Modeling Using Finite Element Method And Proper Orthogonal Decomposition For Haptic Robots

    Get PDF
    Tez (Yüksek Lisans) -- İstanbul Teknik Üniversitesi, Fen Bilimleri Enstitüsü, 2008Thesis (M.Sc.) -- İstanbul Technical University, Institute of Science and Technology, 2008Bu çalışmada, hissedici arabirimler ve bu arabirimlerde kullanılan hesaplama metotları incelenmiştir. Bu amaçla doğrultusunda, yüksek deformasyon özelliğine sahip doğrusal olmayan bir kirişin modeli sonlu elemanlar metodu kullanılarak elde edilmiştir ve bu model gerçek zamalı olarak PHANTOM® Premium 6 DOF hissedici arabirimi ile etkileşime geçirilmiştir. Etkileşimi elde etmek amacıyla, kiriş modeli OpenGL kütüphanesi kullanılarak görselleştirilmiştir ve cihaza OpenHaptics kütüphanesinin HDAPI fonksiyonları kullanılarak hükmedilmiştir. Hissedici cihazların ihtiyaç duyduğu yüksek hesaplama hızlarını elde edebilmek amacıyla uygun ortogonal ayrıştırma metodunu kullanarak düşük mertebeli model elde edilmiştir. Her iki modelin davranışı incelendiginde uygun orthogonal ayrıştırma metodunun, orjinal model davranışı gösterdiği saptanmış ve hesaplama zamanlarının büyük oranda azaldığı görülmüştür.In this study, haptic systems are introduced with investigation of haptic interfaces and haptic rendering. To this end, a large deformation real time beam model is developed and integrated with the PHANTOM® Premium 6 DOF haptic robot. OpenGL library is used as a visualization tool of the model and the haptic robot is manipulated using libraries of OpenHaptics named as HDAPI. In order to obtain high computational demands of the haptic systems, Proper Orthogonal Decomposition method is used to obtain a low order model. Investigations of both models have revealed that lower order model behaves exactly in a similar manner as the original model with reduced computational effort.Yüksek LisansM.Sc

    Enhancing the use of Haptic Devices in Education and Entertainment

    Get PDF
    This research was part of the two-years Horizon 2020 European Project "weDRAW". The aim of the project was that "specific sensory systems have specific roles to learn specific concepts". This work explores the use of the haptic modality, stimulated by the means of force-feedback devices, to convey abstract concepts inside virtual reality. After a review of the current use of haptic devices in education, available haptic software and game engines, we focus on the implementation of an haptic plugin for game engines (HPGE, based on state of the art rendering library CHAI3D) and its evaluation in human perception experiments and multisensory integration

    Research on real-time physics-based deformation for haptic-enabled medical simulation

    Full text link
    This study developed a multiple effective visuo-haptic surgical engine to handle a variety of surgical manipulations in real-time. Soft tissue models are based on biomechanical experiment and continuum mechanics for greater accuracy. Such models will increase the realism of future training systems and the VR/AR/MR implementations for the operating room

    A Soft touch: wearable dielectric elastomer actuated multi-finger soft tactile displays

    Get PDF
    PhDThe haptic modality in human-computer interfaces is significantly underutilised when compared to that of vision and sound. A potential reason for this is the difficulty in turning computer-generated signals into realistic sensations of touch. Moreover, wearable solutions that can be mounted onto multiple fingertips whilst still allowing for the free dexterous movements of the user’s hand, brings an even higher level of complexity. In order to be wearable, such devices should not only be compact, lightweight and energy efficient; but also, be able to render compelling tactile sensations. Current solutions are unable to meet these criteria, typically due to the actuation mechanisms employed. Aimed at addressing these needs, this work presents research into non-vibratory multi-finger wearable tactile displays, through the use of an improved configuration of a dielectric elastomer actuator. The described displays render forces through a soft bubble-like interface worn on the fingertip. Due to the improved design, forces of up to 1N can be generated in a form factor of 20 x 12 x 23 mm, with a weight of only 6g, demonstrating a significant performance increase in force output and wearability over existing tactile rendering systems. Furthermore, it is shown how these compact wearable devices can be used in conjunction with low-cost commercial optical hand tracking sensors, to cater for simple although accurate tactile interactions within virtual environments, using affordable instrumentation. The whole system makes it possible for users to interact with virtually generated soft body objects with programmable tactile properties. Through a 15-participant study, the system has been validated for three distinct types of touch interaction, including palpation and pinching of virtual deformable objects. Through this investigation, it is believed that this approach could have a significant impact within virtual and augmented reality interaction for purposes of medical simulation, professional training and improved tactile feedback in telerobotic control systems.Engineering and Physical Sciences Research Council (EPSRC) Doctoral Training Centre EP/G03723X/
    corecore