68 research outputs found

    Collaborative billiARds: Towards the Ultimate Gaming Experience

    Full text link
    Abstract. In this paper, we identify the features that enhance gaming experience in Augmented Reality (AR) environments. These include Tangibl

    Collaborative billiARds: Towards the ultimate gaming experience

    Get PDF
    Abstract. In this paper, we identify the features that enhance gaming experience in Augmented Reality (AR) environments. These include Tangible User Interface, force-feedback, audio-visual cues, collaboration and mobility. We base our findings on lessons learnt from existing AR games. We apply these results to billiARds which is an AR system that, in addition to visual and aural cues, provides force-feedback. billiARds supports interaction through a visionbased tangible AR interface. Two users can easily operate the proposed system while playing Collaborative billiARds game around a table. The users can collaborate through both virtual and real objects. User study confirmed that the resulting system delivers enhanced gaming experience by supporting the five features highlighted in this paper

    Haptics: state of the art survey

    Get PDF
    This paper presents a novel approach to the understanding of Haptic and its related fields where haptics is used extensively like in display systems, communication, different types of haptic devices, and interconnection of haptic displays where virtual environment should feel like equivalent physical systems. There have been escalating research interests on areas relating to haptic modality in recent years, towards multiple fields. However, there seems to be limited studies in determining the various subfields and interfacing and related information on haptic user interfaces and its influence on the fields mentioned. This paper aims to bring forth the theory behind the essence of Haptics and its Subfields like haptic interfaces and its applications

    Implementation of an Intelligent Force Feedback Multimedia Game

    Get PDF
    This is the published version. Copyright De GruyterThis paper presents the design and programming of an intelligent multimedia computer game, enhanced with force feedback. The augmentation of game images and sounds with appropriate force feedback improves the quality of the game, making it more interesting and more interactive. We used the Immersion Corporation's force feedback joystick, the I-FORCE Studio computation engine, and the Microsoft DirectX Software Development Kit (SDK) to design the multimedia game, running in the Windows NT operating system. In this game, the world contains circles of different sizes and masses. When the circles hit each other, collisions take place, which are shown to, and felt by, the user. When the circles hit together, the overall score increases; the larger the size of the circle, the higher the score increase. The initial score is set to zero, and when the game ends, a lower score represents a better performance. This game was used to examine the behavior of the users under different environments through their respective scores and comments. The analysis of experimental results helps in the comparative study of different kinds of multimedia combinations

    Virtual Reality Games for Motor Rehabilitation

    Get PDF
    This paper presents a fuzzy logic based method to track user satisfaction without the need for devices to monitor users physiological conditions. User satisfaction is the key to any product’s acceptance; computer applications and video games provide a unique opportunity to provide a tailored environment for each user to better suit their needs. We have implemented a non-adaptive fuzzy logic model of emotion, based on the emotional component of the Fuzzy Logic Adaptive Model of Emotion (FLAME) proposed by El-Nasr, to estimate player emotion in UnrealTournament 2004. In this paper we describe the implementation of this system and present the results of one of several play tests. Our research contradicts the current literature that suggests physiological measurements are needed. We show that it is possible to use a software only method to estimate user emotion

    Framework of controlling 3d virtual human emotional walking using BCI

    Get PDF
    A Brain-Computer Interface (BCI) is the device that can read and acquire the brain activities. A human body is controlled by Brain-Signals, which considered as a main controller. Furthermore, the human emotions and thoughts will be translated by brain through brain signals and expressed as human mood. This controlling process mainly performed through brain signals, the brain signals is a key component in electroencephalogram (EEG). Based on signal processing the features representing human mood (behavior) could be extracted with emotion as a major feature. This paper proposes a new framework in order to recognize the human inner emotions that have been conducted on the basis of EEG signals using a BCI device controller. This framework go through five steps starting by classifying the brain signal after reading it in order to obtain the emotion, then map the emotion, synchronize the animation of the 3D virtual human, test and evaluate the work. Based on our best knowledge there is no framework for controlling the 3D virtual human. As a result for implementing our framework will enhance the game field of enhancing and controlling the 3D virtual humans’ emotion walking in order to enhance and bring more realistic as well. Commercial games and Augmented Reality systems are possible beneficiaries of this technique. © 2015 Penerbit UTM Press. All rights reserved

    Enhancing the E-Commerce Experience through Haptic Feedback Interaction

    Get PDF
    The sense of touch is important in our everyday lives and its absence makes it difficult to explore and manipulate everyday objects. Existing online shopping practice lacks the opportunity for physical evaluation, that people often use and value when making product choices. However, with recent advances in haptic research and technology, it is possible to simulate various physical properties such as heaviness, softness, deformation, and temperature. The research described here investigates the use of haptic feedback interaction to enhance e-commerce product evaluation, particularly haptic weight and texture evaluation. While other properties are equally important, besides being fundamental to the shopping experience of many online products, weight and texture can be simulated using cost-effective devices. Two initial psychophysical experiments were conducted using free motion haptic exploration in order to more closely resemble conventional shopping. One experiment was to measure weight force thresholds and another to measure texture force thresholds. The measurements can provide better understanding of haptic device limitation for online shopping in terms of the availability of different stimuli to represent physical products. The outcomes of the initial psychophysical experimental studies were then used to produce various absolute stimuli that were used in a comparative experimental study to evaluate user experience of haptic product evaluation. Although free haptic exploration was exercised on both psychophysical experiments, results were relatively consistent with previous work on haptic discrimination. The threshold for weight force discrimination represented as downward forces was 10 percent. The threshold for texture force discrimination represented as friction forces was 14.1 percent, when using dynamic coefficient of friction at any level of static coefficient of friction. On the other hand, the comparative experimental study to evaluate user experience of haptic product information indicated that haptic product evaluation does not change user performance significantly. However, although there was an increase in the time taken to complete the task, the number of button click actions tended to decrease. The results showed that haptic product evaluation could significantly increase the confidence of shopping decision. Nevertheless, the availability of haptic product evaluation does not necessarily impose different product choices but it complements other selection criteria such as price and appearance. The research findings from this work are a first step towards exploring haptic-based environments in e-commerce environments. The findings not only lay the foundation for designing online haptic shopping but also provide empirical support to research in this direction

    Quality-controlled audio-visual depth in stereoscopic 3D media

    Get PDF
    BACKGROUND: The literature proposes several algorithms that produce “quality-controlled” stereoscopic depth in 3D films by limiting the stereoscopic depth to a defined depth budget. Like stereoscopic displays, spatial sound systems provide the listener with enhanced (auditory) depth cues, and are now commercially available in multiple forms. AIM: We investigate the implications of introducing auditory depth cues to quality-controlled 3D media, by asking: “Is it important to quality-control audio-visual depth by considering audio-visual interactions, when integrating stereoscopic display and spatial sound systems?” MOTIVATION: There are several reports in literature of such “audio-visual interactions”, in which visual and auditory perception influence each other. We seek to answer our research question by investigating whether these audio-visual interactions could extend the depth budget used in quality-controlled 3D media. METHOD/CONCLUSIONS: The related literature is reviewed before presenting four novel experiments that build upon each other’s conclusions. In the first experiment, we show that content created with a stereoscopic depth budget creates measurable positive changes in audiences’ attitude towards 3D films. These changes are repeatable for different locations, displays and content. In the second experiment we calibrate an audio-visual display system and use it to measure the minimum audible depth difference. Our data is used to formulate recommendations for content designers and systems engineers. These recommendations include the design of an auditory depth perception screening test. We then show that an auditory-visual stimulus with a nearer auditory depth is perceived as nearer. We measure the impact of this effect upon a relative depth judgement, and investigate how the impact varies with audio-visual depth separation. Finally, the size of the cross-modal bias in depth is measured, from which we conclude that sound does have the potential to extend the depth budget by a small, but perceivable, amount
    • 

    corecore