8,062 research outputs found

    Predictive text entry in immersive environments

    Get PDF
    One of the classic problems with immersive environments is data entry; with a head-mounted display (HMD) the user can no longer see the keyboard. Although for many applications data entry is not a requirement, for some it is essential: communicating in collaborative environments, entering a filename to which work can be saved, or accessing system controls. Combining data gloves and a graphically represented keyboard with a predictive spelling paradigm, we describe an effective text entry technique for immersive environments; we explore the key issues when using such a technique, and report the results of preliminary usability testing

    Serious interface design for dental health: Wiimote-based tangible interaction for school children

    Get PDF
    This paper describes a camera-based approach towards creating a tangible interface for serious games. We introduce our game for dental health targeted at school children which implements the Nintendo WiiMote as infrared camera. Paired with a gesture-recognition system, this combination allows us to apply real-world items as input devices. Thereby, the game tries to address different aspects of dental hygiene along with the improvement of children's motor skills. In our focus group test, we found that tangible interfaces offer great potential for educational purposes and can be used to engage kids in a playful learning process by addressing their childlike curiosity and fostering implicit learning

    Exploring the Front Touch Interface for Virtual Reality Headsets

    Full text link
    In this paper, we propose a new interface for virtual reality headset: a touchpad in front of the headset. To demonstrate the feasibility of the front touch interface, we built a prototype device, explored VR UI design space expansion, and performed various user studies. We started with preliminary tests to see how intuitively and accurately people can interact with the front touchpad. Then, we further experimented various user interfaces such as a binary selection, a typical menu layout, and a keyboard. Two-Finger and Drag-n-Tap were also explored to find the appropriate selection technique. As a low-cost, light-weight, and in low power budget technology, a touch sensor can make an ideal interface for mobile headset. Also, front touch area can be large enough to allow wide range of interaction types such as multi-finger interactions. With this novel front touch interface, we paved a way to new virtual reality interaction methods

    Effective Gesture Based Framework for Capturing User Input

    Full text link
    Computers today aren't just confined to laptops and desktops. Mobile gadgets like mobile phones and laptops also make use of it. However, one input device that hasn't changed in the last 50 years is the QWERTY keyboard. Users of virtual keyboards can type on any surface as if it were a keyboard thanks to sensor technology and artificial intelligence. In this research, we use the idea of image processing to create an application for seeing a computer keyboard using a novel framework which can detect hand gestures with precise accuracy while also being sustainable and financially viable. A camera is used to capture keyboard images and finger movements which subsequently acts as a virtual keyboard. In addition, a visible virtual mouse that accepts finger coordinates as input is also described in this study. This system has a direct benefit of reducing peripheral cost, reducing electronics waste generated due to external devices and providing accessibility to people who cannot use the traditional keyboard and mouse

    What does touch tell us about emotions in touchscreen-based gameplay?

    Get PDF
    This is the post-print version of the Article. The official published version can be accessed from the link below - Copyright @ 2012 ACM. It is posted here by permission of ACM for your personal use. Not for redistribution.Nowadays, more and more people play games on touch-screen mobile phones. This phenomenon raises a very interesting question: does touch behaviour reflect the player’s emotional state? If possible, this would not only be a valuable evaluation indicator for game designers, but also for real-time personalization of the game experience. Psychology studies on acted touch behaviour show the existence of discriminative affective profiles. In this paper, finger-stroke features during gameplay on an iPod were extracted and their discriminative power analysed. Based on touch-behaviour, machine learning algorithms were used to build systems for automatically discriminating between four emotional states (Excited, Relaxed, Frustrated, Bored), two levels of arousal and two levels of valence. The results were very interesting reaching between 69% and 77% of correct discrimination between the four emotional states. Higher results (~89%) were obtained for discriminating between two levels of arousal and two levels of valence
    • 

    corecore