20,627 research outputs found

    Navigation and interaction in a real-scale digital mock-up using natural language and user gesture

    Get PDF
    This paper tries to demonstrate a very new real-scale 3D system and sum up some firsthand and cutting edge results concerning multi-modal navigation and interaction interfaces. This work is part of the CALLISTO-SARI collaborative project. It aims at constructing an immersive room, developing a set of software tools and some navigation/interaction interfaces. Two sets of interfaces will be introduced here: 1) interaction devices, 2) natural language (speech processing) and user gesture. The survey on this system using subjective observation (Simulator Sickness Questionnaire, SSQ) and objective measurements (Center of Gravity, COG) shows that using natural languages and gesture-based interfaces induced less cyber-sickness comparing to device-based interfaces. Therefore, gesture-based is more efficient than device-based interfaces.FUI CALLISTO-SAR

    Exploring the Front Touch Interface for Virtual Reality Headsets

    Full text link
    In this paper, we propose a new interface for virtual reality headset: a touchpad in front of the headset. To demonstrate the feasibility of the front touch interface, we built a prototype device, explored VR UI design space expansion, and performed various user studies. We started with preliminary tests to see how intuitively and accurately people can interact with the front touchpad. Then, we further experimented various user interfaces such as a binary selection, a typical menu layout, and a keyboard. Two-Finger and Drag-n-Tap were also explored to find the appropriate selection technique. As a low-cost, light-weight, and in low power budget technology, a touch sensor can make an ideal interface for mobile headset. Also, front touch area can be large enough to allow wide range of interaction types such as multi-finger interactions. With this novel front touch interface, we paved a way to new virtual reality interaction methods

    Gaze and Gestures in Telepresence: multimodality, embodiment, and roles of collaboration

    Full text link
    This paper proposes a controlled experiment to further investigate the usefulness of gaze awareness and gesture recognition in the support of collaborative work at a distance. We propose to redesign experiments conducted several years ago with more recent technology that would: a) enable to better study of the integration of communication modalities, b) allow users to freely move while collaborating at a distance and c) avoid asymmetries of communication between collaborators.Comment: Position paper, International Workshop New Frontiers in Telepresence 2010, part of CSCW2010, Savannah, GA, USA, 7th of February, 2010. http://research.microsoft.com/en-us/events/nft2010

    SymbolDesign: A User-centered Method to Design Pen-based Interfaces and Extend the Functionality of Pointer Input Devices

    Full text link
    A method called "SymbolDesign" is proposed that can be used to design user-centered interfaces for pen-based input devices. It can also extend the functionality of pointer input devices such as the traditional computer mouse or the Camera Mouse, a camera-based computer interface. Users can create their own interfaces by choosing single-stroke movement patterns that are convenient to draw with the selected input device and by mapping them to a desired set of commands. A pattern could be the trace of a moving finger detected with the Camera Mouse or a symbol drawn with an optical pen. The core of the SymbolDesign system is a dynamically created classifier, in the current implementation an artificial neural network. The architecture of the neural network automatically adjusts according to the complexity of the classification task. In experiments, subjects used the SymbolDesign method to design and test the interfaces they created, for example, to browse the web. The experiments demonstrated good recognition accuracy and responsiveness of the user interfaces. The method provided an easily-designed and easily-used computer input mechanism for people without physical limitations, and, with some modifications, has the potential to become a computer access tool for people with severe paralysis.National Science Foundation (IIS-0093367, IIS-0308213, IIS-0329009, EIA-0202067

    Touch Screen Avatar English Learning System For University Students Learning Simplicity

    Get PDF
    This paper discusses on touch screen avatar for an English language learning application system. The system would be a combination of avatar as Animated Pedagogical Agent (APA) and a touch screen application that adapt the up to date gesture-based computing which is found as having potential to change the way how we learn as it could reduce the amount of Information Communication Technology (ICT) devices used during teaching and learning process. The key here is interaction between university students and touch screen avatar intelligent application system as well as learning resources that could be learned anytime anywhere twenty four hours in seven days 24/7 based on their study time preference where they could learn at their own comfort out of the tradition. The students would be provided with a learning tool that could help them learn interactively with the current trend which they might be interested with based on their own personalization. Apart from that, their performance shall be monitored from a distance and evaluated to avoid disturbing their learning process from working smoothly and getting rid of feeling of being controlled. Thus, the students are expected to have lower affective filter level that may enhance the way they learn unconsciously. Keywords: Gesture-Based Computing, Avatar, Portable Learning Tool, Interactivity, Language Learnin

    Prototype gesture recognition interface for vehicular head-up display system

    Get PDF

    Biopsym : a learning environment for transrectal ultrasound guided prostate biopsies

    Full text link
    This paper describes a learning environment for image-guided prostate biopsies in cancer diagnosis; it is based on an ultrasound probe simulator virtually exploring real datasets obtained from patients. The aim is to make the training of young physicians easier and faster with a tool that combines lectures, biopsy simulations and recommended exercises to master this medical gesture. It will particularly help acquiring the three-dimensional representation of the prostate needed for practicing biopsy sequences. The simulator uses a haptic feedback to compute the position of the virtual probe from three-dimensional (3D) ultrasound recorded data. This paper presents the current version of this learning environment
    • …
    corecore