466 research outputs found

    Designing, testing and adapting navigation techniques for the immersive web

    Get PDF
    One of the most essential interactions in Virtual Reality (VR) is the user’s ability to move around and explore the virtual environment. The design of the navigation technique plays a crucial role in the user experience since it determines key usability aspects. VR devices allow for an immersive exploration of 3D worlds, but navigation in VR is challenging for many users, due to potential usability issues related to specific VR controllers, user skills, and motion sickness. Although hundreds of interaction techniques have been proposed for this task, VR navigation still poses a high entry barrier for many users. In this paper we argue that adapting the navigation technique to its context of use can lead to substantial improvements in navigation usability and accessibility. The context of use includes the type of scene, the available physical space, as well as the profile of the user. We present a test platform to facilitate the design and fine-tuning of interaction techniques for 3D navigation. We focus on mainstream VR devices (headsets and controllers) and support the most common navigation metaphors (walking, flying, teleportation). The key idea is to let developers specify, at runtime, the exact mapping between user actions and locomotion changes, for any of the supported metaphors. Such mappings are described by a collection of parameters (e.g. maximum speed) whose values can be adjusted interactively through a GUI, or be provided by user-defined code which can be edited at runtime. Feedback obtained from developers suggests that this approach can be used to quickly adapt the navigation techniques to various people including persons with no previous 3D navigation skills, elderly people, and people with disabilities, as well as to the type, size and semantics of the virtual environment.This work has been funded by MCIN/AEI/10.13039/501100011033/FEDER ‘‘A way to make Europe’’. Pedret model partially funded by EU Horizon 2020, JPICH Conservation, Protection and Use initiative (JPICH-0127) and the Spanish Agencia Estatal de Investigación, grant PCI2020-111979 Enhancement of Heritage Experiences: the Middle Ages; Digital Layered Models of Architecture and Mural Paintings over Time (EHEM)Peer ReviewedPostprint (published version

    Augmenting User Interfaces with Haptic Feedback

    Get PDF
    Computer assistive technologies have developed considerably over the past decades. Advances in computer software and hardware have provided motion-impaired operators with much greater access to computer interfaces. For people with motion impairments, the main di�culty in the communication process is the input of data into the system. For example, the use of a mouse or a keyboard demands a high level of dexterity and accuracy. Traditional input devices are designed for able-bodied users and often do not meet the needs of someone with disabilities. As the key feature of most graphical user interfaces (GUIs) is to point-and-click with a cursor this can make a computer inaccessible for many people. Human-computer interaction (HCI) is an important area of research that aims to improve communication between humans and machines. Previous studies have identi�ed haptics as a useful method for improving computer access. However, traditional haptic techniques su�er from a number of shortcomings that have hindered their inclusion with real world software. The focus of this thesis is to develop haptic rendering algorithms that will permit motion-impaired operators to use haptic assistance with existing graphical user interfaces. The main goal is to improve interaction by reducing error rates and improving targeting times. A number of novel haptic assistive techniques are presented that utilise the three degrees-of-freedom (3DOF) capabilities of modern haptic devices to produce assistance that is designed speci�- cally for motion-impaired computer users. To evaluate the e�ectiveness of the new techniques a series of point-and-click experiments were undertaken in parallel with cursor analysis to compare the levels of performance. The task required the operator to produce a prede�ned sentence on the densely populated Windows on-screen keyboard (OSK). The results of the study prove that higher performance levels can be i ii achieved using techniques that are less constricting than traditional assistance

    Gaze modulated disambiguation technique for gesture control in 3D virtual objects selection

    Get PDF
    © 2017 IEEE. Inputs with multimodal information provide more natural ways to interact with virtual 3D environment. An emerging technique that integrates gaze modulated pointing with mid-air gesture control enables fast target acquisition and rich control expressions. The performance of this technique relies on the eye tracking accuracy which is not comparable with the traditional pointing techniques (e.g., mouse) yet. This will cause troubles when fine grainy interactions are required, such as selecting in a dense virtual scene where proximity and occlusion are prone to occur. This paper proposes a coarse-to-fine solution to compensate the degradation introduced by eye tracking inaccuracy using a gaze cone to detect ambiguity and then a gaze probe for decluttering. It is tested in a comparative experiment which involves 12 participants with 3240 runs. The results show that the proposed technique enhanced the selection accuracy and user experience but it is still with a potential to be improved in efficiency. This study contributes to providing a robust multimodal interface design supported by both eye tracking and mid-air gesture control

    A hotkey interaction technique that promotes hotkeys

    Get PDF
    Hotkeys provide fast interactions to support expert performance. Compared to the traditional pointer-based selection of commands, hotkeys have the advantage in reducing task completion time. However, research shows that users have a tendency of favoring menu selections. This is partially caused by how hotkeys are displayed in most linear and toolbar menus. This thesis provides a review of key findings from literature that aim to promote hotkeys. On the base of these findings, this thesis develops design criteria for hotkey displays that promote hotkey use. This thesis also proposes a new interaction technique which displays hotkeys on the keyboard. Finally, a cognitive model is constructed to describe a user’s decision-making process of choosing between hotkeys and pointer-based selections when this new hotkey display technique is presented

    Cross-device gaze-supported point-to-point content transfer

    Get PDF
    Within a pervasive computing environment, we see content on shared displays that we wish to acquire and use in a specific way i.e., with an application on a personal device, transferring from point-to-point. The eyes as input can indicate intention to interact with a service, providing implicit pointing as a result. In this paper we investigate the use of gaze and manual input for the positioning of gaze-acquired content on personal devices. We evaluate two main techniques, (1) Gaze Positioning, transfer of content using gaze with manual input to confirm actions, (2) Manual Positioning, content is selected with gaze but final positioning is performed by manual input, involving a switch of modalities from gaze to manual input. A first user study compares these techniques applied to direct and indirect manual input configurations, a tablet with touch input and a laptop with mouse input. A second study evaluated our techniques in an application scenario involving distractor targets. Our overall results showed general acceptance and understanding of all conditions, although there were clear individual user preferences dependent on familiarity and preference toward gaze, touch, or mouse input

    An investigation into gaze-based interaction techniques for people with motor impairments

    Get PDF
    The use of eye movements to interact with computers offers opportunities for people with impaired motor ability to overcome the difficulties they often face using hand-held input devices. Computer games have become a major form of entertainment, and also provide opportunities for social interaction in multi-player environments. Games are also being used increasingly in education to motivate and engage young people. It is important that young people with motor impairments are able to benefit from, and enjoy, them. This thesis describes a program of research conducted over a 20-year period starting in the early 1990's that has investigated interaction techniques based on gaze position intended for use by people with motor impairments. The work investigates how to make standard software applications accessible by gaze, so that no particular modification to the application is needed. The work divides into 3 phases. In the first phase, ways of using gaze to interact with the graphical user interfaces of office applications were investigated, designed around the limitations of gaze interaction. Of these, overcoming the inherent inaccuracies of pointing by gaze at on-screen targets was particularly important. In the second phase, the focus shifted from office applications towards immersive games and on-line virtual worlds. Different means of using gaze position and patterns of eye movements, or gaze gestures, to issue commands were studied. Most of the testing and evaluation studies in this, like the first, used participants without motor-impairments. The third phase of the work then studied the applicability of the research findings thus far to groups of people with motor impairments, and in particular,the means of adapting the interaction techniques to individual abilities. In summary, the research has shown that collections of specialised gaze-based interaction techniques can be built as an effective means of completing the tasks in specific types of games and how these can be adapted to the differing abilities of individuals with motor impairments

    Intelligent microscope III

    Get PDF

    Intelligent microscope III

    Get PDF
    • …
    corecore