369 research outputs found

    Exploring the Front Touch Interface for Virtual Reality Headsets

    Full text link
    In this paper, we propose a new interface for virtual reality headset: a touchpad in front of the headset. To demonstrate the feasibility of the front touch interface, we built a prototype device, explored VR UI design space expansion, and performed various user studies. We started with preliminary tests to see how intuitively and accurately people can interact with the front touchpad. Then, we further experimented various user interfaces such as a binary selection, a typical menu layout, and a keyboard. Two-Finger and Drag-n-Tap were also explored to find the appropriate selection technique. As a low-cost, light-weight, and in low power budget technology, a touch sensor can make an ideal interface for mobile headset. Also, front touch area can be large enough to allow wide range of interaction types such as multi-finger interactions. With this novel front touch interface, we paved a way to new virtual reality interaction methods

    Pointing Devices for Wearable Computers

    Get PDF
    We present a survey of pointing devices for wearable computers, which are body-mounted devices that users can access at any time. Since traditional pointing devices (i.e., mouse, touchpad, and trackpoint) were designed to be used on a steady and flat surface, they are inappropriate for wearable computers. Just as the advent of laptops resulted in the development of the touchpad and trackpoint, the emergence of wearable computers is leading to the development of pointing devices designed for them. However, unlike laptops, since wearable computers are operated from different body positions under different environmental conditions for different uses, researchers have developed a variety of innovative pointing devices for wearable computers characterized by their sensing mechanism, control mechanism, and form factor. We survey a representative set of pointing devices for wearable computers using an “adaptation of traditional devices” versus “new devices” dichotomy and study devices according to their control and sensing mechanisms and form factor. The objective of this paper is to showcase a variety of pointing devices developed for wearable computers and bring structure to the design space for wearable pointing devices. We conclude that a de facto pointing device for wearable computers, unlike laptops, is not likely to emerge

    Literature Survey on Interaction Techniques for Large Displays

    Get PDF
    When designing for large screen displays, designers are forced to deal with cursor tracking issues, interacting over distances, and space management issues. Because of the large visual angle of the user that the screen can cover, it may be hard for users to begin and complete search tasks for basic items such as cursors or icons. In addition, maneuvering over long distances and acquiring small targets understandably takes more time than the same interactions on normally sized screen systems. To deal with these issues, large display researchers have developed more and more unconventional devices, methods and widgets for interaction, and systems for space and task management. For tracking cursors there are techniques that deal with the size and shape of the cursor, as well as the “density” of the cursor. There are other techniques that help direct the attention of the user to the cursor. For target acquisition on large screens, many researchers saw fit to try to augment existing 2D GUI metaphors. They try to optimize Fitts’ law to accomplish this. Some techniques sought to enlarge targets while others sought to enlarge the cursor itself. Even other techniques developed ways of closing the distances on large screen displays. However, many researchers feel that existing 2D metaphors do not and will not work for large screens. They feel that the community should move to more unconventional devices and metaphors. These unconventional means include use of eye-tracking, laser-pointing, hand-tracking, two-handed touchscreen techniques, and other high-DOF devices. In the end, many of these developed techniques do provide effective means for interaction on large displays. However, we need to quantify the benefits of these methods and understand them better. The more we understand the advantages and disadvantages of these techniques, the easier it will be to employ them in working large screen systems. We also need to put into place a kind of interaction standard for these large screen systems. This could mean simply supporting desktop events such as pointing and clicking. It may also mean that we need to identify the needs of each domain that large screens are used for and tailor the interaction techniques for the domain

    In-home and remote use of robotic body surrogates by people with profound motor deficits

    Get PDF
    By controlling robots comparable to the human body, people with profound motor deficits could potentially perform a variety of physical tasks for themselves, improving their quality of life. The extent to which this is achievable has been unclear due to the lack of suitable interfaces by which to control robotic body surrogates and a dearth of studies involving substantial numbers of people with profound motor deficits. We developed a novel, web-based augmented reality interface that enables people with profound motor deficits to remotely control a PR2 mobile manipulator from Willow Garage, which is a human-scale, wheeled robot with two arms. We then conducted two studies to investigate the use of robotic body surrogates. In the first study, 15 novice users with profound motor deficits from across the United States controlled a PR2 in Atlanta, GA to perform a modified Action Research Arm Test (ARAT) and a simulated self-care task. Participants achieved clinically meaningful improvements on the ARAT and 12 of 15 participants (80%) successfully completed the simulated self-care task. Participants agreed that the robotic system was easy to use, was useful, and would provide a meaningful improvement in their lives. In the second study, one expert user with profound motor deficits had free use of a PR2 in his home for seven days. He performed a variety of self-care and household tasks, and also used the robot in novel ways. Taking both studies together, our results suggest that people with profound motor deficits can improve their quality of life using robotic body surrogates, and that they can gain benefit with only low-level robot autonomy and without invasive interfaces. However, methods to reduce the rate of errors and increase operational speed merit further investigation.Comment: 43 Pages, 13 Figure

    OpenBIM-Tango integrated virtual showroom for offsite manufactured production of self-build housing

    Get PDF
    As a result of progressive use of BIM in the AEC sector, the amount of diverse project information is increasing rapidly, thus necessitating interoperability of tools, compatibility of data, effective collaboration and sophisticated data management. Media-rich VR and AR environments have been proven to help users better understand design solutions, however, they have not been quite advanced in supporting interoperability and collaboration. Relying on capabilities of openBIM and IFC schema, this study posits that this shortcoming of VR and AR environment could be addressed by use of BIM server concept allowing for concurrent multiuser and low-latency communication between applications. Successful implementation of this concept can ultimately mitigate the need for advanced technical skills for participation in design processes and facilitate the generation of more useful design solutions by early involvement of stakeholders and end-users in decision making. This paper exemplifies a method for integration of BIM data into immersive VR and AR environments, in order to streamline the design process and provide a pared-down agnostic openBIM system with low latency and synchronised concurrent user accessibility that gives the “right information to the right people at the right time”. These concepts have been further demonstrated through development of a prototype for openBIM-Tango integrated virtual showroom for offsite manufactured production of self-build housing. The prototype directly includes BIM models and data from IFC format and interactively presents them to users on both VR immersive and AR environments, including Google Tango enabled devices. This paper contributes by offering innovative and practical solutions for integration of openBIM and VR/AR interfaces, which can address interoperability issues of the AEC industry

    Designing Wearable Personal Assistants for Surgeons: An Egocentric Approach

    Get PDF

    Novel Interaction Techniques for Mobile Augmented Reality applications. A Systematic Literature Review

    Get PDF
    This study reviews the research on interaction techniques and methods that could be applied in mobile augmented reality scenarios. The review is focused on themost recent advances and considers especially the use of head-mounted displays. Inthe review process, we have followed a systematic approach, which makes the reviewtransparent, repeatable, and less prone to human errors than if it was conducted in amore traditional manner. The main research subjects covered in the review are headorientation and gaze-tracking, gestures and body part-tracking, and multimodality– as far as the subjects are related to human-computer interaction. Besides these,also a number of other areas of interest will be discussed.Siirretty Doriast
    • …
    corecore