820 research outputs found

    Freehand-Steering Locomotion Techniques for Immersive Virtual Environments: A Comparative Evaluation

    Get PDF
    Virtual reality has achieved significant popularity in recent years, and allowing users to move freely within an immersive virtual world has become an important factor critical to realize. The user’s interactions are generally designed to increase the perceived realism, but the locomotion techniques and how these affect the user’s task performance still represent an open issue, much discussed in the literature. In this article, we evaluate the efficiency and effectiveness of, and user preferences relating to, freehand locomotion techniques designed for an immersive virtual environment performed through hand gestures tracked by a sensor placed in the egocentric position and experienced through a head-mounted display. Three freehand locomotion techniques have been implemented and compared with each other, and with a baseline technique based on a controller, through qualitative and quantitative measures. An extensive user study conducted with 60 subjects shows that the proposed methods have a performance comparable to the use of the controller, further revealing the users’ preference for decoupling the locomotion in sub-tasks, even if this means renouncing precision and adapting the interaction to the possibilities of the tracker sensor

    Master of Science

    Get PDF
    thesisTraditionally, hand rests are used to reduce muscle fatigue and to improve precision in small-workspace dexterous tasks. Dynamic hand rests have been shown to be beneficial for large-workspace planar tasks. However, providing high-bandwidth support in the vertical direction proves to be more challenging than in the horizontal plane. One must decouple the gravitational support of the arm from the intended vertical motion of the user. A vertically moving device, called the Vertical Active Handrest (VAHR), is presented in this thesis. This device dynamically supports the weight of the user's arm over a large workspace to add stability for precision dexterous tasks while providing gravitational support to the arm to reduce fatigue. The goal in developing the VAHR is to integrate its capabilities with the current Active Handrest, which provides dynamic support in the horizontal plane, thus creating a three degree-of-freedom active support device. The VAHR takes control inputs from a force sensor embedded in its armrest and from the tracked position of a tool. Studies were conducted with a variety of controllers and user input strategies to evaluate the VAHR's effectiveness at assisting participants in a single-axis tracking task. An initial pilot test with the VAHR shows no statistical improvements in tracking performance using force input control modes over conditions in which the arm is unsupported, or is supported by a static rest surface. The main experiment presented in this thesis focuses on either pure stylus position input or a combination of position and force inputs. Tracking accuracy significantly improves compared to the unsupported condition while using stylus position input control. Poor performance under pure force control is attributed to the required activation of large muscle groups in the arm to provide force input to the VAHR's instrumented armrest. These large muscle groups are poorly suited for the agile tracking task used for experimentation. It is theorized that the better performance when using the stylus position control modes is because inputs from smaller, more dexterous muscle groups in the hand are utilized, allowing the position of the arm to be controlled by muscles that are already adept at precision control

    Doctor of Philosophy

    Get PDF
    dissertationHumans generally have difficulty performing precision tasks with their unsupported hands. To compensate for this difficulty, people often seek to support or rest their hand and arm on a fixed surface. However, when the precision task needs to be performed over a workspace larger than what can be reached from a fixed position, a fixed support is no longer useful. This dissertation describes the development of the Active Handrest, a device that expands its user's dexterous workspace by providing ergonomic support and precise repositioning motions over a large workspace. The prototype Active Handrest is a planar computer-controlled support for the user's hand and arm. The device can be controlled through force input from the user, position input from a grasped tool, or a combination of inputs. The control algorithm of the Active Handrest converts the input(s) into device motions through admittance control where the device's desired velocity is calculated proportionally to the input force or its equivalent. A robotic 2-axis admittance device was constructed as the initial Planar Active Handrest, or PAHR, prototype. Experiments were conducted to optimize the device's control input strategies. Large workspace shape tracing experiments were used to compare the PAHR to unsupported, fixed support, and passive moveable support conditions. The Active Handrest was found to reduce task error and provide better speedaccuracy performance. Next, virtual fixture strategies were explored for the device. From the options considered, a virtual spring fixture strategy was chosen based on its effectiveness. An experiment was conducted to compare the PAHR with its virtual fixture strategy to traditional virtual fixture techniques for a grasped stylus. Virtual fixtures implemented on the Active Handrest were found to be as effective as fixtures implemented on a grasped tool. Finally, a higher degree-of-freedom Enhanced Planar Active Handrest, or E-PAHR, was constructed to provide support for large workspace precision tasks while more closely following the planar motions of the human arm. Experiments were conducted to investigate appropriate control strategies and device utility. The E-PAHR was found to provide a skill level equal to that of the PAHR with reduced user force input and lower perceived exertion

    Systematic literature review of hand gestures used in human computer interaction interfaces

    Get PDF
    Gestures, widely accepted as a humans' natural mode of interaction with their surroundings, have been considered for use in human-computer based interfaces since the early 1980s. They have been explored and implemented, with a range of success and maturity levels, in a variety of fields, facilitated by a multitude of technologies. Underpinning gesture theory however focuses on gestures performed simultaneously with speech, and majority of gesture based interfaces are supported by other modes of interaction. This article reports the results of a systematic review undertaken to identify characteristics of touchless/in-air hand gestures used in interaction interfaces. 148 articles were reviewed reporting on gesture-based interaction interfaces, identified through searching engineering and science databases (Engineering Village, Pro Quest, Science Direct, Scopus and Web of Science). The goal of the review was to map the field of gesture-based interfaces, investigate the patterns in gesture use, and identify common combinations of gestures for different combinations of applications and technologies. From the review, the community seems disparate with little evidence of building upon prior work and a fundamental framework of gesture-based interaction is not evident. However, the findings can help inform future developments and provide valuable information about the benefits and drawbacks of different approaches. It was further found that the nature and appropriateness of gestures used was not a primary factor in gesture elicitation when designing gesture based systems, and that ease of technology implementation often took precedence

    Sketching in 3D : towards a fluid space for mind and body

    Get PDF
    Thesis (S.M. in Architecture Studies)--Massachusetts Institute of Technology, Dept. of Architecture, 2013.Cataloged from PDF version of thesis.Includes bibliographical references (p. 80-82).This thesis explores a new type of computer-aided sketching tool for 3-dimensional designs. Sketching, as a process, has been used as an effective way to explore and develop ideas in the design process. However, when designers deal with volumetric designs in 3-dimensional space, current sketching means, including traditional free-hand sketching and contemporary computer-aided design (CAD) modeling have limitations such as dimensional inconsistency, and non-intuitive interactions. By observing the roles of sketching in the design process and reviewing the history of design tools, this thesis investigates and proposes new digital methods of 3-dimensional sketching that take advantage of motion detecting and computer-vision technology that is widely available today. In this thesis, two prototype tools were developed and compared. The first prototype uses a motion detecting sensor, projection screen, and gesture tracking software. The movement of the user's hands becomes the intuitive interface to shape 3-dimensional objects in the virtual space. The second prototype, developed in collaboration with Nagakura, uses a hand-held tablet computer with marker-based augmented reality technique. The hand-held device displays the virtual object from desired angles and works as a virtual tool like a chisel, plane, drill, and glue gun to shape virtual objects in 3-dimensional space. Testing these two prototypes for use, and comparing the resulting objects and user responses revealed the strengths and weaknesses of these different 3-dimensional sketching environments. The proposed systems provide a possible foundation for novel computer-aided sketching application that takes advantages of both the physical and virtual worlds.by Woongki Sung.S.M.in Architecture Studie

    Experience Prototyping for Automotive Applications

    Get PDF
    In recent years, we started to define our life through experiences we make instead of objectswe buy. To attend a concert of our favorite musician may be more important for us thanowning an expensive stereo system. Similarly, we define interactive systems not only by thequality of the display or its usability, but rather by the experiences we can make when usingthe device. A cell phone is primarily built for making calls and receiving text messages,but on an emotional level it might provide a way to be close to our loved ones, even thoughthey are far away sometimes. When designing interactive technology, we do not only haveto answer the question how people use our systems, but also why they use them. Thus,we need to concentrate on experiences, feelings and emotions arising during interaction.Experience Design is an approach focusing on the story that a product communicates beforeimplementing the system. In an interdisciplinary team of psychologists, industrial designers, product developers andspecialists in human-computer interaction, we applied an Experience Design process to theautomotive domain. A major challenge for car manufacturers is the preservation of theseexperiences throughout the development process. When implementing interactive systemsengineers rely on technical requirements and a set of constraints (e.g., safety) oftentimescontradicting aspects of the designed experience. To resolve this conflict, Experience Prototypingis an important tool translating experience stories to an actual interactive product. With this thesis I investigate the Experience Design process focusing on Experience Prototyping.Within the automotive context, I report on three case studies implementing threekinds of interactive systems, forming and following our approach. I implemented (1) anelectric vehicle information system called Heartbeat, communicating the state of the electricdrive and the batteries to the driver in an unobtrusive and ensuring way. I integrated Heartbeatinto the dashboard of a car mock-up with respect to safety and space requirements butat the same time holding on to the story in order to achieve a consistent experience. With (2)the Periscope I implemented a mobile navigation device enhancing the social and relatednessexperiences of the passengers in the car. I built and evaluated several experience prototypesin different stages of the design process and showed that they transported the designed experiencethroughout the implementation of the system. Focusing on (3) the experience offreehand gestures, GestShare explored this interaction style for in-car and car-to-car socialexperiences. We designed and implemented a gestural prototypes for small but effectivesocial interactions between drivers and evaluated the system in the lab and and in-situ study. The contributions of this thesis are (1) a definition of Experience Prototyping in the automotivedomain resulting from a literature review and my own work, showing the importanceand feasibility of Experience Prototyping for Experience Design. I (2) contribute three casestudies and describe the details of several prototypes as milestones on the way from a anexperience story to an interactive system. I (3) derive best practices for Experience Prototypingconcerning their characteristics such as fidelity, resolution and interactivity as well asthe evaluation in the lab an in situ in different stages of the process.Wir definieren unser Leben zunehmend durch Dinge, die wir erleben und weniger durchProdukte, die wir kaufen. Ein Konzert unseres Lieblingsmusikers zu besuchen kann dabeiwichtiger sein, als eine teure Stereoanlage zu besitzen. Auch interaktive Systeme bewertenwir nicht mehr nur nach der QualitĂ€t des Displays oder der Benutzerfreundlichkeit, sondernauch nach Erlebnissen, die durch die Benutzung möglich werden. Das Smartphone wurdehauptsĂ€chlich zum Telefonieren und Schreiben von Nachrichten entwickelt. Auf einer emotionalenEbene bietet es uns aber auch eine Möglichkeit, wichtigen Personen sehr nah zusein, auch wenn sie manchmal weit weg sind. Bei der Entwicklung interaktiver Systememüssen wir uns daher nicht nur fragen wie, sondern auch warum diese benutzt werden. Erlebnisse,Gefühle und Emotionen, die wĂ€hrend der Interaktion entstehen, spielen dabei einewichtige Rolle. Experience Design ist eine Disziplin, die sich auf Geschichten konzentriert,die ein Produkt erzĂ€hlt, bevor es tatsĂ€chlich implementiert wird. In einem interdisziplinĂ€ren Team aus Psychologen, Industrie-Designern, Produktentwicklernund Spezialisten der Mensch-Maschine-Interaktion wurde ein Prozess zur Erlebnis-Gestaltung im automobilen Kontext angewandt. Die Beibehaltung von Erlebnissen über dengesamten Entwicklungsprozess hinweg ist eine große Herausforderung für Automobilhersteller.Ingenieure hĂ€ngen bei der Implementierung interaktiver Systeme von technischen,sicherheitsrelevanten und ergonomischen Anforderungen ab, die oftmals dem gestaltetenErlebnis widersprechen. Die Bereitstellung von Erlebnis-Prototypen ermöglicht die Übersetzungvon Geschichten in interaktive Produkte und wirkt daher diesem Konflikt entgegen. Im Rahmen dieser Dissertation untersuche ich den Prozess zur Erlebnis-Gestaltung hinsichtlichder Bedeutung von Erlebnis-Prototypen. Ich berichte von drei Fallbeispielen im automobilenBereich, die die Gestaltung und Implementierung verschiedener interaktiver Systemenumfassen. (1) Ein Informationssystem für Elektrofahrzeuge, der Heartbeat, macht den Zustanddes elektrischen Antriebs und den Ladestand der Batterien für den Fahrer visuell undhaptisch erlebbar. Nach der Implementierung mehrerer Prototypen wurde Heartbeat unterBerücksichtigung verschiedener technischer und sicherheitsrelevanter Anforderungen in dieArmaturen eines Fahrzeugmodells integriert, ohne dass dabei das gestaltete Erlebnis verlorengegangen ist. (2) Das Periscope ist ein mobiles NavigationsgerĂ€t, das den Insassensoziale Erlebnisse ermöglicht und das Verbundenheitsgefühl stĂ€rkt. Durch die Implementierungmehrere Erlebnis-Prototypen und deren Evaluation in verschiedenen Phasen des Entwicklungsprozesseskonnten die gestalteten Erlebnisse konsistent erhalten werden. (3) ImProjekt GestShare wurde das Potential der Interaktion durch Freiraumgesten im Fahrzeuguntersucht. Dabei standen ein Verbundenheitserlebnis des Fahrers und soziale Interaktionenmit Fahrern anderer Fahrzeuge im Fokus. Es wurden mehrere Prototypen implementiert undauch in einer Verkehrssituation evaluiert. Die wichtigsten BeitrĂ€ge dieser Dissertation sind (1) eine intensive Betrachtung und Anwendungvon Erlebnis-Prototypen im Auto und deren Relevanz bei der Erlebnis-Gestaltung,beruhend auf einer Literaturauswertung und der eigenen Erfahrung innerhalb des Projekts; (2) drei Fallstudien und eine detaillierte Beschreibung mehrere Prototypen in verschiedenenPhasen des Prozesses und (3) Empfehlungen zu Vorgehensweisen bei der Erstellung vonErlebnis-Prototypen hinsichtlich der Eigenschaften wie NĂ€he zum finalen Produkt, Anzahlder implementierten Details und InteraktivitĂ€t sowie zur Evaluation im Labor und in tatsĂ€chlichenVerkehrssituationen in verschiedenen Phasen des Entwicklungsprozesses

    An Object-oriented drawing package in smalltalk/v

    Get PDF
    Graphics creation applications tend to fall into two categories: bit-mapped paint packages, and object-oriented drawing packages. Although each interface has its own unique advantages, few vendors have attempted to integrate the two into a single package. Those who have tried have, in fact, poor integration both from the user\u27s perspective and in the underlying mathematical model. In this thesis, I have addressed the issue of integrating bit-mapped and object-oriented interfaces by creating an object-oriented graphics package which provides the user with a consistent interface for creating and manipulating both graphical objects and bit-mapped graphics. The consistency of the interface was facilitated by the consistency of the design, the underlying geometric model, and the implementation, all of which are themselves object-oriented. The thesis is written in Smalltalk/V for the Macintosh* . While the solution for this integration was not derived overnight, the use of object-oriented design principles sped the development of a complex graphical user interface, while providing fresh insight into the problem of representing bit-mapped objects. Because Smalltalk enforces the notion that every element in the system is an object, the Smalltalk developer is forced to begin designing his solution purely in terms of objects. This mind-set allowed me to view the point as no other graphics package has presented it: as a unique graphical entity (just as ll IS 1R formal geometry) available to the user as a graphical tool. As a result, users of my package are able to enjoy the benefits of both bit-mapped and object-oriented editors without ever abandoning an environment in which every graphical element is an object, in terms of both the interface and the underlying mathematical model

    Which One is Me?: Identifying Oneself on Public Displays

    Get PDF
    While user representations are extensively used on public displays, it remains unclear how well users can recognize their own representation among those of surrounding users. We study the most widely used representations: abstract objects, skeletons, silhouettes and mirrors. In a prestudy (N=12), we identify five strategies that users follow to recognize themselves on public displays. In a second study (N=19), we quantify the users' recognition time and accuracy with respect to each representation type. Our findings suggest that there is a significant effect of (1) the representation type, (2) the strategies performed by users, and (3) the combination of both on recognition time and accuracy. We discuss the suitability of each representation for different settings and provide specific recommendations as to how user representations should be applied in multi-user scenarios. These recommendations guide practitioners and researchers in selecting the representation that optimizes the most for the deployment's requirements, and for the user strategies that are feasible in that environment
    • 

    corecore