2,863 research outputs found

    Exploring Hand-Based Haptic Interfaces for Mobile Interaction Design

    Get PDF
    Visual attention is crucial in mobile environments, not only for staying aware of dynamic situations, but also for safety reasons. However, current mobile interaction design forces the user to focus on the visual interface of the handheld device, thus limiting the user's ability to process visual information from their environment. In response to these issues, a common solution is to encode information with on-device vibrotactile feedback. However, the vibration is transitory and is often difficult to perceive when mobile. Another approach is to make visual interfaces even more dominant with smart glasses, which enable head-up interaction on their see-through interface. Yet, their input methods raise many concerns regarding social acceptability, preventing them from being widely adopted. There is a need to derive feasible interaction techniques for mobile use while maintaining the user's situational awareness, and this thesis argues that solutions could be derived through the exploration of hand-based haptic interfaces. The objective of this research is to provide multimodal interaction for users to better interact with information while maintaining proper attention to the environment in mobile scenarios. Three research areas were identified. The first is developing expressive haptic stimuli, in which the research investigates how static haptic stimuli could be derived. The second is designing mobile spatial interaction with the user's surroundings as content, which manifests situations in which visual attention to the environment is most needed. The last is interacting with the always-on visual interface on smart glasses, the seemingly ideal solution for mobile applications. The three areas extend along the axis of the demand of visual attention on the interface, from non-visual to always-on visual interfaces. Interactive prototypes were constructed and deployed in studies for each research area, including two shape-changing mechanisms feasible for augmenting mobile devices and a spatial-sensing haptic glove featuring mid-air hand-gestural interaction with haptic support. The findings across the three research areas highlight the immediate benefits of incorporating hand-based haptic interfaces into applications. First, shape-changing interfaces can provide static and continuous haptic stimuli for mobile communication. Secondly, enabling direct interaction with real-world landmarks through a haptic glove and leaving visual attention on the surroundings could result in a higher level of immersed experience. Lastly, the users of smart glasses can benefit from the unobtrusive hand-gestural interaction enabled by the isolated tracking technique of a haptic glove. Overall, this work calls for mobile interaction design to consider haptic stimuli beyond on-device vibration, and mobile hardware solutions beyond the handheld form factor. It also invites designers to consider how to confront the competition of cognitive resources among multiple tasks from an interaction design perspective.Visuaalisen huomiokyvyn säilyttäminen mobiililaitteita käytettäessä on tärkeää sekä ympäröivien tilanteiden havainnoimisen että käyttäjän turvallisuuden kannalta. Nykyiset mobiilikäyttöliittymäratkaisut kuitenkin vaativat käyttäjää keskittämään huomionsa mobiililaitteen ruudulle, mikä rajoittaa käyttäjän kykyä käsitellä ympäristöön liittyvää visuaalista informaatiota. Eräs paljon käytetty ratkaisu tähän ongelmaan on esittää informaatiota laitteen värinän avulla. Värinäpohjainen esitystapa on kuitenkin hetkeen sidottu ja siten ohimenevä, eikä sitä ole välttämättä helppo havaita käyttäjän liikkeellä ollessa. Toinen tapa hyödyntää ns. älylaseja visuaalisen informaation esittämiseen. Tämän tavan etuna on läpinäkyvä näyttöpinta, joka ei vaadi keskittämään katsetta erilliseen laitteeseen. Älylasien tyypilliset syötemuodot kuitenkin aiheuttavat ongelmia niiden sosiaaliselle hyväksyttävyydelle, mikä estää niiden laajempaa käyttöönottoa. Niinpä tämän tutkimuksen lähtökohtana on tarve uudenlaisten mobiilikäyttöliittymien suunnittelulle siten, että käyttäjän huomio säilyy ympäristössä. Väitöskirjatutkimuksessa esitetään, että ratkaisu voi pohjautua käsin kosketeltavaan haptiseen rajapintaan. Tutkimuksen tavoitteena on tuottaa mobiilitilanteisiin multimodaalisia käyttöliittymiä, joiden avulla käyttäjä voi vuorovaikuttaa informaation kanssa menettämättä huomiotaan ympäristöstä. Tutkimus keskittyy kolmeen tutkimuskohteeseen. Ensimmäisessä kehitetään ilmaisuvoimaisia haptisia ärsykkeitä tarkastelemalla staattisten haptisten ärsykkeiden suunnittelun mahdollisuuksia. Toinen kohde liittyy tilaan perustuvan vuorovaikutuksen suunnitteluun tilanteessa, jossa käyttäjä vuorovaikuttaa ympäristöön liittyvän informaation kanssa liikkeellä ollessaan, jolloin ympäristön visuaalinen havainnointi on tärkeää. Kolmannessa tutkimuskohteessa kehitetään uudenlainen syötemuoto älylaseille. Nämä kolme tutkimuskohdetta voidaan esittää osina jatkumoa, joka perustuu laitteen vaatiman visuaalisen huomion määrään täysin ei-visuaalisista täysin visuaalisiin käyttöliittymiin. Jokaisen tutkimuskohteen osalta kehitettiin vuorovaikutteisia prototyyppejä: kaksi muotoa muuttavaa mekanismia mobiililaitteiden täydentämiseksi uusilla palautemuodoilla sekä haptinen hansikas, joka mahdollistaa vuorovaikutuksen ilmassa suoritettavien eleiden ja haptisen palautteen avulla. Kaikkien kolmen tutkimuskohteen tulokset korostavat käsin kosketeltavien haptisten rajapintojen etuja käytännön sovelluksissa. Ensinnäkin muotoa muuttavat rajapinnat voivat tuottaa staattisia ja jatkuvia ärsykkeitä, joita voidaan hyödyntää mobiilivuorovaikutuksessa. Toiseksi haptisen hansikkaan mahdollistama suora vuorovaikutus ympäröivien maamerkkien kanssa auttaa säilyttämään visuaalisen huomion ympäristössä ja voi saada aikaan mukaansatempaavamman käyttökokemuksen. Kolmanneksi älylasien käyttäjät hyötyvät haptisen hansikkaan anturien mahdollistamasta huomaamattomasta elevuorovaikutuksesta. Tämä väitöskirja kehottaa mobiilikäyttöliittymien suunnittelijoita ottamaan huomioon muut kuin kädessä pideltävät laitemuodot sekä haptisten ärsykkeiden mahdollisuudet laajemmin kuin laitteen sisäänrakennetun värinäominaisuuden kautta. Väitöstutkimus myös pyytää suunnittelijoita ja muotoilijoita pohtimaan vuorovaikutussuunnittelun näkökulmasta, miten kohdata monisuorittamistilanteissa käyttäjän kognitiivisten resurssien välinen kilpailu

    Eyes-Off Physically Grounded Mobile Interaction

    Get PDF
    This thesis explores the possibilities, challenges and future scope for eyes-off, physically grounded mobile interaction. We argue that for interactions with digital content in physical spaces, our focus should not be constantly and solely on the device we are using, but fused with an experience of the places themselves, and the people who inhabit them. Through the design, development and evaluation of a series ofnovel prototypes we show the benefits of a more eyes-off mobile interaction style.Consequently, we are able to outline several important design recommendations for future devices in this area.The four key contributing chapters of this thesis each investigate separate elements within this design space. We begin by evaluating the need for screen-primary feedback during content discovery, showing how a more exploratory experience can be supported via a less-visual interaction style. We then demonstrate how tactilefeedback can improve the experience and the accuracy of the approach. In our novel tactile hierarchy design we add a further layer of haptic interaction, and show how people can be supported in finding and filtering content types, eyes-off. We then turn to explore interactions that shape the ways people interact with aphysical space. Our novel group and solo navigation prototypes use haptic feedbackfor a new approach to pedestrian navigation. We demonstrate how variations inthis feedback can support exploration, giving users autonomy in their navigationbehaviour, but with an underlying reassurance that they will reach the goal.Our final contributing chapter turns to consider how these advanced interactionsmight be provided for people who do not have the expensive mobile devices that areusually required. We extend an existing telephone-based information service to support remote back-of-device inputs on low-end mobiles. We conclude by establishingthe current boundaries of these techniques, and suggesting where their usage couldlead in the future

    An aesthetics of touch: investigating the language of design relating to form

    Get PDF
    How well can designers communicate qualities of touch? This paper presents evidence that they have some capability to do so, much of which appears to have been learned, but at present make limited use of such language. Interviews with graduate designer-makers suggest that they are aware of and value the importance of touch and materiality in their work, but lack a vocabulary to fully relate to their detailed explanations of other aspects such as their intent or selection of materials. We believe that more attention should be paid to the verbal dialogue that happens in the design process, particularly as other researchers show that even making-based learning also has a strong verbal element to it. However, verbal language alone does not appear to be adequate for a comprehensive language of touch. Graduate designers-makers’ descriptive practices combined non-verbal manipulation within verbal accounts. We thus argue that haptic vocabularies do not simply describe material qualities, but rather are situated competences that physically demonstrate the presence of haptic qualities. Such competencies are more important than groups of verbal vocabularies in isolation. Design support for developing and extending haptic competences must take this wide range of considerations into account to comprehensively improve designers’ capabilities

    The Design of a Haptic Device for Training and Evaluating Surgeon and Novice Laparoscopic Movement Skills

    Get PDF
    As proper levels of force application are necessary to ensure patient safety, and training hours with an expert on live subjects are difficult, enhanced computer-based training is needed to teach the next generation of surgeons. Considering the role of touch in surgery, there is a need for a device capable of discerning the haptic ability of surgical trainees. This need is amplified by minimally invasive surgical techniques where a surgeon\u27s sense of tissue properties comes not directly through their own hands but indirectly through the tools. A haptic device capable of producing a realistic range of forces and motions that can be used to test the ability of users to replicate salient forces in specific maneuvers is proposed. This device also provides the opportunity to use inexpensive haptic trainers to educate surgeons about proper force application. A novel haptic device was designed and built to provide a simplified analogy of the forces and torques felt during free tool motion and constrained pushing, sweep with laparoscopic instruments. The device is realized as a single-degree-of-freedom robotic system controlled using real-time computer hardware and software. The details of the device design and the results of testing the design against the specifications are presented. A significant achievement in the design is the use of a two-camera vision system to sense the user placement of the input device. The capability of the device as a first-order screening tool to distinguish between novices and expert surgeons is described

    HandSight: A Touch-Based Wearable System to Increase Information Accessibility for People with Visual Impairments

    Get PDF
    Many activities of daily living such as getting dressed, preparing food, wayfinding, or shopping rely heavily on visual information, and the inability to access that information can negatively impact the quality of life for people with vision impairments. While numerous researchers have explored solutions for assisting with visual tasks that can be performed at a distance, such as identifying landmarks for navigation or recognizing people and objects, few have attempted to provide access to nearby visual information through touch. Touch is a highly attuned means of acquiring tactile and spatial information, especially for people with vision impairments. By supporting touch-based access to information, we may help users to better understand how a surface appears (e.g., document layout, clothing patterns), thereby improving the quality of life. To address this gap in research, this dissertation explores methods to augment a visually impaired user’s sense of touch with interactive, real-time computer vision to access information about the physical world. These explorations span three application areas: reading and exploring printed documents, controlling mobile devices, and identifying colors and visual textures. At the core of each application is a system called HandSight that uses wearable cameras and other sensors to detect touch events and identify surface content beneath the user’s finger. To create HandSight, we designed and implemented the physical hardware, developed signal processing and computer vision algorithms, and designed real-time feedback that enables users to interpret visual or digital content. We involve visually impaired users throughout the design and development process, conducting several user studies to assess usability and robustness and to improve our prototype designs. The contributions of this dissertation include: (i) developing and iteratively refining HandSight, a novel wearable system to assist visually impaired users in their daily lives; (ii) evaluating HandSight across a diverse set of tasks, and identifying tradeoffs of a finger-worn approach in terms of physical design, algorithmic complexity and robustness, and usability; and (iii) identifying broader design implications for future wearable systems and for the fields of accessibility, computer vision, augmented and virtual reality, and human-computer interaction

    Measuring user experience for virtual reality

    Get PDF
    In recent years, Virtual Reality (VR) and 3D User Interfaces (3DUI) have seen a drastic increase in popularity, especially in terms of consumer-ready hardware and software. These technologies have the potential to create new experiences that combine the advantages of reality and virtuality. While the technology for input as well as output devices is market ready, only a few solutions for everyday VR - online shopping, games, or movies - exist, and empirical knowledge about performance and user preferences is lacking. All this makes the development and design of human-centered user interfaces for VR a great challenge. This thesis investigates the evaluation and design of interactive VR experiences. We introduce the Virtual Reality User Experience (VRUX) model based on VR-specific external factors and evaluation metrics such as task performance and user preference. Based on our novel UX evaluation approach, we contribute by exploring the following directions: shopping in virtual environments, as well as text entry and menu control in the context of everyday VR. Along with this, we summarize our findings by design spaces and guidelines for choosing optimal interfaces and controls in VR.In den letzten Jahren haben Virtual Reality (VR) und 3D User Interfaces (3DUI) stark an Popularität gewonnen, insbesondere bei Hard- und Software im Konsumerbereich. Diese Technologien haben das Potenzial, neue Erfahrungen zu schaffen, die die Vorteile von Realität und Virtualität kombinieren. Während die Technologie sowohl für Eingabe- als auch für Ausgabegeräte marktreif ist, existieren nur wenige Lösungen für den Alltag in VR - wie Online-Shopping, Spiele oder Filme - und es fehlt an empirischem Wissen über Leistung und Benutzerpräferenzen. Dies macht die Entwicklung und Gestaltung von benutzerzentrierten Benutzeroberflächen für VR zu einer großen Herausforderung. Diese Arbeit beschäftigt sich mit der Evaluation und Gestaltung von interaktiven VR-Erfahrungen. Es wird das Virtual Reality User Experience (VRUX)- Modell eingeführt, das auf VR-spezifischen externen Faktoren und Bewertungskennzahlen wie Leistung und Benutzerpräferenz basiert. Basierend auf unserem neuartigen UX-Evaluierungsansatz leisten wir einen Beitrag, indem wir folgende interaktive Anwendungsbereiche untersuchen: Einkaufen in virtuellen Umgebungen sowie Texteingabe und Menüsteuerung im Kontext des täglichen VR. Die Ergebnisse werden außerdem mittels Richtlinien zur Auswahl optimaler Schnittstellen in VR zusammengefasst

    Human factors in instructional augmented reality for intravehicular spaceflight activities and How gravity influences the setup of interfaces operated by direct object selection

    Get PDF
    In human spaceflight, advanced user interfaces are becoming an interesting mean to facilitate human-machine interaction, enhancing and guaranteeing the sequences of intravehicular space operations. The efforts made to ease such operations have shown strong interests in novel human-computer interaction like Augmented Reality (AR). The work presented in this thesis is directed towards a user-driven design for AR-assisted space operations, iteratively solving issues arisen from the problem space, which also includes the consideration of the effect of altered gravity on handling such interfaces.Auch in der bemannten Raumfahrt steigt das Interesse an neuartigen Benutzerschnittstellen, um nicht nur die Mensch-Maschine-Interaktion effektiver zu gestalten, sondern auch um einen korrekten Arbeitsablauf sicherzustellen. In der Vergangenheit wurden wiederholt Anstrengungen unternommen, Innenbordarbeiten mit Hilfe von Augmented Reality (AR) zu erleichtern. Diese Arbeit konzentriert sich auf einen nutzerorientierten AR-Ansatz, welcher zum Ziel hat, die Probleme schrittweise in einem iterativen Designprozess zu lösen. Dies erfordert auch die Berücksichtigung veränderter Schwerkraftbedingungen
    corecore