153 research outputs found

    A Review of Interaction Techniques for Immersive Environments

    Get PDF
    The recent proliferation of immersive technology has led to the rapid adoption of consumer-ready hardware for Augmented Reality (AR) and Virtual Reality (VR). While this increase has resulted in a variety of platforms that can offer a richer interactive experience, the advances in technology bring more variability in display types, interaction sensors and use cases. This provides a spectrum of device-specific interaction possibilities, with each offering a tailor-made solution for delivering immersive experiences to users, but often with an inherent lack of standardisation across devices and applications. To address this, a systematic review and an evaluation of explicit, task-based interaction methods in immersive environments are presented in this paper. A corpus of papers published between 2013 and 2020 is reviewed to thoroughly explore state-of-the-art user studies, which investigate input methods and their implementation for immersive interaction tasks (pointing, selection, translation, rotation, scale, viewport, menu-based and abstract). Focus is given to how input methods have been applied within the spectrum of immersive technology (AR, VR, XR). This is achieved by categorising findings based on display type, input method, study type, use case and task. Results illustrate key trends surrounding the benefits and limitations of each interaction technique and highlight the gaps in current research. The review provides a foundation for understanding the current and future directions for interaction studies in immersive environments, which, at this pivotal point in XR technology adoption, provides routes forward for achieving more valuable, intuitive and natural interactive experiences

    Brave New GES World:A Systematic Literature Review of Gestures and Referents in Gesture Elicitation Studies

    Get PDF
    How to determine highly effective and intuitive gesture sets for interactive systems tailored to end users’ preferences? A substantial body of knowledge is available on this topic, among which gesture elicitation studies stand out distinctively. In these studies, end users are invited to propose gestures for specific referents, which are the functions to control for an interactive system. The vast majority of gesture elicitation studies conclude with a consensus gesture set identified following a process of consensus or agreement analysis. However, the information about specific gesture sets determined for specific applications is scattered across a wide landscape of disconnected scientific publications, which poses challenges to researchers and practitioners to effectively harness this body of knowledge. To address this challenge, we conducted a systematic literature review and examined a corpus of N=267 studies encompassing a total of 187, 265 gestures elicited from 6, 659 participants for 4, 106 referents. To understand similarities in users’ gesture preferences within this extensive dataset, we analyzed a sample of 2, 304 gestures extracted from the studies identified in our literature review. Our approach consisted of (i) identifying the context of use represented by end users, devices, platforms, and gesture sensing technology, (ii) categorizing the referents, (iii) classifying the gestures elicited for those referents, and (iv) cataloging the gestures based on their representation and implementation modalities. Drawing from the findings of this review, we propose guidelines for conducting future end-user gesture elicitation studies

    Touch-Move-Release: Studies of Surface and Motion Gestures for Mobile Augmented Reality

    Get PDF
    Recent advancements in both hardware and software for mobile devices have allowed developers to create better mobile Augmented Reality (AR) experiences, which has led to an increase in the number of mobile AR applications and users engaging in these experiences. However, despite a broad range of mobile AR applications available to date, the majority of these applications that we surveyed still primarily use surface gestures, i.e., gesturing on the touch screen surface of the device, as the default interaction method and do not utilise the affordance of three-dimensional user interaction that AR interfaces support. In this research, we have investigated and compared two methods of gesture interaction for mobile AR applications: Surface Gestures, which are commonly used in mainstream applications, and Motion Gestures, which take advantage of the spatial information of the mobile device. Our goal is to determine if motion gestures are comparable or even superior to surface gestures for mobile AR applications. To achieve this, we have conducted two user studies: an elicitation study 15 and a validation study. The first study recruited twenty-one participants and elicited two sets of 16 gestures, surface and mobile gestures, for twelve everyday mobile AR tasks. This yielded a total 17 of five hundred and four gestures. The two sets of gestures were classified and compared in terms of goodness, ease of use, and engagement. As expected, the participants’ elicited surface gestures are familiar and easy to use, while motion gestures were found more engaging. Using design patterns derived from the elicited motion gestures, we proposed a novel interaction technique called ”TMR” (Touch-Move-Release). We developed a mobile AR game similar to Pokemon GO to validate this new technique and implemented a selected gesture chosen from ´ the two gesture sets. A validation study was conducted with ten participants, and we found that the motion gesture enhanced engagement and provided a better game experience. In contrast, the surface gesture provided higher precision resulting in higher accuracy and was easier to use. Finally, we discuss the implications of our findings and give our design recommendations for using the elicited gestures

    Designing for Effective Freehand Gestural Interaction

    Get PDF

    A comparison of surface and motion user-defined gestures for mobile augmented reality.

    Get PDF
    Augmented Reality (AR) technology permits interaction between the virtual and physical worlds. Recent advancements in mobile devices allow for a better mobile AR experience, and in turn, improving user adoption rate and increasing the number of mobile AR applications across a wide range of disciplines. Nevertheless, the majority of mobile AR applications, that we have surveyed, adopted surface gestures as the default interaction method for the AR experience and have not utilised three-dimensional (3D) spatial interaction, as supported by AR interfaces. This research investigates two types of gestures for interacting in mobile AR applications, surface gestures, which have been deployed by mainstream applications, and motion gestures, that take advantages of 3D movement of the handheld device. Our goal is to find out if there exists a gesture-based interaction suitable for handheld devices, that can utilise the 3D interaction of mobile AR applications. We conducted two user studies, an elicitation study and a validation study. In the elicitation study, we elicited two sets of gestures, surface and motion, for mobile AR applications. We recruited twenty-one participants to perform twelve common mobile AR tasks, which yielded a total of five-hundred and four gestures. We classified and illustrated the two sets of gestures, and compared them in terms of goodness, ease of use, and engagement. The elicitation process yielded two separate sets of user-defined gestures; legacy surface gestures, which were familiar and easy to use by the participants, and motion gestures, which found to be more engaging. From the design patterns of the motion gestures, we proposed a novel interaction technique for mobile AR called TMR (Touch-Move-Release). To validate our elicited gestures in an actual application, we conducted a second study. We have developed a mobile AR game similar to Pokémon GO and implemented the selected gestures from the elicitation study. The study was conducted with ten participants, and we found that the motion gesture could provide more engagement and better game experience. Nevertheless, surface gestures were more accurate and easier to use. We discussed the implications of our findings and gave our design recommendations for designers on the usage of the elicited gestures. Our research can be further explored in the future. It can be used as a "prequel" to the design of better gesture-based interaction technique for different tasks in various mobile AR applications

    Barehand Mode Switching in Touch and Mid-Air Interfaces

    Get PDF
    Raskin defines a mode as a distinct setting within an interface where the same user input will produce results different to those it would produce in other settings. Most interfaces have multiple modes in which input is mapped to different actions, and, mode-switching is simply the transition from one mode to another. In touch interfaces, the current mode can change how a single touch is interpreted: for example, it could draw a line, pan the canvas, select a shape, or enter a command. In Virtual Reality (VR), a hand gesture-based 3D modelling application may have different modes for object creation, selection, and transformation. Depending on the mode, the movement of the hand is interpreted differently. However, one of the crucial factors determining the effectiveness of an interface is user productivity. Mode-switching time of different input techniques, either in a touch interface or in a mid-air interface, affects user productivity. Moreover, when touch and mid-air interfaces like VR are combined, making informed decisions pertaining to the mode assignment gets even more complicated. This thesis provides an empirical investigation to characterize the mode switching phenomenon in barehand touch-based and mid-air interfaces. It explores the potential of using these input spaces together for a productivity application in VR. And, it concludes with a step towards defining and evaluating the multi-faceted mode concept, its characteristics and its utility, when designing user interfaces more generally

    Sélection et Contrôle à Distance d'Objets Physiques Augmentés

    No full text
    International audienceNotre recherche doctorale concerne l'interaction dans les environnements intelligents. Plus particulièrement, nous considérons la sélection et le contrôle à distance d'objets physiques augmentés. Nos objectifs sont à la fois conceptuels, par la mise en place d'un espace de conception mais aussi pratiques par la conception, le développement et l'évaluation de techniques d'interaction. Nos résultats ont permis de souligner où l'attention de l'utilisateur doit être pour la sélection efficace et plaisante des objets augmentés à travers la comparaison expérimentale de deux nouvelles techniques de sélection d'objets physiques : P2Roll et P2Slide. Les perspectives en vue de la complétude des travaux concernent principalement le contrôle d'objets et incluent (1) l'évaluation des techniques de guidage pour le contrôle gestuel des objets augmentés par un utilisateur novice, et (2) l'évaluation in situ des techniques conçues

    Realidade aumentada para produção assistida em ambiente industrial

    Get PDF
    Smart factories are becoming more and more common and Augmented Reality (AR) is a pillar of the transition to Industry 4.0 and smart manufacturing. AR can improve many industrial processes such as training, maintenance, assembly, quality control, remote collaboration and others. AR has the potential to revolutionize the way information is accessed, used and exchanged, extending user’s perception and improving their performance. This work proposes a Pervasive AR tool, created in collaboration with industrial partners, to support the training of operators on industrial shop floors while performing production operations. A Human-Centered Design (HCD) methodology was used to identify operators’ difficulties, challenges, and define requirements. After initial meetings with stakeholders, an AR prototype was designed and developed to allow the configuration and visualization of AR content on the shop floor. Several meetings and user studies were conducted to evaluate the developed tools and improve their usability and features. Comparisons between the proposed Head Mounted Display (HMD) solution, the method currently being used in the shopfloor and alternative AR solutions (mobile based) were conducted. The results of user studies suggest that the proposed AR system can significantly improve the performance (up to 70% when compared with the method currently used in the shop floor) of novice operators.Fábricas inteligentes estão a tornar-se cada vez mais comuns e a Realidade Aumentada (Augmented Reality) é essencial para a transição para a Indústria 4.0 e para a produção inteligente. A AR pode ser usada para melhorar muitos processos industriais, tais como treino, assistência, montagem, controlo de qualidade, colaboração remota, entre outros. A AR tem potencial para revolucionar a maneira como a informação é acedida, usada e partilhada, expandindo a perceção do utilizador e melhorando a sua performance. Este trabalho propõe uma ferramenta de AR Pervasiva, criada em colaboração com parceiros da indústria, para ajudar no treino de operadores de chão de fábrica em tarefas de produção fabril. Para identificar as dificuldades, desafios e definir requisitos, foi seguida uma metodologia de Desenho Centrada no Utilizador (HCD). Depois de vários encontros com o público-alvo, um protótipo de AR foi desenhado e desenvolvido para permitir a configuração e visualização de conteúdo em AR na linha de montagem de uma fábrica. Diversas reuniões e testes com utilizadores foram realizados de modo a avaliar as ferramentas desenvolvidas e melhorar a usabilidade e as suas funcionalidades. Foram também realizadas comparações entre a solução de AR proposta, o método atualmente utilizado na linha de produção e uma solução alternativa de AR para dispositivos móveis. Os resultados dos testes de utilizador realizados sugerem que a solução proposta pode melhorar substancialmente a eficiência (até 70% quando comparado com método atualmente utilizado na linha de produção) de novos operadores.Mestrado em Engenharia de Computadores e Telemátic

    Exploring user-defined gestures for alternate interaction space for smartphones and smartwatches

    Get PDF
    2016 Spring.Includes bibliographical references.In smartphones and smartwatches, the input space is limited due to their small form factor. Although many studies have highlighted the possibility of expanding the interaction space for these devices, limited work has been conducted on exploring end-user preferences for gestures in the proposed interaction spaces. In this dissertation, I present the results of two elicitation studies that explore end-user preferences for creating gestures in the proposed alternate interaction spaces for smartphones and smartwatches. Using the data collected from the two elicitation studies, I present gestures preferred by end-users for common tasks that can be performed using smartphones and smartwatches. I also present the end-user mental models for interaction in proposed interaction spaces for these devices, and highlight common user motivations and preferences for suggested gestures. Based on the findings, I present design implications for incorporating the proposed alternate interaction spaces for smartphones and smartwatches
    corecore