2,685 research outputs found

    To Draw or Not to Draw: Recognizing Stroke-Hover Intent in Gesture-Free Bare-Hand Mid-Air Drawing Tasks

    Get PDF
    Over the past several decades, technological advancements have introduced new modes of communication with the computers, introducing a shift from traditional mouse and keyboard interfaces. While touch based interactions are abundantly being used today, latest developments in computer vision, body tracking stereo cameras, and augmented and virtual reality have now enabled communicating with the computers using spatial input in the physical 3D space. These techniques are now being integrated into several design critical tasks like sketching, modeling, etc. through sophisticated methodologies and use of specialized instrumented devices. One of the prime challenges in design research is to make this spatial interaction with the computer as intuitive as possible for the users. Drawing curves in mid-air with fingers, is a fundamental task with applications to 3D sketching, geometric modeling, handwriting recognition, and authentication. Sketching in general, is a crucial mode for effective idea communication between designers. Mid-air curve input is typically accomplished through instrumented controllers, specific hand postures, or pre-defined hand gestures, in presence of depth and motion sensing cameras. The user may use any of these modalities to express the intention to start or stop sketching. However, apart from suffering with issues like lack of robustness, the use of such gestures, specific postures, or the necessity of instrumented controllers for design specific tasks further result in an additional cognitive load on the user. To address the problems associated with different mid-air curve input modalities, the presented research discusses the design, development, and evaluation of data driven models for intent recognition in non-instrumented, gesture-free, bare-hand mid-air drawing tasks. The research is motivated by a behavioral study that demonstrates the need for such an approach due to the lack of robustness and intuitiveness while using hand postures and instrumented devices. The main objective is to study how users move during mid-air sketching, develop qualitative insights regarding such movements, and consequently implement a computational approach to determine when the user intends to draw in mid-air without the use of an explicit mechanism (such as an instrumented controller or a specified hand-posture). By recording the user’s hand trajectory, the idea is to simply classify this point as either hover or stroke. The resulting model allows for the classification of points on the user’s spatial trajectory. Drawing inspiration from the way users sketch in mid-air, this research first specifies the necessity for an alternate approach for processing bare hand mid-air curves in a continuous fashion. Further, this research presents a novel drawing intent recognition work flow for every recorded drawing point, using three different approaches. We begin with recording mid-air drawing data and developing a classification model based on the extracted geometric properties of the recorded data. The main goal behind developing this model is to identify drawing intent from critical geometric and temporal features. In the second approach, we explore the variations in prediction quality of the model by improving the dimensionality of data used as mid-air curve input. Finally, in the third approach, we seek to understand the drawing intention from mid-air curves using sophisticated dimensionality reduction neural networks such as autoencoders. Finally, the broad level implications of this research are discussed, with potential development areas in the design and research of mid-air interactions

    Embodied Interactions for Spatial Design Ideation: Symbolic, Geometric, and Tangible Approaches

    Get PDF
    Computer interfaces are evolving from mere aids for number crunching into active partners in creative processes such as art and design. This is, to a great extent, the result of mass availability of new interaction technology such as depth sensing, sensor integration in mobile devices, and increasing computational power. We are now witnessing the emergence of maker culture that can elevate art and design beyond the purview of enterprises and professionals such as trained engineers and artists. Materializing this transformation is not trivial; everyone has ideas but only a select few can bring them to reality. The challenge is the recognition and the subsequent interpretation of human actions into design intent

    Understanding 3D mid-air hand gestures with interactive surfaces and displays: a systematic literature review

    Get PDF
    3D gesture based systems are becoming ubiquitous and there are many mid-air hand gestures that exist for interacting with digital surfaces and displays. There is no well defined gesture set for 3D mid-air hand gestures which makes it difficult to develop applications that have consistent gestures. To understand what gestures exist we conducted the first comprehensive systematic literature review on mid-air hand gestures following existing research methods. The results of the review identified 65 papers where the mid-air hand gestures supported tasks for selection, navigation, and manipulation. We also classified the gestures according to a gesture classification scheme and identified how these gestures have been empirically evaluated. The results of the review provide a richer understanding of what mid-air hand gestures have been designed, implemented, and evaluated in the literature which can help developers design better user experiences for digital interactive surfaces and displays

    Virtual kompang: mapping in-air hand gestures for music interaction using gestural musical controller

    Get PDF
    The introduction of new gesture interfaces has been expanding the possibilities of creating new Digital Musical Instruments (DMI). However, the created interfaces are mainly focused on modern western musical instruments such as piano, drum and guitar. This paper presents a virtual musical instrument, namely Virtual Kompang, a traditional Malay percussion instrument. The interface design and its implementation are presented in this paper. The results of a guessability study are presented in the study to elicit end-user hand movement to map onto commands. The study demonstrated the existing of common hand gestures among the users on mapping with the selected commands. A consensus set of gestures is presented as the outcome of this study.Keywords: Digital Music Instrument, Virtual Environment, Gestural Control, Leap Motion, Virtual Instrumen

    Understanding 3D mid-air hand gestures with interactive surfaces and displays: a systematic literature review

    Get PDF
    3D gesture based systems are becoming ubiquitous and there are many mid-air hand gestures that exist for interacting with digital surfaces and displays. There is no well defined gesture set for 3D mid-air hand gestures which makes it difficult to develop applications that have consistent gestures. To understand what gestures exist we conducted the first comprehensive systematic literature review on mid-air hand gestures following existing research methods. The results of the review identified 65 papers where the mid-air hand gestures supported tasks for selection, navigation, and manipulation. We also classified the gestures according to a gesture classification scheme and identified how these gestures have been empirically evaluated. The results of the review provide a richer understanding of what mid-air hand gestures have been designed, implemented, and evaluated in the literature which can help developers design better user experiences for digital interactive surfaces and displays

    Virtual Reality Interfaces for Product Design: Finding User Interface solutions for design creation within Virtual Reality

    Get PDF
    The focus of Virtual Reality has gone from research to widespread adoption in entertainment and practical directions, like automotive design and architectural visualization. With that, we have to take into consideration the best way to give in-experience control to the user and the interaction within the interface. Recent studies explore the ergonomic considerations and zones of content for VR interfaces. But Virtual Reality interaction design has a long way to go and nowadays is done mainly like a projection of 2D screens, with planar interfaces in the 3D space, almost ignoring the immersive potential of the Virtual Reality medium (Alger 2015; Google Developers 2017). Designers that work with 3D objects might find it difficult to make design decisions and validate their concepts based on context and empathy. To help with this, they often prototype, which can take a great deal of time and effort. Virtual reality can be a tool that improves the process and gives the designer an unconstrained and flexible canvas. By reimagining interactions for Virtual Reality, this thesis aims to create interface tools that help designers explore shape and manipulate their designs

    Augmented reality experiments with industrial robot in industry 4.0 environment

    Get PDF
    The role of human in the Industrie 4.0 vision is still considered as irreplaceable. Therefore, user interfaces of cyber-physical systems involved in the production automation need to be well designed and taking into consideration the industrial application requirements. With the advances in augmented and virtual reality data visualization and novel interaction techniques like mid-air gestures, these approaches seem to be suitable for integration into the industry environment. This paper describes the implementation of an augmented reality application for smart glasses with mid-air gestures and smart phone with touch interaction to compare and evaluate the usage of such interfaces in a production cell comprising an industrial robot.This research has been (partially) supported by the Technology Agency of the Czech Republic under the research program TE01020415 (V3C – Visual Computing Competence Center). This work is also (partially) funded by the Operational Programme for Competitiveness and Internationalisation –COMPETE 2020 and by FCT – Portuguese Foundation for Science and Technology.info:eu-repo/semantics/publishedVersio

    GIFT: Gesture-Based Interaction by Fingers Tracking, an Interaction Technique for Virtual Environment

    Get PDF
    Three Dimensional (3D) interaction is the plausible human interaction inside a Virtual Environment (VE). The rise of the Virtual Reality (VR) applications in various domains demands for a feasible 3D interface. Ensuring immersivity in a virtual space, this paper presents an interaction technique where manipulation is performed by the perceptive gestures of the two dominant fingers; thumb and index. The two fingertip-thimbles made of paper are used to trace states and positions of the fingers by an ordinary camera. Based on the positions of the fingers, the basic interaction tasks; selection, scaling, rotation, translation and navigation are performed by intuitive gestures of the fingers. Without keeping a gestural database, the features-free detection of the fingers guarantees speedier interactions. Moreover, the system is user-independent and depends neither on the size nor on the color of the users’ hand. With a case-study project; Interactions by the Gestures of Fingers (IGF) the technique is implemented for evaluation. The IGF application traces gestures of the fingers using the libraries of OpenCV at the back-end. At the front-end, the objects of the VE are rendered accordingly using the Open Graphics Library; OpenGL. The system is assessed in a moderate lighting condition by a group of 15 users. Furthermore, usability of the technique is investigated in games. Outcomes of the evaluations revealed that the approach is suitable for VR applications both in terms of cost and accuracy
    • …
    corecore