3,962 research outputs found

    Exploring At-Your-Side Gestural Interaction for Ubiquitous Environments

    Get PDF
    International audienceFree-space gestural systems are faced with two major issues: a lack of subtlety due to explicit mid-air arm movements, and the highly effortful nature of such interactions. With an ever-growing ubiquity of interactive devices, displays, and appliances with non-standard interfaces, lower-effort and more socially acceptable interaction paradigms are essential. To address these issues, we explore at-one's-side gestural input. Within this space, we present the results of two studies that investigate the use of side-gesture input for interaction. First, we investigate end-user preference through a gesture elicitation study, present a gesture set, and validate the need for dynamic, diverse, and variable-length gestures. We then explore the feasibility of designing such a gesture recognition system, dubbed WatchTrace, which supports alphanumeric gestures of up to length three with an average accuracy of up to 82%, providing a rich, dynamic, and feasible gestural vocabulary

    A three person poncho and a set of maracas:designing Ola De La Vida, a co-located social play computer game

    Get PDF
    Events that bring people together to play video games as a social experience are growing in popularity across the western world. Amongst these events are ‘play parties,’ temporary social play environments which create unique shared play experiences for attendees unlike anything they could experience elsewhere. This paper explores co-located play experience design and proposes that social play games can lead to the formation of temporary play communities. These communities may last for a single gameplay session, for a whole event, or beyond the event. The paper analyses games designed or enhanced by social play contexts and evaluates a social play game, Ola de la Vida. The research findings suggest that social play games can foster community through the design of game play within the game itself, through curation which enhances their social potential, and through design for ‘semi-spectatorship’, which blurs the boundaries between player and spectator thus widening the game’s magic circle

    Brave New GES World:A Systematic Literature Review of Gestures and Referents in Gesture Elicitation Studies

    Get PDF
    How to determine highly effective and intuitive gesture sets for interactive systems tailored to end users’ preferences? A substantial body of knowledge is available on this topic, among which gesture elicitation studies stand out distinctively. In these studies, end users are invited to propose gestures for specific referents, which are the functions to control for an interactive system. The vast majority of gesture elicitation studies conclude with a consensus gesture set identified following a process of consensus or agreement analysis. However, the information about specific gesture sets determined for specific applications is scattered across a wide landscape of disconnected scientific publications, which poses challenges to researchers and practitioners to effectively harness this body of knowledge. To address this challenge, we conducted a systematic literature review and examined a corpus of N=267 studies encompassing a total of 187, 265 gestures elicited from 6, 659 participants for 4, 106 referents. To understand similarities in users’ gesture preferences within this extensive dataset, we analyzed a sample of 2, 304 gestures extracted from the studies identified in our literature review. Our approach consisted of (i) identifying the context of use represented by end users, devices, platforms, and gesture sensing technology, (ii) categorizing the referents, (iii) classifying the gestures elicited for those referents, and (iv) cataloging the gestures based on their representation and implementation modalities. Drawing from the findings of this review, we propose guidelines for conducting future end-user gesture elicitation studies

    Supporting public participation through interactive

    Get PDF
    A thesis submitted in partial fulfillment of the requirements for the degree of Doctor in Information Management, specialization in Geographic Information SystemsCitizen participation as a key priority of open cities, gives citizens the chance to influence public decision-making. Effectively engaging broader types of citizens into high participation levels has long been an issue due to various situational and technical constrains. Traditional public participation technologies (e.g. public hearing) usually are blame for low accessibility by the general public. The development of Information Communication Technology brings new methods to engage a broader spectrum of citizens in deeper participation level during urban planning processes. Interactive public displays as a public communication medium, hold some key advantages in comparison to other media. Compared to personal devices, public displays make public spaces into sociable places, where social communication and interaction can be enriched without intentionally or unintentionally excluding some groups’ opinions. Public displays can increase the visibility of public events while it is more flexible and up-to-date regarding showing information. Besides, they can also foster a collective awareness and support group behavioral changes. Moreover, due to the public nature of public displays, they provide broad accessibility to different groups of citizens. Public displays have a great potential in bringing new opportunities to facilitate public participation in an urban planning process. In the light of previous work on public displays, the research goal is to investigate a relatively new form of citizen participation known as Public Display Participation. This participation form refers to the use of public displays for citizen participation in the context of urban planning. The main research question of the thesis is how public displays can be used for facilitating citizen consultation in an urban planning process. First, a systematic literature review is done to get an understanding of the current achievements and gaps of research on public displays for public participation. Second, an elicitation study has been conducted to design end user centered interactions with public displays for citizens’ consulting activities. Finally, we run a usability to evaluate the usability of public displays for citizen consultation and their user experience. The main contributions of this thesis can be summarized as: (1) the identification of key challenges and opportunities for future research in using public displays for public participation in urban contexts; (2) two sets of user-defined gestures for two sets of user-defined phone gestures and hand gestures for performing eleven consulting activities, which are about examining the urban planning designs and giving feedback related to design alternatives, are also identified. (3) a new approach for using public displays for voting and commenting in urban planning, and a multi-level evaluation of a prototypical system implementing the proposed approach. Designers and researchers can use the contributions of this thesis, to create interactive public displays for supporting higher public participat i.e. citizen collaboration and empowerment

    Discoverable Free Space Gesture Sets for Walk-Up-and-Use Interactions

    Get PDF
    abstract: Advances in technology are fueling a movement toward ubiquity for beyond-the-desktop systems. Novel interaction modalities, such as free space or full body gestures are becoming more common, as demonstrated by the rise of systems such as the Microsoft Kinect. However, much of the interaction design research for such systems is still focused on desktop and touch interactions. Current thinking in free-space gestures are limited in capability and imagination and most gesture studies have not attempted to identify gestures appropriate for public walk-up-and-use applications. A walk-up-and-use display must be discoverable, such that first-time users can use the system without any training, flexible, and not fatiguing, especially in the case of longer-term interactions. One mechanism for defining gesture sets for walk-up-and-use interactions is a participatory design method called gesture elicitation. This method has been used to identify several user-generated gesture sets and shown that user-generated sets are preferred by users over those defined by system designers. However, for these studies to be successfully implemented in walk-up-and-use applications, there is a need to understand which components of these gestures are semantically meaningful (i.e. do users distinguish been using their left and right hand, or are those semantically the same thing?). Thus, defining a standardized gesture vocabulary for coding, characterizing, and evaluating gestures is critical. This dissertation presents three gesture elicitation studies for walk-up-and-use displays that employ a novel gesture elicitation methodology, alongside a novel coding scheme for gesture elicitation data that focuses on features most important to users’ mental models. Generalizable design principles, based on the three studies, are then derived and presented (e.g. changes in speed are meaningful for scroll actions in walk up and use displays but not for paging or selection). The major contributions of this work are: (1) an elicitation methodology that aids users in overcoming biases from existing interaction modalities; (2) a better understanding of the gestural features that matter, e.g. that capture the intent of the gestures; and (3) generalizable design principles for walk-up-and-use public displays.Dissertation/ThesisDoctoral Dissertation Computer Science 201

    Context-aware gestural interaction in the smart environments of the ubiquitous computing era

    Get PDF
    A thesis submitted to the University of Bedfordshire in partial fulfilment of the requirements for the degree of Doctor of PhilosophyTechnology is becoming pervasive and the current interfaces are not adequate for the interaction with the smart environments of the ubiquitous computing era. Recently, researchers have started to address this issue introducing the concept of natural user interface, which is mainly based on gestural interactions. Many issues are still open in this emerging domain and, in particular, there is a lack of common guidelines for coherent implementation of gestural interfaces. This research investigates gestural interactions between humans and smart environments. It proposes a novel framework for the high-level organization of the context information. The framework is conceived to provide the support for a novel approach using functional gestures to reduce the gesture ambiguity and the number of gestures in taxonomies and improve the usability. In order to validate this framework, a proof-of-concept has been developed. A prototype has been developed by implementing a novel method for the view-invariant recognition of deictic and dynamic gestures. Tests have been conducted to assess the gesture recognition accuracy and the usability of the interfaces developed following the proposed framework. The results show that the method provides optimal gesture recognition from very different view-points whilst the usability tests have yielded high scores. Further investigation on the context information has been performed tackling the problem of user status. It is intended as human activity and a technique based on an innovative application of electromyography is proposed. The tests show that the proposed technique has achieved good activity recognition accuracy. The context is treated also as system status. In ubiquitous computing, the system can adopt different paradigms: wearable, environmental and pervasive. A novel paradigm, called synergistic paradigm, is presented combining the advantages of the wearable and environmental paradigms. Moreover, it augments the interaction possibilities of the user and ensures better gesture recognition accuracy than with the other paradigms

    BendableSound: An Elastic Multisensory Surface Using Touch-based interactions to Assist Children with Severe Autism During Music Therapy

    Get PDF
    Neurological Music Therapy uses live music to improve the sensorimotor regulation of children with severe autism. However, they often lack musical training and their impairments limit their interactions with musical instruments. In this paper, we present our co-design work that led to the BendableSound prototype: an elastic multisensory surface encouraging users to practice coordination movements when touching a fabric to play sounds. We present the results of a formative study conducted with 18 teachers showing BendableSound was perceived as “usable” and “attractive”. Then, we present a deployment study with 24 children with severe autism showing BendableSound is “easy to use” and may potentially have therapeutic benefits regarding attention and motor development. We propose a set of design insights that could guide the design of natural user interfaces, particularly elastic multisensory surfaces. We close with a discussion and directions for future work
    • 

    corecore