Exploring user-defined gestures for lingual and palatal interaction

Abstract

Individuals with motor disabilities can benefit from an alternative means of interacting with the world: using their tongue. The tongue possesses precise movement capabilities within the mouth, allowing individuals to designate targets on the palate. This form of interaction, known as lingual interaction, enables users to perform basic functions by utilizing their tongues to indicate positions. The purpose of this work is to identify the lingual and palatal gestures proposed by end-users. In order to achieve this goal, our initial step was to examine relevant literature on the subject, including clinical studies on the motor capacity of the tongue, devices detecting the movement of the tongue, and current lingual interfaces (e.g., using a wheelchair). Then, we conducted a Gesture Elicitation Study (GES) involving twenty-four (N = 24) participants, who proposed lingual and palatal gestures to perform nineteen (19) Internet of Things (IoT) referents, thus obtaining a corpus of 456 gestures. These gestures were clustered into similarity classes (80 unique gestures) and analyzed by dimension, nature, complexity, thinking time, and goodness-of-fit. Using the Agreement Rate methodology, we present a set of sixteen (16) gestures for a lingual and palatal interface, which serve as a basis for further comparison with gestures suggested by disabled people

    Similar works