Mid-air Gesture Recognition by Ultra-Wide Band Radar Echoes

Abstract

Microwave radar sensors in human-computer interaction promote several advantages over wearable and image-based sensors, such as privacy preservation, high reliability regardless of the ambient and lighting conditions, and larger field of view. However, the raw signals produced by such radars are high-dimension and very complex to process and interpret for gesture recognition. For these reasons, machine learning techniques have been mainly used for gesture recognition, but require a significant amount of gesture templates for training and calibration that are specific for each radar. To address these challenges in the context of mid-air gesture interaction, we introduce a data processing pipeline for hand gesture recognition adopting a model-based approach that combines full-wave electromagnetic modeling and inversion. Thanks to this model, gesture recognition is reduced to handling two dimensions: the hand-radar distance and the relative dielectric permittivity, which depends on the hand only (e.g., size, surface, electric properties, orientation). We are developing a software environment that accommodates the significant stages of our pipeline towards final gesture recognition. We already tested it on a dataset of 16 gesture classes with 5 templates per class recorded with the Walabot, a lightweight, off-the-shelf array radar. We are now studying how user-defined radar gestures resulting from gesture elicitation studies could be properly recognized or not by our gesture recognition engine

    Similar works

    Full text

    thumbnail-image

    Available Versions