3 research outputs found

    Analysis of User-Defined Radar-Based Hand Gestures Sensed Through Multiple Materials

    No full text
    Radar sensing can penetrate non-conducting materials, such as glass, wood, and plastic, which makes it appropriate for recognizing gestures in environments with poor visibility, limited accessibility, and privacy sensitivity. While the performance of radar-based gesture recognition in these environments has been extensively researched, the preferences that users express for these gestures are less known. To analyze such gestures simultaneously according to their user preference and their system recognition performance, we conducted three gesture elicitation studies each with n1=30n_{1}{=}30 participants to identify user-defined, radar-based gestures sensed through three distinct materials: the glass of a shop window, the wood of an office door, and polyvinyl chloride in an emergency. On this basis, we created a new dataset of nine selected gesture classes for n2=20n_{2}{=}20 participants repeating twice the same gesture captured by radar through three materials, i.e., glass, wood, and polyvinyl chloride. To uniformly compare recognition rates in these conditions with sensing variations, a specifically tailored procedure was defined and conducted with one-shot radar calibration to train and evaluate a gesture recognizer. ‘Wood’ achieved the best recognition rate (96.44%), followed by ‘Polyvinyl chloride’ and ‘Glass’. We perform a preference-performance analysis of the gestures by combining the agreement rate from the elicitation studies and the recognition rate from the evaluation
    corecore