Tangible fidgeting interfaces for mental wellbeing recognition using deep learning applied to physiological sensor data

Abstract

The momentary assessment of an individual's affective state is critical to the monitoring of mental wellbeing and the ability to instantly apply interventions. This thesis introduces the concept of tangible fidgeting interfaces for affective recognition from design and development through to evaluation. Tangible interfaces expand upon the affordance of familiar physical objects as the ability to touch and fidget may help to tap into individuals' psychological need to feel occupied and engaged. Embedding digital technologies within interfaces capitalises on motor and perceptual capabilities and allows for the direct manipulation of data, offering people the potential for new modes of interaction when experiencing mental wellbeing challenges. Tangible interfaces present an ideal opportunity to digitally enable physical fidgeting interactions along with physiological sensor monitoring to unobtrusively and comfortable measure non-visable changes in affective state. This opportunity initiated the investigation of factors that would bring about the designing of more effective intelligent solutions using participatory design techniques to engage people in designing solutions relevant to themselves. Adopting an artificial intelligence approach using physiological signals creates the possibility to quantify affect with high levels of accuracy. However, labelling is an indispensable stage of data pre-processing that is required before classification and can be extremely challenging with multi-model sensor data. New techniques are introduced for labelling at the point of collection coupled with a pilot study and a systematic performance comparison of five custom built labelling interfaces. When classifying labelled physiological sensor data, individual differences between people limit the generalisability of models. To address this challenge, a transfer learning approach has been developed that personalises affective models using few labelled samples. This approach to personalise models and improve cross-domain performance is completed on-device, automating the traditionally manual process, saving time and labour. Furthermore, monitoring trajectories over long periods of time inherits some critical limitations in relation to the size of the training dataset. This shortcoming may hinder the development of reliable and accurate machine learning models. A second framework has been developed to overcome the limitation of small training datasets using an image-encoding transfer learning approach. This research offers the first attempt at the development of tangible interfaces using artificial intelligence towards building a real-world continuous affect recognition system in addition to offering real-time feedback to perform as interventions. This exploration of affective interfaces has many potential applications to help improve quality of life for the wider population

    Similar works