2,200 research outputs found

    An Evaluation of Mouse and Keyboard Interaction Indicators towards Non-intrusive and Low Cost Affective Modeling in an Educational Context

    Get PDF
    AbstractIn this paper we propose a series of indicators, which derive from user's interactions with mouse and keyboard. The goal is to evaluate their use in identifying affective states and behavior changes in an e-learning platform by means of non-intrusive and low cost methods. The approach we have followed study user's interactions regardless of the task being performed and its presentation, aiming at finding a solution applicable in any domain. In particular, mouse movements and clicks, as well as keystrokes were recorded during a math problem solving activity where users involved in the experiment had not only to score their degree of valence (i.e., pleasure versus displeasure) and arousal (i.e., high activation versus low activation) of their affective states after each problem by using the Self-Assessment-Manikin scale, but also type a description of their own feelings. By using that affective labeling, we evaluated the information provided by these different indicators processed from the original user's interactions logs. In total, we computed 42 keyboard indicators and 96 mouse indicators

    Predictive biometrics: A review and analysis of predicting personal characteristics from biometric data

    Get PDF
    Interest in the exploitation of soft biometrics information has continued to develop over the last decade or so. In comparison with traditional biometrics, which focuses principally on person identification, the idea of soft biometrics processing is to study the utilisation of more general information regarding a system user, which is not necessarily unique. There are increasing indications that this type of data will have great value in providing complementary information for user authentication. However, the authors have also seen a growing interest in broadening the predictive capabilities of biometric data, encompassing both easily definable characteristics such as subject age and, most recently, `higher level' characteristics such as emotional or mental states. This study will present a selective review of the predictive capabilities, in the widest sense, of biometric data processing, providing an analysis of the key issues still adequately to be addressed if this concept of predictive biometrics is to be fully exploited in the future

    Exploring Natural Language Processing Methods for Interactive Behaviour Modelling

    Full text link
    Analysing and modelling interactive behaviour is an important topic in human-computer interaction (HCI) and a key requirement for the development of intelligent interactive systems. Interactive behaviour has a sequential (actions happen one after another) and hierarchical (a sequence of actions forms an activity driven by interaction goals) structure, which may be similar to the structure of natural language. Designed based on such a structure, natural language processing (NLP) methods have achieved groundbreaking success in various downstream tasks. However, few works linked interactive behaviour with natural language. In this paper, we explore the similarity between interactive behaviour and natural language by applying an NLP method, byte pair encoding (BPE), to encode mouse and keyboard behaviour. We then analyse the vocabulary, i.e., the set of action sequences, learnt by BPE, as well as use the vocabulary to encode the input behaviour for interactive task recognition. An existing dataset collected in constrained lab settings and our novel out-of-the-lab dataset were used for evaluation. Results show that this natural language-inspired approach not only learns action sequences that reflect specific interaction goals, but also achieves higher F1 scores on task recognition than other methods. Our work reveals the similarity between interactive behaviour and natural language, and presents the potential of applying the new pack of methods that leverage insights from NLP to model interactive behaviour in HCI

    Behavioral biometrics and ambient intelligence: New opportunities for context-aware applications

    Get PDF
    Ambient Intelligence has always been associated with the promise of exciting new applications, aware of the users' needs and state, and proactive towards their goals. However, the acquisition of the necessary information for supporting such high-level learning and decision-making processes is not always straightforward. In this chapter we describe a multi-faceted smart environment for the acquisition of relevant contextual information about its users. This information, acquired transparently through the technological devices in the environment, supports the building of high-level knowledge about the users, including a quantification of aspects such as performance, attention, mental fatigue and stress. The environment described is particularly suited for milieus such as workplaces and classrooms, in which this kind of information may be very important for the effective management of human resources, with advantages for organizations and individuals alike.(UID/CEC/00319/2013)info:eu-repo/semantics/publishedVersio

    A Review of Emotion Recognition Methods from Keystroke, Mouse, and Touchscreen Dynamics

    Get PDF
    Emotion can be defined as a subject’s organismic response to an external or internal stimulus event. The responses could be reflected in pattern changes of the subject’s facial expression, gesture, gait, eye-movement, physiological signals, speech and voice, keystroke, and mouse dynamics, etc. This suggests that on the one hand emotions can be measured/recognized from the responses, and on the other hand they can be facilitated/regulated by external stimulus events, situation changes or internal motivation changes. It is well-known that emotion has a close relationship with both physical and mental health, usually affecting an individual’s and a team’s work performance, thus emotion recognition is an important prerequisite for emotion regulation towards better emotional states and work performance. The primary problem in emotion recognition is how to recognize a subject’s emotional states easily and accurately. Currently, there are a body of good research on emotion recognition from facial expression, gesture, gait, eye-tracking, and other physiological signals such as speech and voice, but they are all intrusive and obtrusive to some extent. In contrast, keystroke, mouse and touchscreen (KMT) dynamics data can be collected non-intrusively and unobtrusively as secondary data responding to primary physical actions, thus, this paper aims to review the state-of-the-art research on emotion recognition from KMT dynamics and to identify key research challenges, opportunities and a future research roadmap for referencing. In addition, this paper answers the following six research questions (RQs): (1) what are the commonly used emotion elicitation methods and databases for emotion recognition? (2) which emotions could be recognized from KMT dynamics? (3) what key features are most appropriate for recognizing different specific emotions? (4) which classification methods are most effective for specific emotions? (5) what are the application trends of emotion recognition from KMT dynamics? (6) which application contexts are of greatest concern

    The Multimodal Tutor: Adaptive Feedback from Multimodal Experiences

    Get PDF
    This doctoral thesis describes the journey of ideation, prototyping and empirical testing of the Multimodal Tutor, a system designed for providing digital feedback that supports psychomotor skills acquisition using learning and multimodal data capturing. The feedback is given in real-time with machine-driven assessment of the learner's task execution. The predictions are tailored by supervised machine learning models trained with human annotated samples. The main contributions of this thesis are: a literature survey on multimodal data for learning, a conceptual model (the Multimodal Learning Analytics Model), a technological framework (the Multimodal Pipeline), a data annotation tool (the Visual Inspection Tool) and a case study in Cardiopulmonary Resuscitation training (CPR Tutor). The CPR Tutor generates real-time, adaptive feedback using kinematic and myographic data and neural networks

    Knowledge extraction from pointer movements and its application to detect uncertainty

    Get PDF
    This work was supported by the Doctoral Program NOVA I4H (Fundacao para a Ciencia e a Tecnologia) [grant PD/BDE/114561/2016].Pointer-tracking methods can capture a real-time trace at high spatio-temporal resolution of users' pointer interactions with a graphical user interface. This trace is potentially valuable for research on human-computer interaction (HCI) and for investigating perceptual, cognitive and affective processes during HCI. However, little research has reported spatio-temporal pointer features for the purpose of tracking pointer movements in on-line surveys. In two studies, we identified a set of pointer features and movement patterns and showed that these can be easily distinguished. In a third study, we explored the feasibility of using patterns of interactive pointer movements, or micro-behaviours, to detect response uncertainty. Using logistic regression and k-fold cross-validation in model training and testing, the uncertainty model achieved an estimated performance accuracy of 81%. These findings suggest that micro-behaviours provide a promising approach toward developing a better understanding of the relationship between the dynamics of pointer movements and underlying perceptual, cognitive and affective psychological mechanisms. Human-computer interaction; Pointer-tracking; Mouse movement dynamics; Decision uncertainty; On-line survey; Spatio-temporal features; Machine learningproofpublishe
    • 

    corecore