3,099 research outputs found

    Puzzle-solving activity as an indicator of epistemic confusion

    Get PDF
    When students perform complex cognitive activities, such as solving a problem, epistemic emotions can occur and influence the completion of the task. Confusion is one of these emotions and it can produce either negative or positive outcomes, according to the situation. For this reason, considering confusion can be an important factor for educators to evaluate students' progression in cognitive activities. However, in digital learning environments, observing students' confusion, as well as other epistemic emotions, can be problematic because of the remoteness of students. The study reported in this article explored new methodologies to assess emotions in a problem-solving task. The experimental task consisted of the resolution of logic puzzles presented on a computer, before, and after watching an instructional video depicting a method to solve the puzzle. In parallel to collecting self-reported confusion ratings, human-computer interaction was captured to serve as non-intrusive measures of emotions. The results revealed that the level of self-reported confusion was negatively correlated with the performance on solving the puzzles. In addition, while comparing the pre- and post-video sequences, the experience of confusion tended to differ. Before watching the instructional video, the number of clicks on the puzzle was positively correlated with the level of confusion whereas the correlation was negatively after the video. Moreover, the main emotions reported before the video (e.g., confusion, frustration, curiosity) tended to differ from the emotions reported after the videos (e.g., engagement, delight, boredom). These results provide insights into the ambivalent impact of confusion in problem-solving task, illustrating the dual effect (i.e., positive or negative) of this emotion on activity and performance, as reported in the literature. Applications of this methodology to real-world settings are discussed

    Immersive Telepresence: A framework for training and rehearsal in a postdigital age

    Get PDF

    THE ROLE OF EMOTION IN VISUALIZATION

    Get PDF
    The popular notion that emotion and reason are incompatible is no longer defensi- ble. Recent research in psychology and cognitive science has established emotion as a key element in numerous aspects of perception and cognition, including attention, memory, decision-making, risk perception, and creativity. This dissertation centers around the observation that emotion influences many aspects of perception and cog- nition that are crucial for effective visualization. First, I demonstrate that emotion influences accuracy in fundamental visualiza- tion tasks by combining a classic graphical perception experiment (from Cleveland and McGill) with emotion induction procedures from psychology (chapter 3). Next, I expand on the experiments in the first chapter to explore additional techniques for studying emotion and visualization, resulting in an experiment that shows that performance differences between primed individuals persist even as task difficulty in- creases (chapter 4). In a separate experiment, I show how certain emotional states (i.e. frustration and engagement) can be inferred from visualization interaction logs using machine learning (chapter 5). I then discuss a model for individual cognitive dif- ferences in visualization, which situates emotion into existing individual differences research in visualization (chapter 6). Finally, I propose an preliminary model for emotion in visualization (chapter 7)

    Affective Computing for Emotion Detection using Vision and Wearable Sensors

    Get PDF
    The research explores the opportunities, challenges, limitations, and presents advancements in computing that relates to, arises from, or deliberately influences emotions (Picard, 1997). The field is referred to as Affective Computing (AC) and is expected to play a major role in the engineering and development of computationally and cognitively intelligent systems, processors and applications in the future. Today the field of AC is bolstered by the emergence of multiple sources of affective data and is fuelled on by developments under various Internet of Things (IoTs) projects and the fusion potential of multiple sensory affective data streams. The core focus of this thesis involves investigation into whether the sensitivity and specificity (predictive performance) of AC, based on the fusion of multi-sensor data streams, is fit for purpose? Can such AC powered technologies and techniques truly deliver increasingly accurate emotion predictions of subjects in the real world? The thesis begins by presenting a number of research justifications and AC research questions that are used to formulate the original thesis hypothesis and thesis objectives. As part of the research conducted, a detailed state of the art investigations explored many aspects of AC from both a scientific and technological perspective. The complexity of AC as a multi-sensor, multi-modality, data fusion problem unfolded during the state of the art research and this ultimately led to novel thinking and origination in the form of the creation of an AC conceptualised architecture that will act as a practical and theoretical foundation for the engineering of future AC platforms and solutions. The AC conceptual architecture developed as a result of this research, was applied to the engineering of a series of software artifacts that were combined to create a prototypical AC multi-sensor platform known as the Emotion Fusion Server (EFS) to be used in the thesis hypothesis AC experimentation phases of the research. The thesis research used the EFS platform to conduct a detailed series of AC experiments to investigate if the fusion of multiple sensory sources of affective data from sensory devices can significantly increase the accuracy of emotion prediction by computationally intelligent means. The research involved conducting numerous controlled experiments along with the statistical analysis of the performance of sensors for the purposes of AC, the findings of which serve to assess the feasibility of AC in various domains and points to future directions for the AC field. The AC experiments data investigations conducted in relation to the thesis hypothesis used applied statistical methods and techniques, and the results, analytics and evaluations are presented throughout the two thesis research volumes. The thesis concludes by providing a detailed set of formal findings, conclusions and decisions in relation to the overarching research hypothesis on the sensitivity and specificity of the fusion of vision and wearables sensor modalities and offers foresights and guidance into the many problems, challenges and projections for the AC field into the future

    CorrFeat: Correlation-based feature extraction algorithm using skin conductance and pupil diameter for emotion recognition

    Get PDF
    To recognize emotions using less obtrusive wearable sensors, we present a novel emotion recognition method that uses only pupil diameter (PD) and skin conductance (SC). Psychological studies show that these two signals are related to the attention level of humans exposed to visual stimuli. Based on this, we propose a feature extraction algorithm that extract correlation-based features for participants watching the same video clip. To boost performance given limited data, we implement a learning system without a deep architecture to classify arousal and valence. Our method outperforms not only state-of-art approaches, but also widely-used traditional and deep learning methods
    • 

    corecore