2,232 research outputs found

    When Does Disengagement Correlate with Performance in Spoken Dialog Computer Tutoring?

    Get PDF
    In this paper we investigate how student disengagement relates to two performance metrics in a spoken dialog computer tutoring corpus, both when disengagement is measured through manual annotation by a trained human judge, and also when disengagement is measured through automatic annotation by the system based on a machine learning model. First, we investigate whether manually labeled overall disengagement and six different disengagement types are predictive of learning and user satisfaction in the corpus. Our results show that although students’ percentage of overall disengaged turns negatively correlates both with the amount they learn and their user satisfaction, the individual types of disengagement correlate differently: some negatively correlate with learning and user satisfaction, while others don’t correlate with eithermetric at all. Moreover, these relationships change somewhat depending on student prerequisite knowledge level. Furthermore, using multiple disengagement types to predict learning improves predictive power. Overall, these manual label-based results suggest that although adapting to disengagement should improve both student learning and user satisfaction in computer tutoring, maximizing performance requires the system to detect and respond differently based on disengagement type. Next, we present an approach to automatically detecting and responding to user disengagement types based on their differing correlations with correctness. Investigation of ourmachine learningmodel of user disengagement shows that its automatic labels negatively correlate with both performance metrics in the same way as the manual labels. The similarity of the correlations across the manual and automatic labels suggests that the automatic labels are a reasonable substitute for the manual labels. Moreover, the significant negative correlations themselves suggest that redesigning ITSPOKE to automatically detect and respond to disengagement has the potential to remediate disengagement and thereby improve performance, even in the presence of noise introduced by the automatic detection process

    Qualitative, quantitative, and data mining methods for analyzing log data to characterize students' learning strategies and behaviors [discussant]

    Get PDF
    This symposium addresses how different classes of research methods, all based upon the use of log data from educational software, can facilitate the analysis of students’ learning strategies and behaviors. To this end, four multi-method programs of research are discussed, including the use of qualitative, quantitative-statistical, quantitative-modeling, and educational data mining methods. The symposium presents evidence regarding the applicability of each type of method to research questions of different grain sizes, and provides several examples of how these methods can be used in concert to facilitate our understanding of learning processes, learning strategies, and behaviors related to motivation, meta-cognition, and engagement

    Log file analysis for disengagement detection in e-Learning environments

    Get PDF

    Knowledge or gaming? Cognitive modelling based on multiple-attempt response

    Full text link
    © 2017 International World Wide Web Conference Committee (IW3C2), published under Creative Commons CC BY 4.0 License. Recent decades have witnessed the rapid growth of intelligent tutoring systems (ITS), in which personalized adaptive techniques are successfully employed to improve the learning of each individual student. However, the problem of using cognitive analysis to distill the knowledge and gaming factor from students learning history is still underexplored. To this end, we propose a Knowledge Plus Gaming Response Model (KPGRM) based on multiple-attempt responses. Specifically, we first measure the explicit gaming factor in each multiple-attempt response. Next, we utilize collaborative filtering methods to infer the implicit gaming factor of one-attempt responses. Then we model student learning cognitively by considering both gaming and knowledge factors simultaneously based on a signal detection model. Extensive experiments on two real-world datasets prove that KPGRM can model student learning more effectively as well as obtain a more reasonable analysis

    Efficiency of Automated Detectors of Learner Engagement and Affect Compared with Traditional Observation Methods

    Get PDF
    This report investigates the costs of developing automated detectors of student affect and engagement and applying them at scale to the log files of students using educational software. We compare these costs and the accuracy of the computer-based observations with those of more traditional observation methods for detecting student engagement and affect. We discuss the potential for automated detectors to contribute to the development of adaptive and responsive educational software

    Carelessness and Affect in an Intelligent Tutoring System for Mathematics

    Get PDF
    We investigate the relationship between students’ affect and their frequency of careless errors while using an Intelligent Tutoring System for middle school mathematics. A student is said to have committed a careless error when the student’s answer is wrong despite knowing the skill required to provide the correct answer. We operationalize the probability that an error is careless through the use of an automated detector, developed using educational data mining, which infers the probability that an error involves carelessness rather than not knowing the relevant skill. This detector is then applied to log data produced by high-school students in the Philippines using a Cognitive Tutor for scatterplots. We study the relationship between carelessness and affect, triangulating between the detector of carelessness and field observations of affect. Surprisingly, we find that carelessness is common among students who frequently experience engaged concentration. This finding implies that a highly engaged student may paradoxically become overconfident or impulsive, leading to more careless errors. In contrast, students displaying confusion or boredom make fewer careless errors. Further analysis over time suggests that confused and bored students have lower learning overall. Thus, their mistakes appear to stem from a genuine lack of knowledge rather than carelessness
    • 

    corecore