129 research outputs found

    Student risk identification learning model using machine learning approach

    Get PDF
    Several challenges are associated with online based learning systems, the most important of which is the lack of student motivation in various course materials and for various course activities. Further, it is important to identify student who are at risk of failing to complete the course on time. The existing models applied machine learning approach for solving it. However, these models are not efficient as they are trained using legacy data and also failed to address imbalanced data issues for both training and testing the classification approach. Further, they are not efficient for classifying new courses. For overcoming these research challenges, this work presented a novel design by training the learning model for identifying risk using current courses. Further, we present an XGBoost classification algorithm that can classify risk for new courses. Experiments are conducted to evaluate performance of proposed model. The outcome shows the proposed model attain significant performance over stat-of-art model in terms of ROC, F-measure, Precision and Recall

    Embracing trustworthiness and authenticity in the validation of learning analytics systems

    Get PDF
    Learning analytics sits in the middle space between learning theory and data analytics. The inherent diversity of learning analytics manifests itself in an epistemology that strikes a balance between positivism and interpretivism, and knowledge that is sourced from theory and practice. In this paper, we argue that validation approaches for learning analytics systems should be cognisant of these diverse foundations. Through a systematic review of learning analytics validation research, we find that there is currently an over-reliance on positivistic validity criteria. Researchers tend to ignore interpretivistic criteria such as trustworthiness and authenticity. In the 38 papers we analysed, researchers covered positivistic validity criteria 221 times, whereas interpretivistic criteria were mentioned 37 times. We motivate that learning analytics can only move forward with holistic validation strategies that incorporate “thick descriptions” of educational experiences. We conclude by outlining a planned validation study using argument-based validation, which we believe will yield meaningful insights by considering a diverse spectrum of validity criteria.Horizon 2020(H2020)883588Algorithms and the Foundations of Software technolog

    Embracing Trustworthiness and Authenticity in the Validation of Learning Analytics Systems

    Get PDF
    Learning analytics sits in the middle space between learning theory and data analytics. The inherent diversity of learning analytics manifests itself in an epistemology that strikes a balance between positivism and interpretivism, and knowledge that is sourced from theory and practice. In this paper, we argue that validation approaches for learning analytics systems should be cognisant of these diverse foundations. Through a systematic review of learning analytics validation research, we find that there is currently an over-reliance on positivistic validity criteria. Researchers tend to ignore interpretivistic criteria such as trustworthiness and authenticity. In the 38 papers we analysed, researchers covered positivistic validity criteria 221 times, whereas interpretivistic criteria were mentioned 37 times. We motivate that learning analytics can only move forward with holistic validation strategies that incorporate “thick descriptions” of educational experiences. We conclude by outlining a planned validation study using argument-based validation, which we believe will yield meaningful insights by considering a diverse spectrum of validity criteria

    Narrowing the Feedback Gap : Examining Student Engagement with Personalized and Actionable Feedback Messages

    Get PDF
    Funding The authors declared no financial support for the research, authorship, and/or publication of this articlePeer reviewedPublisher PD
    corecore