129 research outputs found
Student risk identification learning model using machine learning approach
Several challenges are associated with online based learning systems, the most important of which is the lack of student motivation in various course materials and for various course activities. Further, it is important to identify student who are at risk of failing to complete the course on time. The existing models applied machine learning approach for solving it. However, these models are not efficient as they are trained using legacy data and also failed to address imbalanced data issues for both training and testing the classification approach. Further, they are not efficient for classifying new courses. For overcoming these research challenges, this work presented a novel design by training the learning model for identifying risk using current courses. Further, we present an XGBoost classification algorithm that can classify risk for new courses. Experiments are conducted to evaluate performance of proposed model. The outcome shows the proposed model attain significant performance over stat-of-art model in terms of ROC, F-measure, Precision and Recall
Embracing trustworthiness and authenticity in the validation of learning analytics systems
Learning analytics sits in the middle space between learning theory and data analytics. The inherent diversity of learning analytics manifests itself in an epistemology that strikes a balance between positivism and interpretivism, and knowledge that is sourced from theory and practice. In this paper, we argue that validation approaches for learning analytics systems should be cognisant of these diverse foundations. Through a systematic review of learning analytics validation research, we find that there is currently an over-reliance on positivistic validity criteria. Researchers tend to ignore interpretivistic criteria such as trustworthiness and authenticity. In the 38 papers we analysed, researchers covered positivistic validity criteria 221 times, whereas interpretivistic criteria were mentioned 37 times. We motivate that learning analytics can only move forward with holistic validation strategies that incorporate “thick descriptions” of educational experiences. We conclude by outlining a planned validation study using argument-based validation, which we believe will yield meaningful insights by considering a diverse spectrum of validity criteria.Horizon 2020(H2020)883588Algorithms and the Foundations of Software technolog
Embracing Trustworthiness and Authenticity in the Validation of Learning Analytics Systems
Learning analytics sits in the middle space between learning theory and data analytics. The inherent diversity of learning analytics manifests itself in an epistemology that strikes a balance between positivism and interpretivism, and knowledge that is sourced from theory and practice. In this paper, we argue that validation approaches for learning analytics systems should be cognisant of these diverse foundations. Through a systematic review of learning analytics validation research, we find that there is currently an over-reliance on positivistic validity criteria. Researchers tend to ignore interpretivistic criteria such as trustworthiness and authenticity. In the 38 papers we analysed, researchers covered positivistic validity criteria 221 times, whereas interpretivistic criteria were mentioned 37 times. We motivate that learning analytics can only move forward with holistic validation strategies that incorporate “thick descriptions” of educational experiences. We conclude by outlining a planned validation study using argument-based validation, which we believe will yield meaningful insights by considering a diverse spectrum of validity criteria
Towards Mapping Competencies through Learning Analytics: Real-time Competency Assessment for Career Direction through Interactive Simulation
Assessment and Evaluation in Higher Education, 201, pp. 1-13
Recommended from our members
Moving Forward with Learning Analytics: Expert Views
Learning analytics involve the measurement, collection, analysis, and reporting of data about learners and their contexts, in order to understand and optimise learning and the environments in which it occurs. Since emerging as a distinct field in 2011, learning analytics has grown rapidly, and early adopters around the world are already developing and deploying these new tools. This paper reports on a study that investigated how the field is likely to develop by 2025, in order to make recommendations for action to those concerned with the implementation of learning analytics. The study used a Policy Delphi approach, presenting a range of future scenarios to international experts in the field and asking for responses related to the desirability and feasibility of these scenarios, as well as actions that would be required. Responses were received from 103 people from 21 countries. Responses were coded thematically, inter-rater reliability was checked using Cohen’s kappa coefficient, and data were recoded if kappa was below 0.6. The seven major themes that were identified within the data were power, pedagogy, validity, regulation, complexity, ethics, and affect. The paper considers in detail each of these themes and its implications for the implementation of learning analytics
Narrowing the Feedback Gap : Examining Student Engagement with Personalized and Actionable Feedback Messages
Funding The authors declared no financial support for the research, authorship, and/or publication of this articlePeer reviewedPublisher PD
Recommended from our members
Modelling student online behaviour in a virtual learning environment
In recent years, distance education has enjoyed a major boom. Much work at The Open University (OU) has focused on improving retention rates in these modules by providing timely support to students who are at risk of failing the module. In this paper we explore methods for analysing student activity in online virtual learning environment (VLE) - General Unary Hypotheses Automaton (GUHA) and Markov chain-based analysis - and we explain how this analysis can be relevant for module tutors and other student support staff. We show that both methods are a valid approach to modelling student activities. An advantage of the Markov chain-based approach is in its graphical output and in the possibility to model time dependencies of the student activities.
Drahomira Herrmannova,Lucie Vachova,Jakub Kuzilek,Zdenek Zdrahal,Annika Wolf
- …