85 research outputs found

    Exploring the Effectiveness of AI Algorithms in Predicting and Enhancing Student Engagement in an E-Learning

    Get PDF
    The shift from traditional to digital learning platforms has highlighted the need for more personalized and engaging student experiences. In response, researchers are investigating AI algorithms' ability to predict and improve e-learning student engagement.  Machine Learning (ML) methods like Decision Trees, Support Vector Machines, and Deep Learning models can predict student engagement using variables like interaction patterns, learning behavior, and academic performance. These AI algorithms have identified at-risk students, enabling early interventions and personalized learning. By providing adaptive content, personalized feedback, and immersive learning environments, some AI methods have increased student engagement. Despite these advances, data privacy, unstructured data, and transparent and interpretable models remain challenges. The review concludes that AI has great potential to improve e-learning outcomes, but these challenges must be addressed for ethical and effective applications. Future research should develop more robust and interpretable AI models, multidimensional engagement metrics, and more comprehensive studies on AI's ethical implications in education

    MOOC next week dropout prediction: weekly assessing time and learning patterns

    Get PDF
    Although Massive Open Online Course (MOOC) systems have become more prevalent in recent years, associated student attrition rates are still a major drawback. In the past decade, many researchers have sought to explore the reasons behind learner attrition or lack of interest. A growing body of literature recognises the importance of the early prediction of student attrition from MOOCs, since it can lead to timely interventions. Among them, most are concerned with identifying the best features for the entire course dropout prediction. This study focuses on innovations in predicting student dropout rates by examining their next-week-based learning activities and behaviours. The study is based on multiple MOOC platforms including 251,662 students from 7 courses with 29 runs spanning in 2013 to 2018. This study aims to build a generalised early predictive model for the weekly prediction of student completion using machine learning algorithms. In addition, this study is the first to use a ‘learner’s jumping behaviour’ as a feature, to obtain a high dropout prediction accuracy

    Assessment-driven Learning through Serious Games: Guidance and Effective Outcomes

    Get PDF
    Evaluation in serious games is an important aspect; it aims to evaluate the good transmission of pedagogical objectives, the performance of student in relation to these objectives defined in the pedagogical scenario, the content of the course and the predefined criteria. However, the effectiveness of learning is under-studied due to the complexity involved to gamify the assessment concept, particularly when it comes to intangible measures related to the progression of learning outcomes, which is among the most important aspects of evaluation in serious games. This paper reviews the literature regarding assessment due to their importance in the learning process with a detailed assessment plan applied on serious game. Then, it presents a framework used to facilitate the assessment design integrated in serious games. Finally, a significant example of how the proposed framework proved successful with corresponding results will conclude the paper

    Data Analytics in Higher Education: An Integrated View

    Get PDF
    Data analytics in higher education provides unique opportunities to examine, understand, and model pedagogical processes. Consequently, the methodologies and processes underpinning data analytics in higher education have led to distinguishing, highly correlative terms such as Learning Analytics (LA), Academic Analytics (AA), and Educational Data Mining (EDM), where the outcome of one may become the input of another. The purpose of this paper is to offer IS educators and researchers an overview of the current status of the research and theoretical perspectives on educational data analytics. The paper proposes a set of unified definitions and an integrated framework for data analytics in higher education. By considering the framework, researchers may discover new contexts as well as areas of inquiry. As a Gestalt-like exercise, the framework (whole) and the articulation of data analytics (parts) may be useful for educational stakeholders in decision-making at the level of individual students, classes of students, the curriculum, schools, and educational systems

    Student Engagement in Aviation Moocs: Identifying Subgroups and Their Differences

    Get PDF
    The purpose of this study was to expand the current understanding of learner engagement in aviation-related Massive Open Online Courses (MOOCs) through cluster analysis. MOOCs, regarded for their low- or no-cost educational content, often attract thousands of students who are free to engage with the provided content to the extent of their choosing. As online training for pilots, flight attendants, mechanics, and small unmanned aerial system operators continues to expand, understanding how learners engage in optional aviation-focused, online course material may help inform course design and instruction in the aviation industry. In this study, Moore’s theory of transactional distance, which posits psychological or communicative distance can impede learning and success, was used as a descriptive framework for analysis. Archived learning analytics datasets from two 2018 iterations of the same small unmanned aerial systems MOOC were cluster-analyzed (N = 1,032 and N = 4,037). The enrolled students included individuals worldwide; some were affiliated with the host institution, but most were not. The data sets were cluster analyzed separately to categorize participants into common subpopulations based on discussion post pages viewed and posts written, video pages viewed, and quiz grades. Subgroup differences were examined in days of activity and record of completion. Pre- and postcourse survey data provided additional variables for analysis of subgroup differences in demographics (age, geographic location, education level, employment in the aviation industry) and learning goals. Analysis of engagement variables revealed three significantly different subgroups for each MOOC. Engagement patterns were similar between MOOCs for the most and least engaged groups, but differences were noted in the middle groups; MOOC 1’s middle group had a broader interest in optional content (both in discussions and videos); whereas MOOC 2’s middle group had a narrower interest in optional discussions. Mandatory items (Mandatory Discussion or Quizzes) were the best predictors in classifying subgroups for both MOOCs. Significant associations were found between subgroups and education levels, days of activity, and total quiz scores. This study addressed two known problems: a lack of information on student engagement in aviation-related MOOCs, and more broadly, a growing imperative to examine learners who utilize MOOCs but do not complete them. This study served as an important first step for course developers and instructors who aim to meet the diverse needs of the aviation-education community

    A MULTI-LAYERED TAXONOMY OF LEARNING ANALYTICS APPLICATIONS

    Get PDF
    Digital technologies have become immersed in education systems and the stakeholders have discovered a pervasive need to reform existing learning and teaching practices. Among the emerging educational digital technologies, learning analytics create a disruptive potential as it enables the power of educational decision support, real-time feedback and future prediction. Until today, the field of learning analytics is rapidly evolving, but still immature and especially low on ontological insights. Little guidance is available for educational designers and researchers when it comes to studies applied learning analytics as a method. Hence, this study offers a well-structured multi-layered taxonomy of learning analytics applications for deeper understanding of learning analytics

    Formative assessment strategies for students' conceptions—The potential of learning analytics

    Get PDF
    Formative assessment is considered to be helpful in students' learning support and teaching design. Following Aufschnaiter's and Alonzo's framework, formative assessment practices of teachers can be subdivided into three practices: eliciting evidence, interpreting evidence and responding. Since students' conceptions are judged to be important for meaningful learning across disciplines, teachers are required to assess their students' conceptions. The focus of this article lies on the discussion of learning analytics for supporting the assessment of students' conceptions in class. The existing and potential contributions of learning analytics are discussed related to the named formative assessment framework in order to enhance the teachers' options to consider individual students' conceptions. We refer to findings from biology and computer science education on existing assessment tools and identify limitations and potentials with respect to the assessment of students' conceptions. Practitioner notes What is already known about this topic Students' conceptions are considered to be important for learning processes, but interpreting evidence for learning with respect to students' conceptions is challenging for teachers. Assessment tools have been developed in different educational domains for teaching practice. Techniques from artificial intelligence and machine learning have been applied for automated assessment of specific aspects of learning. What does the paper add Findings on existing assessment tools from two educational domains are summarised and limitations with respect to assessment of students' conceptions are identified. Relevent data that needs to be analysed for insights into students' conceptions is identified from an educational perspective. Potential contributions of learning analytics to support the challenging task to elicit students' conceptions are discussed. Implications for practice and/or policy Learning analytics can enhance the eliciting of students' conceptions. Based on the analysis of existing works, further exploration and developments of analysis techniques for unstructured text and multimodal data are desirable to support the eliciting of students' conceptions

    Explainable AI (XAI): Improving At-Risk Student Prediction with Theory-Guided Data Science, K-means Classification, and Genetic Programming

    Get PDF
    This research explores the use of eXplainable Artificial Intelligence (XAI) in Educational Data Mining (EDM) to improve the performance and explainability of artificial intelligence (AI) and machine learning (ML) models predicting at-risk students. Explainable predictions provide students and educators with more insight into at-risk indicators and causes, which facilitates instructional intervention guidance. Historically, low student retention has been prevalent across the globe as nations have implemented a wide range of interventions (e.g., policies, funding, and academic strategies) with only minimal improvements in recent years. In the US, recent attrition rates indicate two out of five first-time freshman students will not graduate from the same four-year institution within six years. In response, emerging AI research leveraging recent advancements in Deep Learning has demonstrated high predictive accuracy for identifying at-risk students, which is useful for planning instructional interventions. However, research suggested a general trade-off between performance and explainability of predictive models. Those that outperform, such as deep neural networks (DNN), are highly complex and considered black boxes (i.e., systems that are difficult to explain, interpret, and understand). The lack of model transparency/explainability results in shallow predictions with limited feedback prohibiting useful intervention guidance. Furthermore, concerns for trust and ethical use are raised for decision-making applications that involve humans, such as health, safety, and education. To address low student retention and the lack of interpretable models, this research explored the use of eXplainable Artificial Intelligence (XAI) in Educational Data Mining (EDM) to improve instruction and learning. More specifically, XAI has the potential to enhance the performance and explainability of AI/ML models predicting at-risk students. The scope of this study includes a hybrid research design comprising: (1) a systematic literature review of XAI and EDM applications in education; (2) the development of a theory-guided feature selection (TGFS) conceptual learning model; and (3) an EDM study exploring the efficacy of a TGFS XAI model. The EDM study implemented K-Means Classification for explorative (unsupervised) and predictive (supervised) analysis in addition to assessing Genetic Programming (GP), a type of XAI model, predictive performance, and explainability against common AI/ML models. Online student activity and performance data were collected from a learning management system (LMS) from a four-year higher education institution. Student data was anonymized and protected to ensure data privacy and security. Data was aggregated at weekly intervals to compute and assess the predictive performance (sensitivity, recall, and f-1 score) over time. Mean differences and effect sizes are reported at the .05 significance level. Reliability and validity are improved by implementing research best practices

    Utilizing Online Activity Data to Improve Face-to-Face Collaborative Learning in Technology-Enhanced Learning Environments

    Get PDF
    학위논문 (박사)-- 서울대학교 대학원 : 융합과학기술대학원 융합과학부(디지털정보융합전공), 2019. 2. Rhee, Wonjong .We live in a flood of information and face more and more complex problems that are difficult to be solved by a single individual. Collaboration with others is necessary to solve these problems. In educational practice, this leads to more attention on collaborative learning. Collaborative learning is a problem-solving process where students learn and work together with other peers to accomplish shared tasks. Through this group-based learning, students can develop collaborative problem-solving skills and improve the core competencies such as communication skills. However, there are many issues for collaborative learning to succeed, especially in a face-to-face learning environment. For example, group formation, the first step to design successful collaborative learning, requires a lot of time and effort. In addition, it is difficult for a small number of instructors to manage a large number of student groups when trying to monitor and support their learning process. These issues can amount hindrance to the effectiveness of face-to-face collaborative learning. The purpose of this dissertation is to enhance the effectiveness of face-to-face collaborative learning with online activity data. First, online activity data is explored to find whether it can capture relevant student characteristics for group formation. If meaningful characteristics can be captured from the data, the entire group formation process can be performed more efficiently because the task can be automated. Second, learning analytics dashboards are implemented to provide adaptive support during a class. The dashboards system would monitor each group's collaboration status by utilizing online activity data that is collected during class in real-time, and provide adaptive feedback according to the status. Lastly, a predictive model is built to detect at-risk groups by utilizing the online activity data. The model is trained based on various features that represent important learning behaviors of a collaboration group. The results reveal that online activity data can be utilized to address some of the issues we have in face-to-face collaborative learning. Student characteristics captured from the online activity data determined important group characteristics that significantly influenced group achievement. This indicates that student groups can be formed efficiently by utilizing the online activity data. In addition, the adaptive support provided by learning analytics dashboards significantly improved group process as well as achievement. Because the data allowed the dashboards system to monitor current learning status, appropriate feedback could be provided accordingly. This led to an improvement of both learning process and outcome. Finally, the predictive model could detect at-risk groups with high accuracy during the class. The random forest algorithm revealed important learning behaviors of a collaboration group that instructors should pay more attention to. The findings indicate that the online activity data can be utilized to address practical issues of face-to-face collaborative learning and to improve the group-based learning where the data is available. Based on the investigation results, this dissertation makes contributions to learning analytics research and face-to-face collaborative learning in technology-enhanced learning environments. First, it can provide a concrete case study and a guide for future research that may take a learning analytics approach and utilize student activity data. Second, it adds a research endeavor to address challenges in face-to-face collaborative learning, which can lead to substantial enhancement of learning in educational practice. Third, it suggests interdisciplinary problem-solving approaches that can be applied to the real classroom context where online activity data is increasingly available with advanced technologies.Abstract i Chapter 1. Introduction 1 1.1. Motivation 1 1.2. Research questions 4 1.3. Organization 6 Chapter 2. Background 8 2.1. Learning analytics 8 2.2. Collaborative learning 22 2.3. Technology-enhanced learning environment 27 Chapter 3. Heterogeneous group formation with online activity data 35 3.1. Student characteristics for heterogeneous group formation 36 3.2. Method 41 3.3. Results 51 3.4. Discussion 59 3.5. Summary 64 Chapter 4. Real-time dashboard for adaptive feedback in face-to-face CSCL 67 4.1. Theoretical background 70 4.2. Dashboard characteristics 81 4.3. Evaluation of the dashboard 94 4.4. Discussion 107 4.5. Summary 114 Chapter 5. Real-time detection of at-risk groups in face-to-face CSCL 118 5.1. Important learning behaviors of group in collaborative argumentation 118 5.2. Method 120 5.3. Model performance and influential features 125 5.4. Discussion 129 5.5. Summary 132 Chapter 6. Conclusion 134 Bibliography 140Docto
    corecore