1,112 research outputs found
Online module login data as a proxy measure of student engagement: the case of myUnisa, MoyaMA, Flipgrid, and Gephi at an ODeL institution in South Africa
Abstract
The current study employed online module login data harvested from three tools, myUnisa, MoyaMA and Flipgrid to determine how such data served as a proxy measure of student engagement. The first tool is a legacy learning management system (LMS) utilised for online learning at the University of South Africa (UNISA), while the other two tools are a mobile messaging application and an educational video discussion platform, respectively. In this regard, the study set out to investigate the manner in which module login data of undergraduate students (n = 3475 & n = 2954) and a cohort of Mathew Goniwe students (n = 27) enrolled for a second-level module, ENG2601, as extracted from myUnisa, MoyaMA, and Flipgrid served as a proxy measure of student engagement. Collectively, these students were registered for this second-level module at UNISA at the time the study was conducted. The online login data comprised myUnisa module login file access frequencies. In addition, the online login data consisted of the frequencies of instant messages (IMs) posted on MoyaMA by both the facilitator and Mathew Goniwe students, and video clips posted on and video clip view frequencies captured by Flipgrid in respect of the afore-cited module. One finding of this study is that student engagement as measured by login file access frequencies was disproportionally skewed toward one module file relative to other module files. The other finding of this study is that the overall module file access metrics of the Mathew Goniwe group were disproportionally concentrated in a sub-cohort of highly active users (HAU)
Using Prior Knowledge and Student Engagement to Understand Student Performance in an Undergraduate Learning-to-Learn Course
This study examined prior knowledge and student engagement in student performance. Log data were used to explore the distribution of final grades (i.e., weak, good, excellent final grades) occurring in an elective under-graduate course. Previous research has established behavioral and agentic engagement factors contribute to academic achievement (Reeve, 2013). Hierarchical logistic regression using both prior knowledge and log data from the course revealed: (a) the weak-grades group demonstrated less behavioral engagement than the good-grades group, (b) the good-grades group demonstrated less agentic engagement than the excellent-grades group, and (c) models composed of both prior knowledge and engagement measures were more accurate than models composed of only engagement measures. Findings demonstrate students performing at different grade-levels may experience different challenges in their course engagement. This study informs our own instructional strategies and interventions to increase student success in the course and provides recommendations for other instructors to support student success
The current state of using learning analytics to measure and support K-12 student engagement: A scoping review
Student engagement has been identified as a critical construct for understanding and predicting educational success. However, research has shown that it can be hard to align data-driven insights of engagement with observed and self-reported levels of engagement. Given the emergence and increasing application of learning analytics (LA) within K-12 education, further research is needed to understand how engagement is being conceptualized and measured within LA research. This scoping review identifies and synthesizes literature published between 2011-2022, focused on LA and student engagement in K-12 contexts, and indexed in five international databases. 27 articles and conference papers from 13 different countries were included for review. We found that most of the research was undertaken in middle school years within STEM subjects. The results show that there is a wide discrepancy in researchers' understanding and operationalization of engagement and little evidence to suggest that LA improves learning outcomes and support. However, the potential to do so remains strong. Guidance is provided for future LA engagement research to better align with these goals
The relationship between instructor course participation, student participation, and student performance in online courses
Online learning has become ubiquitous with higher education and has catalyzed many changes in teaching and learning, particularly in academic technology. However, foundational frameworks for supporting learning in a virtual environment argue that learners need very similar, if not more, instructional engagement and support as the traditional classroom. Moore’s (1989) three types of interaction and Garrison & Akyol’s (2013) community of inquiry theoretical framework opine the importance of social engagement on the part of instructors and students in the online classroom, further asserting that learner-to-instructor interactions are essential to supporting student satisfaction and learning. Nevertheless, there are few studies, particularly quantitative studies, that examine the relationship between instructor participation in online courses and student participation and achievement. This study analyzed the relationship between select forms of instructor participation, including course announcements and discussion board posts, and student participation and achievement, represented by student course accesses, clicks within a course, time in a course, discussion board posts, and final course grade. The researcher utilized data available in the learning management system (LMS) log files from over 500 online master’s degree courses delivered at a private nonprofit university in the Northwest United States. The results of the multiple regression and multivariate analysis of variance (MANOVA) analyses on the data from the logs showed significant relationships between instructor participation and student participation as well as student participation and achievement within an online course. No significant relationship was identified between instructor participation and student achievement. Potential explanations for this discrepancy and opportunities for future research are also discussed
Recommended from our members
Practitioner Track Proceedings of the 6th International Learning Analytics & Knowledge Conference (LAK16)
Practitioners spearhead a significant portion of learning analytics, relying on implementation and experimentation rather than on traditional academic research. Both approaches help to improve the state of the art. The LAK conference has created a practitioner track for submissions, which first ran in 2015 as an alternative to the researcher track.
The primary goal of the practitioner track is to share thoughts and findings that stem from learning analytics project implementations. While both large and small implementations are considered, all practitioner track submissions are required to relate to initiatives that are designed for large-scale and/or long-term use (as opposed to research-focused initiatives). Other guidelines include:
• Implementation track record The project should have been used by an institution or have been deployed on a learning site. There are no hard guidelines about user numbers or how long the project has been running.
• Learning/education related Submissions have to describe work that addresses learning/academic analytics, either at an educational institution or in an area (such as corporate training, health care or informal learning) where the goal is to improve the learning environment or learning outcomes.
• Institutional involvement Neither submissions nor presentations have to include a named person from an academic institution. However, all submissions have to include information collected from people who have used the tool or initiative in a learning environment (such as faculty, students, administrators and trainees).
• No sales pitches While submissions from commercial suppliers are welcome; reviewers do not accept overt (or covert) sales pitches. Reviewers look for evidence that a presentation will take into account challenges faced, problems that have arisen, and/or user feedback that needs to be addressed.
Submissions are limited to 1,200 words, including an abstract, a summary of deployment with end users, and a full description. Most papers in the proceedings are therefore short, and often informal, although some authors chose to extend their papers once they had been accepted.
Papers accepted in 2016 fell into two categories.
• Practitioner Presentations Presentation sessions are designed to focus on deployment of a single learning analytics tool or initiative.
• Technology Showcase The Technology Showcase event enables practitioners to demonstrate new and emerging learning analytics technologies that they are piloting or deploying.
Both types of paper are included in these proceedings
Educational Theories and Learning Analytics : From Data to Knowledge
Under embargo until 17.01.21.acceptedVersio
Leveraging Student Engagement through MS Teams at an Open and Distance E-learning Institution
The current paper reports on a study that was conducted at the University of South Africa (UNISA) in 2021. The study involved three cohorts of undergraduate students (n = 20, n = 12 and n = 18), where each cohort participated in one of the virtual sessions offered on MS Teams as part of their modules’ virtual classes. Employing a case study research design, the study used the interactions students had on MS Teams through messages in each session to determine how such messages served as indicators of student engagement. Four student engagement dimensions, namely emotional, behavioral, cognitive and academic engagement, were the focus of this study. Two of the findings of this study are: (a) only few students dominated the messages posted during the three live virtual sessions; and (b) cognitive and emotional engagement dimensions were the two predominant dimensions of student engagement. The paper ends with the implications and recommendations
Data mining student activity patterns in an interactive activity-based STEM learning environment
Jupyter Notebook is gaining in popularity for STEM instruction and activity-based learning. This platform for sharing interactive documents via a web interface allows instructors to combine a variety of media together with interactive and editable code, providing rich opportunities for an active learning pedagogy. Other online learning environments, such as Canvas and Moodle, provide or integrate learning analytics for the use of administrators, educators, and students to improve learning outcomes; however, these platforms lack the rich learning environment of Jupyter Notebook. Also, with increasing interest in online learning, research communities have arisen for Learning Analytics and Educational Data Mining. Unfortunately, these research communities have not yet begun to address the Jupyter Notebook learning environment. The University of Missouri College of Engineering offers a Program of Study in Data Science (PSDS) under contract with the National Geospatial Intelligence Agency (NGA.) This program is delivered online, making heavy use of Jupyter notebooks served by JupyterHub for active engagement with course content. The PSDS infrastructure uses the Graylog log management program to collect Jupyter logs, which are stored in an integrated Elasticsearch document store for a period of months. The PSDS program provides an excellent case study for a proof-of-concept in applying learning analytics to the Jupyter learning environment. This thesis consists of two major parts. (1) Mining the Graylog system to extract useful log messages, transformation of those messages into student-activities features, and loading the data into a PostgreSQL database for long-term storage. (2) Developing a variety of visualizations of student activity for administrators, instructors and students. The pedological structure of PSDS courses allows unique insights into student engagement with the course material. Finally, recommendations are made for the development of a more comprehensive logging system and additional analyses that could be performed.Includes bibliographical reference
Recommended from our members
Quality in MOOCs: Surveying the Terrain
The purpose of this review is to identify quality measures and to highlight some of the tensions surrounding notions of quality, as well as the need for new ways of thinking about and approaching quality in MOOCs. It draws on the literature on both MOOCs and quality in education more generally in order to provide a framework for thinking about quality and the different variables and questions that must be considered when conceptualising quality in MOOCs. The review adopts a relativist approach, positioning quality as a measure for a specific purpose. The review draws upon Biggs’s (1993) 3P model to explore notions and dimensions of quality in relation to MOOCs — presage, process and product variables — which correspond to an input–environment–output model. The review brings together literature examining how quality should be interpreted and assessed in MOOCs at a more general and theoretical level, as well as empirical research studies that explore how these ideas about quality can be operationalised, including the measures and instruments that can be employed. What emerges from the literature are the complexities involved in interpreting and measuring quality in MOOCs and the importance of both context and perspective to discussions of quality
- …