15,104 research outputs found
Stability and sensitivity of Learning Analytics based prediction models
Learning analytics seek to enhance the learning processes through systematic measurements of learning related data and to provide informative feedback to learners and educators. Track data from Learning Management Systems (LMS) constitute a main data source for learning analytics. This empirical contribution provides an application of Buckingham Shum and Deakin Crickâs theoretical framework of dispositional learning analytics: an infrastructure that combines learning dispositions data with data extracted from computer-assisted, formative assessments and LMSs. In two cohorts of a large introductory quantitative methods module, 2049 students were enrolled in a module based on principles of blended learning, combining face-to-face Problem-Based Learning sessions with e-tutorials. We investigated the predictive power of learning dispositions, outcomes of continuous formative assessments and other system generated data in modelling student performance and their potential to generate informative feedback. Using a dynamic, longitudinal perspective, computer-assisted formative assessments seem to be the best predictor for detecting underperforming students and academic performance, while basic LMS data did not substantially predict learning. If timely feedback is crucial, both use-intensity related track data from e-tutorial systems, and learning dispositions, are valuable sources for feedback generation
Recommended from our members
A review of ten years of implementation and research in aligning learning design with learning analytics at the Open University UK
There is an increased recognition that learning design drives both student learning experience and quality enhancements of teaching and learning. The Open University UK (OU) has been one of few institutions that have explicitly and systematically captured the designs for learning at a large scale. By applying advanced analytical techniques on large and fine-grained datasets, the OU has been unpacking the complexity of instructional practices, as well as providing conceptual and empirical evidence of how learning design influences student behaviour, satisfaction, and performance. This study discusses the implementation of learning design at the OU in the last ten years, and critically reviews empirical evidence from eight recent large-scale studies that have linked learning design with learning analytics. Four future research themes are identified to support future adoptions of learning design approaches
Recommended from our members
Quality in MOOCs: Surveying the Terrain
The purpose of this review is to identify quality measures and to highlight some of the tensions surrounding notions of quality, as well as the need for new ways of thinking about and approaching quality in MOOCs. It draws on the literature on both MOOCs and quality in education more generally in order to provide a framework for thinking about quality and the different variables and questions that must be considered when conceptualising quality in MOOCs. The review adopts a relativist approach, positioning quality as a measure for a specific purpose. The review draws upon Biggsâs (1993) 3P model to explore notions and dimensions of quality in relation to MOOCs â presage, process and product variables â which correspond to an inputâenvironmentâoutput model. The review brings together literature examining how quality should be interpreted and assessed in MOOCs at a more general and theoretical level, as well as empirical research studies that explore how these ideas about quality can be operationalised, including the measures and instruments that can be employed. What emerges from the literature are the complexities involved in interpreting and measuring quality in MOOCs and the importance of both context and perspective to discussions of quality
Personalizing the design of computerâbased instruction to enhance learning
This paper reports two studies designed to investigate the effect on learning outcomes of matching individualsâ preferred cognitive styles to computerâbased instructional (CBI) material. Study 1 considered the styles individually as Verbalizer, Imager, Wholist and Analytic. Study 2 considered the biâdimensional nature of cognitive styles in order to assess the full ramification of cognitive styles on learning: Analytic/Imager, Analytic/ Verbalizer, Wholist/Imager and the Wholist/Verbalizer. The mix of images and text, the nature of the text material, use of advance organizers and proximity of information to facilitate meaningful connections between various pieces of information were some of the considerations in the design of the CBI material. In a quasiâexperimental format, studentsâ cognitive styles were analysed by Cognitive Style Analysis (CSA) software. On the basis of the CSA result, the system defaulted students to either matched or mismatched CBI material by alternating between the two formats. The instructional material had a learning and a test phase. Learning outcome was tested on recall, labelling, explanation and problemâsolving tasks. Comparison of the matched and mismatched instruction did not indicate significant difference between the groups, but the consistently better performance by the matched group suggests potential for further investigations where the limitations cited in this paper are eliminated. The result did indicate a significant difference between the four cognitive styles with the Wholist/Verbalizer group performing better then all other cognitive styles. Analysing the difference between cognitive styles on individual test tasks indicated significant difference on recall, labelling and explanation, suggesting that certain test tasks may suit certain cognitive styles
Recommended from our members
Understanding Evidence-Based Interventions for Cross-Cultural Group Work: A Learning Analytics Perspective
As the numbers of international students worldwide continue to rise, one common challenge is how best to socially integrate diverse groups of students. Indeed, research demonstrates that many students form social and learning relationships with those from the same cultural background, despite benefits of cross-cultural communication. This lack of social cohesion negatively affects students, particularly when it comes to their perceptions of collaborative group work. However, few studies have analysed measurable student behaviours in group work, such as with learning analytics, to determine how culture and existing social networks influence measurable differences in contributions. Similarly, little is known about what evidence-based interventions lead to more equal participation between diverse students. In this research, learning analytics is combined with social network analysis to determine the role of social connections on group work participation, and highlight replicable interventions that can help promote social cohesion in diverse classrooms
Recommended from our members
Linking students' timing of engagement to learning design and academic performance
In recent years, the connection between Learning Design (LD) and Learning Analytics (LA) has been emphasized by many scholars as it could enhance our interpretation of LA findings and translate them to meaningful interventions. Together with numerous conceptual studies, a gradual accumulation of empirical evidence has indicated a strong connection between how instructors design for learning and student behaviour. Nonetheless, students' timing of engagement and its relation to LD and academic performance have received limited attention. Therefore, this study investigates to what extent students' timing of engagement aligned with instructor learning design, and how engagement varied across different levels of performance. The analysis was conducted over 28 weeks using trace data, on 387 students, and replicated over two semesters in 2015 and 2016. Our findings revealed a mismatch between how instructors designed for learning and how students studied in reality. In most weeks, students spent less time studying the assigned materials on the VLE compared to the number of hours recommended by instructors. The timing of engagement also varied, from in advance to catching up patterns. High-performing students spent more time studying in advance, while low-performing students spent a higher proportion of their time on catching-up activities. This study reinforced the importance of pedagogical context to transform analytics into actionable insights
- âŠ