541 research outputs found

    A multi-modal study into students’ timing and learning regulation: time is ticking

    Get PDF
    Purpose This empirical study aims to demonstrate how the combination of trace data derived from technology-enhanced learning environments and self-response survey data can contribute to the investigation of self-regulated learning processes. Design/methodology/approach Using a showcase based on 1,027 students’ learning in a blended introductory quantitative course, the authors analysed the learning regulation and especially the timing of learning by trace data. Next, the authors connected these learning patterns with self-reports based on multiple contemporary social-cognitive theories. Findings The authors found that several behavioural facets of maladaptive learning orientations, such as lack of regulation, self-sabotage or disengagement negatively impacted the amount of practising, as well as timely practising. On the adaptive side of learning dispositions, the picture was less clear. Where some adaptive dispositions, such as the willingness to invest efforts in learning and self-perceived planning skills, positively impacted learning regulation and timing of learning, other dispositions such as valuing school or academic buoyancy lacked the expected positive effects. Research limitations/implications Due to the blended design, there is a strong asymmetry between what one can observe on learning in both modes. Practical implications This study demonstrates that in a blended setup, one needs to distinguish the grand effect on learning from the partial effect on learning in the digital mode: the most adaptive students might be less dependent for their learning on the use of the digital learning mode. Originality/value The paper presents an application of embodied motivation in the context of blended learning

    Linking students' timing of engagement to learning design and academic performance

    Get PDF
    In recent years, the connection between Learning Design (LD) and Learning Analytics (LA) has been emphasized by many scholars as it could enhance our interpretation of LA findings and translate them to meaningful interventions. Together with numerous conceptual studies, a gradual accumulation of empirical evidence has indicated a strong connection between how instructors design for learning and student behaviour. Nonetheless, students' timing of engagement and its relation to LD and academic performance have received limited attention. Therefore, this study investigates to what extent students' timing of engagement aligned with instructor learning design, and how engagement varied across different levels of performance. The analysis was conducted over 28 weeks using trace data, on 387 students, and replicated over two semesters in 2015 and 2016. Our findings revealed a mismatch between how instructors designed for learning and how students studied in reality. In most weeks, students spent less time studying the assigned materials on the VLE compared to the number of hours recommended by instructors. The timing of engagement also varied, from in advance to catching up patterns. High-performing students spent more time studying in advance, while low-performing students spent a higher proportion of their time on catching-up activities. This study reinforced the importance of pedagogical context to transform analytics into actionable insights

    Regular Online Assessment, Motivation and Learning

    Get PDF
    In 2002 regular online assessment was introduced as one of the pillars of an improved course in economics for business students. These online tests were introduced in the context of the problem-based teaching format used at Universiteit Maastricht, where students work in small groups guided by tasks. In this student-centred approach it is important that students come well-prepared to their group meetings. For students this is a type of Prisoner’s Dilemma, because students can free-ride on the preparation of other students. It has also characteristics of an Assurance Game, because if a large part of the group is not well-prepared, the students that did prepare well will also get not much out of the group discussion and therefore will be less motivated to prepare for themselves, too. The risk that such an Assurance Game arises is higher when the majority of students is not intrinsically motivated at the start of the course. The interest in the subject matter of the course will certainly not increase when students do not study enough. Regular online assessment may help to solve these dilemmas by forcing students to prepare at least the textbook they have to read before the group meetings.In this paper we discuss the role of online testing in the context of problem-based learning and show that after the introduction of online learning and other innovations students worked harder, had the feeling that they learned more and reported to be more interested in the subject-matter of the course (i.e. economics). It is obvious that the increase in work effort and motivation as the consequence of online testing is not limited to the context of a problem-based learning environment.Economics ;

    Stability and sensitivity of Learning Analytics based prediction models

    Get PDF
    Learning analytics seek to enhance the learning processes through systematic measurements of learning related data and to provide informative feedback to learners and educators. Track data from Learning Management Systems (LMS) constitute a main data source for learning analytics. This empirical contribution provides an application of Buckingham Shum and Deakin Crick’s theoretical framework of dispositional learning analytics: an infrastructure that combines learning dispositions data with data extracted from computer-assisted, formative assessments and LMSs. In two cohorts of a large introductory quantitative methods module, 2049 students were enrolled in a module based on principles of blended learning, combining face-to-face Problem-Based Learning sessions with e-tutorials. We investigated the predictive power of learning dispositions, outcomes of continuous formative assessments and other system generated data in modelling student performance and their potential to generate informative feedback. Using a dynamic, longitudinal perspective, computer-assisted formative assessments seem to be the best predictor for detecting underperforming students and academic performance, while basic LMS data did not substantially predict learning. If timely feedback is crucial, both use-intensity related track data from e-tutorial systems, and learning dispositions, are valuable sources for feedback generation
    • …
    corecore