5 research outputs found

    Fostering Self-Regulated Learning Online Development and Evaluation of Interventions for E-Learning Scenarios

    Get PDF
    Self-regulated learning (SRL), defined as systematic orientation of thoughts, feelings, and actions towards the attainment of learning goals, is crucial for academic success in school, at university, and at work. Particularly in the context of e-learning, regulation of one鈥檚 learning process is essential because of the personal responsibility learners typically have in such scenarios. Deficits in SRL competency can lead to procrastination, dissatisfaction, and deteriorated performance. A variety of SRL interventions has been developed by researchers in order to support self- regulatory processes during learning or to increase SRL competency. Empirical studies showed that trainings are capable of improving learning behavior and academic performance. However, with increasing numbers of participants, personnel expenses for this approach rise linearly, making it unattractive when facing large groups of learners (e.g. all beginning students of one university). Therefore, in the present work a web-based training (WBT) was developed in order to increase SRL competency of university students. Target groups in the three empirical studies were students in STEM fields, participating in a mathematics preparation course. The WBT consisted of three lessons of approximately 90 minutes, comprising videos, presentations, exercises, games, and group discussions. In the empirical studies, effects of the WBT were compared to other SRL interventions such as a learning diary, peer feedback groups, and a digital learning assistant. In study 1, 211 prospective students took part in a mathematics preparation course that was administered completely online, covering mathematical school knowledge and lasting for four weeks. Participants were randomly assigned to one of four experimental conditions: Having access to the WBT (group T), filling in a learning diary (group D), having access to both interventions (group TD), or the none of them (control group C). A pre-post evaluation design found significant increases in declarative knowledge about SRL, in self-reported SRL behavior, and in self-efficacy for participants of the WBT. A small detrimental effect of the WBT on mathematics performance could be observed. The learning diary was found to have no significant effect on the employed measures. Time series analyses of the diary data confirmed a positive trend in SRL behavior and found large intervention effects of the first two lessons particularly. In study 2, the WBT was augmented by a peer feedback intervention. Participants were assigned to groups of five persons each and were given peer feedback exercises after each lesson of the WBT. In these exercises, participants gave mutual feedback on time schedules, learning strategies, and goal setting. Additionally, participants in this experimental condition (group TDP) filled in a learning diary. Results were compared to a group with access to the regular WBT (without peer feedback intervention) and the learning diary (group TD), a group with access only the learning diary (group D), and a control group without any intervention (group C). Significantly positive effects of the WBT were found on declarative SRL knowledge, SRL behavior, and self-efficacy. In the condition with additional peer feedback, the above mentioned positive effects were even larger. Furthermore, a significant positive effect on mathematics performance was observed. However, no significant effects were found for the learning diary. Study 3 aimed at enhancing the learning diary to a digital learning assistant by adding dynamic, interactive elements. In a interdisciplinary project, methods from Learning Analytics (LA) were used to provide visual and textual feedback to their learning behavior as documented in the learning diary. The completed software PeerLA allows to define learning goals, to schedule a time plan, and to judge the success of learning goals afterwards. On the basis of these judgments, machine learning algorithms calculate suggestions how much time to invest for future goals. Additionally, user data (e.g. time investment or learning progress) is visualized and compared to data from other users as a social frame of reference. Results of a first pilot study showed a satisfying acceptance of the software by the users. Summing up, the WBT presented in this work can be regarded as an effective and efficient intervention for fostering self-regulated learning of university students. Positive effects can even be increased through additional peer feedback interventions; in this case positive effects are also observed for objective academic performance measures. While a mere learning diary did not show positive effects in two empirical studies, a promising concept was developed in an interdisciplinary approach: A digital learning assistant

    Delving into instructor鈥恖ed feedback interventions informed by learning analytics in massive open online courses

    Get PDF
    Producci贸n Cient铆ficaBackground:Providing feedback in massive open online courses (MOOCs) is chal-lenging due to the massiveness and heterogeneity of learners' population. Learninganalytics (LA) solutions aim at scaling up feedback interventions and supportinginstructors in this endeavour.Paper Objectives:This paper focuses on instructor-led feedback mediated by LAtools in MOOCs. Our goal is to answer how, to what extent data-driven feedback isprovided to learners, and what its impact is.Methods:We conducted a systematic literature review on the state-of-the-art LA-informed instructor-led feedback in MOOCs. From a pool of 227 publications, weselected 38 articles that address the topic of LA-informed feedback in MOOCs medi-ated by instructors. We applied etic content analysis to the collected data.Results and Conclusions:The results revealed a lack of empirical studies exploring LA todeliver feedback, and limited attention on pedagogy to inform feedback practices. Our find-ings suggest the need for systematization and evaluation of feedback. Additionally, there isa need for conceptual tools to guide instructors' in the design of LA-based feedback.Takeaways:We point out the need for systematization and evaluation of feedback. Weenvision that this research can support the design of LA-based feedback, thus contribut-ing to bridge the gap between pedagogy and data-driven practice in MOOCs.Consejo de Investigaci贸n de Estonia (PSG286)Ministerio de Ciencia e Innovaci贸n - Fondo Europeo de Desarrollo Regional y la Agencia Nacional de Investigaci贸n (grant PID2020-112584RB-C32) and (grant TIN2017-85179-C3-2-R)Junta de Castilla y Le贸n - Fondo Social Europeo y el Consejo Regional de Educaci贸n (grant E-47-2018-0108488

    License to evaluate: Preparing learning analytics dashboards for educational practice

    Get PDF
    Learning analytics can bridge the gap between learning sciences and data analytics, leveraging the expertise of both fields in exploring the vast amount of data generated in online learning environments. A typical learning analytics intervention is the learning dashboard, a visualisation tool built with the purpose of empowering teachers and learners to make informed decisions about the learning process. Related work has investigated learning dashboards, yet none have explored the theoretical foundation that should inform the design and evaluation of such interventions. In this systematic literature review, we analyse the extent to which theories and models from learning sciences have been integrated into the development of learning dashboards aimed at learners. Our analysis revealed that very few dashboard evaluations take into account the educational concepts that were used as a theoretical foundation for their design. Furthermore, we report findings suggesting that comparison with peers, a common reference frame for contextualising information on learning analytics dashboards, was not perceived positively by all learners. We summarise the insights gathered through our literature review in a set of recommendations for the design and evaluation of learning analytics dashboards for learners

    Designing Learning Analytics Dashboards for Digital Learning Environments: Investigating Learner Preferences, Usage, and Self-Efficacy

    Get PDF
    This dissertation, a product of the European Union's CHARMING project, investigates the intersection of technology and learning, focusing on the design of learning analytics for lifelong learning. It emphasizes the importance of effective learning design and the innovative use of technology in digital learning environments. Chapter 1 presents the problem statement, highlighting the knowledge gap related to learning analytics design and the overarching research question: How does learning analytics dashboard (LAD) design influence learner preferences, interaction, and self-efficacy in training and education? Chapter 2 investigates workplace learner preferences for LADs designed for different phases of the self-regulated learning (SRL) cycle. The study reveals a preference for progress reference frames before and after task performance, while social reference frames are least preferred. Chapter 3 examines the impact of LADs with progress and social reference frames on occupational self-efficacy in virtual reality simulation-based training environments. The findings suggest that both reference frames could elicit equal change in self-efficacy, with social reference frames potentially inducing more significant change. Chapter 4 analyzes log-file data to understand chemical plant employees' engagement with LADs. The results indicate that progress reference frames might foster mastery goal orientation behaviors, while social reference frames seem to promote performance goal orientation behaviors. Chapter 5 investigates the impact of LAD reference frame type and direction of comparison on academic self-efficacy among university students. The findings highlight the influence of both comparison type and direction on changes in academic self-efficacy. Chapter 6 discusses the main research findings, theoretical and practical implications, limitations, and future research opportunities. The dissertation contributes to the understanding of LAD design and its influence on learning-related variables, providing valuable insights for educational stakeholders and researchers. This dissertation advances the understanding of learning analytics dashboard design and its impact on learner preferences, interaction, and self-efficacy in various educational contexts. The findings provide a foundation for future research and the development of more effective digital learning environments
    corecore