8,044 research outputs found
Recommended from our members
Unravelling the dynamics of learning design within and between disciplines in higher education using learning analytics
Designing effective learning experience in virtual learning environment (VLE) can be supported by learning analytics (LA) through explicit feedback on how learning design (LD) influences studentsâ engagement, satisfaction and performance. Marrying LA with LD not only puts existing pedagogical theories in instructional design to the test with actual learning data, but also provides the context of learning which helps educators translate established LA findings to direct interventions. My dissertation aims at unpacking the complexity of LD and its impact on studentsâ engagement, satisfaction and performance on VLE using LA. The context of this study is 400+ online and blended learning modules at the Open University (OU) UK. This research combines multiple sources of data from the OU Learning Design Initiative (OULDI), system log data, self-reported surveys, and performance data. Given the scope of this study, a wide range of visualization techniques, social network analysis, multi-level modelling, and machine learning will be used
Mixing and Matching Learning Design and Learning Analytics
In the last five years, learning analytics has proved its potential in predicting academic performance based on trace data of learning activities. However, the role of pedagogical context in learning analytics has not been fully understood. To date, it has been difficult to quantify learning in a way that can be measured and compared. By coding the design of e-learning courses, this study demonstrates how learning design is being implemented on a large scale at the Open University UK, and how learning analytics could support as well as benefit from learning design. Building on our previous work, our analysis was conducted longitudinally on 23 undergraduate distance learning modules and their 40,083 students. The innovative aspect of this study is the availability of fine-grained learning design data at individual task level, which allows us to consider the connections between learning activities, and the media used to produce the activities. Using a combination of visualizations and social network analysis, our findings revealed a diversity in how learning activities were designed within and between disciplines as well as individual learning activities. By reflecting on the learning design in an explicit manner, educators are empowered to compare and contrast their design using their own institutional data
Recommended from our members
A review of ten years of implementation and research in aligning learning design with learning analytics at the Open University UK
There is an increased recognition that learning design drives both student learning experience and quality enhancements of teaching and learning. The Open University UK (OU) has been one of few institutions that have explicitly and systematically captured the designs for learning at a large scale. By applying advanced analytical techniques on large and fine-grained datasets, the OU has been unpacking the complexity of instructional practices, as well as providing conceptual and empirical evidence of how learning design influences student behaviour, satisfaction, and performance. This study discusses the implementation of learning design at the OU in the last ten years, and critically reviews empirical evidence from eight recent large-scale studies that have linked learning design with learning analytics. Four future research themes are identified to support future adoptions of learning design approaches
Recommended from our members
Linking students' timing of engagement to learning design and academic performance
In recent years, the connection between Learning Design (LD) and Learning Analytics (LA) has been emphasized by many scholars as it could enhance our interpretation of LA findings and translate them to meaningful interventions. Together with numerous conceptual studies, a gradual accumulation of empirical evidence has indicated a strong connection between how instructors design for learning and student behaviour. Nonetheless, students' timing of engagement and its relation to LD and academic performance have received limited attention. Therefore, this study investigates to what extent students' timing of engagement aligned with instructor learning design, and how engagement varied across different levels of performance. The analysis was conducted over 28 weeks using trace data, on 387 students, and replicated over two semesters in 2015 and 2016. Our findings revealed a mismatch between how instructors designed for learning and how students studied in reality. In most weeks, students spent less time studying the assigned materials on the VLE compared to the number of hours recommended by instructors. The timing of engagement also varied, from in advance to catching up patterns. High-performing students spent more time studying in advance, while low-performing students spent a higher proportion of their time on catching-up activities. This study reinforced the importance of pedagogical context to transform analytics into actionable insights
Learning analytics support to teachers' design and orchestrating tasks
Background:
Data-driven educational technology solutions have the potential to support teachers in different tasks, such as the designing and orchestration of collaborative learning activities. When designing, such solutions can improve teacher understanding of how learning designs impact student learning and behaviour; and guide them to refine and redesign future learning designs. When orchestrating educational scenarios, data-driven solutions can support teacher awareness of learner participation and progress and enhance real time classroom management.
Objectives:
The use of learning analytics (LA) can be considered a suitable approach to tackle both problems. However, it is unclear if the same LA indicators are able to satisfactorily support both the designing and orchestration of activities. This study aims to investigate the use of the same LA indicators for supporting multiple teacher tasks, that is, design, redesign and orchestration, as a gap in the existing literature that requires further exploration.
Methods:
In this study, first we refer to the previous work to study the use of different LA to support both tasks. Then we analyse the nature of the two tasks focusing on a case study that uses the same collaborative learning tool with LA to support both tasks.
Implications:
The study findings led to derive design considerations on LA support for teachersâ design and orchestrating tasks
The dinosaur that lost its head: A contribution to a framework using Learning Analytics in Learning Design
This paper presents an approach to the meaningful use of learning analytics as a tool for teachers to improve the robustness of their learning designs. The approach is based on examining how participants act within a Massive Open Online Course (MOOC) format through learning analytics. We show that a teacher/designer can gain knowledge about his or her intended, implemented and attained learning design; about how MOOC participants act in response to these and about how students are able to develop âstudy efficiencyâ when participating in a MOOC. The learning analytics approach makes it possible to follow certain MOOC students and their study behaviour (e.g. the participants who pass the MOOC by earning enough achievement badges) and to examine the role of the moderator in MOOCs, showing that scaffolding plays a central role in studying and learning processes in an educational format such as a MOOC.
Key words: MOOCs, Massive Open Online Courses, data-saturated, learning analytics, learning design, educational design research, LMS
From Theory to Action: Developing and Evaluating Learning Analyticsfor Learning Design
ProducciĂłn CientĂficaThe effectiveness of using learning analytics for learning design primarily depends upon two concepts: grounding and alignment. This is the primary conjecture for the study described in this paper. In our design-based research study, we design, test, and evaluate teacher-facing learning analytics for an online inquiry science unit on global climate change. We design our learning analytics in accordance with a socioconstructivism-based pedagogical framework, called Knowledge Integration, and the principles of learning analytics Implementation Design. Our methodology for the design process draws upon the principle of the Orchestrating for Learning Analytics framework to engage stakeholders (i.e. teachers, researchers, and developers). The resulting learning analytics were aligned to unit activities that engaged students in key aspects of the knowledge integration process. They provided teachers with actionable insight into their students' understanding at critical junctures in the learning process. We demonstrate the efficacy of the learning analytics in supporting the optimization of the unit's learning design. We conclude by synthesizing the principles that guided our design process into a framework for developing and evaluating learning analytics for learning design.Ministerio de Ciencia, InnovaciĂłn y Universidades (Project TIN2017-85179-C3-2-R)Junta de Castilla y LeĂłn (project VA257P18) by the European Commission under project grant 588438-EPP-1-2017-1-EL-EPPKA2-KA
Improvement Research Carried Out Through Networked Communities: Accelerating Learning about Practices that Support More Productive Student Mindsets
The research on academic mindsets shows significant promise for addressing important problems facing educators. However, the history of educational reform is replete with good ideas for improvement that fail to realize the promises that accompany their introduction. As a field, we are quick to implement new ideas but slow to learn how to execute well on them. If we continue to implement reform as we always have, we will continue to get what we have always gotten. Accelerating the field's capacity to learn in and through practice to improve is one key to transforming the good ideas discussed at the White House meeting into tools, interventions, and professional development initiatives that achieve effectiveness reliably at scale. Toward this end, this paper discusses the function of networked communities engaged in improvement research and illustrates the application of these ideas in promoting greater student success in community colleges. Specifically, this white paper:* Introduces improvement research and networked communities as ideas that we believe can enhance educators' capacities to advance positive change. * Explains why improvement research requires a different kind of measures -- what we call practical measurement -- that are distinct from those commonly used by schools for accountability or by researchers for theory development.* Illustrates through a case study how systematic improvement work to promote student mindsets can be carried out. The case is based on the Carnegie Foundation's effort to address the poor success rates for students in developmental math at community colleges.Specifically, this case details:- How a practical theory and set of practical measures were created to assess the causes of "productive persistence" -- the set of "non-cognitive factors" thought to powerfully affect community college student success. In doing this work, a broad set of potential factors was distilled into a digestible framework that was useful topractitioners working with researchers, and a large set of potential measures was reduced to a practical (3-minute) set of assessments.- How these measures were used by researchers and practitioners for practical purposes -- specifically, to assess changes, predict which students were at-risk for course failure, and set priorities for improvement work.-How we organized researchersto work with practitioners to accelerate field-based experimentation on everyday practices that promote academic mindsets(what we call alpha labs), and how we organized practitioners to work with researchers to test, revise, refine, and iteratively improve their everyday practices (using plando-study-act cycles).While significant progress has already occurred, robust, practical, reliable efforts to improve students' mindsets remains at an early formative stage. We hope the ideas presented here are an instructive starting point for new efforts that might attempt to address other problems facing educators, most notably issues of inequality and underperformance in K-12 settings
âA double-edged sword. This is powerful but it could be used destructivelyâ: Perspectives of early career education researchers on learning analytics
Learning analytics has been increasingly outlined as a powerful tool for measuring, analysing, and predicting learning experiences and behaviours. The rising use of learning analytics means that many educational researchers now require new ranges of technical analytical skills to contribute to an increasingly data-heavy field. However, it has been argued that educational data scientists are a âscarce breedâ (Buckingham Shum et al., 2013) and that more resources are needed to support the next generation of early career researchers in the education field. At the same time, little is known about how early career education researchers feel towards learning analytics and whether it is important to their current and future research practices. Using a thematic analysis of a participatory learning analytics workshop discussions with 25 early career education researchers, we outline in this article their ambitions, challenges and anxieties towards learning analytics. In doing so, we have provided a roadmap for how the learning analytics field might evolve and practical implications for supporting early career researchersâ development
A Design Methodology for Learning Analytics Information Systems: Informing Learning Analytics Development with Learning Design
The paper motivates, presents and demonstrates a methodology for developing and evaluating learning analytics information systems (LAIS) to support teachers as learning designers. In recent years, there has been increasing emphasis on the benefits of learning analytics to support learning and teaching. Learning analytics can inform and guide teachers in the iterative design process of improving pedagogical practices. This conceptual study proposed a design approach for learning analytics information systems which considered the alignment between learning analytics and learning design activities. The conceptualization incorporated features from both learning analytics, learning design, and design science frameworks. The proposed development approach allows for rapid development and implementation of learning analytics for teachers as designers. The study attempted to close the loop between learning analytics and learning design. In essence, this paper informs both teachers and education technologists about the interrelationship between learning design and learning analytics
- âŠ