8,044 research outputs found

    Mixing and Matching Learning Design and Learning Analytics

    Get PDF
    In the last five years, learning analytics has proved its potential in predicting academic performance based on trace data of learning activities. However, the role of pedagogical context in learning analytics has not been fully understood. To date, it has been difficult to quantify learning in a way that can be measured and compared. By coding the design of e-learning courses, this study demonstrates how learning design is being implemented on a large scale at the Open University UK, and how learning analytics could support as well as benefit from learning design. Building on our previous work, our analysis was conducted longitudinally on 23 undergraduate distance learning modules and their 40,083 students. The innovative aspect of this study is the availability of fine-grained learning design data at individual task level, which allows us to consider the connections between learning activities, and the media used to produce the activities. Using a combination of visualizations and social network analysis, our findings revealed a diversity in how learning activities were designed within and between disciplines as well as individual learning activities. By reflecting on the learning design in an explicit manner, educators are empowered to compare and contrast their design using their own institutional data

    Learning analytics support to teachers' design and orchestrating tasks

    Full text link
    Background: Data-driven educational technology solutions have the potential to support teachers in different tasks, such as the designing and orchestration of collaborative learning activities. When designing, such solutions can improve teacher understanding of how learning designs impact student learning and behaviour; and guide them to refine and redesign future learning designs. When orchestrating educational scenarios, data-driven solutions can support teacher awareness of learner participation and progress and enhance real time classroom management. Objectives: The use of learning analytics (LA) can be considered a suitable approach to tackle both problems. However, it is unclear if the same LA indicators are able to satisfactorily support both the designing and orchestration of activities. This study aims to investigate the use of the same LA indicators for supporting multiple teacher tasks, that is, design, redesign and orchestration, as a gap in the existing literature that requires further exploration. Methods: In this study, first we refer to the previous work to study the use of different LA to support both tasks. Then we analyse the nature of the two tasks focusing on a case study that uses the same collaborative learning tool with LA to support both tasks. Implications: The study findings led to derive design considerations on LA support for teachers’ design and orchestrating tasks

    The dinosaur that lost its head: A contribution to a framework using Learning Analytics in Learning Design

    Get PDF
    This paper presents an approach to the meaningful use of learning analytics as a tool for teachers to improve the robustness of their learning designs. The approach is based on examining how participants act within a Massive Open Online Course (MOOC) format through learning analytics. We show that a teacher/designer can gain knowledge about his or her intended, implemented and attained learning design; about how MOOC participants act in response to these and about how students are able to develop ‘study efficiency’ when participating in a MOOC. The learning analytics approach makes it possible to follow certain MOOC students and their study behaviour (e.g. the participants who pass the MOOC by earning enough achievement badges) and to examine the role of the moderator in MOOCs, showing that scaffolding plays a central role in studying and learning processes in an educational format such as a MOOC. Key words: MOOCs, Massive Open Online Courses, data-saturated, learning analytics, learning design, educational design research, LMS

    From Theory to Action: Developing and Evaluating Learning Analyticsfor Learning Design

    Get PDF
    ProducciĂłn CientĂ­ficaThe effectiveness of using learning analytics for learning design primarily depends upon two concepts: grounding and alignment. This is the primary conjecture for the study described in this paper. In our design-based research study, we design, test, and evaluate teacher-facing learning analytics for an online inquiry science unit on global climate change. We design our learning analytics in accordance with a socioconstructivism-based pedagogical framework, called Knowledge Integration, and the principles of learning analytics Implementation Design. Our methodology for the design process draws upon the principle of the Orchestrating for Learning Analytics framework to engage stakeholders (i.e. teachers, researchers, and developers). The resulting learning analytics were aligned to unit activities that engaged students in key aspects of the knowledge integration process. They provided teachers with actionable insight into their students' understanding at critical junctures in the learning process. We demonstrate the efficacy of the learning analytics in supporting the optimization of the unit's learning design. We conclude by synthesizing the principles that guided our design process into a framework for developing and evaluating learning analytics for learning design.Ministerio de Ciencia, InnovaciĂłn y Universidades (Project TIN2017-85179-C3-2-R)Junta de Castilla y LeĂłn (project VA257P18) by the European Commission under project grant 588438-EPP-1-2017-1-EL-EPPKA2-KA

    Improvement Research Carried Out Through Networked Communities: Accelerating Learning about Practices that Support More Productive Student Mindsets

    Get PDF
    The research on academic mindsets shows significant promise for addressing important problems facing educators. However, the history of educational reform is replete with good ideas for improvement that fail to realize the promises that accompany their introduction. As a field, we are quick to implement new ideas but slow to learn how to execute well on them. If we continue to implement reform as we always have, we will continue to get what we have always gotten. Accelerating the field's capacity to learn in and through practice to improve is one key to transforming the good ideas discussed at the White House meeting into tools, interventions, and professional development initiatives that achieve effectiveness reliably at scale. Toward this end, this paper discusses the function of networked communities engaged in improvement research and illustrates the application of these ideas in promoting greater student success in community colleges. Specifically, this white paper:* Introduces improvement research and networked communities as ideas that we believe can enhance educators' capacities to advance positive change. * Explains why improvement research requires a different kind of measures -- what we call practical measurement -- that are distinct from those commonly used by schools for accountability or by researchers for theory development.* Illustrates through a case study how systematic improvement work to promote student mindsets can be carried out. The case is based on the Carnegie Foundation's effort to address the poor success rates for students in developmental math at community colleges.Specifically, this case details:- How a practical theory and set of practical measures were created to assess the causes of "productive persistence" -- the set of "non-cognitive factors" thought to powerfully affect community college student success. In doing this work, a broad set of potential factors was distilled into a digestible framework that was useful topractitioners working with researchers, and a large set of potential measures was reduced to a practical (3-minute) set of assessments.- How these measures were used by researchers and practitioners for practical purposes -- specifically, to assess changes, predict which students were at-risk for course failure, and set priorities for improvement work.-How we organized researchersto work with practitioners to accelerate field-based experimentation on everyday practices that promote academic mindsets(what we call alpha labs), and how we organized practitioners to work with researchers to test, revise, refine, and iteratively improve their everyday practices (using plando-study-act cycles).While significant progress has already occurred, robust, practical, reliable efforts to improve students' mindsets remains at an early formative stage. We hope the ideas presented here are an instructive starting point for new efforts that might attempt to address other problems facing educators, most notably issues of inequality and underperformance in K-12 settings

    ‘A double-edged sword. This is powerful but it could be used destructively’: Perspectives of early career education researchers on learning analytics

    Get PDF
    Learning analytics has been increasingly outlined as a powerful tool for measuring, analysing, and predicting learning experiences and behaviours. The rising use of learning analytics means that many educational researchers now require new ranges of technical analytical skills to contribute to an increasingly data-heavy field. However, it has been argued that educational data scientists are a ‘scarce breed’ (Buckingham Shum et al., 2013) and that more resources are needed to support the next generation of early career researchers in the education field. At the same time, little is known about how early career education researchers feel towards learning analytics and whether it is important to their current and future research practices. Using a thematic analysis of a participatory learning analytics workshop discussions with 25 early career education researchers, we outline in this article their ambitions, challenges and anxieties towards learning analytics. In doing so, we have provided a roadmap for how the learning analytics field might evolve and practical implications for supporting early career researchers’ development

    A Design Methodology for Learning Analytics Information Systems: Informing Learning Analytics Development with Learning Design

    Get PDF
    The paper motivates, presents and demonstrates a methodology for developing and evaluating learning analytics information systems (LAIS) to support teachers as learning designers. In recent years, there has been increasing emphasis on the benefits of learning analytics to support learning and teaching. Learning analytics can inform and guide teachers in the iterative design process of improving pedagogical practices. This conceptual study proposed a design approach for learning analytics information systems which considered the alignment between learning analytics and learning design activities. The conceptualization incorporated features from both learning analytics, learning design, and design science frameworks. The proposed development approach allows for rapid development and implementation of learning analytics for teachers as designers. The study attempted to close the loop between learning analytics and learning design. In essence, this paper informs both teachers and education technologists about the interrelationship between learning design and learning analytics
    • 

    corecore