251 research outputs found

    ‘A double-edged sword. This is powerful but it could be used destructively’: Perspectives of early career education researchers on learning analytics

    Get PDF
    Learning analytics has been increasingly outlined as a powerful tool for measuring, analysing, and predicting learning experiences and behaviours. The rising use of learning analytics means that many educational researchers now require new ranges of technical analytical skills to contribute to an increasingly data-heavy field. However, it has been argued that educational data scientists are a ‘scarce breed’ (Buckingham Shum et al., 2013) and that more resources are needed to support the next generation of early career researchers in the education field. At the same time, little is known about how early career education researchers feel towards learning analytics and whether it is important to their current and future research practices. Using a thematic analysis of a participatory learning analytics workshop discussions with 25 early career education researchers, we outline in this article their ambitions, challenges and anxieties towards learning analytics. In doing so, we have provided a roadmap for how the learning analytics field might evolve and practical implications for supporting early career researchers’ development

    It’s About Time: 4th International Workshop on Temporal Analyses of Learning Data

    Get PDF
    Interest in analyses that probe the temporal aspects of learning continues to grow. The study of common and consequential sequences of events (such as learners accessing resources, interacting with other learners and engaging in self-regulatory activities) and how these are associated with learning outcomes, as well as the ways in which knowledge and skills grow or evolve over time are both core areas of interest. Learning analytics datasets are replete with fine-grained temporal data: click streams; chat logs; document edit histories (e.g. wikis, etherpads); motion tracking (e.g. eye-tracking, Microsoft Kinect), and so on. However, the emerging area of temporal analysis presents both technical and theoretical challenges in appropriating suitable techniques and interpreting results in the context of learning. The learning analytics community offers a productive focal ground for exploring and furthering efforts to address these challenges as it is already positioned in the “‘middle space’ where learning and analytic concerns meet” (Suthers & Verbert, 2013, p 1). This workshop, the fourth in a series on temporal analysis of learning, provides a focal point for analytics researchers to consider issues around and approaches to temporality in learning analytics

    Towards a Convergent Development of Learning Analytics

    Full text link
    In the last 7 years, since the first LAK conference, Learning Analytics has grown rapidly as a field from a small group of interested scholars and practitioners to one of the most scientifically successful and institutionally accepted areas of Learning and Educational Technologies. Learning Analytics is often referred as a "Middle-Space" where experts from diverse fields (from the Learning Sciences, Computer Science, Human-Computer Interaction, Psychology and Behavioural Sciences, just to name a few) share their perspectives on how to better understand and optimize learning processes and environments using this new instrument called Data Science

    Embracing Trustworthiness and Authenticity in the Validation of Learning Analytics Systems

    Get PDF
    Learning analytics sits in the middle space between learning theory and data analytics. The inherent diversity of learning analytics manifests itself in an epistemology that strikes a balance between positivism and interpretivism, and knowledge that is sourced from theory and practice. In this paper, we argue that validation approaches for learning analytics systems should be cognisant of these diverse foundations. Through a systematic review of learning analytics validation research, we find that there is currently an over-reliance on positivistic validity criteria. Researchers tend to ignore interpretivistic criteria such as trustworthiness and authenticity. In the 38 papers we analysed, researchers covered positivistic validity criteria 221 times, whereas interpretivistic criteria were mentioned 37 times. We motivate that learning analytics can only move forward with holistic validation strategies that incorporate “thick descriptions” of educational experiences. We conclude by outlining a planned validation study using argument-based validation, which we believe will yield meaningful insights by considering a diverse spectrum of validity criteria

    Refining the Learning Analytics Capability Model: A Single Case Study

    Get PDF
    Learning analytics can help higher educational institutions improve learning. Its adoption, however, is a complex undertaking. The Learning Analytics Capability Model describes what 34 organizational capabilities must be developed to support the successful adoption of learning analytics. This paper described the first iteration to evaluate and refine the current, theoretical model. During a case study, we conducted four semi-structured interviews and collected (internal) documentation at a Dutch university that is mature in the use of student data to improve learning. Based on the empirical data, we merged seven capabilities, renamed three capabilities, and improved the definitions of all others. Six capabilities absent in extant learning analytics models are present at the case organization, implying that they are important to learning analytics adoption. As a result, the new, refined Learning Analytics Capability Model comprises 31 capabilities. Finally, some challenges were identified, showing that even mature organizations still have issues to overcome

    Embracing trustworthiness and authenticity in the validation of learning analytics systems

    Get PDF
    Learning analytics sits in the middle space between learning theory and data analytics. The inherent diversity of learning analytics manifests itself in an epistemology that strikes a balance between positivism and interpretivism, and knowledge that is sourced from theory and practice. In this paper, we argue that validation approaches for learning analytics systems should be cognisant of these diverse foundations. Through a systematic review of learning analytics validation research, we find that there is currently an over-reliance on positivistic validity criteria. Researchers tend to ignore interpretivistic criteria such as trustworthiness and authenticity. In the 38 papers we analysed, researchers covered positivistic validity criteria 221 times, whereas interpretivistic criteria were mentioned 37 times. We motivate that learning analytics can only move forward with holistic validation strategies that incorporate “thick descriptions” of educational experiences. We conclude by outlining a planned validation study using argument-based validation, which we believe will yield meaningful insights by considering a diverse spectrum of validity criteria.Horizon 2020(H2020)883588Algorithms and the Foundations of Software technolog

    The dinosaur that lost its head: A contribution to a framework using Learning Analytics in Learning Design

    Get PDF
    This paper presents an approach to the meaningful use of learning analytics as a tool for teachers to improve the robustness of their learning designs. The approach is based on examining how participants act within a Massive Open Online Course (MOOC) format through learning analytics. We show that a teacher/designer can gain knowledge about his or her intended, implemented and attained learning design; about how MOOC participants act in response to these and about how students are able to develop ‘study efficiency’ when participating in a MOOC. The learning analytics approach makes it possible to follow certain MOOC students and their study behaviour (e.g. the participants who pass the MOOC by earning enough achievement badges) and to examine the role of the moderator in MOOCs, showing that scaffolding plays a central role in studying and learning processes in an educational format such as a MOOC. Key words: MOOCs, Massive Open Online Courses, data-saturated, learning analytics, learning design, educational design research, LMS

    The Construction and Validation of an Instructor Learning Analytics Implementation Model to Support At-Risk Students

    Get PDF
    With the widespread use of learning analytics tools, there is a need to explore how these technologies can be used to enhance teaching and learning. Little research has been conducted on what human processes are necessary to facilitate meaningful adoption of learning analytics. The research problem is that there is a lack of evidence-based guidance on how instructors can effectively implement learning analytics to support academically at-risk students with the purpose of improving learning outcomes. The goal was to develop and validate a model to guide instructors in the implementation of learning analytics tools to support academically at-risk students with the purpose of improving learning outcomes. Using design and development research methods, an implementation model was constructed and validated internally. Themes emerged falling into the categories of adoption and caution with six themes falling under adoption including: LA as evidence, reaching out, frequency, early identification/intervention, self-reflection, and align LA with pedagogical intent and three themes falling under the category of caution including: skepticism, fear of overdependence, and question of usefulness. The model should enhance instructors’ use of learning analytics by enabling them to better take advantage of available technologies to support teaching and learning in online and blended learning environments. Researchers can further validate the model by studying its usability (i.e., usefulness, effectiveness, efficiency, and learnability), as well as, how instructors’ use of this model to implement learning analytics in their courses affects retention, persistence, and performance
    • …
    corecore