251 research outputs found
âA double-edged sword. This is powerful but it could be used destructivelyâ: Perspectives of early career education researchers on learning analytics
Learning analytics has been increasingly outlined as a powerful tool for measuring, analysing, and predicting learning experiences and behaviours. The rising use of learning analytics means that many educational researchers now require new ranges of technical analytical skills to contribute to an increasingly data-heavy field. However, it has been argued that educational data scientists are a âscarce breedâ (Buckingham Shum et al., 2013) and that more resources are needed to support the next generation of early career researchers in the education field. At the same time, little is known about how early career education researchers feel towards learning analytics and whether it is important to their current and future research practices. Using a thematic analysis of a participatory learning analytics workshop discussions with 25 early career education researchers, we outline in this article their ambitions, challenges and anxieties towards learning analytics. In doing so, we have provided a roadmap for how the learning analytics field might evolve and practical implications for supporting early career researchersâ development
Itâs About Time: 4th International Workshop on Temporal Analyses of Learning Data
Interest in analyses that probe the temporal aspects of learning continues to grow. The study of common and consequential sequences of events (such as learners accessing resources, interacting with other learners and engaging in self-regulatory activities) and how these are associated with learning outcomes, as well as the ways in which knowledge and skills grow or evolve over time are both core areas of interest. Learning analytics datasets are replete with fine-grained temporal data: click streams; chat logs; document edit histories (e.g. wikis, etherpads); motion tracking (e.g. eye-tracking, Microsoft Kinect), and so on. However, the emerging area of temporal analysis presents both technical and theoretical challenges in appropriating suitable techniques and interpreting results in the context of learning. The learning analytics community offers a productive focal ground for exploring and furthering efforts to address these challenges as it is already positioned in the ââmiddle spaceâ where learning and analytic concerns meetâ (Suthers & Verbert, 2013, p 1). This workshop, the fourth in a series on temporal analysis of learning, provides a focal point for analytics researchers to consider issues around and approaches to temporality in learning analytics
Recommended from our members
Analytics for learning and becoming in practice
Learning Analytics sits at the intersection of the learning sciences and computational data capture and analysis. Analytics should be grounded in the existing literature with a view to data âgeologyâ or âarcheologyâ over âminingâ. This workshop explores how analytics may extend the common notion of activity trace data from learning processes to encompass learning practices, with a working distinction for discussion as (1) process: a series of related actions engaged in as part of learning activities; and (2) practice: a repertoire of processes organised around particular foci recognised within a social group. The workshop intersperses attendee presentations and demonstrations with relevant theme-based discussions
Towards a Convergent Development of Learning Analytics
In the last 7 years, since the first LAK conference, Learning Analytics has grown rapidly as a field from a small group of interested scholars and practitioners to one of the most scientifically successful and institutionally accepted areas of Learning and Educational Technologies. Learning Analytics is often referred as a "Middle-Space" where experts from diverse fields (from the Learning Sciences, Computer Science, Human-Computer Interaction, Psychology and Behavioural Sciences, just to name a few) share their perspectives on how to better understand and optimize learning processes and environments using this new instrument called Data Science
Embracing Trustworthiness and Authenticity in the Validation of Learning Analytics Systems
Learning analytics sits in the middle space between learning theory and data analytics. The inherent diversity of learning analytics manifests itself in an epistemology that strikes a balance between positivism and interpretivism, and knowledge that is sourced from theory and practice. In this paper, we argue that validation approaches for learning analytics systems should be cognisant of these diverse foundations. Through a systematic review of learning analytics validation research, we find that there is currently an over-reliance on positivistic validity criteria. Researchers tend to ignore interpretivistic criteria such as trustworthiness and authenticity. In the 38 papers we analysed, researchers covered positivistic validity criteria 221 times, whereas interpretivistic criteria were mentioned 37 times. We motivate that learning analytics can only move forward with holistic validation strategies that incorporate âthick descriptionsâ of educational experiences. We conclude by outlining a planned validation study using argument-based validation, which we believe will yield meaningful insights by considering a diverse spectrum of validity criteria
Refining the Learning Analytics Capability Model: A Single Case Study
Learning analytics can help higher educational institutions improve learning. Its adoption, however, is a complex undertaking. The Learning Analytics Capability Model describes what 34 organizational capabilities must be developed to support the successful adoption of learning analytics. This paper described the first iteration to evaluate and refine the current, theoretical model. During a case study, we conducted four semi-structured interviews and collected (internal) documentation at a Dutch university that is mature in the use of student data to improve learning. Based on the empirical data, we merged seven capabilities, renamed three capabilities, and improved the definitions of all others. Six capabilities absent in extant learning analytics models are present at the case organization, implying that they are important to learning analytics adoption. As a result, the new, refined Learning Analytics Capability Model comprises 31 capabilities. Finally, some challenges were identified, showing that even mature organizations still have issues to overcome
Embracing trustworthiness and authenticity in the validation of learning analytics systems
Learning analytics sits in the middle space between learning theory and data analytics. The inherent diversity of learning analytics manifests itself in an epistemology that strikes a balance between positivism and interpretivism, and knowledge that is sourced from theory and practice. In this paper, we argue that validation approaches for learning analytics systems should be cognisant of these diverse foundations. Through a systematic review of learning analytics validation research, we find that there is currently an over-reliance on positivistic validity criteria. Researchers tend to ignore interpretivistic criteria such as trustworthiness and authenticity. In the 38 papers we analysed, researchers covered positivistic validity criteria 221 times, whereas interpretivistic criteria were mentioned 37 times. We motivate that learning analytics can only move forward with holistic validation strategies that incorporate âthick descriptionsâ of educational experiences. We conclude by outlining a planned validation study using argument-based validation, which we believe will yield meaningful insights by considering a diverse spectrum of validity criteria.Horizon 2020(H2020)883588Algorithms and the Foundations of Software technolog
The dinosaur that lost its head: A contribution to a framework using Learning Analytics in Learning Design
This paper presents an approach to the meaningful use of learning analytics as a tool for teachers to improve the robustness of their learning designs. The approach is based on examining how participants act within a Massive Open Online Course (MOOC) format through learning analytics. We show that a teacher/designer can gain knowledge about his or her intended, implemented and attained learning design; about how MOOC participants act in response to these and about how students are able to develop âstudy efficiencyâ when participating in a MOOC. The learning analytics approach makes it possible to follow certain MOOC students and their study behaviour (e.g. the participants who pass the MOOC by earning enough achievement badges) and to examine the role of the moderator in MOOCs, showing that scaffolding plays a central role in studying and learning processes in an educational format such as a MOOC.
Key words: MOOCs, Massive Open Online Courses, data-saturated, learning analytics, learning design, educational design research, LMS
Recommended from our members
Affordances of Learning Analytics for Mediating Learning
Learning analytics acceptance and adoption is a socio-technological endeavour. Understanding how learning analytics impact practice is an important part of demonstrating their value. In the study presented in this thesis, "Mediated Learning" provides a framework through which to describe how learning analytics can impact psychological, social and material aspects of learning, from the perspective of educators and learners. It also offers a structure through which to make recommendations for improving the mediatory effects of learning analytics. A qualitative research design, based on "Grounded Theory" was implemented and 10 educators from 3 European universities were recruited through convenience and purposive sampling for exploratory interviews. A subsequent case study of the Open University provided critical perspectives from both educators (n=18) and learners (n=22) about the institutional, departmental, domain-related and epistemological factors that broadly influence perceptions of learning analytics. The study applied "Affordance Theory" to identify what participants were most easily able to recognise as beneficial to their own practice. Participant contributions were open-coded to uncover emerging themes and then organised into thematic categories and subcategories. Respondent validation, as well as triangulation of data between the exploratory interviews and focus groups support the validity of the study. Findings suggested that domain-related epistemological assumptions and previous experience influence how and why an individual could make use of learning analytics insights. Gaining stakeholder acceptance involves targeting the right training and opportunities at the appropriate disciplines. Findings also indicate that learning analytics has the strongest mediatory effect for learners when the technology is capable of exposing them to other learners' strategies, or when it assists them personally, and continually in goal orientation adoption. The implications of the study are important for higher education institutions looking to implement large-scale learning analytics initiatives, in particular, those with a diverse student body
The Construction and Validation of an Instructor Learning Analytics Implementation Model to Support At-Risk Students
With the widespread use of learning analytics tools, there is a need to explore how these technologies can be used to enhance teaching and learning. Little research has been conducted on what human processes are necessary to facilitate meaningful adoption of learning analytics. The research problem is that there is a lack of evidence-based guidance on how instructors can effectively implement learning analytics to support academically at-risk students with the purpose of improving learning outcomes. The goal was to develop and validate a model to guide instructors in the implementation of learning analytics tools to support academically at-risk students with the purpose of improving learning outcomes. Using design and development research methods, an implementation model was constructed and validated internally. Themes emerged falling into the categories of adoption and caution with six themes falling under adoption including: LA as evidence, reaching out, frequency, early identification/intervention, self-reflection, and align LA with pedagogical intent and three themes falling under the category of caution including: skepticism, fear of overdependence, and question of usefulness. The model should enhance instructorsâ use of learning analytics by enabling them to better take advantage of available technologies to support teaching and learning in online and blended learning environments. Researchers can further validate the model by studying its usability (i.e., usefulness, effectiveness, efficiency, and learnability), as well as, how instructorsâ use of this model to implement learning analytics in their courses affects retention, persistence, and performance
- âŚ