29,054 research outputs found
‘A double-edged sword. This is powerful but it could be used destructively’: Perspectives of early career education researchers on learning analytics
Learning analytics has been increasingly outlined as a powerful tool for measuring, analysing, and predicting learning experiences and behaviours. The rising use of learning analytics means that many educational researchers now require new ranges of technical analytical skills to contribute to an increasingly data-heavy field. However, it has been argued that educational data scientists are a ‘scarce breed’ (Buckingham Shum et al., 2013) and that more resources are needed to support the next generation of early career researchers in the education field. At the same time, little is known about how early career education researchers feel towards learning analytics and whether it is important to their current and future research practices. Using a thematic analysis of a participatory learning analytics workshop discussions with 25 early career education researchers, we outline in this article their ambitions, challenges and anxieties towards learning analytics. In doing so, we have provided a roadmap for how the learning analytics field might evolve and practical implications for supporting early career researchers’ development
Recommended from our members
Learning design in diverse institutional and cultural contexts: suggestions from a participatory workshop with higher education professionals in Africa
Learning design approaches, such as those adopted by the Open University, provide a set of tools and resources for purposefully-designing modules with a focus on student experiences. However, many of the current learning design strategies have been situated within specific institutions in Europe and North America. This means that there are several issues worth considering around if and how established learning design approaches make sense in diverse institutional and cultural contexts. To critically assess the relevance and appropriateness of learning design strategies in new contexts, this article describes an in-depth participatory workshop with 34 education professionals from five African countries. Altogether, 10 suggestions for learning design practices were derived from the consensus of workshop participants, which provide a foundation for the development of learning design practices moving forward
Study design and protocol for a mixed methods evaluation of an intervention to reduce and break up sitting time in primary school classrooms in the UK: the CLASS PAL (Physically Active Learning) Programme
Introduction: Children engage in a high volume of sitting in school, particularly in the classroom. A number of strategies, such as physically active lessons (termed movement integration (MI)), have been developed to integrate physical activity into this learning environment; however, no single approach is likely to meet the needs of all pupils and teachers. This protocol outlines an implementation study of a primary school-based MI intervention: CLASS PAL (Physically Active Learning) programme. This study aims to (A) determine the degree of implementation of CLASS PAL, (B) identify processes by which teachers and schools implement CLASS PAL and (C) investigate individual (pupil and teacher) level and school-level characteristics associated with implementation of CLASS PAL.
Methods and analysis: The intervention will provide teachers with a professional development workshop and a bespoke teaching resources website. The study will use a single group before-and-after design, strengthened by multiple interim measurements. Six state-funded primary schools will be recruited within Leicestershire, UK. Evaluation data will be collected prior to implementation and at four discrete time points during implementation: At measurement 0 (October 2016), school, teacher and pupil characteristics will be collected. At measurements 0 and 3 (June-July 2017), accelerometry, cognitive functioning, self-reported sitting and classroom engagement data will be collected. At measurements 1(December 2016-March 2017) and 3, teacher interviews (also at measurement 4; September-October 2017) and pupil focus groups will be conducted, and at measurements 1 and 2 (April-May 2017), classroom observations. Implementation will be captured through website analytics and ongoing teacher completed logs.
Ethics and dissemination: Ethical approval was obtained through the Loughborough University Human Participants Ethics Sub-Committee (Reference number: R16-P115). Findings will be disseminated via practitioner and/or research journals and to relevant regional and national stakeholders through print and online media and dissemination event(s)
Recommended from our members
Analysing video and audio data: existing approaches and new innovations
Across many subject disciplines, video and audio data are recorded in order to document processes, procedures or interactions. These video and audio data are consequently analysed using a number of techniques, in order to try and make sense of what was happening at the time of the recording, sometimes in relation to initial hypotheses or sometimes in terms of a 'post hoc' analysis where a more grounded approach is used. This paper contains an overview of tools and techniques for examining video data and looks at potential new methods borrowed from the field of learning analytics, related to discourse analysis. Discourse analysis, where conversations and the spoken word are explored and dissected in detail, can provide us with information about the learning context and the ways in which learners interact with people and other resources in their environment
The Evidence Hub: harnessing the collective intelligence of communities to build evidence-based knowledge
Conventional document and discussion websites provide users with no help in assessing the quality or quantity of evidence behind any given idea. Besides, the very meaning of what evidence is may not be unequivocally defined within a community, and may require deep understanding, common ground and debate. An Evidence Hub is a tool to pool the community collective intelligence on what is evidence for an idea. It provides an infrastructure for debating and building evidence-based knowledge and practice. An Evidence Hub is best thought of as a filter onto other websites — a map that distills the most important issues, ideas and evidence from the noise by making clear why ideas and web resources may be worth further investigation. This paper describes the Evidence Hub concept and rationale, the breath of user engagement and the evolution of specific features, derived from our work with different community groups in the healthcare and educational sector
Recommended from our members
Innovating for Learning: Designing for the Future of Education
Teaching has moved online as the world has moved online and learning is losing its sense of physical location with the availability of many different options from mobile to MOOC (Massive Open Online Course). The impact of online learning is not confined to distance learning; when a student attends a campus university they are now as likely to meet with their fellow learners virtually as face to face. The education sector has yet to fully adapt to what this means, and indeed there strong signs of a built in resilience from providers, employers and students themselves which may mean an apparent evolution is more likely than a revolution. At the same time, there are some quiet changes underway that mean we should be preparing to innovate for the revolution to come. Some of those changes are considered in work undertaken at The Open University that has been disseminated in a series of Innovating Pedagogy reports. These reports allow the academic authors to be more speculative than is usual practice and engage in considering the future, while remaining based on a view of what is happening in the sector. In particular they adopt a position focused on pedagogy that balances technology-based futurology that can dominate yet fail to resonate with those actually involved in the teaching process. The annual Innovating Pedagogy reports cover 10 topics each, with some deliberate overlap from year to year and development of themes that show innovations moving into teaching practice. This is illustrated by two cases, the impact of MOOCs and the application of learning design and analytics. The development of MOOCs demonstrates the value of reviewing pedagogy that aligns with technology. While the use of learning design and learning analytics demonstrates how improvements in the way we describe our learning processes and the way we understand learner behaviour is helping determine how choices in pedagogy impact on student satisfaction, progression and success
- …