4,375 research outputs found
Reflective writing analytics for actionable feedback
© 2017 ACM. Reflective writing can provide a powerful way for students to integrate professional experience and academic learning. However, writing reflectively requires high quality actionable feedback, which is time-consuming to provide at scale. This paper reports progress on the design, implementation, and validation of a Reflective Writing Analytics platform to provide actionable feedback within a tertiary authentic assessment context. The contributions are: (1) a new conceptual framework for reflective writing; (2) a computational approach to modelling reflective writing, deriving analytics, and providing feedback; (3) the pedagogical and user experience rationale for platform design decisions; and (4) a pilot in a student learning context, with preliminary data on educator and student acceptance, and the extent to which we can evidence that the software provided actionable feedback for reflective writing
Recommended from our members
A Framework for Assessing Reflective Writing Produced Within the Context of Computer Science Education
Reflective writing is known to be an effective activity to increase students' learning. However, there is limited literature in reflective writing assessment criteria in the context of computer science (CS) education. In this paper, we aim to explore a meaningful reflective writing assessment characteristics. That has been used to assess reflective text by CS educators. This paper has two contributions: (a) we developed a Reflective Writing Framework (RWF) for the main criteria has been used to assess reflective text in CS education from the findings of a semi-structure questionnaire; (b) the RWF was tested empirically using a pilot test of the manual annotation used to modify the framework. This analysis resulted in an inter-rater reliability of 0.78 being achieved. The overall goal of this research is to develop a Learning Analytics (LA) tool which can automatically detect the categories of the RWF present in a text to assess the student authorsâ reflective writing in relation to CS
Recommended from our members
Scholarly insight Spring 2018: a Data wrangler perspective
In the movie classic Back to the Future a young Michael J. Fox is able to explore the past by a time machine developed by the slightly bizarre but exquisite Dr Brown. Unexpectedly by some small intervention the course of history was changed a bit along Foxâs adventures. In this fourth Scholarly Insight Report we have explored two innovative approaches to learn from OU data of the past, which hopefully in the future will make a large difference in how we support our students and design and implement our teaching and learning practices. In Chapter 1, we provide an in-depth analysis of 50 thousands comments expressed by students through the Student Experience on a Module (SEAM) questionnaire. By analysing over 2.5 million words using big data approaches, our Scholarly insights indicate that not all student voices are heard. Furthermore, our big data analysis indicate useful potential insights to explore how student voices change over time, and for which particular modules emergent themes might arise.
In Chapter 2 we provide our second innovative approach of a proof-of-concept of qualification path way using graph approaches. By exploring existing data of one qualification (i.e., Psychology), we show that students make a range of pathway choices during their qualification, some of which are more successful than others. As highlighted in our previous Scholarly Insight Reports, getting data from a qualification perspective within the OU is a difficult and challenging process, and the proof-of-concept provided in Chapter 2 might provide a way forward to better understand and support the complex choices our students make.
In Chapter 3, we provide a slightly more practically-oriented and perhaps down to earth approach focussing on the lessons-learned with Analytics4Action. Over the last four years nearly a hundred modules have worked with more active use of data and insights into module presentation to support their students. In Chapter 3 several good-practices are described by the LTI/TEL learning design team, as well as three innovative case-studies which we hope will inspire you to try something new as well.
Working organically in various Faculty sub-group meetings and LTI Units and in a google doc with various key stakeholders in the Faculties, we hope that our Scholarly insights can help to inform our staff, but also spark some ideas how to further improve our module designs and qualification pathways. Of course we are keen to hear what other topics require Scholarly insight. We hope that you see some potential in the two innovative approaches, and perhaps you might want to try some new ideas in your module. While a time machine has not really been invented yet, with the increasing rich and fine-grained data about our students and our learning practices we are getting closer to understand what really drives our students
Critical perspectives on writing analytics
Writing Analytics focuses on the measurement and analysis of written texts for the purpose of understanding writing processes and products, in their educational contexts, and improving the teaching and learning of writing. This workshop adopts a critical, holistic perspective in which the definition of "the system" and "success" is not restricted to IR metrics such as precision and recall, but recognizes the many wider issues that aid or obstruct analytics adoption in educational settings, such as theoretical and pedagogical grounding, usability, user experience, stakeholder design engagement, practitioner development, organizational infrastructure, policy and ethics
Embracing imperfection in learning analytics
© 2018 Copyright held by the owner/author(s). Learning Analytics (LA) sits at the confluence of many contributing disciplines, which brings the risk of hidden assumptions inherited from those fields. Here, we consider a hidden assumption derived from computer science, namely, that improving computational accuracy in classification is always a worthy goal. We demonstrate that this assumption is unlikely to hold in some important educational contexts, and argue that embracing computational âimperfectionâ can improve outcomes for those scenarios. Specifically, we show that learner-facing approaches aimed at âlearning how to learnâ require more holistic validation strategies. We consider what information must be provided in order to reasonably evaluate algorithmic tools in LA, to facilitate transparency and realistic performance comparisons
Recommended from our members
Unravelling the dynamics of learning design within and between disciplines in higher education using learning analytics
Designing effective learning experience in virtual learning environment (VLE) can be supported by learning analytics (LA) through explicit feedback on how learning design (LD) influences studentsâ engagement, satisfaction and performance. Marrying LA with LD not only puts existing pedagogical theories in instructional design to the test with actual learning data, but also provides the context of learning which helps educators translate established LA findings to direct interventions. My dissertation aims at unpacking the complexity of LD and its impact on studentsâ engagement, satisfaction and performance on VLE using LA. The context of this study is 400+ online and blended learning modules at the Open University (OU) UK. This research combines multiple sources of data from the OU Learning Design Initiative (OULDI), system log data, self-reported surveys, and performance data. Given the scope of this study, a wide range of visualization techniques, social network analysis, multi-level modelling, and machine learning will be used
Quantified Self Analytics Tools for Self-regulated Learning with myPAL
One of the major challenges in higher education is developing self-regulation skills for lifelong learning. We address this challenge within the myPAL project, in medical education context, utilising the vast amount of student assessment and feedback data collected throughout the programme. The underlying principle of myPAL is Quantified Self -- the use of personal data to enable students to become lifelong learners. myPAL is facilitating this with learning analytics combined with interactive nudges. This paper reviews the state of the art in Quantified Self analytics tools to identify what approaches can be adopted in myPAL and what gaps require further research. The paper contributes to awareness and reflection in technology-enhanced learning by: (i) identifying requirements for intelligent personal adaptive learning systems that foster self-regulation (using myPAL as an example); (ii) analysing the state of the art in text analytics and visualisation related to Quantified Self for self-regulated learning; and (iii) identifying open issues and suggesting possible ways to address them
- âŠ