4,375 research outputs found

    Reflective writing analytics for actionable feedback

    Full text link
    © 2017 ACM. Reflective writing can provide a powerful way for students to integrate professional experience and academic learning. However, writing reflectively requires high quality actionable feedback, which is time-consuming to provide at scale. This paper reports progress on the design, implementation, and validation of a Reflective Writing Analytics platform to provide actionable feedback within a tertiary authentic assessment context. The contributions are: (1) a new conceptual framework for reflective writing; (2) a computational approach to modelling reflective writing, deriving analytics, and providing feedback; (3) the pedagogical and user experience rationale for platform design decisions; and (4) a pilot in a student learning context, with preliminary data on educator and student acceptance, and the extent to which we can evidence that the software provided actionable feedback for reflective writing

    Critical perspectives on writing analytics

    Get PDF
    Writing Analytics focuses on the measurement and analysis of written texts for the purpose of understanding writing processes and products, in their educational contexts, and improving the teaching and learning of writing. This workshop adopts a critical, holistic perspective in which the definition of "the system" and "success" is not restricted to IR metrics such as precision and recall, but recognizes the many wider issues that aid or obstruct analytics adoption in educational settings, such as theoretical and pedagogical grounding, usability, user experience, stakeholder design engagement, practitioner development, organizational infrastructure, policy and ethics

    Embracing imperfection in learning analytics

    Full text link
    © 2018 Copyright held by the owner/author(s). Learning Analytics (LA) sits at the confluence of many contributing disciplines, which brings the risk of hidden assumptions inherited from those fields. Here, we consider a hidden assumption derived from computer science, namely, that improving computational accuracy in classification is always a worthy goal. We demonstrate that this assumption is unlikely to hold in some important educational contexts, and argue that embracing computational “imperfection” can improve outcomes for those scenarios. Specifically, we show that learner-facing approaches aimed at “learning how to learn” require more holistic validation strategies. We consider what information must be provided in order to reasonably evaluate algorithmic tools in LA, to facilitate transparency and realistic performance comparisons

    Quantified Self Analytics Tools for Self-regulated Learning with myPAL

    Get PDF
    One of the major challenges in higher education is developing self-regulation skills for lifelong learning. We address this challenge within the myPAL project, in medical education context, utilising the vast amount of student assessment and feedback data collected throughout the programme. The underlying principle of myPAL is Quantified Self -- the use of personal data to enable students to become lifelong learners. myPAL is facilitating this with learning analytics combined with interactive nudges. This paper reviews the state of the art in Quantified Self analytics tools to identify what approaches can be adopted in myPAL and what gaps require further research. The paper contributes to awareness and reflection in technology-enhanced learning by: (i) identifying requirements for intelligent personal adaptive learning systems that foster self-regulation (using myPAL as an example); (ii) analysing the state of the art in text analytics and visualisation related to Quantified Self for self-regulated learning; and (iii) identifying open issues and suggesting possible ways to address them
    • 

    corecore