31 research outputs found

    Towards Value-Sensitive Learning Analytics Design

    Full text link
    To support ethical considerations and system integrity in learning analytics, this paper introduces two cases of applying the Value Sensitive Design methodology to learning analytics design. The first study applied two methods of Value Sensitive Design, namely stakeholder analysis and value analysis, to a conceptual investigation of an existing learning analytics tool. This investigation uncovered a number of values and value tensions, leading to design trade-offs to be considered in future tool refinements. The second study holistically applied Value Sensitive Design to the design of a recommendation system for the Wikipedia WikiProjects. To proactively consider values among stakeholders, we derived a multi-stage design process that included literature analysis, empirical investigations, prototype development, community engagement, iterative testing and refinement, and continuous evaluation. By reporting on these two cases, this paper responds to a need of practical means to support ethical considerations and human values in learning analytics systems. These two cases demonstrate that Value Sensitive Design could be a viable approach for balancing a wide range of human values, which tend to encompass and surpass ethical issues, in learning analytics design.Comment: The 9th International Learning Analytics & Knowledge Conference (LAK19

    Aligning the Goals of Learning Analytics with its Research Scholarship: An Open Peer Commentary Approach

    Get PDF
    To promote cross-community dialogue on matters of significance within the field of learning analytics (LA), we as editors-in-chief of the Journal of Learning Analytics (JLA) have introduced a section for papers that are open to peer commentary. An invitation to submit proposals for commentaries on the paper was released, and 12 of these proposals were accepted. The 26 authors of the accepted commentaries are based in Europe, North America, and Australia. They range in experience from PhD students and early-career researchers to some of the longest-standing, most senior members of the learning analytics community. This paper brings those commentaries together, and we recommend reading it as a companion piece to the original paper by Motz et al. (2023), which also appears in this issu

    Aligning the goals of learning analytics with its research scholarship: an open peer commentary approach

    Get PDF
    To promote cross-community dialogue on matters of significance within the field of learning analytics (LA), we as editors-in-chief of the Journal of Learning Analytics (JLA) have introduced a section for papers that are open to peer commentary. An invitation to submit proposals for commentaries on the paper was released, and 12 of these proposals were accepted. The 26 authors of the accepted commentaries are based in Europe, North America, and Australia. They range in experience from PhD students and early-career researchers to some of the longest-standing, most senior members of the learning analytics community. This paper brings those commentaries together, and we recommend reading it as a companion piece to the original paper by Motz et al. (2023), which also appears in this issue.Horizon 2020(H2020)883588Algorithms and the Foundations of Software technolog

    A LAK of Direction Misalignment Between the Goals of Learning Analytics and its Research Scholarship

    Get PDF
    Learning analytics defines itself with a focus on data from learners and learning environments, with corresponding goals of understanding and optimizing student learning. In this regard, learning analytics research, ideally, should be characterized by studies that make use of data from learners engaged in education systems, should measure student learning, and should make efforts to intervene and improve these learning environments

    The Question-driven Dashboard: How Can We Design Analytics Interfaces Aligned to Teachers’ Inquiry?

    Get PDF
    One of the ultimate goals of several learning analytics (LA) initiatives is to close the loop and support students’ and teachers’ reflective practices. Although there has been a proliferation of end-user interfaces (often in the form of dashboards), various limitations have already been identified in the literature such as key stakeholders not being involved in their design, little or no account for sense-making needs, and unclear effects on teaching and learning. There has been a recent call for human-centred design practices to create LA interfaces in close collaboration with educational stakeholders to consider the learning design, and their authentic needs and pedagogical intentions. This paper addresses the call by proposing a question-driven LA design approach to ensure that end-user LA interfaces explicitly address teachers’ questions. We illustrate the approach in the context of synchronous online activities, orchestrated by pairs of teachers using audio-visual and text-based tools (namely Zoom and Google Docs). This study led to the design and deployment of an open-source monitoring tool to be used in real-time by teachers when students work collaboratively in breakout rooms, and across learning spaces

    Embracing trustworthiness and authenticity in the validation of learning analytics systems

    Get PDF
    Learning analytics sits in the middle space between learning theory and data analytics. The inherent diversity of learning analytics manifests itself in an epistemology that strikes a balance between positivism and interpretivism, and knowledge that is sourced from theory and practice. In this paper, we argue that validation approaches for learning analytics systems should be cognisant of these diverse foundations. Through a systematic review of learning analytics validation research, we find that there is currently an over-reliance on positivistic validity criteria. Researchers tend to ignore interpretivistic criteria such as trustworthiness and authenticity. In the 38 papers we analysed, researchers covered positivistic validity criteria 221 times, whereas interpretivistic criteria were mentioned 37 times. We motivate that learning analytics can only move forward with holistic validation strategies that incorporate “thick descriptions” of educational experiences. We conclude by outlining a planned validation study using argument-based validation, which we believe will yield meaningful insights by considering a diverse spectrum of validity criteria.Horizon 2020(H2020)883588Algorithms and the Foundations of Software technolog

    Embracing Trustworthiness and Authenticity in the Validation of Learning Analytics Systems

    Get PDF
    Learning analytics sits in the middle space between learning theory and data analytics. The inherent diversity of learning analytics manifests itself in an epistemology that strikes a balance between positivism and interpretivism, and knowledge that is sourced from theory and practice. In this paper, we argue that validation approaches for learning analytics systems should be cognisant of these diverse foundations. Through a systematic review of learning analytics validation research, we find that there is currently an over-reliance on positivistic validity criteria. Researchers tend to ignore interpretivistic criteria such as trustworthiness and authenticity. In the 38 papers we analysed, researchers covered positivistic validity criteria 221 times, whereas interpretivistic criteria were mentioned 37 times. We motivate that learning analytics can only move forward with holistic validation strategies that incorporate “thick descriptions” of educational experiences. We conclude by outlining a planned validation study using argument-based validation, which we believe will yield meaningful insights by considering a diverse spectrum of validity criteria

    Analytics of student interactions: towards theory-driven, actionable insights

    Get PDF
    The field of learning analytics arose as a response to the vast quantities of data that are increasingly generated about students, their engagement with learning resources, and their learning and future career outcomes. While the field began as a collage, adopting methods and theories from a variety of disciplines, it has now become a major area of research, and has had a substantial impact on practice, policy, and decision-making. Although the field supports the collection and analysis of a wide array of data, existing work has predominantly focused on the digital traces generated through interactions with technology, learning content, and other students. Yet for any analyses to support students and teachers, the measures derived from these data must (1) offer practical and actionable insight into learning processes and outcomes, and (2) be theoretically grounded. As the field has matured, a number of challenges related to these criteria have become apparent. For instance, concerns have been raised that the literature prioritises predictive modeling over ensuring that these models are capable of informing constructive actions. Furthermore, the methodological validity of much of this work has been challenged, as a swathe of recent research has found many of these models fail to replicate to novel contexts. The work presented in this thesis addresses both of these concerns. In doing so, our research is pervaded by three key concerns: firstly, ensuring that any measures developed are both structurally valid and generalise across contexts; secondly, providing actionable insight with regards to student engagement; and finally, providing representations of student interactions that are predictive of student outcomes, namely, grades and students’ persistence in their studies. This research programme is heavily indebted to the work of Vincent Tinto, who conceptually distinguishes between the interactions students have with the academic and social domains present within their educational institution. This model has been subjected to extensive empirical validation, using a range of methods and data. For instance, while some studies have relied upon survey responses, others have used social network metrics, demographic variables, and students’ time spent in class together to evaluate Tinto’s claims. This model provides a foundation for the thesis, and the work presented may be categorised into two distinct veins aligning with the academic and social aspects of integration that Tinto proposes. These two domains, Tinto argues, continually modify a student’s goals and commitments, resulting in persistence or eventual disengagement and dropout. In the former, academic domain, we present a series of novel methodologies developed for modeling student engagement with academic resources. In doing so, we assessed how an individual student’s behaviour may be modeled using hidden Markov models (HMMs) to provide representations that enable actionable insight. However, in the face of considerable individual differences and cross-course variation, the validity of such methods may be called into question. Accordingly, ensuring that any measurements of student engagement are both structurally valid, and generalise across course contexts and disciplines became a central concern. To address this, we developed our model of student engagement using sticky-HMMs, emphasised the more interpretable insight such an approach provides compared to competing models, demonstrated its cross-course generality, and assessed its structural validity through the successful prediction of student dropout. In the social domain, a critical concern was to ensure any analyses conducted were valid. Accordingly, we assessed how the diversity of social tie definitions may undermine the validity of subsequent modeling practices. We then modeled students’ social integration using graph embedding techniques, and found that not only are student embeddings predictive of their final grades, but also of their persistence in their educational institution. In keeping with Tinto’s model, our research has focused on academic and social interactions separately, but both avenues of investigation have led to the question of student disengagement and dropout, and how this may be represented and remedied through the provision of actionable insight

    Understanding privacy and data protection issues in learning analytics using a systematic review

    Get PDF
    The field of learning analytics has advanced from infancy stages into a more practical domain, where tangible solutions are being implemented. Nevertheless, the field has encountered numerous privacy and data protection issues that have garnered significant and growing attention. In this systematic review, four databases were searched concerning privacy and data protection issues of learning analytics. A final corpus of 47 papers published in top educational technology journals was selected after running an eligibility check. An analysis of the final corpus was carried out to answer the following three research questions: (1) What are the privacy and data protection issues in learning analytics? (2) What are the similarities and differences between the views of stakeholders from different backgrounds on privacy and data protection issues in learning analytics? (3) How have previous approaches attempted to address privacy and data protection issues? The results of the systematic review show that there are eight distinct, intertwined privacy and data protection issues that cut across the learning analytics cycle. There are both cross-regional similarities and three sets of differences in stakeholder perceptions towards privacy and data protection in learning analytics. With regard to previous attempts to approach privacy and data protection issues in learning analytics, there is a notable dearth of applied evidence, which impedes the assessment of their effectiveness. The findings of our paper suggest that privacy and data protection issues should not be relaxed at any point in the implementation of learning analytics, as these issues persist throughout the learning analytics development cycle. One key implication of this review suggests that solutions to privacy and data protection issues in learning analytics should be more evidence-based, thereby increasing the trustworthiness of learning analytics and its usefulness.publishedVersio
    corecore