598 research outputs found

    The Complexities of Developing a Personal Code of Ethics for Learning Analytics Practitioners: Implications for Institutions and the Field

    Get PDF
    In this paper we explore the potential role, value and utility of a personal code of ethics (COE) for learning analytics practitioners, and in particular we consider whether such a COE might usefully mediate individual actions and choices in relation to a more abstract institutional COE. While several institutional COEs now exist, little attention has been paid to detailing the ethical responsibilities of individual practitioners. To investigate the problems associated with developing and implementing a personal COE, we drafted an LA Practitioner COE based on other professional codes, and invited feedback from a range of learning analytics stakeholders and practitioners: ethicists, students, researchers and technology executives. Three main themes emerged from their reflections: 1. A need to balance real world demands with abstract principles, 2. The limits to individual accountability within the learning analytics space, and 3. The continuing value of debate around an aspirational code of ethics within the field of learning analytics

    Mixing and Matching Learning Design and Learning Analytics

    Get PDF
    In the last five years, learning analytics has proved its potential in predicting academic performance based on trace data of learning activities. However, the role of pedagogical context in learning analytics has not been fully understood. To date, it has been difficult to quantify learning in a way that can be measured and compared. By coding the design of e-learning courses, this study demonstrates how learning design is being implemented on a large scale at the Open University UK, and how learning analytics could support as well as benefit from learning design. Building on our previous work, our analysis was conducted longitudinally on 23 undergraduate distance learning modules and their 40,083 students. The innovative aspect of this study is the availability of fine-grained learning design data at individual task level, which allows us to consider the connections between learning activities, and the media used to produce the activities. Using a combination of visualizations and social network analysis, our findings revealed a diversity in how learning activities were designed within and between disciplines as well as individual learning activities. By reflecting on the learning design in an explicit manner, educators are empowered to compare and contrast their design using their own institutional data

    Beyond Failure: The 2nd LAK Failathon Poster

    Get PDF
    This poster will be a chance for a wider LAK audience to engage with the 2nd LAK Failathon workshop. Both of these will build on the successful Failathon event in 2016 and extend beyond discussing individual experiences of failure to exploring how the field can improve, particularly regarding the creation and use of evidence. Failure in research is an increasingly hot topic, with high-profile crises of confidence in the published research literature in medicine and psychology. Among the major factors in this research crisis are the many incentives to report and publish only positive findings. These incentives prevent the field in general from learning from negative findings, and almost entirely preclude the publication of mistakes and errors. Thus providing an alternative forum for practitioners and researchers to learn from each other’s failures can be very productive. The first LAK Failathon, held in 2016, provided just such an opportunity for researchers and practitioners to share their failures and negative findings in a lower-stakes environment, to help participants learn from each other’s mistakes. It was very successful, and there was strong support for running it as an annual event. The 2nd LAK Failathon workshop will build on that success, with twin objectives to provide an environment for individuals to learn from each other’s failures, and also to co-develop plans for how we as a field can better build and deploy our evidence base. This poster is an opportunity for wider feedback on the plans developed in the workshop, with interactive use of sticky notes to add new ideas and coloured dots to illustrate prioritisation. This broadens the participant base in this important work, which should improve the quality of the plans and the commitment of the community to delivering them

    Workshop on methodology in learning analytics (MLA)

    Get PDF
    Learning analytics is an interdisciplinary and inclusive field, a fact which makes the establishment of methodological norms both challenging and important. This community-building workshop intends to convene methodology-focused researchers to discuss new and established approaches, comment on the state of current practice, author pedagogical manuscripts, and co-develop guidelines to help move the field forward with quality and rigor

    Current and future multimodal learning analytics data challenges

    Get PDF
    Multimodal Learning Analytics (MMLA) captures, integrates and analyzes learning traces from different sources in order to obtain a more holistic understanding of the learning process, wherever it happens. MMLA leverages the increasingly widespread availability of diverse sensors, highfrequency data collection technologies and sophisticated machine learning and artificial intelligence techniques. The aim of this workshop is twofold: first, to expose participants to, and develop, different multimodal datasets that reflect how MMLA can bring new insights and opportunities to investigate complex learning processes and environments; second, to collaboratively identify a set of grand challenges for further MMLA research, built upon the foundations of previous workshops on the topic

    A LAK of Direction Misalignment Between the Goals of Learning Analytics and its Research Scholarship

    Get PDF
    Learning analytics defines itself with a focus on data from learners and learning environments, with corresponding goals of understanding and optimizing student learning. In this regard, learning analytics research, ideally, should be characterized by studies that make use of data from learners engaged in education systems, should measure student learning, and should make efforts to intervene and improve these learning environments
    • …
    corecore