18,530 research outputs found
âA double-edged sword. This is powerful but it could be used destructivelyâ: Perspectives of early career education researchers on learning analytics
Learning analytics has been increasingly outlined as a powerful tool for measuring, analysing, and predicting learning experiences and behaviours. The rising use of learning analytics means that many educational researchers now require new ranges of technical analytical skills to contribute to an increasingly data-heavy field. However, it has been argued that educational data scientists are a âscarce breedâ (Buckingham Shum et al., 2013) and that more resources are needed to support the next generation of early career researchers in the education field. At the same time, little is known about how early career education researchers feel towards learning analytics and whether it is important to their current and future research practices. Using a thematic analysis of a participatory learning analytics workshop discussions with 25 early career education researchers, we outline in this article their ambitions, challenges and anxieties towards learning analytics. In doing so, we have provided a roadmap for how the learning analytics field might evolve and practical implications for supporting early career researchersâ development
Analytics and complexity: learning and leading for the future
There is growing interest in the application of learning analytics to manage, inform and improve learning and teaching within higher education. In particular, learning analytics is seen as enabling data-driven decision making as universities are seeking to respond a range of significant challenges that are reshaping the higher education landscape. Experience over four years with a project exploring the use of learning analytics to improve learning and teaching at a particular university has, however, revealed a much more complex reality that potentially limits the value of some analytics-based strategies. This paper uses this experience with over 80,000 students across three learning management systems, combined with literature from complex adaptive systems and learning analytics to identify the source and nature of these limitations along with a suggested path forward
Student Privacy in Learning Analytics: An Information Ethics Perspective
In recent years, educational institutions have started using the tools of commercial data analytics in higher education. By gathering information about students as they navigate campus information systems, learning analytics âuses analytic techniques to help target instructional, curricular, and support resourcesâ to examine student learning behaviors and change studentsâ learning environments. As a result, the information educators and educational institutions have at their disposal is no longer demarcated by course content and assessments, and old boundaries between information used for assessment and information about how students live and work are blurring. Our goal in this paper is to provide a systematic discussion of the ways in which privacy and learning analytics conflict and to provide a framework for understanding those conflicts.
We argue that there are five crucial issues about student privacy that we must address in order to ensure that whatever the laudable goals and gains of learning analytics, they are commensurate with respecting studentsâ privacy and associated rights, including (but not limited to) autonomy interests. First, we argue that we must distinguish among different entities with respect to whom students have, or lack, privacy. Second, we argue that we need clear criteria for what information may justifiably be collected in the name of learning analytics. Third, we need to address whether purported consequences of learning analytics (e.g., better learning outcomes) are justified and what the distributions of those consequences are. Fourth, we argue that regardless of how robust the benefits of learning analytics turn out to be, students have important autonomy interests in how information about them is collected. Finally, we argue that it is an open question whether the goods that justify higher education are advanced by learning analytics, or whether collection of information actually runs counter to those goods
The Complexities of Developing a Personal Code of Ethics for Learning Analytics Practitioners: Implications for Institutions and the Field
In this paper we explore the potential role, value and utility of a personal code of ethics (COE) for learning analytics practitioners, and in particular we consider whether such a COE might usefully mediate individual actions and choices in relation to a more abstract institutional COE. While several institutional COEs now exist, little attention has been paid to detailing the ethical responsibilities of individual practitioners. To investigate the problems associated with developing and implementing a personal COE, we drafted an LA Practitioner COE based on other professional codes, and invited feedback from a range of learning analytics stakeholders and practitioners: ethicists, students, researchers and technology executives. Three main themes emerged from their reflections: 1. A need to balance real world demands with abstract principles, 2. The limits to individual accountability within the learning analytics space, and 3. The continuing value of debate around an aspirational code of ethics within the field of learning analytics
Recommended from our members
Student perspectives on the use of their data: between intrusion, surveillance and care
The Open University (OU) is a large, open distance learning institution with more than 200,000 students. In common with many other higher education institutions (HEIs), the University is looking more closely at its use of learning analytics. Learning analytics has been defined as the collection and analysis of data generated during the learning process in order to improve the quality of learning and teaching (Siemens, Dawson, & Lynch, 2013). In the context of the Open University, learning analytics is the use of raw and analysed student data to, inter alia, proactively identify interventions which aim to support students in completing their study goals. Such interventions may be designed to support students as individuals as well as at a cohort level.
The use of a learning analytics approach to inform and provide direction to student support within the Open University is relatively new and, as such, existing policies relating and referring to potential uses of student data have required fresh scrutiny to ensure their continued relevance and completeness (Prinsloo & Slade, 2013). In response, The Open University made the decision to address a range of ethical issues relating to the Universityâs approach to learning analytics via the implementation of new policy. In order to formulate a clear policy which reflected the Universityâs mission and key principles, it was considered essential to consult with a wide range of stakeholders, including students
Recommended from our members
Situating multimodal learning analytics
The digital age has introduced a host of new challenges and opportunities for the learning sciences community. These challenges and opportunities are particularly abundant in multimodal learning analytics (MMLA), a research methodology that aims to extend work from Educational Data Mining (EDM) and Learning Analytics (LA) to multimodal learning environments by treating multimodal data. Recognizing the short-term opportunities and longterm challenges will help develop proof cases and identify grand challenges that will help propel the field forward. To support the field's growth, we use this paper to describe several ways that MMLA can potentially advance learning sciences research and touch upon key challenges that researchers who utilize MMLA have encountered over the past few years
<i>âWeâre Seeking Relevanceâ</i>: Qualitative Perspectives on the Impact of Learning Analytics on Teaching and Learning
Whilst a significant body of learning analytics research tends to focus on impact from the perspective of usability or improved learning outcomes, this paper proposes an approach based on Affordance Theory to describe awareness and intention as a bridge between usability and impact. 10 educators at 3 European institutions participated in detailed interviews on the affordances they perceive in using learning analytics to support practice in education. Evidence illuminates connections between an educatorâs epistemic beliefs about learning and the purpose of education, their perception of threats or resources in delivering a successful learning experience, and the types of data they would consider as evidence in recognising or regulating learning. This evidence can support the learning analytics community in considering the proximity to the student, the role of the educator, and their personal belief structure in developing robust analytics tools that educators may be more likely to use
Open University Learning Analytics dataset
Learning Analytics focuses on the collection and analysis of learnersâ data to improve their learning experience by providing informed guidance and to optimise learning materials. To support the research in this area we have developed a dataset, containing data from courses presented at the Open University (OU). What makes the dataset unique is the fact that it contains demographic data together with aggregated clickstream data of studentsâ interactions in the Virtual Learning Environment (VLE). This enables the analysis of student behaviour, represented by their actions. The dataset contains the information about 22 courses, 32,593 students, their assessment results, and logs of their interactions with the VLE represented by daily summaries of student clicks (10,655,280 entries). The dataset is freely available at https://analyse.kmi.open.ac.uk/open_dataset under a CC-BY 4.0 license
- âŠ