186,287 research outputs found
Analytics and complexity: learning and leading for the future
There is growing interest in the application of learning analytics to manage, inform and improve learning and teaching within higher education. In particular, learning analytics is seen as enabling data-driven decision making as universities are seeking to respond a range of significant challenges that are reshaping the higher education landscape. Experience over four years with a project exploring the use of learning analytics to improve learning and teaching at a particular university has, however, revealed a much more complex reality that potentially limits the value of some analytics-based strategies. This paper uses this experience with over 80,000 students across three learning management systems, combined with literature from complex adaptive systems and learning analytics to identify the source and nature of these limitations along with a suggested path forward
‘A double-edged sword. This is powerful but it could be used destructively’: Perspectives of early career education researchers on learning analytics
Learning analytics has been increasingly outlined as a powerful tool for measuring, analysing, and predicting learning experiences and behaviours. The rising use of learning analytics means that many educational researchers now require new ranges of technical analytical skills to contribute to an increasingly data-heavy field. However, it has been argued that educational data scientists are a ‘scarce breed’ (Buckingham Shum et al., 2013) and that more resources are needed to support the next generation of early career researchers in the education field. At the same time, little is known about how early career education researchers feel towards learning analytics and whether it is important to their current and future research practices. Using a thematic analysis of a participatory learning analytics workshop discussions with 25 early career education researchers, we outline in this article their ambitions, challenges and anxieties towards learning analytics. In doing so, we have provided a roadmap for how the learning analytics field might evolve and practical implications for supporting early career researchers’ development
Recommended from our members
Student perspectives on the use of their data: between intrusion, surveillance and care
The Open University (OU) is a large, open distance learning institution with more than 200,000 students. In common with many other higher education institutions (HEIs), the University is looking more closely at its use of learning analytics. Learning analytics has been defined as the collection and analysis of data generated during the learning process in order to improve the quality of learning and teaching (Siemens, Dawson, & Lynch, 2013). In the context of the Open University, learning analytics is the use of raw and analysed student data to, inter alia, proactively identify interventions which aim to support students in completing their study goals. Such interventions may be designed to support students as individuals as well as at a cohort level.
The use of a learning analytics approach to inform and provide direction to student support within the Open University is relatively new and, as such, existing policies relating and referring to potential uses of student data have required fresh scrutiny to ensure their continued relevance and completeness (Prinsloo & Slade, 2013). In response, The Open University made the decision to address a range of ethical issues relating to the University’s approach to learning analytics via the implementation of new policy. In order to formulate a clear policy which reflected the University’s mission and key principles, it was considered essential to consult with a wide range of stakeholders, including students
Recommended from our members
Learning at Scale: Using an Evidence Hub To Make Sense of What We Know
The large datasets produced by learning at scale, and the need for ways of dealing with high learner/educator ratios, mean that MOOCs and related environments are frequently used for the deployment and development of learning analytics. Despite the current proliferation of analytics, there is as yet relatively little hard evidence of their effectiveness. The Evidence Hub developed by the Learning Analytics Community Exchange (LACE) provides a way of collating and filtering the available evidence in order to support the use of analytics and to target future studies to fill the gaps in our knowledge
<i>“We’re Seeking Relevance”</i>: Qualitative Perspectives on the Impact of Learning Analytics on Teaching and Learning
Whilst a significant body of learning analytics research tends to focus on impact from the perspective of usability or improved learning outcomes, this paper proposes an approach based on Affordance Theory to describe awareness and intention as a bridge between usability and impact. 10 educators at 3 European institutions participated in detailed interviews on the affordances they perceive in using learning analytics to support practice in education. Evidence illuminates connections between an educator’s epistemic beliefs about learning and the purpose of education, their perception of threats or resources in delivering a successful learning experience, and the types of data they would consider as evidence in recognising or regulating learning. This evidence can support the learning analytics community in considering the proximity to the student, the role of the educator, and their personal belief structure in developing robust analytics tools that educators may be more likely to use
What learning analytics based prediction models tell us about feedback preferences of students
Learning analytics (LA) seeks to enhance learning processes through systematic measurements of learning related data and to provide informative feedback to learners and educators (Siemens & Long, 2011). This study examined the use of preferred feedback modes in students by using a dispositional learning analytics framework, combining learning disposition data with data extracted from digital systems. We analyzed the use of feedback of 1062 students taking an introductory mathematics and statistics course, enhanced with digital tools. Our findings indicated that compared with hints, fully worked-out solutions demonstrated a stronger effect on academic performance and acted as a better mediator between learning dispositions and academic performance. This study demonstrated how e-learners and their data can be effectively re-deployed to provide meaningful insights to both educators and learners
What Types of Predictive Analytics are Being Used in Talent Management Organizations?
[Excerpt] Talent management organizations are increasingly deriving insights from data to make better decisions. Their use of data analytics is advancing from descriptive to predictive and prescriptive analytics. Descriptive analytics is the most basic form, providing the hindsight view of what happened and laying the foundation for turning data into information. More advanced uses are predictive (advanced forecasts and the ability to model future results) and prescriptive (“the top-tier of analytics that leverage machine learning techniques … to both interpret data and recommend actions”) analytics (1). Appendix A illustrates these differences. This report summarizes our most relevant findings about how both academic researchers and HR practitioners are successfully using data analytics to inform decision-making in workforce issues, with a focus on executive assessment and selection
Stability and sensitivity of Learning Analytics based prediction models
Learning analytics seek to enhance the learning processes through systematic measurements of learning related data and to provide informative feedback to learners and educators. Track data from Learning Management Systems (LMS) constitute a main data source for learning analytics. This empirical contribution provides an application of Buckingham Shum and Deakin Crick’s theoretical framework of dispositional learning analytics: an infrastructure that combines learning dispositions data with data extracted from computer-assisted, formative assessments and LMSs. In two cohorts of a large introductory quantitative methods module, 2049 students were enrolled in a module based on principles of blended learning, combining face-to-face Problem-Based Learning sessions with e-tutorials. We investigated the predictive power of learning dispositions, outcomes of continuous formative assessments and other system generated data in modelling student performance and their potential to generate informative feedback. Using a dynamic, longitudinal perspective, computer-assisted formative assessments seem to be the best predictor for detecting underperforming students and academic performance, while basic LMS data did not substantially predict learning. If timely feedback is crucial, both use-intensity related track data from e-tutorial systems, and learning dispositions, are valuable sources for feedback generation
- …