186,287 research outputs found

    Analytics and complexity: learning and leading for the future

    Get PDF
    There is growing interest in the application of learning analytics to manage, inform and improve learning and teaching within higher education. In particular, learning analytics is seen as enabling data-driven decision making as universities are seeking to respond a range of significant challenges that are reshaping the higher education landscape. Experience over four years with a project exploring the use of learning analytics to improve learning and teaching at a particular university has, however, revealed a much more complex reality that potentially limits the value of some analytics-based strategies. This paper uses this experience with over 80,000 students across three learning management systems, combined with literature from complex adaptive systems and learning analytics to identify the source and nature of these limitations along with a suggested path forward

    ‘A double-edged sword. This is powerful but it could be used destructively’: Perspectives of early career education researchers on learning analytics

    Get PDF
    Learning analytics has been increasingly outlined as a powerful tool for measuring, analysing, and predicting learning experiences and behaviours. The rising use of learning analytics means that many educational researchers now require new ranges of technical analytical skills to contribute to an increasingly data-heavy field. However, it has been argued that educational data scientists are a ‘scarce breed’ (Buckingham Shum et al., 2013) and that more resources are needed to support the next generation of early career researchers in the education field. At the same time, little is known about how early career education researchers feel towards learning analytics and whether it is important to their current and future research practices. Using a thematic analysis of a participatory learning analytics workshop discussions with 25 early career education researchers, we outline in this article their ambitions, challenges and anxieties towards learning analytics. In doing so, we have provided a roadmap for how the learning analytics field might evolve and practical implications for supporting early career researchers’ development

    <i>“We’re Seeking Relevance”</i>: Qualitative Perspectives on the Impact of Learning Analytics on Teaching and Learning

    Get PDF
    Whilst a significant body of learning analytics research tends to focus on impact from the perspective of usability or improved learning outcomes, this paper proposes an approach based on Affordance Theory to describe awareness and intention as a bridge between usability and impact. 10 educators at 3 European institutions participated in detailed interviews on the affordances they perceive in using learning analytics to support practice in education. Evidence illuminates connections between an educator’s epistemic beliefs about learning and the purpose of education, their perception of threats or resources in delivering a successful learning experience, and the types of data they would consider as evidence in recognising or regulating learning. This evidence can support the learning analytics community in considering the proximity to the student, the role of the educator, and their personal belief structure in developing robust analytics tools that educators may be more likely to use

    What learning analytics based prediction models tell us about feedback preferences of students

    Get PDF
    Learning analytics (LA) seeks to enhance learning processes through systematic measurements of learning related data and to provide informative feedback to learners and educators (Siemens & Long, 2011). This study examined the use of preferred feedback modes in students by using a dispositional learning analytics framework, combining learning disposition data with data extracted from digital systems. We analyzed the use of feedback of 1062 students taking an introductory mathematics and statistics course, enhanced with digital tools. Our findings indicated that compared with hints, fully worked-out solutions demonstrated a stronger effect on academic performance and acted as a better mediator between learning dispositions and academic performance. This study demonstrated how e-learners and their data can be effectively re-deployed to provide meaningful insights to both educators and learners

    What Types of Predictive Analytics are Being Used in Talent Management Organizations?

    Get PDF
    [Excerpt] Talent management organizations are increasingly deriving insights from data to make better decisions. Their use of data analytics is advancing from descriptive to predictive and prescriptive analytics. Descriptive analytics is the most basic form, providing the hindsight view of what happened and laying the foundation for turning data into information. More advanced uses are predictive (advanced forecasts and the ability to model future results) and prescriptive (“the top-tier of analytics that leverage machine learning techniques … to both interpret data and recommend actions”) analytics (1). Appendix A illustrates these differences. This report summarizes our most relevant findings about how both academic researchers and HR practitioners are successfully using data analytics to inform decision-making in workforce issues, with a focus on executive assessment and selection

    Stability and sensitivity of Learning Analytics based prediction models

    Get PDF
    Learning analytics seek to enhance the learning processes through systematic measurements of learning related data and to provide informative feedback to learners and educators. Track data from Learning Management Systems (LMS) constitute a main data source for learning analytics. This empirical contribution provides an application of Buckingham Shum and Deakin Crick’s theoretical framework of dispositional learning analytics: an infrastructure that combines learning dispositions data with data extracted from computer-assisted, formative assessments and LMSs. In two cohorts of a large introductory quantitative methods module, 2049 students were enrolled in a module based on principles of blended learning, combining face-to-face Problem-Based Learning sessions with e-tutorials. We investigated the predictive power of learning dispositions, outcomes of continuous formative assessments and other system generated data in modelling student performance and their potential to generate informative feedback. Using a dynamic, longitudinal perspective, computer-assisted formative assessments seem to be the best predictor for detecting underperforming students and academic performance, while basic LMS data did not substantially predict learning. If timely feedback is crucial, both use-intensity related track data from e-tutorial systems, and learning dispositions, are valuable sources for feedback generation
    corecore