142 research outputs found

    A Bibliometric Study on Learning Analytics

    Get PDF
    Learning analytics tools and techniques are continually developed and published in scholarly discourse. This study aims at examining the intellectual structure of the Learning Analytics domain by collecting and analyzing empirical articles on Learning Analytics for the period of 2004-2018. First, bibliometric analysis and citation analyses of 2730 documents from Scopus identified the top authors, key research affiliations, leading publication sources (journals and conferences), and research themes of the learning analytics domain. Second, Domain Analysis (DA) techniques were used to investigate the intellectual structures of learning analytics research, publication, organization, and communication (Hjørland & Bourdieu 2014). The software of VOSviewer is used to analyze the relationship by publication: historical and institutional; author and institutional relationships and the dissemination of Learning Analytics knowledge. The results of this study showed that Learning Analytics had captured the attention of the global community. The United States, Spain, and the United Kingdom are among the leading countries contributing to the dissemination of learning analytics knowledge. The leading publication sources are ACM International Conference Proceeding Series, and Lecture Notes in Computer Science. The intellectual structures of the learning analytics domain are presented in this study the LA research taxonomy can be re-used by teachers, administrators, and other stakeholders to support the teaching and learning environments in a higher education institution

    Epistemology, pedagogy, assessment and learning analytics

    Get PDF
    There is a well-established literature examining the relationships between epistemology (the nature of knowledge), pedagogy (the nature of learning and teaching), and assessment. Learning Analytics (LA) is a new assessment technology and should engage with this literature since it has implications for when and why different LA tools might be deployed. This paper discusses these issues, relating them to an example construct, epistemic beliefs – beliefs about the nature of knowledge – for which analytics grounded in pragmatic, sociocultural theory might be well placed to explore. This example is particularly interesting given the role of epistemic beliefs in the everyday knowledge judgements students make in their information processing. Traditional psychological approaches to measuring epistemic beliefs have parallels with high stakes testing regimes; this paper outlines an alternative LA for epistemic beliefs which might be readily applied to other areas of interest. Such sociocultural approaches afford opportunity for engaging LA directly in high quality pedagogy

    Motivation Classification and Grade Prediction for MOOCs Learners

    Get PDF
    While MOOCs offer educational data on a new scale, many educators find great potential of the big data including detailed activity records of every learner. A learner’s behavior such as if a learner will drop out from the course can be predicted. How to provide an effective, economical, and scalable method to detect cheating on tests such as surrogate exam-taker is a challenging problem. In this paper, we present a grade predicting method that uses student activity features to predict whether a learner may get a certification if he/she takes a test. The method consists of two-step classifications: motivation classification (MC) and grade classification (GC). The MC divides all learners into three groups including certification earning, video watching, and course sampling. The GC then predicts a certification earning learner may or may not obtain a certification. Our experiment shows that the proposed method can fit the classification model at a fine scale and it is possible to find a surrogate exam-taker

    Evaluating emotion visualizations using AffectVis, an affect-aware dashboard for students

    Get PDF
    Purpose - The purpose of this paper is to evaluate four visualizations that represent affective states of students. Design/methodology/approach - An empirical-experimental study approach was used to assess the usability of affective state visualizations in a learning context. The first study was conducted with students who had knowledge of visualization techniques (n=10). The insights from this pilot study were used to improve the interpretability and ease of use of the visualizations. The second study was conducted with the improved visualizations with students who had no or limited knowledge of visualization techniques (n=105). Findings - The results indicate that usability, measured by perceived usefulness and insight, is overall acceptable. However, the findings also suggest that interpretability of some visualizations, in terms of the capability to support emotional awareness, still needs to be improved. The level of students’ awareness of their emotions during learning activities based on the visualization interpretation varied depending on previous knowledge of information visualization techniques. Awareness was found to be high for the most frequently experienced emotions and activities that were the most frustrating, but lower for more complex insights such as interpreting differences with peers. Furthermore, simpler visualizations resulted in better outcomes than more complex techniques. Originality/value - Detection of affective states of students and visualizations of these states in computer-based learning environments have been proposed to support student awareness and improve learning. However, the evaluation of visualizations of these affective states with students to support awareness in real life settings is an open issue

    Data analytics 2016: proceedings of the fifth international conference on data analytics

    Get PDF

    Learning analytics: challenges and limitations

    Get PDF
    Learning analytic implementations are increasingly being included in learning management systems in higher education. We lay out some concerns with the way learning analytics – both data and algorithms – are often presented within an unproblematized Big Data discourse. We describe some potential problems with the often implicit assumptions about learning and learners – and indeed the tendency not to theorize learning explicitly – that underpin such implementations. Finally, we describe an attempt to devise our own analytics, grounded in a sociomaterial conception of learning. We use the data obtained to suggest that the relationships between learning and the digital traces left by participants in online learning are far from trivial, and that any analytics that relies on these as proxies for learning tends towards a behaviourist evaluation of learning processes
    • …
    corecore