128,854 research outputs found

    Quality Indicators for Learning Analytics

    Get PDF
    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on the results of a Group Concept Mapping study conducted with experts from the field of learning analytics. The outcomes of this study are further extended with findings from a focused literature review

    Learn Smarter, Not Harder – Exploring the Development of Learning Analytics Use Cases to Create Tailor-Made Online Learning Experiences

    Get PDF
    Our world is significantly shaped by digitalization, fostering new opportunities for technology-mediated learning. Therefore, massive amounts of knowledge become available online. However, concurrently these formats entail less interaction and guidance from lecturers. Thus, learners need to be supported by intelligent learning tools that provide suitable knowledge in a tailored way. In this context, the use of learning analytics in its multifaceted forms is essential. Existing literature shows a proliferation of learning analytics use cases without a systematic structure. Based on a structured literature review of 42 papers we organized existing literature contributions systematically and derived four use cases: learning dashboards, individualized content, tutoring systems, and adaptable learning process based on personality. Our use cases will serve as a basis for a targeted scientific discourse and are valuable orientation for the development of future learning analytics use cases to give rise to the new form of Learning Experience Platforms

    Citizen Science: The Ring to Rule Them All?

    Get PDF
    There are many uncertainties about the future of e-Learning, but one thing is certain: e-Learning will be more data-driven in the future. The automation of data capturing, analysis and presentation, together with economic constraints that require evidence-based proof of impact, compels this data focus. On the other hand, the importance of community involvement in learning analytics and educational data mining is an accepted fact. Citizen science, at the nexus of community engagement, and data science can bridge the divide between data-driven and community-driven approaches to policy and content development. The rationale for this paper is the investigation of citizen science as an approach to collecting data for learning analytics in the field of e-Learning. Capturing data for policy and content development for learning analytics through citizen science projects is novel in the e-Learning field. Like any other new area, citizen science needs to be mapped in terms of the existing parent fields of data science and education so that differences and potential overlaps can be made explicit. This is important when considering conceptual or functional definitions, research tools and methodologies. A preliminary review of the literature has not provided any conceptual positioning of citizen science in relation to the research topics of learning analytics, data science, big data and visualisation in the e-Learning environment. The intent of this paper is firstly to present an overview of citizen science and the related research topics in the academic and practitioner literature based on a systematic literature review. Secondly, we propose a model that represents the relationship between citizen science and other salient concepts and shows how citizen science projects can be positioned in the e-Learning environment. Finally, we suggest research opportunities involving citizen science projects in the field of e-Learning.School of Computin

    The Use of Learning Analytics Interactive Dashboards in Serious Games: A Review of the Literature

    Get PDF
    The learning analytics in serious games, corresponds to a subject in increasing demand in the educational field. In this context, there is a need to study how data visualizations found in the literature are adopted in learning analytics in serious games. This paper presents a Systematic Literature Review (SLR) on how the evolution of studies associated with the use of learning analytics interactive dashboards in serious games is processed, seeking to investigate the characteristics of using dashboards for viewing educational data. A bibliometric analysis was carried out in which 75 relevant studies were selected from the Scopus, Web of Science, and IEEExplore databases. From the data analysis, it was observed that in the current literature there is a reduced number of studies containing the main actors in the learning process, as follows: teachers/instructors, students/participants, game developers/designers, and managers/researchers. In the vast majority of investigated studies, data visualization algorithms are used, where the main focus takes into account only actors, such as teachers/instructors and students/participants

    A framework for strategic planning of data analytics in the educational sector

    Get PDF
    The field of big data and data analysis is not a new one. Big data systems have been investigated with respect to the volume of the data and how it is stored, the data velocity and how it is subject to change, variety of data to be analysed and data veracity referring to integrity and quality. Higher Education Institutions (HEIs) have a significant range of data sources across their operations and increasingly invest in collecting, analysing and reporting on their data in order to improve their efficiency. Data analytics and Business Intelligence (BI) are two terms that are increasingly popular over the past few years in the relevant literature with emphasis on their impact in the education sector. There is a significant volume of literature discussing the benefits of data analytics in higher education and even more papers discussing specific case studies of institutions resorting on BI by deploying various data analytics practices. Nevertheless, there is a lack of an integrated framework that supports HEIs in using learning analytics both at strategic and operational level. This research study was driven by the need to offer a point of reference for universities wishing to make good use of the plethora of data they can access. Increasingly institutions need to become ‘smart universities’ by supporting their decisions with findings from the analysis of their operations. The Business Intelligence strategies of many universities seems to focus mostly on identifying how to collect data but fail to address the most important issue that is how to analyse the data, what to do with the findings and how to create the means for a scalable use of learning analytics at institutional level. The scope of this research is to investigate the different factors that affect the successful deployment of data analytics in educational contexts focusing both on strategic and operational aspects of academia. The research study attempts to identify those elements necessary for introducing data analytics practices across an institution. The main contribution of the research is a framework that models the data collection, analysis and visualisation in higher education. The specific contribution to the field comes in the form of generic guidelines for strategic planning of HEI data analytics projects, combined with specific guidelines for staff involved in the deployment of data analytics to support certain institutional operations. The research is based on a mixed method approach that combines grounded theory in the form of extensive literature review, state-of-the-art investigation and case study analysis, as well as a combination of qualitative and quantitative data collection. The study commences with an extensive literature review that identifies the key factors affecting the use of learning analytics. Then the research collected more information from an analysis of a wide range of case studies showing how learning analytics are used across HEIs. The primary data collection concluded with a series of focus groups and interviews assessing the role of learning analytics in universities. Next, the research focused on a synthesis of guidelines for using learning analytics both at strategic and operational levels, leading to the production of generic and specific guidelines intended for different university stakeholders. The proposed framework was revised twice to create an integrated point of reference for HEIs that offers support across institutions in scalable and applicable way that can accommodate the varying needs met at different HEIs. The proposed framework was evaluated by the same participants in the earlier focus groups and interviews, providing a qualitative approach in evaluating the contributions made during this research study. The research resulted in the creation of an integrated framework that offers HEIs a reference for setting up a learning analytics strategy, adapting institutional policies and revising operations across faculties and departments. The proposed C.A.V. framework consists of three phases including Collect, Analysis and Visualisation. The framework determines the key features of data sources and resulting dashboards but also a list of functions for the data collection, analysis and visualisation stages. At strategic level, the C.A.V. framework enables institutions to assess their learning analytics maturity, determine the learning analytics stages that they are involved in, identify the different learning analytics themes and use a checklist as a reference point for their learning analytics deployment. Finally, the framework ensures that institutional operations can become more effective by determining how learning analytics provide added value across different operations, while assessing the impact of learning analytics on stakeholders. The framework also supports the adoption of learning analytics processes, the planning of dashboard contents and identifying factors affecting the implementation of learning analytics

    Learning analytics for motivating self-regulated learning and fostering the improvement of digital MOOC resources

    Get PDF
    Nowadays, the digital learning environment has revolutionized the vision of distance learning course delivery and drastically transformed the online educational system. The emergence of MOOCs (Massive Open Online courses) has exposed web technology used in education in a more advanced revolution ushering a new generation of learning environments. The digital learning environment is expected to augment the real world conventional education setting. The educational pedagogy are tailored with the standard practice which has been noticed to increase student success in MOOCs and provide a revolutionary way of self-regulated learning. However, there are still unresolved questions relating to the understanding of learning analytics data and how this could be implemented in educational contexts to support individual learning. One of the major issue in MOOCs is the consistent high dropout rate which over time has seen courses recorded less than 20% completion rate. This paper explores learning analytics from different perspectives in a MOOC context. Firstly, we review existing literature relating to learning analytics in MOOCs, bringing together findings and analyses from several courses. We explore meta-analysis of the basic factors that correlate to learning analytics and the significant in improving education. Secondly, using themes emerging from the previous study, we propose a preliminary model consisting of four factors of learning analytics. Finally, we provide a framework of learning analytics based on the following dimensions: descriptive, diagnostic, predictive and prescriptive, suggesting how the factors could be applied in a MOOC context. Our exploratory framework indicates the need for engaging learners and providing the understanding of how to support and help participants at risk of dropping out of the course

    A Systematic Literature Review of Empirical Studies on Learning Analytics in Educational Games

    Get PDF
    Learning analytics (LA) in educational games is considered an emerging practice due to its potential of enhancing the learning process. Growing research on formative assessment has shed light on the ways in which students' meaningful and in-situ learning experiences can be supported through educational games. To understand learners' playful experiences during gameplay, researchers have applied LA, which focuses on understanding students' in-game behaviour trajectories and personal learning needs during play. However, there is a lack of studies exploring how further research on LA in educational games can be conducted. Only a few analyses have discussed how LA has been designed, integrated, and implemented in educational games. Accordingly, this systematic literature review examined how LA in educational games has evolved. The study findings suggest that: (1) there is an increasing need to consider factors such as student modelling, iterative game design and personalisation when designing and implementing LA through educational games; and (2) the use of LA creates several challenges from technical, data management and ethical perspectives. In addition to outlining these findings, this article offers important notes for practitioners, and discusses the implications of the study’s results

    A systematic review on business analytics

    Get PDF
    Purpose: Business analytics, a buzzword of the recent decade, has been applied by thousands of enterprises to help generate more values and enhance their business performance. However, many aspects of business analytics remain unclear. This study explores different perspectives on the definition of business analytics and its relation with business intelligence Moreover, we illustrate the applications of business analytics in both business areas and industry sectors and shed light on the education in business analytics. Ultimately, to facilitate future research, we summarize several research techniques used in the literature reviewed. Design/methodology/approach: We set well-established selection criteria to select relevant literature from two widely recognized databases: Web of Science and Scopus. Based on the bibliometric information of the papers selected, we did a bibliometric analysis. Afterward, we reviewed the literature and coded relevant sections in an inductive way using MAXQDA. Then we compared and synthesized the coded information. Findings: There are mainly four findings. Firstly, according to the bibliometric analysis, literature about business analytics is growing exponentially. Secondly, business analytics is a system enabled by machine learning techniques aiming at promoting the efficiency and performance of an organization by supporting the decision-making process. Thirdly, the application of business analytics is comprehensive, not only in specific areas of a company but also in different industry sectors. Finally, business analytics is interdisciplinary, and the successful training should involve technical, analytical, and business skills. Originality/value: This systematic review, as a synthesis of the current research on business analytics, can serve as a quick guide for new researchers and practitioners in the field, while experienced scholars can also benefit from this work, taking it as a practical reference.Peer Reviewe

    License to evaluate: Preparing learning analytics dashboards for educational practice

    Get PDF
    Learning analytics can bridge the gap between learning sciences and data analytics, leveraging the expertise of both fields in exploring the vast amount of data generated in online learning environments. A typical learning analytics intervention is the learning dashboard, a visualisation tool built with the purpose of empowering teachers and learners to make informed decisions about the learning process. Related work has investigated learning dashboards, yet none have explored the theoretical foundation that should inform the design and evaluation of such interventions. In this systematic literature review, we analyse the extent to which theories and models from learning sciences have been integrated into the development of learning dashboards aimed at learners. Our analysis revealed that very few dashboard evaluations take into account the educational concepts that were used as a theoretical foundation for their design. Furthermore, we report findings suggesting that comparison with peers, a common reference frame for contextualising information on learning analytics dashboards, was not perceived positively by all learners. We summarise the insights gathered through our literature review in a set of recommendations for the design and evaluation of learning analytics dashboards for learners
    corecore