19 research outputs found

    Ethical and privacy issues in the application of learning analytics

    Get PDF
    The large-scale production, collection, aggregation, and processing of information from various learning platforms and online environments have led to ethical and privacy concerns regarding potential harm to individuals and society. In the past, these types of concern have impacted on areas as diverse as computer science, legal studies and surveillance studies. Within a European consortium that brings together the EU project LACE, the SURF SIG Learning Analytics, the Apereo Foundation and the EATEL SIG dataTEL, we aim to understand the issues with greater clarity, and to find ways of overcoming the issues and research challenges related to ethical and privacy aspects of learning analytics practice. This interactive workshop aims to raise awareness of major ethics and privacy issues. It will also be used to develop practical solutions to advance the application of learning analytics technologies

    The Lifecycle of Sustainable Analytics: From Data Collection to Change Management

    Get PDF
    In this age of an ever-increasing list of analytics vendors and endlessly forwarded news articles that trumpet the promises of big data in higher education, it can be easy to become distracted by data science and miss out on another opportuni­ty—supporting increased professionalism amongst university staff, faculty, and administrators. Indeed, like many technologies before it, analytics provides us with an opportunity to catalyze institutional effectiveness, but only when we resist the tenden­cy to believe that technology can replace the need for human ingenuity and judgment. This report will argue that such threats to pro­fessional flourishing can be insulated against if administrators in higher education are willing to imbue analytics initiatives with a focus on increased data literacy, professional autonomy, and human collaboration. Our initial successes with focusing on the human element in analytics will be explored, accompanied by evidence supporting this approach

    A learning analytics pilot in Moodle and its impact on developing organisational capacity in a university

    Full text link
    © ASCILITE 2017 - Conference Proceedings - 34th International Conference of Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education.All right reserved. Moodle is used as a learning management system around the world. However, integrated learning analytics solutions for Moodle that provide actionable information and allow teachers to efficiently use it to connect with their students are lacking. The enhanced Moodle Engagement Analytics Plugin (MEAP), presented at ASCILITE2015, enabled teachers to identify and contact students at-risk of not completing their units. Here, we discuss a pilot using MEAP in 36 units at Macquarie University, a metropolitan Australian university. We use existing models for developing organisational capacity in learning analytics and to embed learning analytics into the practice of teaching and learning to discuss a range of issues arising from the pilot. We outline the interaction and interdependency of five stages during the pilot: technology infrastructure, analytics tools and applications; policies, processes, practices and workflows; values and skills; culture and behaviour; and leadership. We conclude that one of the most significant stages is to develop a culture and behaviour around learning analytics

    Refining the Learning Analytics Capability Model: A Single Case Study

    Get PDF
    Learning analytics can help higher educational institutions improve learning. Its adoption, however, is a complex undertaking. The Learning Analytics Capability Model describes what 34 organizational capabilities must be developed to support the successful adoption of learning analytics. This paper described the first iteration to evaluate and refine the current, theoretical model. During a case study, we conducted four semi-structured interviews and collected (internal) documentation at a Dutch university that is mature in the use of student data to improve learning. Based on the empirical data, we merged seven capabilities, renamed three capabilities, and improved the definitions of all others. Six capabilities absent in extant learning analytics models are present at the case organization, implying that they are important to learning analytics adoption. As a result, the new, refined Learning Analytics Capability Model comprises 31 capabilities. Finally, some challenges were identified, showing that even mature organizations still have issues to overcome

    Towards learning analytics adoption: A mixed methods study of data-related practices and policies in Latin American universities

    Get PDF
    In Latin American universities, Learning Analytics (LA) has been perceived as a promising opportunity to leverage data to meet the needs of a diverse student cohort. Although universities have been collecting educational data for years, the adoption of LA in this region is still limited due to the lack of expertise and policies for processing and using educational data. In order to get a better picture of how existing data‐related practices and policies might affect the incorporation of LA in Latin American institutions, we conducted a mixed methods study in four Latin American universities (two Chilean and two Ecuadorian). In this paper, the qualitative data were based on 37 interviews with managers and 16 focus groups with 51 teaching staff and 45 students; the quantitative data were collected through two surveys answered by 1884 students and 368 teachers, respectively. The findings reveal opportunities to incorporate LA services into existing data practices in the four case studies. However, the lack of reliable information systems and policies to regulate the use of data imposes challenges that need to be overcome for future LA adoption.In Latin American universities, Learning Analytics (LA) has been perceived as a promising opportunity to leverage data to meet the needs of a diverse student cohort. Although universities have been collecting educational data for years, the adoption of LA in this region is still limited due to the lack of expertise and policies for processing and using educational data. In order to get a better picture of how existing data‐related practices and policies might affect the incorporation of LA in Latin American institutions, we conducted a mixed methods study in four Latin American universities (two Chilean and two Ecuadorian). In this paper, the qualitative data were based on 37 interviews with managers and 16 focus groups with 51 teaching staff and 45 students; the quantitative data were collected through two surveys answered by 1884 students and 368 teachers, respectively. The findings reveal opportunities to incorporate LA services into existing data practices in the four case studies. However, the lack of reliable information systems and policies to regulate the use of data imposes challenges that need to be overcome for future LA adoption

    Quality Indicators for Learning Analytics

    Get PDF
    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on the results of a Group Concept Mapping study conducted with experts from the field of learning analytics. The outcomes of this study are further extended with findings from a focused literature review

    The Relationship Between i-Ready Diagnostic and 10th Grade Students\u27 High-Stakes Mathematics Test Scores Heath Andrew Thompson

    Get PDF
    Twenty percent of the 2013-2014 sophomore class at a Washington high school was failing high-stakes tests, making these students ineligible to graduate. In an attempt to help students identify their academic proficiency with respect to the Common Core Curricular Standards 9 months before the high-stakes exam, the high school recently introduced the adaptive diagnostic software i-Ready. Cognitive learning theories comprised the framework for this study, which posit that learning is dependent on previous knowledge and central to measuring performance levels. The purpose of this quantitative correlational project study was to examine whether 10th grade students\u27 achievement on i-Ready math scores (N = 220) could predict the subsequent high-stakes mathematics scores on the End of Course Exam while controlling for gender, ethnicity, and socioeconomic status. The i-Ready emerged as a statistically significant predictor of the End of Course Exam scores with β = .64 (p \u3c .001), explaining R2 = .43 of the criterion variance. Gender, ethnicity, and socioeconomic status had no significant moderating influence. The project deliverable as a result of this study was a position paper advising the use of the i-Ready as a predictor for the End of Course Exam at the high school under study. The implications for positive social change include allowing educators to use the i-Ready as an early warning system for students in danger of failing high-stakes exams. This study may help identify students at risk of not graduating who could benefit from instructional support

    El Proceso de Implementación de Analíticas de Aprendizaje

    Get PDF
    With the popularity takeoff of the learning analytics area during the last decade, numerous research studies have emerged and public opinion has echoed this trend as well. However, the fact is that the impact the field has had in practice has been quite limited, and there has been little transfer to educational institutions. One of the possible causes is the high complexity of the field, and that there are no clear implementation processes; therefore, in this work, we propose a pragmatic implementation process of learning analytics in five stages: 1) learning environments, 2) raw data capture, 3) data tidying and feature engineering, 4) analysis and modelling and 5) educational application. In addition, we also review a series of transverse factors that affect this implementation, like technology, learning sciences, privacy, institutions, and educational policies. The detailed process can be helpful for researchers, educational data analysts, teachers and educational institutions that are looking to start working in this area. Achieving the true potential of learning analytics will require close collaboration and conversation between all the actors involved in their development, which might eventually lead to the desired systematic and productive implementation.Con el despegue de la popularidad del área de analítica de aprendizaje durante la última década, numerosas investigaciones han surgido y la opinión pública se ha hecho eco de esta tendencia. Sin embargo, la realidad es que el impacto que ha tenido en la práctica ha sido bastante bajo, y se está produciendo poca transferencia a las instituciones educativas. Una de las posibles causas es la elevada complejidad del campo, y que no existan procesos claros; por ello, en este trabajo, se propone un pragmático proceso de implementación de analíticas de aprendizaje en cinco etapas: 1) entornos de aprendizaje, 2) recolección de datos en crudo, 3) manipulación de datos e ingeniería de características, 4) análisis y modelos y 5) aplicación educacional. Además, se revisan una serie de factores transversales que afectan esta implementación, como la tecnología, ciencias del aprendizaje, privacidad, instituciones y políticas educacionales. El proceso que se detalla puede resultar de utilidad para investigadores, analistas de datos educacionales, educadores e instituciones educativas que busquen introducirse en el área. Alcanzar el verdadero potencial de las analíticas de aprendizaje requerirá de estrecha colaboración y conversación entre todos los actores involucrados en su desarrollo, que permita su implementación de forma sistemática y productiva

    The SHEILA framework: informing institutional strategies and policy processes of learning analytics

    Get PDF
    This paper introduces a learning analytics policy and strategy framework developed by a cross-European research project team — SHEILA (Supporting Higher Education to Integrate Learning Analytics), based on interviews with 78 senior managers from 51 European higher education institutions across 16 countries. The framework was developed adapting the RAPID Outcome Mapping Approach (ROMA), which is designed to develop effective strategies and evidence-based policy in complex environments. This paper presents four case studies to illustrate the development process of the SHEILA framework and how it can be used iteratively to inform strategic planning and policy processes in real world environments, particularly for large-scale implementation in higher education contexts. To this end, the selected cases were analyzed at two stages, each a year apart, to investigate the progression of adoption approaches that were followed to solve existing challenges, and identify new challenges that could be addressed by following the SHEILA framework
    corecore