17 research outputs found
From dirty data to multiple versions of truth: How different choices in data cleaning lead to different learning analytics outcomes
Learning analytics is the analysis of student data with the purpose of improving learning. However, the process of data cleaning remains underexposed within learning analytics literature. In this paper, we elaborate on choices made in the cleaning process of student data and their consequences. We illustrate this with a case where data was gathered during six courses taught via Moodle. In this data set, only 21% of the logged activities were linked to a specific course. We illustrate possible choices in dealing with missing data by applying the cleaning process twelve times with different choices on copies of the raw data. Consequently, the analysis of the data shows varying outcomes. As the purpose of learning analytics is to intervene based on analysis and visualizations, it is of utmost importance to be aware of choices made during data cleaning. This paper\u27s main goal is to make stakeholders of (learning) analytics activities aware of the fact that choices are made during data cleaning have consequences on the outcomes. We believe that there should be transparency to the users of these outcomes and give them a detailed report of the decisions made
Refining the Learning Analytics Capability Model: A Single Case Study
Learning analytics can help higher educational institutions improve learning. Its adoption, however, is a complex undertaking. The Learning Analytics Capability Model describes what 34 organizational capabilities must be developed to support the successful adoption of learning analytics. This paper described the first iteration to evaluate and refine the current, theoretical model. During a case study, we conducted four semi-structured interviews and collected (internal) documentation at a Dutch university that is mature in the use of student data to improve learning. Based on the empirical data, we merged seven capabilities, renamed three capabilities, and improved the definitions of all others. Six capabilities absent in extant learning analytics models are present at the case organization, implying that they are important to learning analytics adoption. As a result, the new, refined Learning Analytics Capability Model comprises 31 capabilities. Finally, some challenges were identified, showing that even mature organizations still have issues to overcome
Learning Analytics in het onderwijs:Een onderwijskundig perspectief
https://www.surf.nl/kennisbank/2016/rapport-learning-analytics-in-het-onderwijs-een-onderwijskundig-perspectief.htmlLearning analytics in de onderwijspraktijk
Meer inzicht in het onderwijsproces, gerichte feedback aan studenten en uiteindelijk verbetering van het onderwijs: dat is de gedachte achter learning analytics. De mogelijkheden van learning analytics zijn groot, maar hoe past een opleiding of instelling ze succesvol toe? Dat valt of staat met de manier waarop learning analytics wordt toegepast in de onderwijspraktijk.
Ontwerpen van online onderwijs
Learning analytics werkt pas echt als we erin slagen de juiste vragen aan de data te stellen. Dat begint al bij het ontwerpen van online onderwijs. Voor het rapport 'Learning analytics in het onderwijs: een onderwijskundig perspectief' hebben we samen met vertegenwoordigers uit het hoger onderwijs onderzocht hoe je in een onderwijsontwerp effectief gebruik kunt maken van learning analytics. In een aantal cases laten we bovendien zien hoe dat in de onderwijspraktijk kan werken.
Ondersteuning en inspiratie voor docenten en onderwijsontwikkelaars
Het rapport ondersteunt en inspireert docenten en onderwijsontwikkelaars bij het toepassen van learning analytics in online onderwijs. Zo kunnen ze data verzamelen over hoe studenten door een online omgeving klikken, welke video’s ze bekijken, en welke andere digitale voetsporen ze achterlaten, en wat dat zegt over hun leergedrag.SUR
Supporting Learning Analytics Adoption: Evaluating the Learning Analytics Capability Model in a Real-World Setting
Although learning analytics benefit learning, its uptake by higher educational institutions remains low. Adopting learning analytics is a complex undertaking, and higher educational institutions lack insight into how to build organizational capabilities to successfully adopt learning analytics at scale. This paper describes the ex-post evaluation of a capability model for learning analytics via a mixed-method approach. The model intends to help practitioners such as program managers, policymakers, and senior management by providing them a comprehensive overview of necessary capabilities and their operationalization. Qualitative data were collected during pluralistic walk-throughs with 26 participants at five educational institutions and a group discussion with seven learning analytics experts. Quantitative data about the model’s perceived usefulness and ease-of-use was collected via a survey (n = 23). The study’s outcomes show that the model helps practitioners to plan learning analytics adoption at their higher educational institutions. The study also shows the applicability of pluralistic walk-throughs as a method for ex-post evaluation of Design Science Research artefacts
Hogeschool Utrecht start met learning-analyticsexperiment
Interview met onderzoeker Justian Knobbout over experiment met learning analytics
Designing the learning analytics capability model
Educational institutions in higher education encounter different thresholds when scaling up to institution-wide learning analytics. This doctoral research focuses on designing a model of capabilities that institutions need to develop in order to remove these barriers and thus maximise the benefits of learning analytics
Refined definitions of LACM capabilities: Changes made to the definitions of capabilities of the Learning Analytics Capability Model
Addition to https://www.online-journals.org/index.php/i-jai/article/view/1279
A Capability Model for Learning Analytics Adoption: Identifying Organizational Capabilities from Literature on Learning Analytics, Big Data Analytics, and Business Analytics
Despite the promises of learning analytics and the existence of several learning analytics implementation frameworks, the large-scale adoption of learning analytics within higher educational institutions remains low. Extant frameworks either focus on a specific element of learning analytics implementation, for example, policy or privacy, or lack operationalization of the organizational capabilities necessary for successful deployment. Therefore, this literature review addresses the research question “What capabilities for the successful adoption of learning analytics can be identified in existing literature on big data analytics, business analytics, and learning analytics?” Our research is grounded in resource-based view theory and we extend the scope beyond the field of learning analytics and include capability frameworks for the more mature research fields of big data analytics and business analytics. This paper’s contribution is twofold: 1) it provides a literature review on known capabilities for big data analytics, business analytics, and learning analytics and 2) it introduces a capability model to support the implementation and uptake of learning analytics. During our study, we identified and analyzed 15 key studies. By synthesizing the results, we found 34 organizational capabilities important to the adoption of analytical activities within an institution and provide 461 ways to operationalize these capabilities. Five categories of capabilities can be distinguished – Data, Management, People, Technology, and Privacy & Ethics. Capabilities presently absent from existing learning analytics frameworks concern sourcing and integration, market, knowledge, training, automation, and connectivity. Based on the results of the review, we present the Learning Analytics Capability Model: a model that provides senior management and policymakers with concrete operationalizations to build the necessary capabilities for successful learning analytics adoption
Where Is the Learning in Learning Analytics?: A Systematic Literature Review to Identify Measures of Affected Learning
Conference Paper
From the article: Abstract
Learning analytics is the analysis and visualization of student data with the purpose of improving education. Literature reporting on measures of the effects of data-driven pedagogical interventions on learning and the environment in which this takes place, allows us to assess in what way learning analytics actually improves learning. We conducted a systematic literature review aimed at identifying such measures of data-driven improvement. A review of 1034 papers yielded 38 key studies, which were thoroughly analyzed on aspects like objective, affected learning and their operationalization (measures). Based on prevalent learning theories, we synthesized a classification scheme comprised of four categories: learning process, student performance, learning environment, and departmental performance. Most of the analyzed studies relate to either student performance or learning process. Based on the results, we recommend to make deliberate decisions on the (multiple) aspects of learning one tries to improve by the application of learning analytics. Our classification scheme with examples of measures may help both academics and practitioners doing so, as it allows for structured positioning of learning analytics benefits
What challenges are holding Us back from adopting learning analytics?:Insights from dutch higher educational institutions
The institutional adoption of learning analytics in the Netherlands is still low. This chapter presents a study on the challenges that Dutch higher educational institutions encounter when adopting learning analytics. The literature describes possible challenges regarding assets, data governance, data literacy, data quality, organizational culture, pedagogical grounding, privacy and ethics, and technical issues. Eight interviews with practitioners from four universities verified that all these challenges are causing problems for Dutch institutions as well. The practitioners provided recommendations on how to overcome these adoption challenges. Higher educational institutions need to demonstrate the value of learning analytics, provide users with training, clearly identify users' needs, and establish a 'one-stop-shop' that acts as a single contact point within the organization. Combined with recommendations already present in the literature, this helps accelerate the successful adoption of learning analytics by Dutch higher educational institutions.</p