9 research outputs found

    Analysis of student achievement scores via cluster analysis

    No full text
    In education, the overall performance of every student is an important issue when assessing the quality of teaching. However, in the traditional educational system not all students have the same opportu nity to develop their academic skills in an efficient way. Different teaching techniques have been proposed to adapt the learning process to the stu dent profile. In this work, we analyze the profile of students according to their performance on academic activities and taking into account two dif ferent evaluation systems: work-based assessment and knowledge-based assessment. To this aim, data was collected during the fall semester of 2019 from a physics course at Universidad Loyola AndalucŽıa, in Seville, Spain. In order to study the student profiles, a clustering approach com bined with supervised feature selection was applied. Results suggest that two student profiles are clearly distinguished according to their perfor mance in the course in both evaluation approaches. These two profiles correspond to students that pass and fail the course. The output of the analysis also indicates that there are redundant and/or irrelevant fea tures. Therefore, machine learning techniques may be helpful for the design of effective activities to enhance the student learning process in this physics course

    Explainable Artificial Intelligence for Human-Centric Data Analysis in Virtual Learning Environments

    No full text
    The amount of data to analyze in virtual learning environments (VLEs) grows exponentially everyday. The daily interaction of students with VLE platforms represents a digital foot print of the students' engagement with the learning materials and activities. This big and worth source of information needs to be managed and processed to be useful. Educational Data Mining and Learning Analytics are two research branches that have been recently emerged to analyze educational data. Artificial Intelligence techniques are commonly used to extract hidden knowledge from data and to construct models that could be used, for example, to predict students' outcomes. However, in the educational field, where the interaction between humans and AI systems is a main concern, there is a need of developing new Explainable AI (XAI) systems, that are able to communicate, in a human understandable way, the data analysis results. In this paper, we use an XAI tool, called ExpliClas, with the aim of facilitating data analysis in the context of the decision-making processes to be carried out by all the stakeholders involved in the educational process. The Open University Learning Analytics Dataset (OULAD) has been used to predict students' outcome, and both graphical and textual explanations of the predictions have shown the need and the effectiveness of using XAI in the educational field

    Linking assessment and learning analytics to support learning processes in higher education

    Full text link
    In higher education assessments are mostly used for summative purposes such as grading and certifying. Albeit, assessments are also considered to support learning processes by offering formative feedback to learners about their current performance and how to improve. Even though such feedback might enhance learners’ self-regulated learning processes, it is used infrequently due to resource constraints. In addition, the competences, skills, and knowledge that should be assessed are evermore complex. To derive valid inferences about learners’ current performance, ongoing assessments across contexts are desirable. With the advancing use of digital learning environments, learning analytics are also coming in for increasing discussion in higher education. However, learning analytics are still not sufficiently linked to learning theory and are lacking empirical evidence. Hence, the purpose of this paper is to propose how theory on assessment and related feedback can be linked to learning analytics with regard to supporting self-regulated learning. Therefore, relevant concepts of assessment, assessment design, and feedback plus current perspectives on learning analytics are introduced. Based on this theoretical foundation, a conceptual integrative framework and potential learning analytics features were derived. The framework and its implications plus further research needs are discussed and concluded
    corecore