5 research outputs found

    A Learning Analytics-informed Activity to Improve Student Performance in a First Year Physiology Course

    Get PDF
    Learning Analytics (LA) can be employed to identify course-specific factors that hinder student course (outcome) performance, which can be subsequently rectified using targeted interventions. Supplementing interventions with predictive modelling also permits the identification of students who are at-risk of failing the course and encourages their participation. LA findings suggested that a targeted intervention for our course should focus on improving student short answer question (SAQ) performance, which we attempted to achieve by improving their understanding of features pertaining to various SAQ answer standards and how to achieve them using examples of varying scores. Every student was invited to the intervention via a course-wide announcement through the course learning management system. At-risk students identified using predictive models were given an additional invitation in the form of a personalised email. Results suggest that intervention improved student understanding of SAQ performance criteria. The intervention also enhanced student end-of-semester SAQ performance by 12% and 11% for at-risk and no-risk students respectively. Course failure rate was also lower by 26% and 9% among at-risk and no-risk intervention participants. Student perception of the intervention was also positive where an overwhelming majority of participants (96%) found the interventional activity to be useful for their learning and exam preparations

    Redesigning a First Year Physiology Course using Learning Analytics to Improve Student Performance

    No full text
    Learning analytics (LA), a fast emerging concept in higher education, is used to understand and optimize the student learning process and the envi-ronment in which it occurs. Knowledge obtained from the LA paradigm is often utilized to construct statistical models aimed at identifying students who are at risk of failing the unit/course, and to subsequently design inter-ventions that are targeted towards improving the course outcomes for these students. In previous studies, models were constructed using a wide variety of variables, but emerging evidence suggests that the models constructed us-ing course-specific variables are more accurate, and provide a better under-standing of the learning context. For our current study, student performance in the various course assessment tasks was used as a basis for the predictive models and future intervention design, as they are conventionally used to evaluate student learning outcomes and the degree to which the various course learning objectives are met. Further, students in our course are pri-marily first-year university students, who are still unfamiliar with the learning and assessment context of higher education, and this prevents them from adequately preparing for the tasks, and consequently reduces their course performance and outcome. We first constructed statistical models that would be used to identify students who are at risk of failing the course and to identify assessment tasks that students in our course find challeng-ing, as a guide for the design of future interventional activities. Every con-structed predictive model had an excellent capacity to discriminate between students who passed the course and those who failed. Analysis revealed that not only at-risk students, but the whole cohort, would benefit from in-terventions improving their conceptual understanding and ability to con-struct high-scoring answers to Short Answer Questions

    A metacognitive activity to enhance student understanding of complexity of a threshold concept in biology

    Get PDF
    Background, Aims and Methods Threshold concepts are transformative, but also likely to be troublesome, for undergraduate students. Metacognitive activities that expose students to the structural complexity of a threshold concept and are organized in terms of the Structure of Observed Learning Outcomes (SOLO) taxonomy have been shown to improve student learning outcomes in a third year engineering course (Meyer et al., 2015). The current study aimed to emphasise to a large cohort of first year biology students the structural complexity of the concept of ‘cell membrane transfer’ and improve their understanding of the concept. The metacognitive activity was divided into several parts. In the first stage, students were asked to answer an open-ended question related to how transfer of substances occurs across the cell membrane. Following this, the students were asked to mark their own answer on a scale of 1-10 and provide justification of their marking by selecting one of statements 1-5, which were ascending in complexity based on the SOLO taxonomy. Subsequently, students were provided with 9 model answers to the question, which varied in structural complexity, and were asked to mark the answers out of 10. This was followed by an instructor explaining their marking of the 9 answers and justification of the marks. Lastly, students were asked to revisit their own answer and re-mark their answer out of 10 and provide justification. After the class, the instructor marked each student response, providing a score and justification, which could be compared to students’ pre- and post-scores and justification. Results and Discussion In our study, it was evident that students find it difficult to identify the variation in complexity of a threshold concept. More than 50% of students were unable to match the instructor score for each model answer, with the exception of only the least complex, and hence lowest scored, answer. The self-assigned student post-score of their own answers was marginally greater than the instructor’s score (

    SOLO-based task to improve self-evaluation and capacity to integrate concepts in first-year physiology students

    No full text
    An accurate self-assessment of student work can enhance student learning and subsequently improve academic performance. Instructors can facilitate this process by providing "standards" that students can utilize as feedback when self-evaluating their understanding. Traditional forms of feedback, such as marked assessment tasks, are limited in their ability to serve as standards, as they do not adequately capture variations corresponding to different levels of understanding. To develop a complex understanding in physiology, students have to integrate concepts pertaining to different subcomponents of body systems. The present study attempted to ascertain if exposing students to variations in complexity would refine their ability to self-evaluate their understanding and capacity to integrate concepts. Students were tasked to answer an essay-length, open-ended physiology question to expose their current understanding of the topic. The change in students' self-marking of their answer before and after being exposed to the variations in conceptual understanding of the topic were used to determine whether improvements in self-evaluation accuracy occurred. These variations were presented as instructor-generated answers to the open-ended question, framed using the structure of the observed learning outcome (SOLO) taxonomy. Student scores in the integrative questions of the end-of-semester exam were used as a measure of student ability to integrate concepts. Findings indicated that this intervention led to improvements in student self-evaluation and exam performance, and the positive outcomes were replicated across multiple iterations of the activity
    corecore