42 research outputs found

    L-carnitine attenuates cardiac impairment but not vascular dysfunction in DOCA-salt hypertensive rats

    Get PDF
    L-Carnitine is an important co-factor in fatty acid metabolism by mitochondria. This study has determined whether oral administration of L-carnitine prevents remodelling and the development of impaired cardiovascular function in deoxycorticosterone acetate (DOCA)-salt hypertensive rats (n = 6–12; #p < 0.05 versus DOCA-salt). Uninephrectomized rats administered DOCA (25 mg every 4th day s.c.) and 1% NaCl in drinking water for 28 days developed cardiovascular remodelling shown as systolic hypertension, left ventricular hypertrophy, increased thoracic aortic and left ventricular wall thickness, increased left ventricular inflammatory cell infiltration together with increased interstitial collagen and increased passive diastolic stiffness and vascular dysfunction with increased plasma malondialdehyde concentrations. Treatment with L-carnitine (1.2% in food; 0.9 mg⁄g⁄day in DOCA-salt rats) decreased blood pressure (DOCA-salt 169 € 2; + L-carnitine 148 € 6# mmHg), decreased left ventricular wet weights (DOCA-salt 3.02 € 0.07; + L-carnitine 2.72 € 0.06# mg⁄ g body-wt), decreased inflammatory cells in the replacement fibrotic areas, reduced left ventricular interstitial collagen content (DOCA-salt 14.4 € 0.2; + L-carnitine 8.7 € 0.5# % area), reduced diastolic stiffness constant (DOCA-salt 26.9 € 0.5; + L-carnitine 23.8 € 0.5# dimensionless) and decreased plasma malondialdehyde concentrations (DOCA-salt 26.9 € 0.8; + L-carnitine 21.2 € 0.4# lmol ⁄ l) without preventing endothelial dysfunction. L-carnitine attenuated the cardiac remodelling and improved cardiac function in DOCA-salt hypertension but produced minimal changes in aortic wall thickness and vascular function. This study suggests that the mitochondrial respiratory chain is a significant source of reactive oxygen species in the heart but less so in the vasculature in DOCA-salt rats, underlying the relatively selective cardiac responses to L-carnitine treatment

    A Learning Analytics-informed Activity to Improve Student Performance in a First Year Physiology Course

    Get PDF
    Learning Analytics (LA) can be employed to identify course-specific factors that hinder student course (outcome) performance, which can be subsequently rectified using targeted interventions. Supplementing interventions with predictive modelling also permits the identification of students who are at-risk of failing the course and encourages their participation. LA findings suggested that a targeted intervention for our course should focus on improving student short answer question (SAQ) performance, which we attempted to achieve by improving their understanding of features pertaining to various SAQ answer standards and how to achieve them using examples of varying scores. Every student was invited to the intervention via a course-wide announcement through the course learning management system. At-risk students identified using predictive models were given an additional invitation in the form of a personalised email. Results suggest that intervention improved student understanding of SAQ performance criteria. The intervention also enhanced student end-of-semester SAQ performance by 12% and 11% for at-risk and no-risk students respectively. Course failure rate was also lower by 26% and 9% among at-risk and no-risk intervention participants. Student perception of the intervention was also positive where an overwhelming majority of participants (96%) found the interventional activity to be useful for their learning and exam preparations

    The effect of grading matrix assessment on student performance in a large first year biology class

    Get PDF
    In our large first year biology course, ‘Cells to Organisms’, for 400-900 students per semester, we aimed to provide students with clear links between the course delivery framework and assessment. We wanted the students’ grades to reflect higher order learning of key concepts of cellular and tissue biology, achievement of related practical skills, understanding of the nature of evidence and communication of science. In Semester 1 2008, student grades were determined by the traditional weighted average of marks for the assessment tasks. Since Semester 2 2008, the course has been graded with a grading matrix with specified standards for practical reports, practical competencies, communication tasks and knowledge for the grades of 7 (best) to 1 (worst). Analysis of results for the subsequent three semesters showed that 84-90% of students obtained a passing grade, and that 75% of those students achieved 60% or greater in the final examination – a marked improvement compared with about 50% of the students in Semester 1 2008. Also, their knowledge has improved with a 5% increase in the average mark in the final examination. The grading matrix resulted in improved student engagement with and performance in the assessment areas and graduate attributes addressed in the course

    MATHBENCH Australia: Does it reach the expectations of both biologists and mathematicians?

    Get PDF
    It is well documented that life science students require Quantitative Skills (QS); the ability to apply mathematical and statistical thinking and reasoning, especially in the context of science. However, the relatively low numbers of students completing higher level secondary school mathematics, and the lack of mathematics prerequisites for many Australian university science degrees, has resulted in many students lacking the QS expected by life science academics. This poses a challenge for both life science academics and mathematicians: to raise the level of the QS of students. This lack of QS was targeted by the OLT MathBench project. The recently released online MathBench Australia (http://mathbench.org.au) Biology Modules are Australianised versions of the original US MathBench modules. The MathBench Australia Biology Modules can be used comfortably by Australian life science academics in their science subjects to build the QS of their students. The OLT MathBench project had mathematicians and biologists collaborating from start to finish, to ensure that the content is appropriate and correct from both the life science academics and mathematicians’ viewpoints. The first part of this presentation captures the biologists’ view of the content and use of the MathBench Australia modules. The second part addresses the quantitative aspects of the modules and the fit with reported data on the QS needs of life science students. Aims One aim was to investigate whether the QS that life science academics want in their graduates are covered in MathBench Australia. Another was to find out how the interviewees used MathBench Australia in their subjects, whether they perceived it as useful and how it could be improved. Design and methods Interviews were conducted with the biologists involved in the project. The questions asked included some information about the students, the QS needed for the subject, the delivery, and the appropriateness of content. The QS in MathBench Australia were compared to the previously reported data (Rylands et al., 2013) on the QS needs and requirements of life science students. Results and Conclusion Various MathBench Australia modules were used in life science subjects, and in various different ways. Overall, the biologists were very positive, finding the friendly and conversational tone together with correct scientific language, to be appropriate. Some improvements were proposed. On the whole, the statistics and mathematics was agreed to have been well covered in MathBench Australia, although the modules cover very basic mathematics and statistics, and do not go as far as calculus. References Rylands, L., Simbag, V., Matthews, K., Coady, C., & Belward, S. (2013). Scientists and mathematicians collaborating to build quantitative skills in undergraduate science. International Journal of Mathematical Education in Science and Technology, 44(6), 834-845

    Enhancing student short answer question performance using exemplars of varying complexity

    Get PDF
    The short answer question (SAQ) is one of the mainstay methods in which student conceptual understanding is evaluated. However, students in some courses are only exposed to these questions and associated feedback at the end of the university semester, which would affect their capacity to generate responses that are complex enough to address the question. The study aimed to improve student capacity to evaluate and then generate complex SAQ responses using exemplars of different performance standards. The students were exposed to the exemplars in an activity implemented in one of the lectures during the semester. The change in students’ ability to identify their own SAQ performance level prior to and after being exposed to the exemplars was determined by comparing student self-marking accuracy before and after being exposed to the examples. Student scores in the SAQs of the end of semester (EOS) exam were used to determine if exemplars improved their capacity to construct complex responses. Exemplars improved student (n=114) capacity to identify their own SAQ performance. Furthermore, students who attended the activity generated responses that garnered higher scores for SAQs in the EOS exam and performed better in the course overall when compared to those who did not

    Analysis of student behavioural patterns in the use of a virtual laboratory: A comparison of cohorts from two different disciplines

    Get PDF
    Background: Virtual laboratories are learning tools that are used to prepare students for a downstream “live” laboratory tasks. They are intended to provide students with computer-simulated experimental experiences to support and enrich the learning experience in the corresponding real-life situations. However, prior research in this area in regard to student learning styles using virtual labs and between different cohorts is limited. Aims: To analyse online data retrieved from a virtual pharmacology laboratory module used by science and pharmacy student cohorts in order to determine how students engage with the module. Description of intervention: We collected detailed information regarding student interactions with the virtual lab experience, which was analysed and then compared across the two cohorts. Design and methods: The virtual pharmacology laboratory was based on experiments that tested the effects of increasing drug concentrations on muscle tissue contraction to determine drug potency. Students worked in groups of three, with pharmacy students in first semester (53 groups) and science students in second semester (55 groups). Students completed the task within practical class time but without instruction by the academics or tutors present in the session. In addition to recording the time taken to complete the module, the online computer server also recorded all mouse-click events that occurred in real-time, such as selection and use of equipment, preparing drug solutions and constructing graphical plots. The two cohorts were compared on the time taken to complete the module (one-way ANOVA), and on the frequencies of errors committed by students during the module (two-way Fisher’s exact test). Results: Science students completed the overall task within a significantly shorter duration than pharmacy students. However, pharmacy students acquired individual key objectives using the correct experimental approach, while science students tended to exploit shortcuts to achieve these objectives. Errors committed by students included incorrect use of laboratory equipment (pipettors, organ baths), inappropriate preparation of materials needed to generate expected outcomes (drug solutions and diluents), and failure to adhere to the standard protocol that should be utilised to obtain plots and pharmacological data. These errors were generally significantly more frequent in the science cohort as compared to their pharmacy counterpart. Conclusions: Science students are willing to take shortcuts to complete virtual laboratory tasks, whereas pharmacy students are more methodical and less likely to take risks in their approach. In the coming semesters, we aim to show these data to the science students as an informed teaching practice guide, in order to enhance our teaching of practical-based material

    Changes in Biology Self-Efficacy during a First-Year University Course

    Get PDF
    Academic self-efficacy encompasses judgments regarding one’s ability to perform academic tasks and is correlated with achievement and persistence. This study describes changes in biology self-efficacy during a first-year course. Students (n = 614) were given the Biology Self-Efficacy Scale at the beginning and end of the semester. The instrument consisted of 21 questions ranking confidence in performing biology-related tasks on a scale from 1 (not at all confident) to 5 (totally confident). The results demonstrated that students increased in self-efficacy during the semester. High school biology and chemistry contributed to self-efficacy at the beginning of the semester; however, this relationship was lost by the end of the semester, when experience within the course became a significant contributing factor. A proportion of high- and low- achieving (24 and 40%, respectively) students had inaccurate self-efficacy judgments of their ability to perform well in the course. In addition, female students were significantly less confident than males overall, and high-achieving female students were more likely than males to underestimate their academic ability. These results suggest that the Biology Self-Efficacy Scale may be a valuable resource for tracking changes in self-efficacy in first-year students and for identifying students with poorly calibrated self-efficacy perceptions

    Are students reading my feedback? Using a feedback analytics capture system to understand how large cohorts of biomedical science students use feedback

    Get PDF
    Feedback is one of the most potent teaching strategies known to produce student learning gains (Hattie, 2009). However, the provision of feedback has been identified as one of the weakest elements of university practices (Graduate Careers Australia, 2012). Although there are many theoretical frameworks for improving feedback provision (Hattie & Timperley, 2007; Nicol & Macfarlane Dick, 2006; Sadler, 2010), little is known about how students actually use feedback (Jonsson, 2013). Many authors contend that students commonly ignore feedback (Boud & Molloy, 2013), with some empirical evidence that students do not collect or read written feedback (Sinclair & Cleland, 2007), or ignore it when they do not understand what it means (Still & Koerber, 2010). The increasingly widespread adoption of online marking and feedback tools facilitates students’ access to their feedback, but until now there has been no systematic characterise the patterns of student access of this feedback, nor how this impacts on their subsequent performance (Ellis, 2013). We have developed, and extensively trialled, a Feedback Analytics Capture System (FACS, previously called UQMarkUP) which synthesises large-scale data on digital feedback provision, how students access feedback, and changes in students’ academic performance (Zimbardi et al., 2013). Specifically, FACS captures detailed information about the audio, typed and hand-drawn annotations markers insert in situ in electronic assessment submissions, and the marks awarded across a variety of systems, including detailed criteria-standards rubrics. FACS also collects detailed information about how students access this feedback, logging the timing and nature of every mouse click a student uses to interact with the feedback-embedded document. In this exploratory study, we investigated the frequency, timing, and patterns in how students access their feedback. Analyses of FACS data from laboratory reports submitted for summative assessment in two biomedical science courses in level 1 (n=1781 students) and level 2 (n=389), in Semesters 1 and 2, 2013, revealed that the vast majority of students opened their feedback. In the level 1 course 93% students opened Report 1, 92% opened Report 2, 87% opened Report 3 and 85% opened Report 4. In contrast, far fewer students in the level 2 course opened their feedback, and fewer students opened Report 1 (68%) than Report 2 (82%). Although a similar pattern existed for how long students had their feedback open (level 1 Report 1: 12±8 hours; Report 2: 3.4±1.6 hours; Report 3: 2.1±1.4 hours; Report 4: 43±7 minutes), the level 2 reports now reverted to greater duration of interaction with Report 1 (5.6±0.6 hours) than Report 2 (1.2±0.3 hours). The number of students accessing feedback surges 1-2 days after feedback release, followed by a persistent tail of students accessing the feedback for the subsequent two months. In this context of undergraduate biomedical science laboratory assessments, students are not only collecting and reading their feedback, but they are interacting with it extensively. There may also be potential maturational, course-specific, and interaction effects that shape feedback use, and require further exploration as we expand this feedback analytics approach across a broader range of educational contexts

    Motivating students and improving engagement in biology units using online QS modules

    Get PDF
    MathBench biology modules represent one example of how biology educators can incorporate materials to improve quantitative skills and reasoning into introductory courses. The MathBench- Australia project not only aims to ensure that the science and the maths content of MathBench (USA) modules are accurate, but also appropriate to an Australian context, and further aid to minimise students’ negative attitude towards quantitative skills and increase student engagement. Hence, in this ideas exchange we will explore the strategies to embed the contextualised MathBench modules in first and second year science units to improve student engagement and students’ QS
    corecore