87 research outputs found

    Utilisation and evaluation of cooperative case-based teaching for integration of microbiology and pharmacology in veterinary education

    Get PDF
    Purpose: Integrating basic sciences with clinical disciplines while fostering clinical reasoning capabilities is difficult. We investigated the utilisation of diagnostic specimens and, a cooperative, case-based learning and teaching model to integrate principles of antimicrobial drug pharmacology and microbiology in the fifth year of a veterinary course. Methods: In small groups, students were assigned diagnostic specimens from which they isolated and identified clinically relevant microorganisms and then performed antimicrobial susceptibility tests based on a review of pharmacology, microbiology and pathophysiology. Results were recorded and analysed followed by a student-led integrative tutorial. Learning outcomes were assessed via individually written reports discussing the disease process, interpretation of diagnostic results and, recommendations and rationales for therapeutic interventions. Results: This approach yielded high quality student reports that conformed to antimicrobial prescription guidelines with consistently high summative assessment scores. Mean scores for the final report in this learning activity were: 82 ± 12%, 80 ± 12% and 80 ± 11% for 2015, 2016 and 2017 cohorts respectively; over the same time period, 98 ± 1% of students indicated that these learning activities facilitated the development of confidence, professional knowledge and skills. Discussion: This was a consistent approach for integrating principles of veterinary pharmacology and microbiology in clinical disciplines. These data illustrate the benefit of a systematic application of a cooperative, case-based learning and teaching model in integrating pre-clinical and clinical disciplines in a bachelor of veterinary science course

    The impacts of degraded vegetation on water flows: a case study in the Mzimvubu catchment.

    Get PDF
    Master’s degree. University of KwaZulu-Natal, Pietermaritzburg.The Mzimvubu River is the largest undeveloped river course in South Africa, with the Mzimvubu catchment set to undergo high levels of both social and economic development. A study was undertaken for the catchment with the aim being to determine the impacts of different land use management scenarios on the catchment water flows through the use of the ACRU model. The verification stage of the study involved the modelling of the baseline scenarios of two preselected catchments, viz. T35C and T32A/B/C, in order to perform statistical comparisons of both simulated and observed streamflow. Whilst a number of the desired statistics were out of the ±15% confidence range, the differences between observed and simulated variances and standard deviations were well within the range and the R2 and Nash-Sutcliffe Efficiency Index (Ef) factors, though not exceeding 0.7, were deemed acceptable. The verification of the two Mzimvubu catchments was not ideal, and it was hypothesised that this may have been due, in part, to the parameterisation of degraded areas in the ACRU model configuration Degradation of vegetation can be considered in a number of different ways (from loss of cover through to bush encroachment and poor burning practice), although in ACRU it has only been modelled as a pure loss of vegetative cover. A methodology for determining vegetation parameters was thus determined from Leaf Area Index (LAI) data for 2008-2017 for sites within degraded areas and pristine veld areas within protected sites, and included calculation of crop coefficient, interception and percentage surface cover parameters that were then used within ACRU as the degraded vegetation parameters. These parameters were then input into the model, with simulations being run for both study catchments using both the Kristensen and FAO dual crop coefficients, as well as a set of simulations using degraded parameters that were calculated by using a percentage change (between 10 and 15 % difference) on the existing Acocks veld parameters within the model. This percentage change yielded very minor changes to the initial verification simulations; however, the two other sets of runs using the different crop coefficients both made significant changes to the verification simulations. The T32A/B/C simulation improved by almost 20 % and was only just outside the range of ±15% for the Kristensen set of runs. The T35C simulation, on the other hand, worsened although a challenge existed insofar as only the natural and degraded vegetation Hydrological Response Units (HRUs) had updated parameters – the large amount of commercial forestry,a known streamflow reduction activity (SFRA), within the catchment could have played a role in the under simulation of all the catchment’s model runs. Lastly, land use change scenarios were then modelled by changing both vegetative parameters and the area of different HRUs within both the T35C and T32A/B/C catchments. The scenarios modelled considered land degradation in its many forms, from the degradation of natural vegetation and subsequent rehabilitation, the increase in bush encroachment, differing severities and timing of burning, changes in areas under irrigated and dryland agriculture, and the conversion of traditional dryland crops to biofuel crops. These different scenarios proved to have different sensitivities to change, although all scenarios showed a lessening in the sensitivity as the area under change increased. Given the problems with both rainfall and streamflow records, further research on remote sensing and satellite imagery could provide another source of data for both climatic and land use. Further to this, the methodology used to determine the degraded vegetation parameters using remotely sensed data was shown to be an explicit and repeatable method and can be extended to incorporate the calculation of the parameters of other land uses, such as forestry and agricultural practices. This could be done in conjunction with in situ studies to test whether the methodology works for all types of land use

    Cost-effectiveness of left ventricular assist devices (LVADs) for patients with advanced heart failure : analysis of the British NHS Bridge to Transplant (BTT) program

    Get PDF
    Background: A previous cost-effectiveness analysis showed that bridge to transplant (BTT) with early design left ventricular assist devices (LVADs) for advanced heart failure was more expensive than medical management while appearing less beneficial. Older LVADs were pulsatile, but current second and third generation LVADs are continuous flow pumps. This study aimed to estimate comparative cost-effectiveness of BTT with durable implantable continuous flow LVADs compared to medical management in the British NHS. Methods and results: A semi-Markov multi-state economic model was built using NHS costs data and patient data in the British NHS Blood and Transplant Database (BTDB). Quality-adjusted life years (QALYs) and incremental costs per QALY were calculated for patients receiving LVADs compared to those receiving inotrope supported medical management. LVADs cost £80,569 (127,887)at2011pricesanddeliveredgreaterbenefitthanmedicalmanagement.Theestimatedprobabilisticincrementalcost−effectivenessratio(ICER)was£53,527(127,887) at 2011 prices and delivered greater benefit than medical management. The estimated probabilistic incremental cost-effectiveness ratio (ICER) was £53,527 (84,963)/QALY (95%CI: £31,802–£94,853; 50,479–50,479–150,560) (over a lifetime horizon). Estimates were sensitive to choice of comparator population, relative likelihood of receiving a heart transplant, time to transplant, and LVAD costs. Reducing the device cost by 15% decreased the ICER to £50,106 ($79,533)/QALY. Conclusions: Durable implantable continuous flow LVADs deliver greater benefits at higher costs than medical management in Britain. At the current UK threshold of £20,000 to £30,000/QALY LVADs are not cost effective but the ICER now begins to approach that of an intervention for end of life care recently recommended by the British NHS. Cost-effectiveness estimates are hampered by the lack of randomized trials

    Cost-effectiveness of left ventricular assist devices (LVADs) for patients with advanced heart failure : analysis of the British NHS Bridge to Transplant (BTT) program

    Get PDF
    Background: A previous cost-effectiveness analysis showed that bridge to transplant (BTT) with early design left ventricular assist devices (LVADs) for advanced heart failure was more expensive than medical management while appearing less beneficial. Older LVADs were pulsatile, but current second and third generation LVADs are continuous flow pumps. This study aimed to estimate comparative cost-effectiveness of BTT with durable implantable continuous flow LVADs compared to medical management in the British NHS. Methods and results: A semi-Markov multi-state economic model was built using NHS costs data and patient data in the British NHS Blood and Transplant Database (BTDB). Quality-adjusted life years (QALYs) and incremental costs per QALY were calculated for patients receiving LVADs compared to those receiving inotrope supported medical management. LVADs cost £80,569 (127,887)at2011pricesanddeliveredgreaterbenefitthanmedicalmanagement.Theestimatedprobabilisticincrementalcost−effectivenessratio(ICER)was£53,527(127,887) at 2011 prices and delivered greater benefit than medical management. The estimated probabilistic incremental cost-effectiveness ratio (ICER) was £53,527 (84,963)/QALY (95%CI: £31,802–£94,853; 50,479–50,479–150,560) (over a lifetime horizon). Estimates were sensitive to choice of comparator population, relative likelihood of receiving a heart transplant, time to transplant, and LVAD costs. Reducing the device cost by 15% decreased the ICER to £50,106 ($79,533)/QALY. Conclusions: Durable implantable continuous flow LVADs deliver greater benefits at higher costs than medical management in Britain. At the current UK threshold of £20,000 to £30,000/QALY LVADs are not cost effective but the ICER now begins to approach that of an intervention for end of life care recently recommended by the British NHS. Cost-effectiveness estimates are hampered by the lack of randomized trials

    Non-invasive in vivo assessment of 11ÎČ-hydroxysteroid dehydrogenase type 1 activity by 19F-Magnetic Resonance Spectroscopy

    Get PDF
    11ÎČ-Hydroxysteroid dehydrogenase type 1 (11ÎČ-HSD1) amplifies tissue glucocorticoid levels and is a pharmaceutical target in diabetes and cognitive decline. Clinical translation of inhibitors is hampered by lack of in vivo pharmacodynamic biomarkers. Our goal was to monitor substrates and products of 11ÎČ-HSD1 non-invasively in liver via 19Fluorine magnetic resonance spectroscopy (19F-MRS). Interconversion of mono/poly-fluorinated substrate/product pairs was studied in Wistar rats (male, n = 6) and healthy men (n = 3) using 7T and 3T MRI scanners, respectively. Here we show that the in vitro limit of detection, as absolute fluorine content, was 0.625 ÎŒmole in blood. Mono-fluorinated steroids, dexamethasone and 11-dehydrodexamethasone, were detected in phantoms but not in vivo in human liver following oral dosing. A non-steroidal polyfluorinated tracer, 2-(phenylsulfonyl)-1-(4-(trifluoromethyl)phenyl)ethanone and its metabolic product were detected in vivo in rat liver after oral administration of the keto-substrate, reading out reductase activity. Administration of a selective 11ÎČ-HSD1 inhibitor in vivo in rats altered total liver 19F-MRS signal. We conclude that there is insufficient sensitivity to measure mono-fluorinated tracers in vivo in man with current dosage regimens and clinical scanners. However, since reductase activity was observed in rats using poly-fluorinated tracers, this concept could be pursued for translation to man with further development

    A systematic evidence map of research on Lyme disease in humans

    Get PDF

    Serum antibodies against genitourinary infectious agents in prostate cancer and benign prostate hyperplasia patients: a case-control study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Infection plays a role in the pathogenesis of many human malignancies. Whether prostate cancer (PCa) - an important health issue in the aging male population in the Western world - belongs to these conditions has been a matter of research since the 1970 s. Persistent serum antibodies are a proof of present or past infection. The aim of this study was to compare serum antibodies against genitourinary infectious agents between PCa patients and controls with benign prostate hyperplasia (BPH). We hypothesized that elevated serum antibody levels or higher seroprevalence in PCa patients would suggest an association of genitourinary infection in patient history and elevated PCa risk.</p> <p>Methods</p> <p>A total of 434 males who had undergone open prostate surgery in a single institution were included in the study: 329 PCa patients and 105 controls with BPH. The subjects' serum samples were analysed by means of enzyme-linked immunosorbent assay, complement fixation test and indirect immunofluorescence for the presence of antibodies against common genitourinary infectious agents: human papillomavirus (HPV) 6, 11, 16, 18, 31 and 33, herpes simplex virus (HSV) 1 and 2, human cytomegalovirus (CMV), Chlamydia trachomatis, Mycoplasma hominis, Ureaplasma urealyticum, Neisseria gonorrhoeae and Treponema pallidum. Antibody seroprevalence and mean serum antibody levels were compared between cases and controls. Tumour grade and stage were correlated with serological findings.</p> <p>Results</p> <p>PCa patients were more likely to harbour antibodies against Ureaplasma urealyticum (odds ratio (OR) 2.06; 95% confidence interval (CI) 1.08-4.28). Men with BPH were more often seropositive for HPV 18 and Chlamydia trachomatis (OR 0.23; 95% CI 0.09-0.61 and OR 0.45; 95% CI 0.21-0.99, respectively) and had higher mean serum CMV antibody levels than PCa patients (p = 0.0004). Among PCa patients, antibodies against HPV 6 were associated with a higher Gleason score (p = 0.0305).</p> <p>Conclusions</p> <p>Antibody seropositivity against the analyzed pathogens with the exception of Ureaplasma does not seem to be a risk factor for PCa pathogenesis. The presence or higher levels of serum antibodies against the genitourinary pathogens studied were not consistently associated with PCa. Serostatus was not a predictor of disease stage in the studied population.</p

    The impact of the SARS‐CoV‐2 pandemic and COVID‐19 on lung transplantation in the UK: Lessons learned from the first wave

    Get PDF
    BACKGROUND: Lung transplantation is particularly susceptible to the impact of the severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2) pandemic, and evaluation of changes to practice is required to inform future decision-making. METHODS: A retrospective review of the UK Transplant Registry (UKTR) and national survey of UK lung transplant centers has been performed. RESULTS: There was geographic variation in the prevalence of COVID-19 infection across the UK. The number of donors fell by 48% during the early pandemic period. Lung utilization fell to 10% (compared with 24% for the same period of 2019). The number of lung transplants performed fell by 77% from 53, March to May 2019, to 12. Seven (58%) of these were performed in a single-center, designated "COVID-light." The number of patients who died on the lung transplant waiting list increased, compared to the same period of 2019 (p = .0118). Twenty-six lung transplant recipients with confirmed COVID-19 infection were reported during the study period. CONCLUSION: As the pandemic continues, reviewing practice and implementing the lessons learned during this period, including the use of robust donor testing strategies and the provision of "COVID-light" hospitals, are vital in ensuring the safe continuation of our lung transplant program

    A four-year, systems-wide intervention promoting interprofessional collaboration

    Get PDF
    Background: A four-year action research study was conducted across the Australian Capital Territory health system to strengthen interprofessional collaboration (IPC) through multiple intervention activities. Methods: We developed 272 substantial IPC intervention activities involving 2,407 face-to-face encounters with health system personnel. Staff attitudes toward IPC were surveyed yearly using Heinemann et al’s Attitudes toward Health Care Teams and Parsell and Bligh’s Readiness for Interprofessional Learning scales (RIPLS). At study’s end staff assessed whether project goals were achieved. Results: Of the improvement projects, 76 exhibited progress, and 57 made considerable gains in IPC. Educational workshops and feedback sessions were well received and stimulated interprofessional activities. Over time staff scores on Heinemann’s Quality of Interprofessional Care subscale did not change significantly and scores on the Doctor Centrality subscale increased, contrary to predictions. Scores on the RIPLS subscales of Teamwork & Collaboration and Professional Identity did not alter. On average for the assessment items 33% of staff agreed that goals had been achieved, 10% disagreed, and 57% checked ‘neutral’. There was most agreement that the study had resulted in increased sharing of knowledge between professions and improved quality of patient care, and least agreement that between-professional rivalries had lessened and communication and trust between professions improved. Conclusions: Our longitudinal interventional study of IPC involving multiple activities supporting increased IPC achieved many project-specific goals, but improvements in attitudes over time were not demonstrated and neutral assessments predominated, highlighting the difficulties faced by studies targeting change at the systems level and over extended periods
    • 

    corecore