314 research outputs found

    Results of the British Society of Gastroenterology supporting women in gastroenterology mentoring scheme pilot.

    Get PDF
    Introduction: Mentorship has long been recognised as beneficial in the business world and has more recently been endorsed by medical and academic professional bodies. Recruitment of women into gastroenterology and leadership roles has traditionally been difficult. The Supporting Women in Gastroenterology network developed this pilot scheme for female gastroenterologists 5 years either side of the Completion Certificate of Specialist Training (CCST) to examine the role that mentorship could play in improving this discrepancy. Method: Female gastroenterology trainees and consultant gastroenterologists within 5 years either side of CCST were invited to participate as mentees. Consultant gastroenterologists of both genders were invited to become mentors. 35 pairs of mentor:mentees were matched and completed the scheme over 1 year. Training was provided. Results: The majority of the mentees found the sessions useful (82%) and enjoyable (77%), with the benefit of having time and space to discuss professional or personal challenges with a gastroenterologist who is not a colleague. In the longitudinal study of job satisfaction, work engagement, burnout, resilience, self-efficacy, self-compassion and work-life balance, burnout scale showed a small but non significant improvement over the year (probably an effect of small sample size). Personal accomplishment improved significantly. The main challenges were geography, available time to meet and pair matching. The majority of mentors surveyed found the scheme effective, satisfying, mutually beneficial (70%) and enjoyable (78%). Conclusion: Mentorship is shown to be beneficial despite the challenges and is likely to improve the recruitment and retention of women into gastroenterology and leadership roles, but is likely to benefit gastroenterologists of both genders

    The antisaccade task as an index of sustained goal activation in working memory: modulation by nicotine

    Get PDF
    The antisaccade task provides a laboratory analogue of situations in which execution of the correct behavioural response requires the suppression of a more prepotent or habitual response. Errors (failures to inhibit a reflexive prosaccade towards a sudden onset target) are significantly increased in patients with damage to the dorsolateral prefrontal cortex and patients with schizophrenia. Recent models of antisaccade performance suggest that errors are more likely to occur when the intention to initiate an antisaccade is insufficiently activated within working memory. Nicotine has been shown to enhance specific working memory processes in healthy adults. MATERIALS AND METHODS: We explored the effect of nicotine on antisaccade performance in a large sample (N = 44) of young adult smokers. Minimally abstinent participants attended two test sessions and were asked to smoke one of their own cigarettes between baseline and retest during one session only. RESULTS AND CONCLUSION: Nicotine reduced antisaccade errors and correct antisaccade latencies if delivered before optimum performance levels are achieved, suggesting that nicotine supports the activation of intentions in working memory during task performance. The implications of this research for current theoretical accounts of antisaccade performance, and for interpreting the increased rate of antisaccade errors found in some psychiatric patient groups are discussed

    A cross sectional evaluation of an alcohol intervention targeting young university students

    Get PDF
    BACKGROUND: Hazardous drinking has been found to be higher among young university students compared to their non-university peers. Although young university students are exposed to new and exciting experiences, including greater availability and emphasis on social functions involving alcohol there are few multi strategy comprehensive interventions aimed at reducing alcohol-related harms. METHODS: Random cross sectional online surveys were administered to 18-24 year old students studying at the main campus of a large metropolitan university in Perth, Western Australia. Prior to the completion of the second survey an alcohol intervention was implemented on campus. Completed surveys were received from 2465 (Baseline; T1) and 2422 (Post Year 1: T2) students. Students who consumed alcohol in the past 12 months were categorised as low risk or hazardous drinkers using the Alcohol Use Disorders Identification Test (AUDIT). Due to the cross sectional nature of the two samples two-tailed two-proportion z-test and two sample t-tests were employed to determine statistical significance between the two time periods for categorical and continuous variables respectively. RESULTS: At T1 and T2 89.1 % and 87.2 % of the total sample reported drinking alcohol in the past month respectively. Hazardous levels of alcohol consumption reduced slightly between T1 (39.7 %) and T2 (38 %). In both time periods hazardous drinkers reported significantly higher mean scores for experienced harm, second-hand harm and witnessed harm scores compared to low risk drinkers (p <0.001). Hazardous drinkers were significantly more likely to experience academic problems due to their alcohol consumption and to report more positive alcohol expectations than low risk drinkers at both time periods (p <0.001). CONCLUSIONS: Harms and problems for students who report hazardous drinking are of concern and efforts should be made to ensure integrated and targeted strategies reach higher risk students and focus on specific issues such as driving while intoxicated and alcohol related unplanned sexual activity. However there is also a need for universal strategies targeting all students and low risk drinkers as they too are exposed to alcohol harms within the drinking and social environment. Changing the culture of the university environment is a long term aim and to effect change a sustained combination of organisational actions, partnerships and educational actions is required

    Outcome measurement in functional neurological disorder: a systematic review and recommendations.

    Get PDF
    OBJECTIVES: We aimed to identify existing outcome measures for functional neurological disorder (FND), to inform the development of recommendations and to guide future research on FND outcomes. METHODS: A systematic review was conducted to identify existing FND-specific outcome measures and the most common measurement domains and measures in previous treatment studies. Searches of Embase, MEDLINE and PsycINFO were conducted between January 1965 and June 2019. The findings were discussed during two international meetings of the FND-Core Outcome Measures group. RESULTS: Five FND-specific measures were identified-three clinician-rated and two patient-rated-but their measurement properties have not been rigorously evaluated. No single measure was identified for use across the range of FND symptoms in adults. Across randomised controlled trials (k=40) and observational treatment studies (k=40), outcome measures most often assessed core FND symptom change. Other domains measured commonly were additional physical and psychological symptoms, life impact (ie, quality of life, disability and general functioning) and health economics/cost-utility (eg, healthcare resource use and quality-adjusted life years). CONCLUSIONS: There are few well-validated FND-specific outcome measures. Thus, at present, we recommend that existing outcome measures, known to be reliable, valid and responsive in FND or closely related populations, are used to capture key outcome domains. Increased consistency in outcome measurement will facilitate comparison of treatment effects across FND symptom types and treatment modalities. Future work needs to more rigorously validate outcome measures used in this population

    The role of amputation as an outcome measure in cellular therapy for critical limb ischemia: implications for clinical trial design

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Autologous bone marrow-derived stem cells have been ascribed an important therapeutic role in No-Option Critical limb Ischemia (NO-CLI). One primary endpoint for evaluating NO-CLI therapy is major amputation (AMP), which is usually combined with mortality for AMP-free survival (AFS). Only a trial which is double blinded can eliminate physician and patient bias as to the timing and reason for AMP. We examined factors influencing AMP in a prospective double-blinded pilot RCT (2:1 therapy to control) of 48 patients treated with site of service obtained bone marrow cells (BMAC) as well as a systematic review of the literature.</p> <p>Methods</p> <p>Cells were injected intramuscularly in the CLI limbs as either BMAC or placebo (peripheral blood). Six month AMP rates were compared between the two arms. Both patient and treating team were blinded of the assignment in follow-up examinations. A search of the literature identified 9 NO-CLI trials, the control arms of which were used to determine 6 month AMP rates and the influence of tissue loss.</p> <p>Results</p> <p>Fifteen amputations occurred during the 6 month period, 86.7% of these during the first 4 months. One amputation occurred in a Rutherford 4 patient. The difference in amputation rate between patients with rest pain (5.6%) and those with tissue loss (46.7%), irrespective of treatment group, was significant (p = 0.0029). In patients with tissue loss, treatment with BMAC demonstrated a lower amputation rate than placebo (39.1% vs. 71.4%, p = 0.1337). The Kaplan-Meier time to amputation was longer in the BMAC group than in the placebo group (p = 0.067). Projecting these results to a pivotal trial, a bootstrap simulation model showed significant difference in AFS between BMAC and placebo with a power of 95% for a sample size of 210 patients. Meta-analysis of the literature confirmed a difference in amputation rate between patients with tissue loss and rest pain.</p> <p>Conclusions</p> <p>BMAC shows promise in improving AMP-free survival if the trends in this pilot study are validated in a larger pivotal trial. The difference in amp rate between Rutherford 4 & 5 patients suggests that these patients should be stratified in future RCTs.</p

    Survey of information technology in Intensive Care Units in Ontario, Canada

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The Intensive Care Unit (ICU) is a data-rich environment where information technology (IT) may enhance patient care. We surveyed ICUs in the province of Ontario, Canada, to determine the availability, implementation and variability of information systems.</p> <p>Methods</p> <p>A self-administered internet-based survey was completed by ICU directors between May and October 2006. We measured the spectrum of ICU clinical data accessible electronically, the availability of decision support tools, the availability of electronic imaging systems for radiology, the use of electronic order entry and medication administration systems, and the availability of hardware and wireless or mobile systems. We used Fisher's Exact tests to compare IT availability and Classification and Regression Trees (CART) to estimate the optimal cut-point for the number of computers per ICU bed.</p> <p>Results</p> <p>We obtained responses from 50 hospitals (68.5% of institutions with level 3 ICUs), of which 21 (42%) were university-affiliated. The majority electronically accessed laboratory data and imaging reports (92%) and used picture archiving and communication systems (PACS) (76%). Other computing functions were less prevalent (medication administration records 46%, physician or nursing notes 26%; medication order entry 22%). No association was noted between IT availability and ICU size or university affiliation. Sites used clinical information systems from15 different vendors and 8 different PACS systems were in use. Half of the respondents described the number of computers available as insufficient. Wireless networks and mobile computing systems were used in 23 ICUs (46%).</p> <p>Conclusion</p> <p>Ontario ICUs demontrate a high prevalence of the use of basic information technology systems. However, implementation of the more complex and potentially more beneficial applications is low. The wide variation in vendors utilized may impair information exchange, interoperability and uniform data collection.</p

    Autism as a disorder of neural information processing: directions for research and targets for therapy

    Get PDF
    The broad variation in phenotypes and severities within autism spectrum disorders suggests the involvement of multiple predisposing factors, interacting in complex ways with normal developmental courses and gradients. Identification of these factors, and the common developmental path into which theyfeed, is hampered bythe large degrees of convergence from causal factors to altered brain development, and divergence from abnormal brain development into altered cognition and behaviour. Genetic, neurochemical, neuroimaging and behavioural findings on autism, as well as studies of normal development and of genetic syndromes that share symptoms with autism, offer hypotheses as to the nature of causal factors and their possible effects on the structure and dynamics of neural systems. Such alterations in neural properties may in turn perturb activity-dependent development, giving rise to a complex behavioural syndrome many steps removed from the root causes. Animal models based on genetic, neurochemical, neurophysiological, and behavioural manipulations offer the possibility of exploring these developmental processes in detail, as do human studies addressing endophenotypes beyond the diagnosis itself

    Quantitative Detection of Schistosoma japonicum Cercariae in Water by Real-Time PCR

    Get PDF
    In China alone, an estimated 30 million people are at risk of schistosomiasis, caused by the Schistosoma japonicum parasite. Disease has re-emerged in several regions that had previously attained transmission control, reinforcing the need for active surveillance. The environmental stage of the parasite is known to exhibit high spatial and temporal variability, and current detection techniques rely on a sentinel mouse method which has serious limitations in obtaining data in both time and space. Here we describe a real-time PCR assay to quantitatively detect S. japonicum cercariae in laboratory samples and in natural water that has been spiked with known numbers of S. japonicum. Multiple primers were designed and assessed, and the best performing set, along with a TaqMan probe, was used to quantify S. japonicum. The resulting assay was selective, with no amplification detected for Schistosoma mansoni, Schistosoma haematobium, avian schistosomes nor organisms present in non-endemic surface water samples. Repeated samples containing various concentrations of S. japonicum cercariae showed that the real-time PCR method had a strong linear correlation (R2 = 0.921) with light microscopy counts, and the detection limit was below the DNA equivalent of half of one cercaria. Various cercarial concentrations spiked in 1 liter of natural water followed by a filtration process produced positive detection from 93% of samples analyzed. The real-time PCR method performed well quantifying the relative concentrations of various spiked samples, although the absolute concentration estimates exhibited high variance across replicated samples. Overall, the method has the potential to be applied to environmental water samples to produce a rapid, reliable assay for cercarial location in endemic areas
    corecore