652 research outputs found

    A comparison of course-related stressors in undergraduate problem-based learning (PBL) versus non-PBL medical programmes

    Get PDF
    Background: Medical students report high levels of stress related to their medical training as well as to other personal and financial factors. The aim of this study is to investigate whether there are differences in course-related stressors reported by medical students on undergraduate problem-based learning (PBL) and non-PBL programmes in the UK. Method: A cross-sectional study of second-year medical students in two UK medical schools (one PBL and one non-PBL programme) was conducted. A 16-question self-report questionnaire, derived from the Perceived Medical Student Stress Scale and the Higher Education Stress Inventory, was used to measure course-related stressors. Following univariate analysis of each stressor between groups, multivariate logistic regression was used to determine which stressors were the best predictors of each course type, while controlling for socio-demographic differences between the groups. Results: A total of 280 students responded. Compared to the non-PBL students (N = 197), the PBL students (N = 83) were significantly more likely to agree that: they did not know what the faculty expected of them (Odds Ratio (OR) = 0.38, p = 0.03); there were too many small group sessions facilitated only by students resulting in an unclear curriculum (OR = 0.04, p < 0.0001); and that there was a lack of opportunity to explore academic subjects of interest (OR = 0.40, p = 0.02). They were significantly more likely to disagree that: there was a lack of encouragement from teachers (OR = 3.11, p = 0.02); and that the medical course fostered a sense of anonymity and feelings of isolation amongst students (OR = 3.42, p = 0.008). Conclusion: There are significant differences in the perceived course-related stressors affecting medical students on PBL and non-PBL programmes. Course designers and student support services should therefore tailor their work to minimise, or help students cope with, the specific stressors on each course type to ensure optimum learning and wellbeing among our future doctors

    Indicators of ‘critical’ outcomes in 941 horses seen ‘out-of-hours’ for colic

    Get PDF
    Background: This study aimed to describe the presentation and outcomes of horses with signs of colic (abdominal pain) seen ‘out-of-hours’ in equine practice. Methods: This was a retrospective study of horses seen ‘out-of-hours’ with colic by two equine veterinary practices between 2011-2013. Case outcomes were categorised as ‘critical’ or ‘not critical’. A critical outcome was defined as requiring medical or surgical hospital treatment, or resulting in euthanasia or death. A non-critical outcome was defined as resolving with simple medical treatment. A hierarchical generalised linear model was used to identify ‘red flag’ parameters (aspects of signalment, history and presenting clinical signs) associated with critical outcomes.Results: Data were retrieved from 941 cases that presented with colic; 23.9% (n=225/941) were critical. Variables significantly associated with the likelihood of a critical outcome in the final multivariable mode were: increased heart rate (p [less than] 0.001), age of the horse (p=0.013) and abnormal mucous membrane colour (p [less than] 0.001). Overall 18% of cases (n=168/941) were euthanased.Conclusions: This study highlights the mortality associated with colic. The ‘red flag’ parameters identified should be considered an essential component of the primary assessment of horses with colic

    Retrospective case series to identify the most common conditions seen ‘out-of-hours’ by first-opinion equine veterinary practitioners

    Get PDF
    Background: The study aim was to describe conditions seen ‘out-of-hours’ in equine practice. Methods: This was a retrospective case series of first opinion ‘out-of-hours’ cases seen at two equine practices between 2011-2013. Data was retrieved on case presentation, diagnostic testing, treatment administered and outcome, and diseases categorised using a systems-based coding system. A hierarchical logistic regression, formulated using a Generalised Linear Model, was used to identify clinical variables associated with a binary outcome of ‘critical’ cases (required hospitalisation or euthanasia or died).Results: Data from 2,602 cases were analysed. The most common reasons for ‘out-of-hours’ visits were colic (35%, n=923/2,620), wounds (20%, n=511/2,620) and lameness (11%, n=288/2,620). The majority of cases required a single treatment (58%, n=1,475/2,550), 26% (n=656/2,550) needed multiple treatments, and 13% (n=339/2,550) were euthanased. Eighteen percent (n=480/2602) of cases had a critical outcome. Increased heart rate at primary presentation was associated with critical outcome in both practices (Practice A, OR 1.07 (95%CI 1.06-1.09), Practice B OR 1.08 (95%CI 1.07-1.09; p [less than] 0.001)).Conclusion: Colic, wounds and lameness were the most common equine ‘out-of-hours’ conditions; 13% of cases were euthanased. Further research is required into out-of-hours euthanasia decision-making

    Use of whole genome sequencing to determine genetic basis of suspected mitochondrial disorders: cohort study.

    Get PDF
    OBJECTIVE: To determine whether whole genome sequencing can be used to define the molecular basis of suspected mitochondrial disease. DESIGN: Cohort study. SETTING: National Health Service, England, including secondary and tertiary care. PARTICIPANTS: 345 patients with suspected mitochondrial disorders recruited to the 100 000 Genomes Project in England between 2015 and 2018. INTERVENTION: Short read whole genome sequencing was performed. Nuclear variants were prioritised on the basis of gene panels chosen according to phenotypes, ClinVar pathogenic/likely pathogenic variants, and the top 10 prioritised variants from Exomiser. Mitochondrial DNA variants were called using an in-house pipeline and compared with a list of pathogenic variants. Copy number variants and short tandem repeats for 13 neurological disorders were also analysed. American College of Medical Genetics guidelines were followed for classification of variants. MAIN OUTCOME MEASURE: Definite or probable genetic diagnosis. RESULTS: A definite or probable genetic diagnosis was identified in 98/319 (31%) families, with an additional 6 (2%) possible diagnoses. Fourteen of the diagnoses (4% of the 319 families) explained only part of the clinical features. A total of 95 different genes were implicated. Of 104 families given a diagnosis, 39 (38%) had a mitochondrial diagnosis and 65 (63%) had a non-mitochondrial diagnosis. CONCLUSION: Whole genome sequencing is a useful diagnostic test in patients with suspected mitochondrial disorders, yielding a diagnosis in a further 31% after exclusion of common causes. Most diagnoses were non-mitochondrial disorders and included developmental disorders with intellectual disability, epileptic encephalopathies, other metabolic disorders, cardiomyopathies, and leukodystrophies. These would have been missed if a targeted approach was taken, and some have specific treatments

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p&lt;0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p&lt;0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p&lt;0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Children must be protected from the tobacco industry's marketing tactics.

    Get PDF

    Development and validation of a targeted gene sequencing panel for application to disparate cancers

    Get PDF
    Next generation sequencing has revolutionised genomic studies of cancer, having facilitated the development of precision oncology treatments based on a tumour’s molecular profile. We aimed to develop a targeted gene sequencing panel for application to disparate cancer types with particular focus on tumours of the head and neck, plus test for utility in liquid biopsy. The final panel designed through Roche/Nimblegen combined 451 cancer-associated genes (2.01 Mb target region). 136 patient DNA samples were collected for performance and application testing. Panel sensitivity and precision were measured using well-characterised DNA controls (n = 47), and specificity by Sanger sequencing of the Aryl Hydrocarbon Receptor Interacting Protein (AIP) gene in 89 patients. Assessment of liquid biopsy application employed a pool of synthetic circulating tumour DNA (ctDNA). Library preparation and sequencing were conducted on Illumina-based platforms prior to analysis with our accredited (ISO15189) bioinformatics pipeline. We achieved a mean coverage of 395x, with sensitivity and specificity of >99% and precision of >97%. Liquid biopsy revealed detection to 1.25% variant allele frequency. Application to head and neck tumours/cancers resulted in detection of mutations aligned to published databases. In conclusion, we have developed an analytically-validated panel for application to cancers of disparate types with utility in liquid biopsy

    Implementation of corticosteroids in treating COVID-19 in the ISARIC WHO Clinical Characterisation Protocol UK:prospective observational cohort study

    Get PDF
    BACKGROUND: Dexamethasone was the first intervention proven to reduce mortality in patients with COVID-19 being treated in hospital. We aimed to evaluate the adoption of corticosteroids in the treatment of COVID-19 in the UK after the RECOVERY trial publication on June 16, 2020, and to identify discrepancies in care. METHODS: We did an audit of clinical implementation of corticosteroids in a prospective, observational, cohort study in 237 UK acute care hospitals between March 16, 2020, and April 14, 2021, restricted to patients aged 18 years or older with proven or high likelihood of COVID-19, who received supplementary oxygen. The primary outcome was administration of dexamethasone, prednisolone, hydrocortisone, or methylprednisolone. This study is registered with ISRCTN, ISRCTN66726260. FINDINGS: Between June 17, 2020, and April 14, 2021, 47 795 (75·2%) of 63 525 of patients on supplementary oxygen received corticosteroids, higher among patients requiring critical care than in those who received ward care (11 185 [86·6%] of 12 909 vs 36 415 [72·4%] of 50 278). Patients 50 years or older were significantly less likely to receive corticosteroids than those younger than 50 years (adjusted odds ratio 0·79 [95% CI 0·70–0·89], p=0·0001, for 70–79 years; 0·52 [0·46–0·58], p80 years), independent of patient demographics and illness severity. 84 (54·2%) of 155 pregnant women received corticosteroids. Rates of corticosteroid administration increased from 27·5% in the week before June 16, 2020, to 75–80% in January, 2021. INTERPRETATION: Implementation of corticosteroids into clinical practice in the UK for patients with COVID-19 has been successful, but not universal. Patients older than 70 years, independent of illness severity, chronic neurological disease, and dementia, were less likely to receive corticosteroids than those who were younger, as were pregnant women. This could reflect appropriate clinical decision making, but the possibility of inequitable access to life-saving care should be considered. FUNDING: UK National Institute for Health Research and UK Medical Research Council
    corecore