140 research outputs found

    An Interactive Game with Virtual Reality Immersion to Improve Cultural Sensitivity in Healthcare

    Get PDF
    Purpose: Biased perceptions of individuals who are not part of one’s in-groups tend to be negative and habitual. Because health care professionals are no less susceptible to biases than are others, the adverse impact of biases on marginalized populations in health care warrants continued attention and amelioration. Method: Two characters, a Syrian refugee with limited English proficiency and a black pregnant woman with a history of opioid use disorder, were developed for an online training simulation that includes an interactive life course experience focused on social determinants of health, and a clinical encounter in a community health center utilizing virtual reality immersion. Pre- and post-survey data were obtained from 158 health professionals who completed the simulation. Results: Post-simulation data indicated increased feelings of compassion toward the patient and decreased expectations about how difficult future encounters with the patient would be. With respect to attribution, after the simulation participants were less inclined to view the patient as primarily responsible for their situation, suggesting less impact of the fundamental attribution error. Conclusion: This training simulation aimed to utilize components of evidence-based prejudice habit breaking interventions, such as learning more about an individual’s life experience to help minimize filling in gaps with stereotyped assumptions. Although training simulations cannot fully replicate or replace the advantages that come with real-world experience, they can heighten awareness in the increase of increasing the cultural sensitivity of clinicians in health care professions for improving health equity

    Validation of clinical acceptability of deep-learning-based automated segmentation of organs-at-risk for head-and-neck radiotherapy treatment planning

    Get PDF
    IntroductionOrgan-at-risk segmentation for head and neck cancer radiation therapy is a complex and time-consuming process (requiring up to 42 individual structure, and may delay start of treatment or even limit access to function-preserving care. Feasibility of using a deep learning (DL) based autosegmentation model to reduce contouring time without compromising contour accuracy is assessed through a blinded randomized trial of radiation oncologists (ROs) using retrospective, de-identified patient data.MethodsTwo head and neck expert ROs used dedicated time to create gold standard (GS) contours on computed tomography (CT) images. 445 CTs were used to train a custom 3D U-Net DL model covering 42 organs-at-risk, with an additional 20 CTs were held out for the randomized trial. For each held-out patient dataset, one of the eight participant ROs was randomly allocated to review and revise the contours produced by the DL model, while another reviewed contours produced by a medical dosimetry assistant (MDA), both blinded to their origin. Time required for MDAs and ROs to contour was recorded, and the unrevised DL contours, as well as the RO-revised contours by the MDAs and DL model were compared to the GS for that patient.ResultsMean time for initial MDA contouring was 2.3 hours (range 1.6-3.8 hours) and RO-revision took 1.1 hours (range, 0.4-4.4 hours), compared to 0.7 hours (range 0.1-2.0 hours) for the RO-revisions to DL contours. Total time reduced by 76% (95%-Confidence Interval: 65%-88%) and RO-revision time reduced by 35% (95%-CI,-39%-91%). All geometric and dosimetric metrics computed, agreement with GS was equivalent or significantly greater (p<0.05) for RO-revised DL contours compared to the RO-revised MDA contours, including volumetric Dice similarity coefficient (VDSC), surface DSC, added path length, and the 95%-Hausdorff distance. 32 OARs (76%) had mean VDSC greater than 0.8 for the RO-revised DL contours, compared to 20 (48%) for RO-revised MDA contours, and 34 (81%) for the unrevised DL OARs.ConclusionDL autosegmentation demonstrated significant time-savings for organ-at-risk contouring while improving agreement with the institutional GS, indicating comparable accuracy of DL model. Integration into the clinical practice with a prospective evaluation is currently underway

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Microbiological testing of adults hospitalised with community-acquired pneumonia: An international study

    Get PDF
    This study aimed to describe real-life microbiological testing of adults hospitalised with community-acquired pneumonia (CAP) and to assess concordance with the 2007 Infectious Diseases Society of America (IDSA)/American Thoracic Society (ATS) and 2011 European Respiratory Society (ERS) CAP guidelines. This was a cohort study based on the Global Initiative for Methicillin-resistant Staphylococcus aureus Pneumonia (GLIMP) database, which contains point-prevalence data on adults hospitalised with CAP across 54 countries during 2015. In total, 3702 patients were included. Testing was performed in 3217 patients, and included blood culture (71.1%), sputum culture (61.8%), Legionella urinary antigen test (30.1%), pneumococcal urinary antigen test (30.0%), viral testing (14.9%), acute-phase serology (8.8%), bronchoalveolar lavage culture (8.4%) and pleural fluid culture (3.2%). A pathogen was detected in 1173 (36.5%) patients. Testing attitudes varied significantly according to geography and disease severity. Testing was concordant with IDSA/ATS and ERS guidelines in 16.7% and 23.9% of patients, respectively. IDSA/ATS concordance was higher in Europe than in North America (21.5% versus 9.8%; p<0.01), while ERS concordance was higher in North America than in Europe (33.5% versus 19.5%; p<0.01). Testing practices of adults hospitalised with CAP varied significantly by geography and disease severity. There was a wide discordance between real-life testing practices and IDSA/ATS/ERS guideline recommendations

    Prevalence and etiology of community-acquired pneumonia in immunocompromised patients

    Get PDF
    Background. The correct management of immunocompromised patients with pneumonia is debated. We evaluated the prevalence, risk factors, and characteristics of immunocompromised patients coming from the community with pneumonia. Methods. We conducted a secondary analysis of an international, multicenter study enrolling adult patients coming from the community with pneumonia and hospitalized in 222 hospitals in 54 countries worldwide. Risk factors for immunocompromise included AIDS, aplastic anemia, asplenia, hematological cancer, chemotherapy, neutropenia, biological drug use, lung transplantation, chronic steroid use, and solid tumor. Results. At least 1 risk factor for immunocompromise was recorded in 18% of the 3702 patients enrolled. The prevalences of risk factors significantly differed across continents and countries, with chronic steroid use (45%), hematological cancer (25%), and chemotherapy (22%) the most common. Among immunocompromised patients, community-acquired pneumonia (CAP) pathogens were the most frequently identified, and prevalences did not differ from those in immunocompetent patients. Risk factors for immunocompromise were independently associated with neither Pseudomonas aeruginosa nor non\u2013community-acquired bacteria. Specific risk factors were independently associated with fungal infections (odds ratio for AIDS and hematological cancer, 15.10 and 4.65, respectively; both P = .001), mycobacterial infections (AIDS; P = .006), and viral infections other than influenza (hematological cancer, 5.49; P < .001). Conclusions. Our findings could be considered by clinicians in prescribing empiric antibiotic therapy for CAP in immunocompromised patients. Patients with AIDS and hematological cancer admitted with CAP may have higher prevalences of fungi, mycobacteria, and noninfluenza viruses

    Burden and risk factors for Pseudomonas aeruginosa community-acquired pneumonia:a Multinational Point Prevalence Study of Hospitalised Patients

    Get PDF
    Pseudornonas aeruginosa is a challenging bacterium to treat due to its intrinsic resistance to the antibiotics used most frequently in patients with community-acquired pneumonia (CAP). Data about the global burden and risk factors associated with P. aeruginosa-CAP are limited. We assessed the multinational burden and specific risk factors associated with P. aeruginosa-CAP. We enrolled 3193 patients in 54 countries with confirmed diagnosis of CAP who underwent microbiological testing at admission. Prevalence was calculated according to the identification of P. aeruginosa. Logistic regression analysis was used to identify risk factors for antibiotic-susceptible and antibiotic-resistant P. aeruginosa-CAP. The prevalence of P. aeruginosa and antibiotic-resistant P. aeruginosa-CAP was 4.2% and 2.0%, respectively. The rate of P. aeruginosa CAP in patients with prior infection/colonisation due to P. aeruginosa and at least one of the three independently associated chronic lung diseases (i.e. tracheostomy, bronchiectasis and/or very severe chronic obstructive pulmonary disease) was 67%. In contrast, the rate of P. aeruginosa-CAP was 2% in patients without prior P. aeruginosa infection/colonisation and none of the selected chronic lung diseases. The multinational prevalence of P. aeruginosa-CAP is low. The risk factors identified in this study may guide healthcare professionals in deciding empirical antibiotic coverage for CAP patients

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    The impact of surgical delay on resectability of colorectal cancer: An international prospective cohort study

    Get PDF
    AIM: The SARS-CoV-2 pandemic has provided a unique opportunity to explore the impact of surgical delays on cancer resectability. This study aimed to compare resectability for colorectal cancer patients undergoing delayed versus non-delayed surgery. METHODS: This was an international prospective cohort study of consecutive colorectal cancer patients with a decision for curative surgery (January-April 2020). Surgical delay was defined as an operation taking place more than 4 weeks after treatment decision, in a patient who did not receive neoadjuvant therapy. A subgroup analysis explored the effects of delay in elective patients only. The impact of longer delays was explored in a sensitivity analysis. The primary outcome was complete resection, defined as curative resection with an R0 margin. RESULTS: Overall, 5453 patients from 304 hospitals in 47 countries were included, of whom 6.6% (358/5453) did not receive their planned operation. Of the 4304 operated patients without neoadjuvant therapy, 40.5% (1744/4304) were delayed beyond 4 weeks. Delayed patients were more likely to be older, men, more comorbid, have higher body mass index and have rectal cancer and early stage disease. Delayed patients had higher unadjusted rates of complete resection (93.7% vs. 91.9%, P = 0.032) and lower rates of emergency surgery (4.5% vs. 22.5%, P < 0.001). After adjustment, delay was not associated with a lower rate of complete resection (OR 1.18, 95% CI 0.90-1.55, P = 0.224), which was consistent in elective patients only (OR 0.94, 95% CI 0.69-1.27, P = 0.672). Longer delays were not associated with poorer outcomes. CONCLUSION: One in 15 colorectal cancer patients did not receive their planned operation during the first wave of COVID-19. Surgical delay did not appear to compromise resectability, raising the hypothesis that any reduction in long-term survival attributable to delays is likely to be due to micro-metastatic disease

    Global, regional, and national burden of neurological disorders, 1990–2016 : a systematic analysis for the Global Burden of Disease Study 2016

    Get PDF
    Background: Neurological disorders are increasingly recognised as major causes of death and disability worldwide. The aim of this analysis from the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2016 is to provide the most comprehensive and up-to-date estimates of the global, regional, and national burden from neurological disorders. Methods: We estimated prevalence, incidence, deaths, and disability-adjusted life-years (DALYs; the sum of years of life lost [YLLs] and years lived with disability [YLDs]) by age and sex for 15 neurological disorder categories (tetanus, meningitis, encephalitis, stroke, brain and other CNS cancers, traumatic brain injury, spinal cord injury, Alzheimer's disease and other dementias, Parkinson's disease, multiple sclerosis, motor neuron diseases, idiopathic epilepsy, migraine, tension-type headache, and a residual category for other less common neurological disorders) in 195 countries from 1990 to 2016. DisMod-MR 2.1, a Bayesian meta-regression tool, was the main method of estimation of prevalence and incidence, and the Cause of Death Ensemble model (CODEm) was used for mortality estimation. We quantified the contribution of 84 risks and combinations of risk to the disease estimates for the 15 neurological disorder categories using the GBD comparative risk assessment approach. Findings: Globally, in 2016, neurological disorders were the leading cause of DALYs (276 million [95% UI 247–308]) and second leading cause of deaths (9·0 million [8·8–9·4]). The absolute number of deaths and DALYs from all neurological disorders combined increased (deaths by 39% [34–44] and DALYs by 15% [9–21]) whereas their age-standardised rates decreased (deaths by 28% [26–30] and DALYs by 27% [24–31]) between 1990 and 2016. The only neurological disorders that had a decrease in rates and absolute numbers of deaths and DALYs were tetanus, meningitis, and encephalitis. The four largest contributors of neurological DALYs were stroke (42·2% [38·6–46·1]), migraine (16·3% [11·7–20·8]), Alzheimer's and other dementias (10·4% [9·0–12·1]), and meningitis (7·9% [6·6–10·4]). For the combined neurological disorders, age-standardised DALY rates were significantly higher in males than in females (male-to-female ratio 1·12 [1·05–1·20]), but migraine, multiple sclerosis, and tension-type headache were more common and caused more burden in females, with male-to-female ratios of less than 0·7. The 84 risks quantified in GBD explain less than 10% of neurological disorder DALY burdens, except stroke, for which 88·8% (86·5–90·9) of DALYs are attributable to risk factors, and to a lesser extent Alzheimer's disease and other dementias (22·3% [11·8–35·1] of DALYs are risk attributable) and idiopathic epilepsy (14·1% [10·8–17·5] of DALYs are risk attributable). Interpretation: Globally, the burden of neurological disorders, as measured by the absolute number of DALYs, continues to increase. As populations are growing and ageing, and the prevalence of major disabling neurological disorders steeply increases with age, governments will face increasing demand for treatment, rehabilitation, and support services for neurological disorders. The scarcity of established modifiable risks for most of the neurological burden demonstrates that new knowledge is required to develop effective prevention and treatment strategies. Funding: Bill & Melinda Gates Foundation

    Comprehensive Rare Variant Analysis via Whole-Genome Sequencing to Determine the Molecular Pathology of Inherited Retinal Disease

    Get PDF
    Inherited retinal disease is a common cause of visual impairment and represents a highly heterogeneous group of conditions. Here, we present findings from a cohort of 722 individuals with inherited retinal disease, who have had whole-genome sequencing (n = 605), whole-exome sequencing (n = 72), or both (n = 45) performed, as part of the NIHR-BioResource Rare Diseases research study. We identified pathogenic variants (single-nucleotide variants, indels, or structural variants) for 404/722 (56%) individuals. Whole-genome sequencing gives unprecedented power to detect three categories of pathogenic variants in particular: structural variants, variants in GC-rich regions, which have significantly improved coverage compared to whole-exome sequencing, and variants in non-coding regulatory regions. In addition to previously reported pathogenic regulatory variants, we have identified a previously unreported pathogenic intronic variant in CHM\textit{CHM} in two males with choroideremia. We have also identified 19 genes not previously known to be associated with inherited retinal disease, which harbor biallelic predicted protein-truncating variants in unsolved cases. Whole-genome sequencing is an increasingly important comprehensive method with which to investigate the genetic causes of inherited retinal disease.This work was supported by The National Institute for Health Research England (NIHR) for the NIHR BioResource – Rare Diseases project (grant number RG65966). The Moorfields Eye Hospital cohort of patients and clinical and imaging data were ascertained and collected with the support of grants from the National Institute for Health Research Biomedical Research Centre at Moorfields Eye Hospital, National Health Service Foundation Trust, and UCL Institute of Ophthalmology, Moorfields Eye Hospital Special Trustees, Moorfields Eye Charity, the Foundation Fighting Blindness (USA), and Retinitis Pigmentosa Fighting Blindness. M.M. is a recipient of an FFB Career Development Award. E.M. is supported by UCLH/UCL NIHR Biomedical Research Centre. F.L.R. and D.G. are supported by Cambridge NIHR Biomedical Research Centre
    corecore