532 research outputs found

    The Unmet Need for Interpreting Provision in UK Primary Care

    Get PDF
    Background: With increasing globalisation, the challenges of providing accessible and safe healthcare to all are great. Studies show that there are substantial numbers of people who are not fluent in English to a level where they can make best use of health services. We examined how health professionals manage language barriers in a consultation.Methods and Findings: This was a cross-sectional study in 41 UK general practices. Health professionals completed a proforma for a randomly allocated consultation session. Seventy-seven (63%) practitioners responded, from 41(59%) practices. From 1008 consultations, 555 involved patients who did not have English as a first language; 710 took place in English; 222 were in other languages, the practitioner either communicating with the patient in their own language/using an alternative language. Seven consultations were in a mixture of English/patient's own language. Patients' first languages numbered 37 (apart from English), in contrast to health practitioners, who declared at least a basic level of proficiency in 22 languages other than English. The practitioner's reported proficiency in the language used was at a basic level in 24 consultations, whereas in 21, they reported having no proficiency at all. In 57 consultations, a relative/friend interpreted and in 6, a bilingual member of staff/community worker was used. Only in 6 cases was a professional interpreter booked. The main limitation was that only one random session was selected and assessment of patient/professional fluency in English was subjective.Conclusions: It would appear that professional interpreters are under-used in relation to the need for them, with bilingual staff/family and friends being used commonly. In many cases where the patient spoke little/no English, the practitioner consulted in the patient's language but this approach was also used where reported practitioner proficiency was low. Further research in different setting is needed to substantiate these findings

    Whisker Movements Reveal Spatial Attention: A Unified Computational Model of Active Sensing Control in the Rat

    Get PDF
    Spatial attention is most often investigated in the visual modality through measurement of eye movements, with primates, including humans, a widely-studied model. Its study in laboratory rodents, such as mice and rats, requires different techniques, owing to the lack of a visual fovea and the particular ethological relevance of orienting movements of the snout and the whiskers in these animals. In recent years, several reliable relationships have been observed between environmental and behavioural variables and movements of the whiskers, but the function of these responses, as well as how they integrate, remains unclear. Here, we propose a unifying abstract model of whisker movement control that has as its key variable the region of space that is the animal's current focus of attention, and demonstrate, using computer-simulated behavioral experiments, that the model is consistent with a broad range of experimental observations. A core hypothesis is that the rat explicitly decodes the location in space of whisker contacts and that this representation is used to regulate whisker drive signals. This proposition stands in contrast to earlier proposals that the modulation of whisker movement during exploration is mediated primarily by reflex loops. We go on to argue that the superior colliculus is a candidate neural substrate for the siting of a head-centred map guiding whisker movement, in analogy to current models of visual attention. The proposed model has the potential to offer a more complete understanding of whisker control as well as to highlight the potential of the rodent and its whiskers as a tool for the study of mammalian attention

    Cancer Treatment and Bone Health

    Get PDF
    Considerable advances in oncology over recent decades have led to improved survival, while raising concerns about long-term consequences of anticancer treatments. In patients with breast or prostate malignancies, bone health is a major issue due to the high risk of bone metastases and the frequent prolonged use of hormone therapies that alter physiological bone turnover, leading to increased fracture risk. Thus, the onset of cancer treatment-induced bone loss (CTIBL) should be considered by clinicians and recent guidelines should be routinely applied to these patients. In particular, baseline and periodic follow-up evaluations of bone health parameters enable the identification of patients at high risk of osteoporosis and fractures, which can be prevented by the use of bone-targeting agents (BTAs), calcium and vitamin D supplementation and modifications of lifestyle. This review will focus upon the pathophysiology of breast and prostate cancer treatment-induced bone loss and the most recent evidence about effective preventive and therapeutic strategies

    Risk of tuberculosis in patients with diabetes: population based cohort study using the UK Clinical Practice Research Datalink.

    Get PDF
    BACKGROUND: Previous cohort studies demonstrate diabetes as a risk factor for tuberculosis (TB) disease. Public Health England has identified improved TB control as a priority area and has proposed a primary care-based screening program for latent TB. We investigated the association between diabetes and risk of tuberculosis in a UK General Practice cohort in order to identify potential high-risk groups appropriate for latent TB screening. METHODS: Using data from the UK Clinical Practice Research Datalink we constructed a cohort of patients with incident diabetes. We included 222,731 patients with diabetes diagnosed from 1990-2013 and 1,218,616 controls without diabetes at index date who were matched for age, sex and general practice. The effect of diabetes was explored using a Poisson analysis adjusted for age, ethnicity, body mass index, socioeconomic status, alcohol intake and smoking. We explored the effects of age, diabetes duration and severity. The effects of diabetes on risk of incident TB were explored across strata of chronic disease care defined by cholesterol and blood pressure measurement and influenza vaccination rates. RESULTS: During just under 7 million person-years of follow-up, 969 cases of TB were identified. The incidence of TB was higher amongst patients with diabetes compared with the unexposed group: 16.2 and 13.5 cases per 100,000 person-years, respectively. After adjustment for potential confounders the association between diabetes and TB remained (adjusted RR 1.30, 95 % CI 1.01 to 1.67, P = 0.04). There was no evidence that age, time since diagnosis and severity of diabetes affected the association between diabetes and TB. Diabetes patients with the lowest and highest rates of chronic disease management had a higher risk of TB (P <0.001 for all comparisons). CONCLUSIONS: Diabetes as an independent risk factor is associated with only a modest overall increased risk of TB in our UK General Practice cohort and is unlikely to be sufficient cause to screen for latent TB. Across different consulting patterns, diabetes patients accessing the least amount of chronic disease care are at highest risk for TB.This article presents independent research supported by a National Institute for Health Research (NIHR) In Practice Fellowship to LP (grant number NIHR/IPF/11/05). DAJM received Wellcome Trust funding (grant number 092691/Z/10/Z). LS is supported by a Wellcome Trust Senior Research Fellowship in Clinical Science

    A three-dimensional view of structural changes caused by deactivation of fluid catalytic cracking catalysts

    Get PDF
    Since its commercial introduction three-quarters of a century ago, fluid catalytic cracking has been one of the most important conversion processes in the petroleum industry. In this process, porous composites composed of zeolite and clay crack the heavy fractions in crude oil into transportation fuel and petrochemical feedstocks. Yet, over time the catalytic activity of these composite particles decreases. Here, we report on ptychographic tomography, diffraction, and fluorescence tomography, as well as electron microscopy measurements, which elucidate the structural changes that lead to catalyst deactivation. In combination, these measurements reveal zeolite amorphization and distinct structural changes on the particle exterior as the driving forces behind catalyst deactivation. Amorphization of zeolites, in particular, close to the particle exterior, results in a reduction of catalytic capacity. A concretion of the outermost particle layer into a dense amorphous silica–alumina shell further reduces the mass transport to the active sites within the composite

    Charged-particle distributions at low transverse momentum in √s=13 13 TeV pp interactions measured with the ATLAS detector at the LHC

    Get PDF
    Measurements of distributions of charged particles produced in proton–proton collisions with a centre-of-mass energy of 13 TeV are presented. The data were recorded by the ATLAS detector at the LHC and correspond to an integrated luminosity of 151 μb −1 μb−1 . The particles are required to have a transverse momentum greater than 100 MeV and an absolute pseudorapidity less than 2.5. The charged-particle multiplicity, its dependence on transverse momentum and pseudorapidity and the dependence of the mean transverse momentum on multiplicity are measured in events containing at least two charged particles satisfying the above kinematic criteria. The results are corrected for detector effects and compared to the predictions from several Monte Carlo event generators

    Prognostic model to predict postoperative acute kidney injury in patients undergoing major gastrointestinal surgery based on a national prospective observational cohort study.

    Get PDF
    Background: Acute illness, existing co-morbidities and surgical stress response can all contribute to postoperative acute kidney injury (AKI) in patients undergoing major gastrointestinal surgery. The aim of this study was prospectively to develop a pragmatic prognostic model to stratify patients according to risk of developing AKI after major gastrointestinal surgery. Methods: This prospective multicentre cohort study included consecutive adults undergoing elective or emergency gastrointestinal resection, liver resection or stoma reversal in 2-week blocks over a continuous 3-month period. The primary outcome was the rate of AKI within 7 days of surgery. Bootstrap stability was used to select clinically plausible risk factors into the model. Internal model validation was carried out by bootstrap validation. Results: A total of 4544 patients were included across 173 centres in the UK and Ireland. The overall rate of AKI was 14·2 per cent (646 of 4544) and the 30-day mortality rate was 1·8 per cent (84 of 4544). Stage 1 AKI was significantly associated with 30-day mortality (unadjusted odds ratio 7·61, 95 per cent c.i. 4·49 to 12·90; P < 0·001), with increasing odds of death with each AKI stage. Six variables were selected for inclusion in the prognostic model: age, sex, ASA grade, preoperative estimated glomerular filtration rate, planned open surgery and preoperative use of either an angiotensin-converting enzyme inhibitor or an angiotensin receptor blocker. Internal validation demonstrated good model discrimination (c-statistic 0·65). Discussion: Following major gastrointestinal surgery, AKI occurred in one in seven patients. This preoperative prognostic model identified patients at high risk of postoperative AKI. Validation in an independent data set is required to ensure generalizability

    Overcoming language barriers with foreign-language speaking patients: a survey to investigate intra-hospital variation in attitudes and practices

    Get PDF
    Background Use of available interpreter services by hospital clincial staff is often suboptimal, despite evidence that trained interpreters contribute to quality of care and patient safety. Examination of intra-hospital variations in attitudes and practices regarding interpreter use can contribute to identifying factors that facilitate good practice. The purpose of this study was to describe attitudes, practices and preferences regarding communication with limited French proficiency (LFP) patients, examine how these vary across professions and departments within the hospital, and identify factors associated with good practices. Methods A self-administered questionnaire was mailed to random samples of 700 doctors, 700 nurses and 93 social workers at the Geneva University Hospitals, Switzerland. Results Seventy percent of respondents encounter LFP patients at least once a month, but this varied by department. 66% of respondents said they preferred working with ad hoc interpreters (patient's family and bilingual staff), mainly because these were easier to access. During the 6 months preceding the study, ad hoc interpreters were used at least once by 71% of respondents, and professional interpreters were used at least once by 51%. Overall, only nine percent of respondents had received any training in how and why to work with a trained interpreter. Only 23.2% of respondents said the clinical service in which they currently worked encouraged them to use professional interpreters. Respondents working in services where use of professional interpreters was encouraged were more likely to be of the opinion that the hospital should systematically provide a professional interpreter to LFP patients (40.3%) as compared with those working in a department that discouraged use of professional interpreters (15.5%) and they used professional interpreters more often during the previous 6 months. Conclusion Attitudes and practices regarding communication with LFP patients vary across professions and hospital departments. In order to foster an institution-wide culture conducive to ensuring adequate communication with LFP patients will require both the development of a hospital-wide policy and service-level activities aimed at reinforcing this policy and putting it into practice

    Interferon Regulatory Factor-1 (IRF-1) Shapes Both Innate and CD8+ T Cell Immune Responses against West Nile Virus Infection

    Get PDF
    Interferon regulatory factor (IRF)-1 is an immunomodulatory transcription factor that functions downstream of pathogen recognition receptor signaling and has been implicated as a regulator of type I interferon (IFN)-αβ expression and the immune response to virus infections. However, this role for IRF-1 remains controversial because altered type I IFN responses have not been systemically observed in IRF-1-/- mice. To evaluate the relationship of IRF-1 and immune regulation, we assessed West Nile virus (WNV) infectivity and the host response in IRF-1-/- cells and mice. IRF-1-/- mice were highly vulnerable to WNV infection with enhanced viral replication in peripheral tissues and rapid dissemination into the central nervous system. Ex vivo analysis revealed a cell-type specific antiviral role as IRF-1-/- macrophages supported enhanced WNV replication but infection was unaltered in IRF-1-/- fibroblasts. IRF-1 also had an independent and paradoxical effect on CD8+ T cell expansion. Although markedly fewer CD8+ T cells were observed in naïve animals as described previously, remarkably, IRF-1-/- mice rapidly expanded their pool of WNV-specific cytolytic CD8+ T cells. Adoptive transfer and in vitro proliferation experiments established both cell-intrinsic and cell-extrinsic effects of IRF-1 on the expansion of CD8+ T cells. Thus, IRF-1 restricts WNV infection by modulating the expression of innate antiviral effector molecules while shaping the antigen-specific CD8+ T cell response
    corecore