299 research outputs found

    The feasibility and acceptability of an early intervention in primary care to prevent chronic fatigue syndrome (CFS) in adults:randomised controlled trial

    Get PDF
    Background Chronic fatigue syndrome (CFS, also known as myalgic encephalomyelitis (ME)) is defined as fatigue that is disabling, is accompanied by additional symptoms and persists for ≄ 4 months. Treatment of CFS/ME aims to help patients manage their symptoms and make lifestyle adjustments. We do not know whether intervening early in primary care (< 4 months after onset of fatigue) can prevent the development of CFS/ME. Methods This was a feasibility randomised controlled trial with adults (age ≄ 18 years) comparing usual care with usual care plus an early intervention (EI; a combination of psycho-education and cognitive behavioural therapy, CBT). This study took place in fourteen primary care practices in Bristol, England and aimed to identify issues around recruitment and retention for a full-scale trial. It was not powered to support statistical analysis of differences in outcomes. Integrated qualitative methodology was used to explore the feasibility and acceptability of recruitment and randomisation to the intervention. Results Forty-four patients were recruited (1 August 2012–November 28, 2013), falling short of our predicted recruitment rate of 100 patients in 8 months. Qualitative data from GPs showed recruitment was not feasible because it was difficult to identify potential participants within 4 months of symptom onset. Some referring GPs felt screening investigations recommended by NICE were unnecessary, and they had difficulty finding patients who met the eligibility criteria. Qualitative data from some participant interviews suggested that the intervention was not acceptable in its current format. Although the majority of participants found parts of the intervention acceptable, many reported one or more problems with acceptability. Participants who discontinued the intervention or found it problematic did not relate to the therapeutic model, disliked telephone consultations or found self-reflection challenging. Conclusions A randomised controlled trial to test an early intervention for fatigue in adults in primary care is not feasible using this intervention and recruitment strategy

    Mountain hare transcriptome and diagnostic markers as resources to monitor hybridization with European hares

    Get PDF
    We report the first mountain hare (Lepus timidus) transcriptome, produced by de novo assembly of RNA-sequencing reads. Data were obtained from eight specimens sampled in two localities, Alps and Ireland. The mountain hare tends to be replaced by the invading European hare (Lepus europaeus) in their numerous contact zones where the species hybridize, which affects their gene pool to a yet unquantified degree. We characterize and annotate the mountain hare transcriptome, detect polymorphism in the two analysed populations and use previously published data on the European hare (three specimens, representing the European lineage of the species) to identify 4 672 putative diagnostic sites between the species. A subset of 85 random independent SNPs was successfully validated using PCR and Sanger sequencing. These valuable genomic resources can be used to design tools to assess population status and monitor hybridization between species

    Does recreational drug use influence survival and morbidity associated with laryngeal cancer

    Get PDF
    Background: The use of opioids is considered a risk factor for laryngeal cancer. A retrospective study was performed to explore the relationship between recreational drug exposure and laryngeal cancer.Methods: Patients diagnosed between the 1st of January 2013 and the 31st of December 2017 using ICD-10 CD-32 coding were identified from the Head and Neck Multidisciplinary Team database. We divided the study population into two cohorts (RD and non-RD) and compared the demographics, morbidity, and outcomes of these two populations. In addition, we performed case-matched analysis to control for potential confounding factors including gender, alcohol use and cigarette smoking.Findings: 329 patients in Glasgow, Scotland were included with a mean age of 64.96 ± 10.94 and a follow-up of 24 ± 13.91 months. Of these, 39 reported recreational drug use (RD). RD was associated with younger age (53.0 vs. 66.6, p<0.001) at diagnosis with laryngeal cancer. A greater proportion of tumours occurred in the supraglottic subsite (p=0.041). Furthermore, these patients were more likely to undergo tracheostomy (RR=2.50, 95% CI: 1.41-4.44, p=0.008) and laryngectomy (RR=2.25, 95% CI: 1.57-3.21, p<0.001). Recreational drug users were more likely to require enteral feeding support (RR= 1.44, 95% CI: 1.13-1.84, p=0.02) during oncological treatment. No survival differences were noted at 1, 2, or 3-years (plog-rank=0.83). Case matched analysis correcting for smoking, alcohol and gender confirmed that recreational drug users were younger at diagnosis with a predilection for the supraglottic subsite.Conclusion: Recreational drug use is associated with an increased burden of disease and morbidity in laryngeal cancer. We suggest that clinicians view recreational drug exposure as a red flag in those with suspected laryngeal cancer regardless of patient age

    How do frontline staff use patient experience data for service improvement? Findings from an ethnographic case study evaluation

    Get PDF
    Funding Information: The authors would like to thank the following: the ward teams and senior management teams at the six participating case study sites. Neil Churchill, Angela Coulter, Ray Fitzpatrick, Crispin Jenkinson, Trish Greenhalgh and Sian Rees who were co-investigators on the study, contributing to the original design and conduct of the study. Esther Ainley and Steve Sizmur from Picker Institute Europe, who contributed to data collection and analysis. Prof. John Gabbay and Prof. Andr? le May, University of Southampton, for facilitating the learning community meetings. The members of the lay advisory panel: Barbara Bass, Tina Lonhgurst, Georgina McMasters, Carol Munt, Gillian Richards, Tracey Richards, Gordon Sturmey, Karen Swaffield, Ann Tomlime and Paul Whitehouse. The external members of the Study Steering Committee: Joanna Foster, Tony Berendt, Caroline Shuldham, Joanna Goodrich, Leigh Kendall, Bernard Gudgin and Manoj Mistry. At the time of conducting the research LL and SP were employed by the University of Oxford. Preliminary findings from the study have been presented publicly at the following conferences: European Association for Communication in Healthcare 2016; The International Society for Quality in Healthcare 2017; Health Services Research UK 2017; Medical Sociology 2018. The views expressed are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health and Social Care. Publisher Copyright: © The Author(s) 2020. Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Peer reviewedPublisher PD

    Reservoir theory for studying the geochemical evolution of soils

    Get PDF
    [1] Linking mineral weathering rates measured in the laboratory to those measured at the landscape scale is problematic. In laboratory studies, collections of minerals are exposed to the same weathering environment over a fixed amount of time. In natural soils, minerals enter, are mixed within, and leave the soil via erosion and dissolution/leaching over the course of soil formation. The key to correctly comparing mineral weathering studies from laboratory experiments and field soils is to consistently define time. To do so, we have used reservoir theory. Residence time of a mineral, as defined by reservoir theory, describes the time length between the moment that a mineral enters (via soil production) and leaves (via erosion and dissolution/leaching) the soil. Age of a mineral in a soil describes how long the mineral has been present in the soil. Turnover time describes the time needed to deplete a species of minerals in the soil by sediment efflux from the soil. These measures of time are found to be sensitive to not only sediment flux, which controls the mineral fluxes in and out of a soil, but also internal soil mixing that controls the probability that a mineral survives erosion. When these measures of time are combined with published data suggesting that a mineral’s dissolution reaction rate decreases during the course of weathering, we find that internal soil mixing, by partially controlling the age distribution of minerals within a soil, might significantly alter the soil’s mass loss rate via chemical weathering. Citation: Mudd, S. M., and K. Yoo (2010), Reservoir theory for studying the geochemical evolution of soils, J. Geophys. Res., 115, F03030, doi:10.1029/2009JF001591. 1

    Understanding how front-line staff use patient experience data for service improvement: an exploratory case study evaluation

    Get PDF
    Background and aim: The NHS collects a large number of data on patient experience, but there are concerns that it does not use this information to improve care. This study explored whether or not and how front-line staff use patient experience data for service improvement. Methods: Phase 1 – secondary analysis of existing national survey data, and a new survey of NHS trust patient experience leads. Phase 2 – case studies in six medical wards using ethnographic observations and interviews. A baseline and a follow-up patient experience survey were conducted on each ward, supplemented by in-depth interviews. Following an initial learning community to discuss approaches to learning from and improving patient experience, teams developed and implemented their own interventions. Emerging findings from the ethnographic research were shared formatively. Phase 3 – dissemination, including an online guide for NHS staff. Key findings: Phase 1 – an analysis of staff and inpatient survey results for all 153 acute trusts in England was undertaken, and 57 completed surveys were obtained from patient experience leads. The most commonly cited barrier to using patient experience data was a lack of staff time to examine the data (75%), followed by cost (35%), lack of staff interest/support (21%) and too many data (21%). Trusts were grouped in a matrix of high, medium and low performance across several indices to inform case study selection. Phase 2 – in every site, staff undertook quality improvement projects using a range of data sources. The number and scale of these varied, as did the extent to which they drew directly on patient experience data, and the extent of involvement of patients. Before-and-after surveys of patient experience showed little statistically significant change. Making sense of patient experience ‘data’ Staff were engaged in a process of sense-making from a range of formal and informal sources of intelligence. Survey data remain the most commonly recognised and used form of data. ‘Soft’ intelligence, such as patient stories, informal comments and daily ward experiences of staff, patients and family, also fed into staff’s improvement plans, but they and the wider organisation may not recognise these as ‘data’. Staff may lack confidence in using them for improvement. Staff could not always point to a specific source of patient experience ‘data’ that led to a particular project, and sometimes reported acting on what they felt they already knew needed changing. Staff experience as a route to improving patient experience Some sites focused on staff motivation and experience on the assumption that this would improve patient experience through indirect cultural and attitudinal change, and by making staff feel empowered and supported. Staff participants identified several potential interlinked mechanisms: (1) motivated staff provide better care, (2) staff who feel taken seriously are more likely to be motivated, (3) involvement in quality improvement is itself motivating and (4) improving patient experience can directly improve staff experience. ‘Team-based capital’ in NHS settings We propose ‘team-based capital’ in NHS settings as a key mechanism between the contexts in our case studies and observed outcomes. ‘Capital’ is the extent to which staff command varied practical, organisational and social resources that enable them to set agendas, drive process and implement change. These include not just material or economic resources, but also status, time, space, relational networks and influence. Teams involving a range of clinical and non-clinical staff from multiple disciplines and levels of seniority could assemble a greater range of capital; progress was generally greater when the team included individuals from the patient experience office. Phase 3 – an online guide for NHS staff was produced in collaboration with The Point of Care Foundation. Limitations: This was an ethnographic study of how and why NHS front-line staff do or do not use patient experience data for quality improvement. It was not designed to demonstrate whether particular types of patient experience data or quality improvement approaches are more effective than others. Future research: Developing and testing interventions focused specifically on staff but with patient experience as the outcome, with a health economics component. Studies focusing on the effect of team composition and diversity on the impact and scope of patient-centred quality improvement. Research into using unstructured feedback and soft intelligence

    Divergent tree seedling communities indicate different trajectories of change among rainforest remnants

    Get PDF
    Aim: To examine plant community composition within rain forest remnants, and whether communities in different fragments follow similar trajectories of change in composition. We investigate whether plant communities in rain forest fragments either diverge from, or become more similar to, plant communities in other fragments, in order to understand the biodiversity value of forest fragments. Location: Rain forest fragments embedded within agricultural landscapes in Sabah, Malaysian Borneo. Methods: We examined 14 forest fragments (39–120,000 ha) and five sites in continuous forest, and compared pre-isolation (trees >5 cm dbh) and post-isolation (seedlings <1 cm dbh) plant community composition. We used Chao-Sþrensen dissimilarity metric to compute beta diversity between all pairwise combinations of sites, and then used Non-Metric Multidimensional Scaling to reduce 18 pairwise values per site to a single site value, which we used to test whether fragment area and/or isolation are associated with changes in plant communities. We compare analyses for trees and seedlings, and whether community changes arise from recruitment failure. Results: Seedlings in fragments have diverged most from other communities, and divergence was greatest between seedling communities in small fragments, which have not only diverged more from tree communities in the same fragment, but also from seedling communities at other sites. This finding is partly associated with recruitment failure: the number of genera represented by both trees and seedlings is positively associated with site area. Main conclusions: Seedling communities are diverging in forest remnants, associated primarily with reductions in fragment area, whilst tree communities have not diverged, possibly due to extinction debts. Divergence is likely to continue as seedling cohorts mature, resulting in communities in fragments following different trajectories of change. Individual plant communities in each fragment may become impoverished, but they can support different communities of plants and hence contribute to landscape-scale diversity

    The Innate Immune Receptor NLRX1 Functions as a Tumor Suppressor by Reducing Colon Tumorigenesis and Key Tumor-Promoting Signals

    Get PDF
    NOD-like receptor (NLR) proteins are intracellular innate immune sensors/receptors that regulate immunity. This work shows that NLRX1 serves as a tumor suppressor in colitis-associated cancer (CAC) and sporadic colon cancer by keeping key tumor promoting pathways in check. Nlrx1(-/-) mice were highly susceptible to CAC, showing increases in key cancer-promoting pathways including nuclear factor ÎșB (NF-ÎșB), mitogen-activated protein kinase (MAPK), signal transducer and activator of transcription 3 (STAT3), and interleukin 6 (IL-6). The tumor-suppressive function of NLRX1 originated primarily from the non-hematopoietic compartment. This prompted an analysis of NLRX1 function in the Apc(min/+) genetic model of sporadic gastrointestinal cancer. NLRX1 attenuated Apc(min/+) colon tumorigenesis, cellular proliferation, NF-ÎșB, MAPK, STAT3 activation, and IL-6 levels. Application of anti-interleukin 6 receptor (IL6R) antibody therapy reduced tumor burden, increased survival, and reduced STAT3 activation in Nlrx1(-/-)Apc(min/+) mice. As an important clinical correlate, human colon cancer samples expressed lower levels of NLRX1 than healthy controls in multiple patient cohorts. These data implicate anti-IL6R as a potential personalized therapy for colon cancers with reduced NLRX1

    MAVS-dependent host species range and pathogenicity of human hepatitis A virus

    Get PDF
    Although hepatotropic viruses are important causes of human disease, the intrahepatic immune response to hepatitis viruses is poorly understood due to a lack of tractable small animal models. Here we describe a murine model of hepatitis A virus (HAV) infection that recapitulates critical features of type A hepatitis in humans. We demonstrate that the capacity of HAV to evade MAVS-mediated type I interferon responses defines its host species range. HAV-induced liver injury was associated with interferon-independent intrinsic hepatocellular apoptosis and hepatic inflammation that unexpectedly results from MAVS and IRF3/7 signaling. This murine model thus reveals a previously undefined link between innate immune responses to virus infection and acute liver injury, providing a new paradigm for viral pathogenesis in the liver
    • 

    corecore