213 research outputs found

    Causes of death in people with liver cirrhosis in England compared with the general population: a population-based cohort study.

    Get PDF
    OBJECTIVES: There is a need for unbiased estimates of cause-specific mortality by etiology in patients with liver cirrhosis. The aim of this study is to use nationwide linked electronic routine healthcare data from primary and secondary care alongside the national death registry data to report such estimates. METHODS: We identified from the linked Clinical Practice Research Datalink (CPRD) and English Hospital Episode Statistics adults with an incident diagnosis of liver cirrhosis linked to the Office for National Statistics between 1998 and 2009. Age-matched controls from the CPRD general population were selected. We calculated the cumulative incidence (adjusting for competing risks) and excess risk of death by 5 years from diagnosis for different causes of death, stratified by etiology and stage of disease. RESULTS: Five thousand one hundred and eighteen patients with cirrhosis were matched to 152,903 controls. Among compensated patients, the 5-year excess risk of liver-related death was higher than that of any other cause of death for all patients, except those of unspecified etiology. For example, those of alcohol etiology had 30.8% excess risk of liver-related death (95% confidence interval (CI): 27.9%, 33.1%) compared with 9.9% excess risk of non-liver-related death. However, patients of unspecified etiology had a higher excess risk of non-liver-related compared with liver-related death (10.7% vs. 6.7%). This was due to a high excess risk of non-liver neoplasm death (7.7%, 95% CI: 5.9%, 9.5%). All decompensated patients had a higher excess of liver-related mortality than any other cause. CONCLUSIONS: In order to reduce associated mortality among people with liver cirrhosis, patients' care pathways need to be tailored depending on the etiology and stage of the disease

    Socioeconomic variation in the incidence of childhood coeliac disease in the UK.

    Get PDF
    BACKGROUND: Serological studies indicate that evidence of coeliac disease (CD) exists in about 1% of all children, but we lack estimates of current diagnostic patterns among children and how they vary by socioeconomic group. METHODS: We identified all children aged 0-18 years between 1993 and 2012 who were registered with general practices across the UK that contribute to a large population-based general practice database. The incidence of CD was evaluated in each quintile of the Townsend index of deprivation and stratified by age, sex, country and calendar year. RESULTS: Among 2,063,421 children, we identified 1247 CD diagnoses, corresponding to an overall CD incidence of 11.9 per 100,000 person-years, which was similar across the UK countries and higher in girls than in boys. We found a gradient of CD diagnosis across socioeconomic groups, with the rate of diagnosis being 80% higher in children from the least-deprived areas than in those from the most-deprived areas (incident rate ratio 1.80, 95% CI 1.45 to 2.22). This pattern held for both boys and girls and across all ages. Across all four countries of the UK, we found similar associations between CD and socioeconomic status. While CD incidence up to age 2 remained stable over the study period, diagnoses at older ages have almost tripled over the past 20 years. CONCLUSIONS: Children living in less socioeconomically deprived areas in the UK are more likely to be diagnosed with CD. Increased implementation of diagnostic guidelines could result in better case identification in more-deprived areas

    Barriers to initiation of antiretroviral treatment in rural and urban areas of Zambia: a cross-sectional study of cost, stigma, and perceptions about ART

    Get PDF
    BACKGROUND: While the number of HIV-positive patients on antiretroviral therapy (ART) in resource-limited settings has increased dramatically, some patients eligible for treatment do not initiate ART even when it is available to them. Understanding why patients opt out of care, or are unable to opt in, is important to achieving the goal of universal access. METHODS: We conducted a cross-sectional survey among 400 patients on ART (those who were able to access care) and 400 patients accessing home-based care (HBC), but who had not initiated ART (either they were not able to, or chose not to, access care) in two rural and two urban sites in Zambia to identify barriers to and facilitators of ART uptake. RESULTS: HBC patients were 50% more likely to report that it would be very difficult to get to the ART clinic than those on ART (RR: 1.48; 95% CI: 1.21-1.82). Stigma was common in all areas, with 54% of HBC patients, but only 15% of ART patients, being afraid to go to the clinic (RR: 3.61; 95% CI: 3.12-4.18). Cost barriers differed by location: urban HBC patients were three times more likely to report needing to pay to travel to the clinic than those on ART (RR: 2.84; 95% CI: 2.02-3.98) and 10 times more likely to believe they would need to pay a fee at the clinic (RR: 9.50; 95% CI: 2.24-40.3). In rural areas, HBC subjects were more likely to report needing to pay non-transport costs to attend the clinic than those on ART (RR: 4.52; 95% CI: 1.91-10.7). HBC patients were twice as likely as ART patients to report not having enough food to take ART being a concern (27% vs. 13%, RR: 2.03; 95% CI: 1.71-2.41), regardless of location and gender. CONCLUSIONS: Patients in home-based care for HIV/AIDS who never initiated ART perceived greater financial and logistical barriers to seeking HIV care and had more negative perceptions about the benefits of the treatment. Future efforts to expand access to antiretroviral care should consider ways to reduce these barriers in order to encourage more of those medically eligible for antiretrovirals to initiate care

    The use of a bayesian hierarchy to develop and validate a co-morbidity score to predict mortality for linked primary and secondary care data from the NHS in England

    Get PDF
    Background: We have assessed whether the linkage between routine primary and secondary care records provided an opportunity to develop an improved population based co-morbidity score with the combined information on co-morbidities from both health care settings. Methods: We extracted all people older than 20 years at the start of 2005 within the linkage between the Hospital Episodes Statistics, Clinical Practice Research Datalink, and Office for National Statistics death register in England. A random 50% sample was used to identify relevant diagnostic codes using a Bayesian hierarchy to share information between similar Read and ICD 10 code groupings. Internal validation of the score was performed in the remaining 50% and discrimination was assessed using Harrell’s C statistic. Comparisons were made over time, age, and consultation rate with the Charlson and Elixhauser indexes. Results: 657,264 people were followed up from the 1st January 2005. 98 groupings of codes were derived from the Bayesian hierarchy, and 37 had an adjusted weighting of greater than zero in the Cox proportional hazards model. 11 of these groupings had a different weighting dependent on whether they were coded from hospital or primary care. The C statistic reduced from 0.88 (95% confidence interval 0.88–0.88) in the first year of follow up, to 0.85 (0.85–0.85) including all 5 years. When we stratified the linked score by consultation rate the association with mortality remained consistent, but there was a significant interaction with age, with improved discrimination and fit in those under 50 years old (C=0.85, 0.83–0.87) compared to the Charlson (C=0.79, 0.77–0.82) or Elixhauser index (C=0.81, 0.79–0.83). Conclusions: The use of linked population based primary and secondary care data developed a co-morbidity score that had improved discrimination, particularly in younger age groups, and had a greater effect when adjusting for co-morbidity than existing scores

    Production of Virus-Derived Ping-Pong-Dependent piRNA-like Small RNAs in the Mosquito Soma

    Get PDF
    The natural maintenance cycles of many mosquito-borne pathogens require establishment of persistent non-lethal infections in the invertebrate host. The mechanism by which this occurs is not well understood, but we have previously shown that an antiviral response directed by small interfering RNAs (siRNAs) is important in modulating the pathogenesis of alphavirus infections in the mosquito. However, we report here that infection of mosquitoes with an alphavirus also triggers the production of another class of virus-derived small RNAs that exhibit many similarities to ping-pong-dependent piwi-interacting RNAs (piRNAs). However, unlike ping-pong-dependent piRNAs that have been described previously from repetitive elements or piRNA clusters, our work suggests production in the soma. We also present evidence that suggests virus-derived piRNA-like small RNAs are capable of modulating the pathogenesis of alphavirus infections in dicer-2 null mutant mosquito cell lines defective in viral siRNA production. Overall, our results suggest that a non-canonical piRNA pathway is present in the soma of vector mosquitoes and may be acting redundantly to the siRNA pathway to target alphavirus replication

    Arbovirus-Derived piRNAs Exhibit a Ping-Pong Signature in Mosquito Cells

    Get PDF
    The siRNA pathway is an essential antiviral mechanism in insects. Whether other RNA interference pathways are involved in antiviral defense remains unclear. Here, we report in cells derived from the two main vectors for arboviruses, Aedes albopictus and Aedes aegypti, the production of viral small RNAs that exhibit the hallmarks of ping-pong derived piwi-associated RNAs (piRNAs) after infection with positive or negative sense RNA viruses. Furthermore, these cells produce endogenous piRNAs that mapped to transposable elements. Our results show that these mosquito cells can initiate de novo piRNA production and recapitulate the ping-pong dependent piRNA pathway upon viral infection. The mechanism of viral-piRNA production is discussed

    Validation of the Cognitive Assessment of Later Life Status (CALLS) instrument: a computerized telephonic measure

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Brief screening tests have been developed to measure cognitive performance and dementia, yet they measure limited cognitive domains and often lack construct validity. Neuropsychological assessments, while comprehensive, are too costly and time-consuming for epidemiological studies. This study's aim was to develop a psychometrically valid telephone administered test of cognitive function in aging.</p> <p>Methods</p> <p>Using a sequential hierarchical strategy, each stage of test development did not proceed until specified criteria were met. The 30 minute Cognitive Assessment of Later Life Status (CALLS) measure and a 2.5 hour in-person neuropsychological assessment were conducted with a randomly selected sample of 211 participants 65 years and older that included equivalent distributions of men and women from ethnically diverse populations.</p> <p>Results</p> <p>Overall Cronbach's coefficient alpha for the CALLS test was 0.81. A principal component analysis of the CALLS tests yielded five components. The CALLS total score was significantly correlated with four neuropsychological assessment components. Older age and having a high school education or less was significantly correlated with lower CALLS total scores. Females scored better overall than males. There were no score differences based on race.</p> <p>Conclusion</p> <p>The CALLS test is a valid measure that provides a unique opportunity to reliably and efficiently study cognitive function in large populations.</p

    Monitoring frequency influences the analysis of resting behaviour in a forest carnivore

    Get PDF
    Resting sites are key structures for many mammalian species, which can affect reproduction, survival, population density, and even species persistence in human-modified landscapes. As a consequence, an increasing number of studies has estimated patterns of resting site use by mammals, as well as the processes underlying these patterns, though the impact of sampling design on such estimates remain poorly understood. Here we address this issue empirically, based on data from 21 common genets radiotracked during 28 months in Mediterranean forest landscapes. Daily radiotracking data was thinned to simulate every other day and weekly monitoring frequencies, and then used to evaluate the impact of sampling regime on estimates of resting site use. Results showed that lower monitoring frequencies were associated with major underestimates of the average number of resting sites per animal, and of site reuse rates and sharing frequency, though no effect was detected on the percentage use of resting site types. Monitoring frequency also had a major impact on estimates of environmental effects on resting site selection, with decreasing monitoring frequencies resulting in higher model uncertainty and reduced power to identify significant explanatory variables. Our results suggest that variation in monitoring frequency may have had a strong impact on intra- and interspecific differences in resting site use patterns detected in previous studies. Given the errors and uncertainties associated with low monitoring frequencies, we recommend that daily or at least every other day monitoring should be used whenever possible in studies estimating resting site use patterns by mammals

    A randomised controlled trial of a brief online mindfulness-based intervention on paranoia in a non-clinical sample

    Get PDF
    Paranoia is common and distressing in the general population and can impact on health, emotional well-being and social functioning, such that effective interventions are needed. Brief online mindfulness-based interventions (MBIs) have been shown to reduce symptoms of anxiety and depression in non-clinical samples, however at present there is no research investigating whether they can reduce paranoia. The current study explored whether a brief online MBI increased levels of mindfulness and reduced levels of paranoia in a non-clinical population. The mediating effect of mindfulness on any changes in paranoia was also investigated. One hundred and ten participants were randomly allocated to either a two week online MBI including 10 minutes of daily guided mindfulness practice or to a waitlist control condition. Measures of mindfulness and paranoia were administered at baseline, post-intervention and one-week follow-up. Participants in the MBI group displayed significantly greater reductions in paranoia compared to the waitlist control group. Mediation analysis demonstrated that change in mindfulness skills (specifically the observe, describe and nonreact facets of the FFMQ) mediated the relationship between intervention type and change in levels of paranoia. This study provides evidence that a brief online MBI can significantly reduce levels of paranoia in a non-clinical population. Furthermore, increases in mindfulness skills from this brief online MBI can mediate reductions in non-clinical paranoia. The limitations of the study are discussed

    SBP-domain transcription factors as possible effectors of cryptochrome-mediated blue light signalling in the moss Physcomitrella patens

    Get PDF
    Cryptochromes are blue light absorbing photoreceptors found in many organisms and involved in numerous developmental processes. At least two highly similar cryptochromes are known to affect branching during gametophytic development in the moss Physcomitrella patens. We uncovered a relationship between these cryptochromes and the expression of particular members of the SBP-box genes, a plant specific transcription factor family. Transcript levels of the respective moss SBP-box genes, all belonging to the LG1-subfamily, were found to be dependent, albeit not exclusively, on blue light. Moreover, disruptant lines generated for two moss representatives of this SBP-box gene subfamily, both showed enhanced caulonema side branch formation, a phenotype opposite to that of the ppcry1a/1b double disruptant line. In this report we show that PpCRY1a and PpCRY1b act negatively on the transcript levels of several related moss SBP-box genes and that at least PpSBP1 and PpSBP4 act as negative regulators of side branch formation
    • …
    corecore