1,444 research outputs found

    Socioeconomic status is associated with symptom severity and sickness absence in people with infectious intestinal disease in the UK

    Get PDF
    BACKGROUND: The burden of infectious intestinal disease (IID) in the UK is substantial. Negative consequences including sickness absence are common, but little is known about the social patterning of these outcomes, or the extent to which they relate to disease severity. METHODS: We performed a cross-sectional analysis using IID cases identified from a large population-based survey, to explore the association between socioeconomic status (SES) and symptom severity and sickness absence; and to assess the role of symptom severity on the relationship between SES and absence. Regression modelling was used to investigate these associations, whilst controlling for potential confounders such as age, sex and ethnicity. RESULTS: Among 1164 cases, those of lower SES versus high had twice the odds of experiencing severe symptoms (OR 2.2, 95%CI;1.66-2.87). Lower SES was associated with higher odds of sickness absence (OR 1.8, 95%CI;1.26-2.69), however this association was attenuated after adjusting for symptom severity (OR 1.4, 95%CI;0.92-2.07). CONCLUSIONS: In a large sample of IID cases, those of low SES versus high were more likely to report severe symptoms, and sickness absence; with greater severity largely explaining the higher absence. Public health interventions are needed to address the unequal consequences of IID identified

    Socioeconomic status and infectious intestinal disease in the community: a longitudinal study (IID2 study).

    Get PDF
    Infectious intestinal diseases (IID) are common, affecting around 25% of people in UK each year at an estimated annual cost to the economy, individuals and the NHS of £1.5 billion. While there is evidence of higher IID hospital admissions in more disadvantaged groups, the association between socioeconomic status (SES) and risk of IID remains unclear. This study aims to investigate the relationship between SES and IID in a large community cohort.Longitudinal analysis of a prospective community cohort in the UK following 6836 participants of all ages was undertaken. Hazard ratios for IID by SES were estimated using Cox proportional hazard, adjusting for follow-up time and potential confounding factors.In the fully adjusted analysis, hazard ratio of IID was significantly lower among routine/manual occupations compared with managerial/professional occupations (HR 0.74, 95% CI 0.61-0.90).In this large community cohort, lower SES was associated with lower IID risk. This may be partially explained by the low response rate which varied by SES. However, it may be related to differences in exposure or recognition of IID symptoms by SES. Higher hospital admissions associated with lower SES observed in some studies could relate to more severe consequences, rather than increased infection risk

    Psychosocial stress and strategies for managing adversity: measuring population resilience in New South Wales, Australia

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Populations around the world are facing an increasing number of adversities such as the global financial crisis, terrorism, conflict, and climate change. The aim of this paper was to investigate self-reported strategies and sources of support used to get through "tough times" in an Australian context and to identify patterns of response in the general population and differences in potentially vulnerable subgroups.</p> <p>Methods</p> <p>Data were collected through a cross-sectional survey of the New South Wales population in Australia. The final sample consisted of 3,995 New South Wales residents aged 16 years and above who responded to the question: "What are the things that get you through tough times?"</p> <p>Results</p> <p>Respondents provided brief comments that were coded into 14 main subject-area categories. The most frequently reported responses were family and self (52%); friends and neighbors (21%); use of positive emotional and philosophical strategies (17%), such as sense of humor, determination, and the belief that things would get better; and religious beliefs (11%). The responses of four population subgroups were compared, based on gender, household income, level of psychological distress, and whether a language other than English was spoken at home. Women reported greater use of friends and neighbors and religious or spiritual beliefs for support, whereas men reported greater use of drinking/smoking and financial supports. Those with lower incomes reported greater reliance on positive emotional and philosophical strategies and on religious or spiritual beliefs. Those with high levels of psychological distress reported greater use of leisure interests and hobbies, drinking/smoking, and less use of positive lifestyle strategies, such as adequate sleep, relaxation, or work/life balance. Those who spoke a language other than English at home were less likely to report relying on self or others (family/friends) or positive emotional and philosophical strategies to get through tough times.</p> <p>Conclusions</p> <p>Understanding strategies and sources of support used by the population to get through adversity is the first step toward identifying the best approaches to build and support strengths and reduce vulnerabilities. It is also possible to reflect on how large-scale threats such as pandemics, disasters, conflict, bereavement, and loss could impact individual and population resilience.</p

    Trends in inequalities in Children Looked After in England between 2004 and 2019: a local area ecological analysis.

    Get PDF
    OBJECTIVE:To assess trends in inequalities in Children Looked After (CLA) in England between 2004 and 2019, after controlling for unemployment, a marker of recession and risk factor for child maltreatment. DESIGN:Longitudinal local area ecological analysis. SETTING:150 English upper-tier local authorities. PARTICIPANTS:Children under the age of 18 years. PRIMARY OUTCOME MEASURE:The annual age-standardised rate of children starting to be looked after (CLA rate) across English local authorities, grouped into quintiles based on their level of income deprivation. Slope indices of inequality were estimated using longitudinal segmented mixed-effects models, controlling for unemployment. RESULTS:Since 2008, there has been a precipitous rise in CLA rates and a marked widening of inequalities. Unemployment was associated with rising CLA rates: for each percentage point increase in unemployment rate, an estimated additional 9 children per 100 000 per year (95% CI 6 to 11) became looked after the following year. However, inequalities increased independently of the effect of unemployment. Between 2007 and 2019, after controlling for unemployment, the gap between the most and least deprived areas increased by 15 children per 100 000 per year (95% CI 4 to 26) relative to the 2004-2006 trend. CONCLUSIONS:The dramatic increase in the rate of children starting to be looked after has been greater in poorer areas and in areas more deeply affected by recession. But trends in unemployment do not explain the decade-long rise in inequalities, suggesting that other socioeconomic factors, including rising child poverty and reduced spending on children's services, may be fuelling inequalities. Policies to safely reduce the CLA rate should urgently address the social determinants of child health and well-being

    Influence of socio-economic status on Shiga toxin-producing Escherichia coli (STEC) infection incidence, risk factors and clinical features

    Get PDF
    Shiga toxin-producing Escherichia coli (STEC) infection can cause serious illness including haemolytic uraemic syndrome. The role of socio-economic status (SES) in differential clinical presentation and exposure to potential risk factors amongst STEC cases has not previously been reported in England. We conducted an observational study using a dataset of all STEC cases identified in England, 2010–2015. Odds ratios for clinical characteristics of cases and foodborne, waterborne and environmental risk factors were estimated using logistic regression, stratified by SES, adjusting for baseline demographic factors. Incidence was higher in the highest SES group compared to the lowest (RR 1.54, 95% CI 1.19–2.00). Odds of Accident and Emergency attendance (OR 1.35, 95% CI 1.10–1.75) and hospitalisation (OR 1.71, 95% CI 1.36–2.15) because of illness were higher in the most disadvantaged compared to the least, suggesting potential lower ascertainment of milder cases or delayed care-seeking behaviour in disadvantaged groups. Advantaged individuals were significantly more likely to report salad/fruit/vegetable/herb consumption (OR 1.59, 95% CI 1.16–2.17), non-UK or UK travel (OR 1.76, 95% CI 1.40–2.27; OR 1.85, 95% CI 1.35–2.56) and environmental exposures (walking in a paddock, OR 1.82, 95% CI 1.22–2.70; soil contact, OR 1.52, 95% CI 2.13–1.09) suggesting other unmeasured risks, such as person-to-person transmission, could be more important in the most disadvantaged group

    Antibody landscapes after influenza virus infection or vaccination.

    Get PDF
    We introduce the antibody landscape, a method for the quantitative analysis of antibody-mediated immunity to antigenically variable pathogens, achieved by accounting for antigenic variation among pathogen strains. We generated antibody landscapes to study immune profiles covering 43 years of influenza A/H3N2 virus evolution for 69 individuals monitored for infection over 6 years and for 225 individuals pre- and postvaccination. Upon infection and vaccination, titers increased broadly, including previously encountered viruses far beyond the extent of cross-reactivity observed after a primary infection. We explored implications for vaccination and found that the use of an antigenically advanced virus had the dual benefit of inducing antibodies against both advanced and previous antigenic clusters. These results indicate that preemptive vaccine updates may improve influenza vaccine efficacy in previously exposed individuals.This is the author’s version of the work. It will be under embargo for 6 months following publication. It is posted here by permission of the AAAS for personal use, not for redistribution. The final version is available from AAAS in Science at http://www.sciencemag.org/content/346/6212/996.long

    Search for sterile neutrino mixing in the MINOS long-baseline experiment

    Get PDF
    A search for depletion of the combined flux of active neutrino species over a 735 km baseline is reported using neutral-current interaction data recorded by the MINOS detectors in the NuMI neutrino beam. Such a depletion is not expected according to conventional interpretations of neutrino oscillation data involving the three known neutrino flavors. A depletion would be a signature of oscillations or decay to postulated noninteracting sterile neutrinos, scenarios not ruled out by existing data. From an exposure of 3.18×1020 protons on target in which neutrinos of energies between ~500¿¿MeV and 120 GeV are produced predominantly as ¿µ, the visible energy spectrum of candidate neutral-current reactions in the MINOS far detector is reconstructed. Comparison of this spectrum to that inferred from a similarly selected near-detector sample shows that of the portion of the ¿µ flux observed to disappear in charged-current interaction data, the fraction that could be converting to a sterile state is less than 52% at 90% confidence level (C.L.). The hypothesis that active neutrinos mix with a single sterile neutrino via oscillations is tested by fitting the data to various models. In the particular four-neutrino models considered, the mixing angles ¿24 and ¿34 are constrained to be less than 11° and 56° at 90% C.L., respectively. The possibility that active neutrinos may decay to sterile neutrinos is also investigated. Pure neutrino decay without oscillations is ruled out at 5.4 standard deviations. For the scenario in which active neutrinos decay into sterile states concurrently with neutrino oscillations, a lower limit is established for the neutrino decay lifetime t3/m3&gt;2.1×10-12¿¿s/eV at 90% C.L

    Distracting the Mind Improves Performance: An ERP Study

    Get PDF
    When a second target (T2) is presented in close succession of a first target (T1), people often fail to identify T2, a phenomenon known as the attentional blink (AB). However, the AB can be reduced substantially when participants are distracted during the task, for instance by a concurrent task, without a cost for T1 performance. The goal of the current study was to investigate the electrophysiological correlates of this paradoxical effect.Participants successively performed three tasks, while EEG was recorded. The first task (standard AB) consisted of identifying two target letters in a sequential stream of distractor digits. The second task (grey dots task) was similar to the first task with the addition of an irrelevant grey dot moving in the periphery, concurrent with the central stimulus stream. The third task (red dot task) was similar to the second task, except that detection of an occasional brief color change in the moving grey dot was required. AB magnitude in the latter task was significantly smaller, whereas behavioral performance in the standard and grey dots tasks did not differ. Using mixed effects models, electrophysiological activity was compared during trials in the grey dots and red dot tasks that differed in task instruction but not in perceptual input. In the red dot task, both target-related parietal brain activity associated with working memory updating (P3) as well as distractor-related occipital activity was significantly reduced.The results support the idea that the AB might (at least partly) arise from an overinvestment of attentional resources or an overexertion of attentional control, which is reduced when a distracting secondary task is carried out. The present findings bring us a step closer in understanding why and how an AB occurs, and how these temporal restrictions in selective attention can be overcome
    corecore