446 research outputs found

    Does the impact of osteoarthritis vary by age, gender and social deprivation? A community study using the International Classification of Functioning, Disability and Health

    Get PDF
    The aim of the study was to explore if the impact of osteoarthritis varies with respect to age, gender and social deprivation. Impact was defined as impairment, activity limitations and participation restriction (International Classification of Functioning, Disability and Health (ICF)). Investigating the functioning of the ICF model for subgroups is important both practically and theoretically.  The sample comprised a community sample of 763 people diagnosed with osteoarthritis. Uncontaminated measures of the ICF constructs were developed using discriminant content validity from a pool of 134 items, including the WOMAC and SF-36. Multigroup Structural Equation Modelling was used to explore if the same pathways exist for subgroups of gender, age and social deprivation. Results: Different significant paths were found for gender and social deprivation: impairment did not predict participation restriction for women and those most deprived, whereas these paths were significant for men and those less deprived. No difference in the paths was found for age. The impact of osteoarthritis appears to vary with respect to gender and social deprivation but not age. This suggests both that osteoarthritis per se does not adequately explain the health outcomes observed and that different clinical approaches may be appropriate for people of different gender and levels of deprivation. Implications of Rehabilitation • The ICF model appears to vary with respect to gender and social deprivation for people with osteoarthritis. • The ICF model did not appear to vary with respect to age for people with osteoarthritis. • Different treatments and interventions for osteoarthritis may need to be targeted for specific gender and social deprivation groups

    Evaluating Retinal Function in Age-Related Maculopathy with the ERG Photostress Test

    Get PDF
    PURPOSE. To evaluate the diagnostic potential of the electroretinogram (ERG) photostress test and the focal cone ERG in age-related maculopathy (ARM). METHODS. The cohort comprised 31 patients with ARM and 27 age-matched control subjects. The ERG photostress test was used to monitor cone adaptation after intense light adaptation. Focal 41- and 5-Hz cone ERGs were recorded monocularly (central 20°) to assess steady state retinal function. Univariate analysis identified electrophysiological parameters that differed between groups, and receiver operating characteristic (ROC) curves were constructed to assess their diagnostic potential. Logistic regression analysis determined the diagnostic potential of a model incorporating several independent predictors of ARM. RESULTS. The rate of recovery of the ERG photostress test was reduced (recovery was slower) in subjects with ARM. The parameter exhibited good diagnostic potential (P = 0.002, area under ROC curve = 0.74). The implicit times of the 5-Hz (a-wave, P = 0.002; b-wave, P < 0.001) and the 41-Hz (P < 0.001) focal cone ERGs were increased, and the 41-Hz focal cone ERG amplitude (P = 0.003) and focal to full-field amplitude ratio (P = 0.001) were reduced in the ARM group. Logistic regression analysis identified three independent predictors of ARM, including the rate of recovery of the ERG photostress test. CONCLUSIONS. Early ARM has a marked effect on the kinetics of cone adaptation. The clinical application of the ERG photostress test increases the sensitivity and specificity of a model for the diagnosis of ARM. Improved assessment of the functional integrity of the central retina will facilitate early diagnosis and evaluation of therapeutic interventions

    Cell-free (RNA) and cell-associated (DNA) HIV-1 and postnatal transmission through breastfeeding

    Get PDF
    &lt;p&gt;Introduction - Transmission through breastfeeding remains important for mother-to-child transmission (MTCT) in resource-limited settings. We quantify the relationship between cell-free (RNA) and cell-associated (DNA) shedding of HIV-1 virus in breastmilk and the risk of postnatal HIV-1 transmission in the first 6 months postpartum.&lt;/p&gt; &lt;p&gt;Materials and Methods - Thirty-six HIV-positive mothers who transmitted HIV-1 by breastfeeding were matched to 36 non-transmitting HIV-1 infected mothers in a case-control study nested in a cohort of HIV-infected women. RNA and DNA were quantified in the same breastmilk sample taken at 6 weeks and 6 months. Cox regression analysis assessed the association between cell-free and cell-associated virus levels and risk of postnatal HIV-1 transmission.&lt;/p&gt; &lt;p&gt;Results - There were higher median levels of cell-free than cell-associated HIV-1 virus (per ml) in breastmilk at 6 weeks and 6 months. Multivariably, adjusting for antenatal CD4 count and maternal plasma viral load, at 6 weeks, each 10-fold increase in cell-free or cell-associated levels (per ml) was significantly associated with HIV-1 transmission but stronger for cell-associated than cell-free levels [2.47 (95% CI 1.33–4.59) vs. aHR 1.52 (95% CI, 1.17–1.96), respectively]. At 6 months, cell-free and cell-associated levels (per ml) in breastmilk remained significantly associated with HIV-1 transmission but was stronger for cell-free than cell-associated levels [aHR 2.53 (95% CI 1.64–3.92) vs. 1.73 (95% CI 0.94–3.19), respectively].&lt;/p&gt; &lt;p&gt;Conclusions - The findings suggest that cell-associated virus level (per ml) is more important for early postpartum HIV-1 transmission (at 6 weeks) than cell-free virus. As cell-associated virus levels have been consistently detected in breastmilk despite antiretroviral therapy, this highlights a potential challenge for resource-limited settings to achieve the UNAIDS goal for 2015 of eliminating vertical transmission. More studies would further knowledge on mechanisms of HIV-1 transmission and help develop more effective drugs during lactation.&lt;/p&gt

    Seasonal variations in carbon, nitrogen and phosphorus concentrations and C:N:P stoichiometry in different organs of a Larix principis-rupprechtii Mayr. plantation in the Qinling Mountains, China

    Get PDF
    Understanding how concentrations of elements and their stoichiometry change with plant growth and age is critical for predicting plant community responses to environmental change. Weusedlong-term field experiments to explore how the leaf, stem and root carbon (C), nitrogen (N) and phosphorous (P) concentrations and their stoichiometry changed with growth and stand age in a L.principis-rupprechtii Mayr. plantation from 2012–2015 in the Qinling Mountains, China. Our results showed that the C, N and P concentrations and stoichiometric ratios in different tissues of larch stands were affected by stand age, organ type andsampling month and displayed multiple correlations with increased stand age in different growing seasons. Generally, leaf C and N concentrations were greatest in the fast-growing season, but leaf P concentrations were greatest in the early growing season. However, no clear seasonal tendencies in the stem and root C, N and P concentrations were observed with growth. In contrast to N and P, few differences were found in organ-specific C concentrations. Leaf N:P was greatest in the fast-growing season, while C:N and C:P were greatest in the late-growing season. No clear variations were observed in stem and root C:N, C:P andN:Pthroughout the entire growing season, but leaf N:P was less than 14, suggesting that the growth of larch stands was limited by N in our study region. Compared to global plant element concentrations and stoichiometry, the leaves of larch stands had higher C, P, C:NandC:PbutlowerNandN:P,andtherootshadgreater PandC:NbutlowerN,C:Pand N:P. Our study provides baseline information for describing the changes in nutritional elements with plant growth, which will facilitates plantation forest management and restoration, and makes avaluable contribution to the global data pool on leaf nutrition and stoichiometry

    The impact on welfare and public finances of job loss in industrial Britain

    Get PDF
    It is important to take a long view of many economic problems. This paper explains how the large-scale loss of industrial jobs in parts of Britain during the 1980s and 1990s still inflates the contemporary budget deficit in the UK. Drawing on the findings of several empirical studies by the authors, it shows that although there has been progress in regeneration the consequences of job loss in Britain’s older industrial areas have been near-permanently higher levels of worklessness, especially on incapacity benefits, low pay, and a major claim on present-day public finances to pay for both in-work and out-of-work benefits. Furthermore, as the UK government implements reductions in welfare spending the poorest places are being hit hardest. In effect, communities in older industrial Britain now face punishment in the form of welfare cuts for the destruction previously wrought to their industrial base

    The impact on welfare and public finances of job loss in industrial Britain

    Get PDF
    It is important to take a long view of many economic problems. This paper explains how the large-scale loss of industrial jobs in parts of Britain during the 1980s and 1990s still inflates the contemporary budget deficit in the UK. Drawing on the findings of several empirical studies by the authors, it shows that although there has been progress in regeneration the consequences of job loss in Britain’s older industrial areas have been near-permanently higher levels of worklessness, especially on incapacity benefits, low pay, and a major claim on present-day public finances to pay for both in-work and out-of-work benefits. Furthermore, as the UK government implements reductions in welfare spending the poorest places are being hit hardest. In effect, communities in older industrial Britain now face punishment in the form of welfare cuts for the destruction previously wrought to their industrial base

    Impact of the introduction and withdrawal of financial incentives on the delivery of alcohol screening and brief advice in English primary health care : an interrupted time–series analysis

    Get PDF
    Aim To evaluate the impact of the introduction and withdrawal of financial incentives on alcohol screening and brief advice delivery in English primary care. Design Interrupted time–series using data from The Health Improvement Network (THIN) database. Data were split into three periods: (1) before the introduction of financial incentives (1 January 2006–31 March 2008); (2) during the implementation of financial incentives (1 April 2008–31 March 2015); and (3) after the withdrawal of financial incentives (1 April 2015–31 December 2016). Segmented regression models were fitted, with slope and step change coefficients at both intervention points. Setting England. Participants Newly registered patients (16+) in 500 primary care practices for 2006–16 (n = 4 278 723). Measurements The outcome measures were percentage of patients each month who: (1) were screened for alcohol use; (2) screened positive for higher‐risk drinking; and (3) were reported as having received brief advice on alcohol consumption. Findings There was no significant change in the percentage of newly registered patients who were screened for alcohol use when financial incentives were introduced. However, the percentage fell (P < 0.001) immediately when incentives were withdrawn, and fell by a further 2.96 [95% confidence interval (CI) = 2.21–3.70] patients per 1000 each month thereafter. After the introduction of incentives, there was an immediate increase of 9.05 (95% CI = 3.87–14.23) per 1000 patients screening positive for higher‐risk drinking, but no significant further change over time. Withdrawal of financial incentives was associated with an immediate fall in screen‐positive rates of 29.96 (95% CI = 19.56–40.35) per 1000 patients, followed by a rise each month thereafter of 2.14 (95% CI = 1.51–2.77) per 1000. Screen‐positive patients recorded as receiving alcohol brief advice increased by 20.15 (95% CI = 12.30–28.00) per 1000 following the introduction of financial incentives, and continued to increase by 0.39 (95% CI = 0.26–0.53) per 1000 monthly until withdrawal. At this point, delivery of brief advice fell by 18.33 (95% CI = 11.97–24.69) per 1000 patients and continued to fall by a further 0.70 (95% CI = 0.28–1.12) per 1000 per month. Conclusions Removing a financial incentive for alcohol prevention in English primary care was associated with an immediate and sustained reduction in the rate of screening for alcohol use and brief advice provision. This contrasts with no, or limited, increase in screening and brief advice delivery rates following the introduction of the scheme

    Phenotypic and molecular characterization of Staphylococcus aureus isolates expressing low- and high-level mupirocin resistance in Nigeria and South Africa

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Mupirocin is a topical antimicrobial agent which is used for the treatment of skin and postoperative wound infections, and the prevention of nasal carriage of methicillin-resistant <it>Staphylococcus aureus </it>(MRSA). However, the prevalence of mupirocin resistance in <it>S. aureus</it>, particularly in MRSA, has increased with the extensive and widespread use of this agent in hospital settings. This study characterized low- and high-level mupirocin-resistant <it>S. aureus </it>isolates obtained from Nigeria and South Africa.</p> <p>Methods</p> <p>A total of 17 mupirocin-resistant <it>S. aureus </it>isolates obtained from two previous studies in Nigeria and South Africa, were characterized by antibiogram, PCR-RFLP of the coagulase gene and PFGE. High-level mupirocin resistant isolates were confirmed by PCR detection of the <it>mupA </it>gene. The genetic location of the resistance determinants was established by curing and transfer experiments.</p> <p>Results</p> <p>All the low-level mupirocin resistant isolates were MRSA and resistant to gentamicin, tetracycline and trimethoprim. PFGE identified a major clone in two health care institutions located in Durban and a health care facility in Pietermaritzburg, Greytown and Empangeni. Curing and transfer experiments indicated that high-level mupirocin resistance was located on a 41.1 kb plasmid in the South African strain (A15). Furthermore, the transfer of high-level mupirocin resistance was demonstrated by the conjugative transfer of the 41.1 kb plasmid alone or with the co-transfer of a plasmid encoding resistance to cadmium. The size of the mupirocin-resistance encoding plasmid in the Nigerian strain (35 IBA) was approximately 35 kb.</p> <p>Conclusion</p> <p>The emergence of mupirocin-resistant <it>S. aureus </it>isolates in Nigeria and South Africa should be of great concern to medical personnel in these countries. It is recommended that methicillin-susceptible <it>S. aureus </it>(MSSA) and MRSA should be routinely tested for mupirocin resistance even in facilities where the agent is not administered. Urgent measures, including judicious use of mupirocin, need to be taken to prevent clonal dissemination of the mupirocin/methicillin resistant <it>S. aureus </it>in KZN, South Africa and the transfer of the conjugative plasmid encoding high-level mupirocin resistance identified in this study.</p

    Survival benefits of statins for primary prevention: a cohort study

    Get PDF
    Objectives: Estimate the effect of statin prescription on mortality in the population of England and Wales with no previous history of cardiovascular disease.  Methods: Primary care records from The Health Improvement Network 1987-2011 were used.Four cohorts of participants aged 60, 65, 70, or 75 years at baseline included 118,700,199,574, 247,149, and 194,085 participants; and 1.4, 1.9, 1.8, and 1.1 million person-years of data, respectively. The exposure was any statin prescription at any time before the participant reached the baseline age (60, 65, 70 or 75) and the outcome was all-cause mortality at any age above the baseline age. The hazard of mortality associated with statin prescription was calculated by Cox's proportional hazard regressions, adjusted for sex, year of birth, socioeconomic status, diabetes,antihypertensive medication, hypercholesterolaemia, body mass index, smoking status, and general practice. Participants were grouped by QRISK2 baseline risk of afirst cardiovascular event in the next ten years of <10%, 10-19%, or ≥20%.  Results: There was no reduction in all-cause mortality for statin prescription initiated in participants with a QRISK2 score <10% at any baseline age, or in participants aged 60at baseline in any risk group. Mortality was lower in participants with a QRISK2 score≥20% if statin prescription had been initiated by age 65 (adjusted hazard ratio (HR)0.86 (0.79-0.94)), 70 (HR 0.83 (0.79-0.88)), or 75 (HR 0.82 (0.79-0.86)). Mortality reduction was uncertain with a QRISK2 score of 10-19%: the HR was 1.00 (0.91-1.11)for statin prescription by age 65, 0.89 (0.81-0.99) by age 70, or 0.79 (0.52-1.19) by age75.  Conclusions: The current internationally recommended thresholds for statin therapy for primary prevention of cardiovascular disease in routine practice may be too low and may lead to overtreatment of younger people and those at low risk
    corecore