1,488 research outputs found

    Competition between wild and captive-bred Penaeus plebejus and implications for stock enhancement

    Get PDF
    The mechanisms that drive density dependence are rarely studied in the applied context of population management. We examined the potential for competition for food and shelter and the resulting demographic density dependence to influence how well populations of the eastern king prawn Penaeus plebejus Hess can recover following marine stock enhancement programmes in which captive-bred juveniles are released into the wild. Specifically, manipulative laboratory experiments were used to quantify the differential effects of competition for food and competition for shelter on survival of wild and captive-bred P. plebejus as densities were increased and as each category of P. plebejus (wild or captive-bred) was supplemented with the alternate category. Increasing population densities when food and shelter were limited lowered survival for both categories. When food was limited, survival of both categories was unaffected by addition of the alternative category. Adding wild P. plebejus to their captive-bred counterparts when shelter was limited under laboratory conditions resulted in significantly higher mortality in captive-bred individuals. In contrast, adding captive-bred P. plebejus to wild individuals under these conditions did not affect wild P. plebejus. We conclude that if the current results can be extended to wild conditions, competition for shelter may lead to the loss of captive-bred P. plebejus, thereby reducing the intended outcomes of stock enhancement. This highlights the importance of investigating interactions between wild and captive-bred animals prior to stock enhancement to predict long-term outcomes and identify situations where stock enhancement could be an effective response to the loss of populations or recruitment limitation

    Mapping tuberculosis treatment outcomes in Ethiopia

    Get PDF
    Background: Tuberculosis (TB) is the leading cause of death from an infectious disease in Ethiopia, killing more than 30 thousand people every year. This study aimed to determine whether the rates of poor TB treatment outcome varied geographically across Ethiopia at district and zone levels and whether such variability was associated with socioeconomic, behavioural, health care access, or climatic conditions. Methods: A geospatial analysis was conducted using national TB data reported to the health management information system (HMIS), for the period 2015-2017. The prevalence of poor TB treatment outcomes was calculated by dividing the sum of treatment failure, death and loss to follow-up by the total number of TB patients. Binomial logistic regression models were computed and a spatial analysis was performed using a Bayesian framework. Estimates of parameters were generated using Markov chain Monte Carlo (MCMC) simulation. Geographic clustering was assessed using the Getis-Ord Gi* statistic, and global and local Moran's I statistics. Results: A total of 223,244 TB patients were reported from 722 districts in Ethiopia during the study period. Of these, 63,556 (28.5%) were cured, 139,633 (62.4%) completed treatment, 6716 (3.0%) died, 1459 (0.7%) had treatment failure, and 12,200 (5.5%) were lost to follow-up. The overall prevalence of a poor TB treatment outcome was 9.0% (range, 1-58%). Hot-spots and clustering of poor TB treatment outcomes were detected in districts near the international borders in Afar, Gambelia, and Somali regions and cold spots were detected in Oromia and Amhara regions. Spatial clustering of poor TB treatment outcomes was positively associated with the proportion of the population with low wealth index (OR: 1.01; 95%CI: 1.0, 1.01), the proportion of the population with poor knowledge about TB (OR: 1.02; 95%CI: 1.01, 1.03), and higher annual mean temperature per degree Celsius (OR: 1.15; 95% CI: 1.08, 1.21). Conclusions: This study showed significant spatial variation in poor TB treatment outcomes in Ethiopia that was related to underlying socioeconomic status, knowledge about TB, and climatic conditions. Clinical and public health interventions should be targeted in hot spot areas to reduce poor TB treatment outcomes and to achieve the national End-TB Strategy targets

    Development of a risk score for prediction of poor treatment outcomes among patients with multidrug-resistant tuberculosis

    Get PDF
    Background Treatment outcomes among patients treated for multidrug-resistant tuberculosis (MDR-TB) are often sub-optimal. Therefore, the early prediction of poor treatment outcomes may be useful in patient care, especially for clinicians when they have the ability to make treatment decisions or offer counselling or additional support to patients. The aim of this study was to develop a simple clinical risk score to predict poor treatment outcomes in patients with MDR-TB, using routinely collected data from two large countries in geographically distinct regions. Methods We used MDR-TB data collected from Hunan Chest Hospital, China and Gondar University Hospital, Ethiopia. The data were divided into derivation (n = 343; 60%) and validation groups (n = 227; 40%). A poor treatment outcome was defined as treatment failure, lost to follow up or death. A risk score for poor treatment outcomes was derived using a Cox proportional hazard model in the derivation group. The model was then validated in the validation group. Results The overall rate of poor treatment outcome was 39.5% (n = 225); 37.9% (n = 86) in the derivation group and 40.5% (n = 139) in the validation group. Three variables were identified as predictors of poor treatment outcomes, and each was assigned a number of points proportional to its regression coefficient. These predictors and their points were: 1) history of taking second-line TB treatment (2 points), 2) resistance to any fluoroquinolones (3 points), and 3) smear did not convert from positive to negative at two months (4 points). We summed these points to calculate the risk score for each patient; three risk groups were defined: low risk (0 to 2 points), medium risk (3 to 5 points), and high risk (6 to 9 points). In the derivation group, poor treatment outcomes were reported for these three groups as 14%, 27%, and 71%, respectively. The area under the receiver operating characteristic curve for the point system in the derivation group was 0.69 (95% CI 0.60 to 0.77) and was similar to that in the validation group (0.67; 95% CI 0.56 to 0.78; p = 0.82). Conclusion History of second-line TB treatment, resistance to any fluoroquinolones, and smear non-conversion at two months can be used to estimate the risk of poor treatment outcome in patients with MDR-TB with a moderate degree of accuracy (AUROC = 0.69)

    The association between driving time and unhealthy lifestyles: a cross-sectional, general population study of 386 493 UK Biobank participants

    Get PDF
    Background: Driving is a common type of sedentary behaviour; an independent risk factor for poor health. The study explores whether driving is also associated with other unhealthy lifestyle factors. Methods: In a cross-sectional study of UK Biobank participants, driving time was treated as an ordinal variable and other lifestyle factors dichotomized into low/high risk based on guidelines. The associations were explored using chi-square tests for trend and binary logistic regression. Results: Of the 386 493 participants who drove, 153 717 (39.8%) drove <1 h/day; 140 140 (36.3%) 1 h/day; 60 973 (15.8%) 2 h/day; and 31 663 (8.2%) ≥3 h/day. Following adjustment for potential confounders, driving ≥3 h/day was associated with being overweight/obese (OR = 1.74, 95% CI: 1.64–1.85), smoking (OR = 1.48, 95% CI: 1.37–1.63), insufficient sleep (1.70, 95% CI: 1.61–1.80), low fruit/vegetable intake (OR = 1.26, 95% CI: 1.18–1.35) and low physical activity (OR = 1.05, 95% CI: 1.00–1.11), with dose relationships for the first three, but was not associated with higher alcohol consumption (OR = 0.94, 95% CI: 0.87–1.02). Conclusions: Sedentary behaviour, such as driving, is known to have an independent association with adverse health outcomes. It may have additional impact mediated through its effect on other aspects of lifestyle. People with long driving times are at higher risk and might benefit from targeted interventions

    Comparison of the validity of smear and culture conversion as a prognostic marker of treatment outcome in patients with multidrug-resistant tuberculosis

    Get PDF
    Background The World Health Organization (WHO) has conditionally recommended the use of sputum smear microscopy and culture examination for the monitoring of multidrug-resistant tuberculosis (MDR-TB) treatment. We aimed to assess and compare the validity of smear and culture conversion at different time points during treatment for MDR-TB, as a prognostic marker for end-of-treatment outcomes. Methods We undertook a retrospective observational cohort study using data obtained from Hunan Chest Hospital, China and Gondar University Hospital, Ethiopia. The sensitivity and specificity of culture and sputum smear conversion for predicting treatment outcomes were analysed using a random-effects generalized linear mixed model. Results A total of 429 bacteriologically confirmed MDR-TB patients with a culture and smear positive result were included. Overall, 345 (80%) patients had a successful treatment outcome, and 84 (20%) patients had poor treatment outcomes. The sensitivity of smear and culture conversion to predict a successful treatment outcome were: 77.9% and 68.9% at 2 months after starting treatment (difference between tests, p = 0.007); 95.9% and 92.7% at 4 months (p = 0.06); 97.4% and 96.2% at 6 months (p = 0.386); and 99.4% and 98.9% at 12 months (p = 0.412), respectively. The specificity of smear and culture non-conversion to predict a poor treatment outcome were: 41.6% and 60.7% at 2 months (p = 0.012); 23.8% and 48.8% at 4 months (p<0.001); and 20.2% and 42.8% at 6 months (p<0.001); and 15.4% and 32.1% (p<0.001) at 12 months, respectively. The sensitivity of culture and smear conversion increased as the month of conversion increased but at the cost of decreased specificity. The optimum time points after conversion to provide the best prognostic marker of a successful treatment outcome were between two and four months after treatment commencement for smear, and between four and six months for culture. The common optimum time point for smear and culture conversion was four months. At this time point, culture conversion (AUROC curve = 0.71) was significantly better than smear conversion (AUROC curve = 0.6) in predicting successful treatment outcomes (p < 0.001). However, the validity of smear conversion (AUROC curve = 0.7) was equivalent to culture conversion (AUROC curve = 0.71) in predicting treatment outcomes when demographic and clinical factors were included in the model. The positive and negative predictive values for smear conversion were: 57.3% and 65.7% at two months, 55.7% and 85.4% at four months, and 55.0% and 88.6% at six months; and for culture conversions it was: 63.7% and 66.2% at two months, 64.4% and 87.1% at four months, and 62.7% and 91.9% at six months, respectively. Conclusions The validity of smear conversion is significantly lower than culture conversion in predicting MDR-TB treatment outcomes. We support the WHO recommendation of using both smear and culture examination rather than smear alone for the monitoring of MDR-TB patients for a better prediction of successful treatment outcomes. The optimum time points to predict a future successful treatment outcome were between two and four months after treatment commencement for sputum smear conversion and between four and six months for culture conversion. The common optimum times for culture and smear conversion together was four months

    Associations of dietary protein intake with fat free mass and grip strength: cross-sectional study in 146,816 UK Biobank participants

    Get PDF
    Adequate dietary protein intake is important for the maintenance of fat-free mass (FFM) and muscle strength: optimal requirements remain unknown. The aim of the current study was to explore the associations of protein intake with FFM and grip strength. We used baseline data from the UK Biobank (146,816 participants aged 40-69 years with data collected 2007-2010 across the UK) to examine the associations of protein intake with FFM and grip strength. Protein intake was positively associated with FFM (men 5.1% [95% CI: 5.0; 5.2] and women 7.7% [95% CI: 7.7; 7.8]) and grip strength (men 0.076 kg/kg [95% CI: 0.074; 0.078] and women 0.074 kg/kg [95% CI: 0.073; 0.076]) per 0.5 grams per kg body mass per day (g/kg/day) increment in protein intake. FFM and grip strength were higher with higher intakes across the full range of intakes, i.e. highest in those reporting consuming &gt; 2.0 g/grams per kg/day independently of socio-demographics, other dietary measures, physical activity and comorbidities. FFM and grip strength were lower with age, but this association did not differ by protein intake categories (P &gt; 0.05). Current recommendation for all adults (40-69 years) for protein intake (0.8 grams per kg body mass per day) may need to be increased to optimise FFM and grip strength

    Exogenous NG-hydroxy-l-arginine causes nitrite production in vascular smooth muscle cells in the absence of nitric oxide synthase activity

    Get PDF
    AbstractNitric oxide (NO) production from exogenous NG-hydroxy-l-arginine (OH-l-Arg) was investigated in rat aortic smooth muscle cells in culture by measuring nitrite accumulation in the culture medium. As well, the interaction between OH-l-Arg and l-arginine uptake via the y+ cationic amino acid transporter was studied. In cells without NO-synthase activity, OH-l-Arg (1–1000 μM) induced a dose-dependent nitrite production with a half-maximal effective concentration (EC50) of 18.0 ± 1.5 μM (n = 4–7). This nitrite accumulation was not inhibited by the NO-synthase inhibitor NG-nitro-l-arginine methyl ester, l-NAME (300 μM). In contrast, it was abolished by miconazole (100 μM), an inhibitor of cytochrome P450. Incubation of vascular smooth muscle cells with LPS (10 μgml) induced an l-name inhibited nitrite accumulation, but did not enhance the OH-l-Arg induced nitrite production. OH-l-Arg and other cationic amino acids, L-lysine and l-ornithine, competitively inhibited [3H]-l-arginine uptake m rat aortic smooth muscle cells, with inhibition constants of 195 ± 23 μM(n = 12), 260 ± 40 μM(n= 5) and 330 ± 10 μM(n = 5), respectively. These results show that OH-l-Arg is recognized by the cationic l-amino acid carrier present in vascular smooth muscle cells and can be oxidized to NO and nitrite in these cells in the absence of NO-synthase, probably by cytochrome P450 or by a reaction involving a cytochrome P450 byproduct

    Dance training improves cytokine secretion and viability of neutrophils in diabetic patients

    Get PDF
    Background. Evidence suggests that exercise improves neutrophil function. The decreased functional longevity of neutrophils and their increased clearance from infectious sites contribute to the increased susceptibility to infection and severity of infection observed in patients with diabetes. Objective. Herein, we investigated the effects of a dance program on neutrophil number, function, and death in type 2 diabetes mellitus (T2DM) patients and healthy volunteers. Methods. Ten patients with T2DM and twelve healthy individuals participated in a moderate-intensity dance training program for 4 months. The plasma levels of leptin, free fatty acids (FFAs), tumour necrosis factor-α (TNF-α), C-reactive protein (CRP), interleukin-1β (IL-1β), and interleukin-1 receptor antagonist (IL-1ra); neutrophil counts; extent of DNA fragmentation; cell membrane integrity; and production of TNF-α, interleukin-8 (IL-8), interleukin-6 (IL-6), and IL-1β in neutrophils were measured before and after training. Results. Training reduced plasma levels of TNF-α (1.9-fold in controls and 2.2-fold in patients with T2DM) and CRP (1.4-fold in controls and 3.4-fold in patients with T2DM). IL-1ra levels were higher in the control group (2.2-fold) after training. After training, neutrophil DNA fragmentation was decreased in patients with T2DM (90%), while the number of neutrophils increased (70% in controls and 1.1-fold in patients with T2DM). Conclusion. Dance training is a nonpharmacological strategy to reduce inflammation and improve neutrophil clearance in patients with T2DM

    Personalised service? Changing the role of the government librarian

    Get PDF
    Investigates the feasibility of personalised information service in a government department. A qualitative methodology explored stakeholder opinions on the remit, marketing, resourcing and measurement of the service. A questionnaire and interviews gathered experiences of personalised provision across the government sector. Potential users were similarly surveyed to discuss how the service could meet their needs. Data were analysed using coding techniques to identify emerging theory. Lessons learned from government librarians centred on clarifying requirements, balancing workloads and selective marketing. The user survey showed low usage and awareness of existing specialist services, but high levels of need and interest in services repackaged as a tailored offering. Fieldwork confirmed findings from the literature on the scope for adding value through information management advice, information skills training and substantive research assistance and the need to understand business processes and develop effective partnerships. Concluding recommendations focus on service definition, strategic marketing, resource utilisation and performance measurement
    • …
    corecore