52 research outputs found
Trauma history and depression predict incomplete adherence to antiretroviral therapies in a low income country.
As antiretroviral therapy (ART) for HIV becomes increasingly available in low and middle income countries (LMICs), understanding reasons for lack of adherence is critical to stemming the tide of infections and improving health. Understanding the effect of psychosocial experiences and mental health symptomatology on ART adherence can help maximize the benefit of expanded ART programs by indicating types of services, which could be offered in combination with HIV care. The Coping with HIV/AIDS in Tanzania (CHAT) study is a longitudinal cohort study in the Kilimanjaro Region that included randomly selected HIV-infected (HIV+) participants from two local hospital-based HIV clinics and four free-standing voluntary HIV counselling and testing sites. Baseline data were collected in 2008 and 2009; this paper used data from 36 month follow-up interviews (N = 468). Regression analyses were used to predict factors associated with incomplete self-reported adherence to ART. INCOMPLETE ART ADHERENCE WAS SIGNIFICANTLY MORE LIKELY TO BE REPORTED AMONGST PARTICIPANTS WHO EXPERIENCED A GREATER NUMBER OF CHILDHOOD TRAUMATIC EVENTS: sexual abuse prior to puberty and the death in childhood of an immediate family member not from suicide or homicide were significantly more likely in the non-adherent group and other negative childhood events trended toward being more likely. Those with incomplete adherence had higher depressive symptom severity and post-traumatic stress disorder (PTSD). In multivariable analyses, childhood trauma, depression, and financial sacrifice remained associated with incomplete adherence.\ud
This is the first study to examine the effect of childhood trauma, depression and PTSD on HIV medication adherence in a low income country facing a significant burden of HIV. Allocating spending on HIV/AIDS toward integrating mental health services with HIV care is essential to the creation of systems that enhance medication adherence and maximize the potential of expanded antiretroviral access to improve health and reduce new infections
Recommended from our members
Bioavailability in soils
The consumption of locally-produced vegetables by humans may be an important exposure pathway for soil contaminants in many urban settings and for agricultural land use. Hence, prediction of metal and metalloid uptake by vegetables from contaminated soils is an important part of the Human Health Risk Assessment procedure. The behaviour of metals (cadmium, chromium, cobalt, copper, mercury, molybdenum, nickel, lead and zinc) and metalloids (arsenic, boron and selenium) in contaminated soils depends to a large extent on the intrinsic charge, valence and speciation of the contaminant ion, and soil properties such as pH, redox status and contents of clay and/or organic matter. However, chemistry and behaviour of the contaminant in soil alone cannot predict soil-to-plant transfer. Root uptake, root selectivity, ion interactions, rhizosphere processes, leaf uptake from the atmosphere, and plant partitioning are important processes that ultimately govern the accumulation ofmetals and metalloids in edible vegetable tissues. Mechanistic models to accurately describe all these processes have not yet been developed, let alone validated under field conditions. Hence, to estimate risks by vegetable consumption, empirical models have been used to correlate concentrations of metals and metalloids in contaminated soils, soil physico-chemical characteristics, and concentrations of elements in vegetable tissues. These models should only be used within the bounds of their calibration, and often need to be re-calibrated or validated using local soil and environmental conditions on a regional or site-specific basis.Mike J. McLaughlin, Erik Smolders, Fien Degryse, and Rene Rietr
Long-Term Secondary Care Costs of Endometrial Cancer: A Prospective Cohort Study Nested within the United Kingdom Collaborative Trial of Ovarian Cancer Screening (UKCTOCS).
BACKGROUND: There is limited evidence on the costs of Endometrial Cancer (EC) by stage of disease. We estimated the long-term secondary care costs of EC according to stage at diagnosis in an English population-based cohort. METHODS: Women participating in UKCTOCS and diagnosed with EC following enrolment (2001-2005) and prior to 31st Dec 2009 were identified to have EC through multiple sources. Survival was calculated through data linkage to death registry. Costs estimates were derived from hospital records accessed from Hospital Episode Statistics (HES) with additional patient level covariates derived from case notes and patient questionnaires. Missing and censored data was imputed using Multiple Imputation. Regression analysis of cost and survival was undertaken. RESULTS: 491 of 641 women with EC were included. Five year total costs were strongly dependent on stage, ranging from £9,475 (diagnosis at stage IA/IB) to £26,080 (diagnosis at stage III). Stage, grade and BMI were the strongest predictors of costs. The majority of costs for stage I/II EC were incurred in the first six months after diagnosis while for stage III / IV considerable costs accrued after the first six months. CONCLUSIONS: In addition to survival advantages, there are significant cost savings if patients with EC are detected earlier.The analysis underpinning this study was supported with a grant from Cancer
Research UK (CRUK Grant No: A16008) awarded to RL (http://www.cancerresearchuk.
org/funding-for-researchers). The trial (UKCTOCS) for which the patients in this study
form a subgroup was funded by the Medical Research Council, Cancer Research UK, the
Department of Health and the Eve Appeal
Genetic overlap between endometriosis and endometrial cancer: evidence from cross-disease genetic correlation and GWAS meta-analyses.
Epidemiological, biological, and molecular data suggest links between endometriosis and endometrial cancer, with recent epidemiological studies providing evidence for an association between a previous diagnosis of endometriosis and risk of endometrial cancer. We used genetic data as an alternative approach to investigate shared biological etiology of these two diseases. Genetic correlation analysis of summary level statistics from genomewide association studies (GWAS) using LD Score regression revealed moderate but significant genetic correlation (rg = 0.23, P = 9.3 × 10-3 ), and SNP effect concordance analysis provided evidence for significant SNP pleiotropy (P = 6.0 × 10-3 ) and concordance in effect direction (P = 2.0 × 10-3 ) between the two diseases. Cross-disease GWAS meta-analysis highlighted 13 distinct loci associated at P ≤ 10-5 with both endometriosis and endometrial cancer, with one locus (SNP rs2475335) located within PTPRD associated at a genomewide significant level (P = 4.9 × 10-8 , OR = 1.11, 95% CI = 1.07-1.15). PTPRD acts in the STAT3 pathway, which has been implicated in both endometriosis and endometrial cancer. This study demonstrates the value of cross-disease genetic analysis to support epidemiological observations and to identify biological pathways of relevance to multiple diseases
Nucleases as a barrier to gene silencing in the cotton boll weevil, Anthonomus grandis.
Made available in DSpace on 2018-01-04T23:23:41Z (GMT). No. of bitstreams: 1
journal.pone.0189600.pdf: 7131320 bytes, checksum: ece3da5d8a008843e58701868100618d (MD5)
Previous issue date: 2018-01-04bitstream/item/170309/1/journal.pone.0189600.pd
The impact of surgical delay on resectability of colorectal cancer: An international prospective cohort study
AIM: The SARS-CoV-2 pandemic has provided a unique opportunity to explore the impact of surgical delays on cancer resectability. This study aimed to compare resectability for colorectal cancer patients undergoing delayed versus non-delayed surgery. METHODS: This was an international prospective cohort study of consecutive colorectal cancer patients with a decision for curative surgery (January-April 2020). Surgical delay was defined as an operation taking place more than 4 weeks after treatment decision, in a patient who did not receive neoadjuvant therapy. A subgroup analysis explored the effects of delay in elective patients only. The impact of longer delays was explored in a sensitivity analysis. The primary outcome was complete resection, defined as curative resection with an R0 margin. RESULTS: Overall, 5453 patients from 304 hospitals in 47 countries were included, of whom 6.6% (358/5453) did not receive their planned operation. Of the 4304 operated patients without neoadjuvant therapy, 40.5% (1744/4304) were delayed beyond 4 weeks. Delayed patients were more likely to be older, men, more comorbid, have higher body mass index and have rectal cancer and early stage disease. Delayed patients had higher unadjusted rates of complete resection (93.7% vs. 91.9%, P = 0.032) and lower rates of emergency surgery (4.5% vs. 22.5%, P < 0.001). After adjustment, delay was not associated with a lower rate of complete resection (OR 1.18, 95% CI 0.90-1.55, P = 0.224), which was consistent in elective patients only (OR 0.94, 95% CI 0.69-1.27, P = 0.672). Longer delays were not associated with poorer outcomes. CONCLUSION: One in 15 colorectal cancer patients did not receive their planned operation during the first wave of COVID-19. Surgical delay did not appear to compromise resectability, raising the hypothesis that any reduction in long-term survival attributable to delays is likely to be due to micro-metastatic disease
- …