61 research outputs found

    A complex increase in hepatitis C virus in a correctional facility: bumps in the road

    Get PDF
    Objective: The prevalence of hepatitis C virus (HCV) in correctional facilities in Australia among people who inject drugs is 60%, with disproportionate effects observed in Aboriginal and Torres Strait Islander people. Following the micro-elimination of HCV in a Queensland correctional facility (QCF), newly acquired cases began to increase in mid-2019. Here we discuss the public health response to increasing HCV in a QCF. Methods: Enhanced surveillance was performed to obtain contextual outbreak data on risk factors including injecting drug use, sharing of personal hygiene equipment and do-it-yourself-tattooing. Results: In the sixteen months, there were 250 notifications of new and re-infected HCV infections in prisoners in the QCF. Qualitative data revealed the leading factor in transmission to be injecting drug use. Conclusions: Drivers for increased HCV transmission in correctional facilities include boredom, waiting lists for opioid substitution programs, changes in injecting behaviours and sharing of injecting paraphernalia. Point-of-care testing combined with education and the development of a needle and syringe program may be promising ways forward for managing HCV in correctional facilities. Implications for public health: Correctional facilities are key locations to target sexually transmitted infection (STI) and blood-borne virus (BBV) testing and treatment as well as health promotion to improve the health of inmates and the communities they return to

    Multicenter, prospective cohort study of oesophageal injuries and related clinical outcomes (MUSOIC study)

    Get PDF
    Objective: To identify prognostic factors associated with 90-day mortality in patients with oesophageal perforation (OP), and characterize the specific timeline from presentation to intervention, and its relation to mortality. Background: OP is a rare gastro-intestinal surgical emergency with a high mortality rate. However, there is no updated evidence on its outcomes in the context of centralized esophago-gastric services; updated consensus guidelines; and novel non-surgical treatment strategies. Methods: A multi-center, prospective cohort study involving eight high-volume esophago-gastric centers (January 2016 to December 2020) was undertaken. The primary outcome measure was 90-day mortality. Secondary measures included length of hospital and ICU stay, and complications requiring re-intervention or re-admission. Mortality model training was performed using random forest, support-vector machines, and logistic regression with and without elastic net regularisation. Chronological analysis was performed by examining each patient’s journey timepoint with reference to symptom onset. Results: The mortality rate for 369 patients included was 18.9%. Patients treated conservatively, endoscopically, surgically, or combined approaches had mortality rates of 24.1%, 23.7%, 8.7%, and 18.2%, respectively. The predictive variables for mortality were Charlson comorbidity index, haemoglobin count, leucocyte count, creatinine levels, cause of perforation, presence of cancer, hospital transfer, CT findings, whether a contrast swallow was performed, and intervention type. Stepwise interval model showed that time to diagnosis was the most significant contributor to mortality. Conclusion: Non-surgical strategies have better outcomes and may be preferred in selected cohorts to manage perforations. Outcomes can be significantly improved through better risk-stratification based on afore-mentioned modifiable risk factors

    Children’s and adolescents’ rising animal-source food intakes in 1990–2018 were impacted by age, region, parental education and urbanicity

    Get PDF
    Animal-source foods (ASF) provide nutrition for children and adolescents’ physical and cognitive development. Here, we use data from the Global Dietary Database and Bayesian hierarchical models to quantify global, regional and national ASF intakes between 1990 and 2018 by age group across 185 countries, representing 93% of the world’s child population. Mean ASF intake was 1.9 servings per day, representing 16% of children consuming at least three daily servings. Intake was similar between boys and girls, but higher among urban children with educated parents. Consumption varied by age from 0.6 at <1 year to 2.5 servings per day at 15–19 years. Between 1990 and 2018, mean ASF intake increased by 0.5 servings per week, with increases in all regions except sub-Saharan Africa. In 2018, total ASF consumption was highest in Russia, Brazil, Mexico and Turkey, and lowest in Uganda, India, Kenya and Bangladesh. These findings can inform policy to address malnutrition through targeted ASF consumption programmes.publishedVersio

    Incident type 2 diabetes attributable to suboptimal diet in 184 countries

    Get PDF
    The global burden of diet-attributable type 2 diabetes (T2D) is not well established. This risk assessment model estimated T2D incidence among adults attributable to direct and body weight-mediated effects of 11 dietary factors in 184 countries in 1990 and 2018. In 2018, suboptimal intake of these dietary factors was estimated to be attributable to 14.1 million (95% uncertainty interval (UI), 13.8–14.4 million) incident T2D cases, representing 70.3% (68.8–71.8%) of new cases globally. Largest T2D burdens were attributable to insufficient whole-grain intake (26.1% (25.0–27.1%)), excess refined rice and wheat intake (24.6% (22.3–27.2%)) and excess processed meat intake (20.3% (18.3–23.5%)). Across regions, highest proportional burdens were in central and eastern Europe and central Asia (85.6% (83.4–87.7%)) and Latin America and the Caribbean (81.8% (80.1–83.4%)); and lowest proportional burdens were in South Asia (55.4% (52.1–60.7%)). Proportions of diet-attributable T2D were generally larger in men than in women and were inversely correlated with age. Diet-attributable T2D was generally larger among urban versus rural residents and higher versus lower educated individuals, except in high-income countries, central and eastern Europe and central Asia, where burdens were larger in rural residents and in lower educated individuals. Compared with 1990, global diet-attributable T2D increased by 2.6 absolute percentage points (8.6 million more cases) in 2018, with variation in these trends by world region and dietary factor. These findings inform nutritional priorities and clinical and public health planning to improve dietary quality and reduce T2D globally.publishedVersio

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    A comparative analysis of whole genome sequencing of esophageal adenocarcinoma pre- and post-chemotherapy

    Get PDF
    The scientific community has avoided using tissue samples from patients that have been exposed to systemic chemotherapy to infer the genomic landscape of a given cancer. Esophageal adenocarcinoma is a heterogeneous, chemoresistant tumor for which the availability and size of pretreatment endoscopic samples are limiting. This study compares whole-genome sequencing data obtained from chemo-naive and chemo-treated samples. The quality of whole-genomic sequencing data is comparable across all samples regardless of chemotherapy status. Inclusion of samples collected post-chemotherapy increased the proportion of late-stage tumors. When comparing matched pre- and post-chemotherapy samples from 10 cases, the mutational signatures, copy number, and SNV mutational profiles reflect the expected heterogeneity in this disease. Analysis of SNVs in relation to allele-specific copy-number changes pinpoints the common ancestor to a point prior to chemotherapy. For cases in which pre- and post-chemotherapy samples do show substantial differences, the timing of the divergence is near-synchronous with endoreduplication. Comparison across a large prospective cohort (62 treatment-naive, 58 chemotherapy-treated samples) reveals no significant differences in the overall mutation rate, mutation signatures, specific recurrent point mutations, or copy-number events in respect to chemotherapy status. In conclusion, whole-genome sequencing of samples obtained following neoadjuvant chemotherapy is representative of the genomic landscape of esophageal adenocarcinoma. Excluding these samples reduces the material available for cataloging and introduces a bias toward the earlier stages of cancer.This study was partly funded by a project grant from Cancer Research UK. R.C.F. is funded by an NIHR Professorship and receives core funding from the Medical Research Council and infrastructure support from the Biomedical Research Centre and the Experimental Cancer Medicine Centre. We acknowledge the support of The University of Cambridge, Cancer Research UK (C14303/A17197) and Hutchison Whampoa Limited

    CXCR5<sup>+</sup> follicular cytotoxic T cells control viral infection in B cell follicles

    Get PDF
    During unresolved infections, some viruses escape immunological control and establish a persistant reservoir in certain cell types, such as human immunodeficiency virus (HIV), which persists in follicular helper T cells (TFH cells), and Epstein-Barr virus (EBV), which persists in B cells. Here we identified a specialized group of cytotoxic T cells (TC cells) that expressed the chemokine receptor CXCR5, selectively entered B cell follicles and eradicated infected TFH cells and B cells. The differentiation of these cells, which we have called 'follicular cytotoxic T cells' (TFC cells), required the transcription factors Bcl6, E2A and TCF-1 but was inhibited by the transcriptional regulators Blimp1, Id2 and Id3. Blimp1 and E2A directly regulated Cxcr5 expression and, together with Bcl6 and TCF-1, formed a transcriptional circuit that guided TFC cell development. The identification of TFC cells has far-reaching implications for the development of strategies to control infections that target B cells and TFH cells and to treat B cell–derived malignancies
    corecore