69 research outputs found

    Energy density and weight change in a long-term weight-loss trial

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Health risks linked to obesity and the difficulty most have in achieving weight loss underscore the importance of identifying dietary factors that contribute to successful weight loss.</p> <p>Methods</p> <p>This study examined the association between change in dietary energy density and weight loss over time. Subjects were 213 men and women with BMI of 30–39 kg/m<sup>2 </sup>and without chronic illness enrolled in 2004 in a randomized trial evaluating behavioral treatments for long-term weight loss. Subjects completed a 62-item food frequency questionnaire at baseline and at 6, 12, and 18 months.</p> <p>Results</p> <p>Pearson correlations between BMI and energy density (kcals/g of solid food) at baseline were not significantly different from zero (r = -0.02, p = 0.84). In a longitudinal analysis, change in energy density was strongly related to change in BMI. The estimated ÎČ for change in BMI (kg/m<sup>2</sup>) of those in the quartile representing greatest decrease in energy density at 18 months compared to those in the quartile with the least was -1.95 (p = 0.006). The association was especially strong in the first six months (estimated ÎČ = -1.43), the period with greatest weight loss (mean change in BMI = -2.50 kg/m<sup>2 </sup>from 0–6 months <it>vs. </it>0.23 kg/m<sup>2 </sup>from 12–18 months) and the greatest contrast with respect to change in energy density.</p> <p>Conclusion</p> <p>Decreased energy density predicted weight loss in this 18 month weight loss study. These findings may have important implications for individual dietary advice and public health policies targeting weight control in the general population</p

    Practice change in chronic conditions care: an appraisal of theories

    Get PDF
    Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.Background Management of chronic conditions can be complex and burdensome for patients and complex and costly for health systems. Outcomes could be improved and costs reduced if proven clinical interventions were better implemented, but the complexity of chronic care services appears to make clinical change particularly challenging. Explicit use of theories may improve the success of clinical change in this area of care provision. Whilst theories to support implementation of practice change are apparent in the broad healthcare arena, the most applicable theories for the complexities of practice change in chronic care have not yet been identified. Methods We developed criteria to review the usefulness of change implementation theories for informing chronic care management and applied them to an existing list of theories used more widely in healthcare. Results Criteria related to the following characteristics of chronic care: breadth of the field; multi-disciplinarity; micro, meso and macro program levels; need for field-specific research on implementation requirements; and need for measurement. Six theories met the criteria to the greatest extent: the Consolidate Framework for Implementation Research; Normalization Process Theory and its extension General Theory of Implementation; two versions of the Promoting Action on Research Implementation in Health Services framework and Sticky Knowledge. None fully met all criteria. Involvement of several care provision organizations and groups, involvement of patients and carers, and policy level change are not well covered by most theories. However, adaptation may be possible to include multiple groups including patients and carers, and separate theories may be needed on policy change. Ways of qualitatively assessing theory constructs are available but quantitative measures are currently partial and under development for all theories. Conclusions Theoretical bases are available to structure clinical change research in chronic condition care. Theories will however need to be adapted and supplemented to account for the particular features of care in this field, particularly in relation to involvement of multiple organizations and groups, including patients, and in relation to policy influence. Quantitative measurement of theory constructs may present difficulties

    DNA mismatch repair gene MSH6 implicated in determining age at natural menopause

    Get PDF
    notes: PMCID: PMC3976329This is a freely-available open access publication. Please cite the published version which is available via the DOI link in this record.The length of female reproductive lifespan is associated with multiple adverse outcomes, including breast cancer, cardiovascular disease and infertility. The biological processes that govern the timing of the beginning and end of reproductive life are not well understood. Genetic variants are known to contribute to ∌50% of the variation in both age at menarche and menopause, but to date the known genes explain <15% of the genetic component. We have used genome-wide association in a bivariate meta-analysis of both traits to identify genes involved in determining reproductive lifespan. We observed significant genetic correlation between the two traits using genome-wide complex trait analysis. However, we found no robust statistical evidence for individual variants with an effect on both traits. A novel association with age at menopause was detected for a variant rs1800932 in the mismatch repair gene MSH6 (P = 1.9 × 10(-9)), which was also associated with altered expression levels of MSH6 mRNA in multiple tissues. This study contributes to the growing evidence that DNA repair processes play a key role in ovarian ageing and could be an important therapeutic target for infertility.UK Medical Research CouncilWellcome Trus

    Fc-Optimized Anti-CD25 Depletes Tumor-Infiltrating Regulatory T Cells and Synergizes with PD-1 Blockade to Eradicate Established Tumors

    Get PDF
    CD25 is expressed at high levels on regulatory T (Treg) cells and was initially proposed as a target for cancer immunotherapy. However, anti-CD25 antibodies have displayed limited activity against established tumors. We demonstrated that CD25 expression is largely restricted to tumor-infiltrating Treg cells in mice and humans. While existing anti-CD25 antibodies were observed to deplete Treg cells in the periphery, upregulation of the inhibitory Fc gamma receptor (FcγR) IIb at the tumor site prevented intra-tumoral Treg cell depletion, which may underlie the lack of anti-tumor activity previously observed in pre-clinical models. Use of an anti-CD25 antibody with enhanced binding to activating FcγRs led to effective depletion of tumor-infiltrating Treg cells, increased effector to Treg cell ratios, and improved control of established tumors. Combination with anti-programmed cell death protein-1 antibodies promoted complete tumor rejection, demonstrating the relevance of CD25 as a therapeutic target and promising substrate for future combination approaches in immune-oncology

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    Allele-Specific HLA Loss and Immune Escape in Lung Cancer Evolution

    Get PDF
    Immune evasion is a hallmark of cancer. Losing the ability to present neoantigens through human leukocyte antigen (HLA) loss may facilitate immune evasion. However, the polymorphic nature of the locus has precluded accurate HLA copy-number analysis. Here, we present loss of heterozygosity in human leukocyte antigen (LOHHLA), a computational tool to determine HLA allele-specific copy number from sequencing data. Using LOHHLA, we find that HLA LOH occurs in 40% of non-small-cell lung cancers (NSCLCs) and is associated with a high subclonal neoantigen burden, APOBEC-mediated mutagenesis, upregulation of cytolytic activity, and PD-L1 positivity. The focal nature of HLA LOH alterations, their subclonal frequencies, enrichment in metastatic sites, and occurrence as parallel events suggests that HLA LOH is an immune escape mechanism that is subject to strong microenvironmental selection pressures later in tumor evolution. Characterizing HLA LOH with LOHHLA refines neoantigen prediction and may have implications for our understanding of resistance mechanisms and immunotherapeutic approaches targeting neoantigens. Video Abstract [Figure presented] Development of the bioinformatics tool LOHHLA allows precise measurement of allele-specific HLA copy number, improves the accuracy in neoantigen prediction, and uncovers insights into how immune escape contributes to tumor evolution in non-small-cell lung cancer

    Phylogenetic ctDNA analysis depicts early-stage lung cancer evolution.

    Get PDF
    The early detection of relapse following primary surgery for non-small-cell lung cancer and the characterization of emerging subclones, which seed metastatic sites, might offer new therapeutic approaches for limiting tumour recurrence. The ability to track the evolutionary dynamics of early-stage lung cancer non-invasively in circulating tumour DNA (ctDNA) has not yet been demonstrated. Here we use a tumour-specific phylogenetic approach to profile the ctDNA of the first 100 TRACERx (Tracking Non-Small-Cell Lung Cancer Evolution Through Therapy (Rx)) study participants, including one patient who was also recruited to the PEACE (Posthumous Evaluation of Advanced Cancer Environment) post-mortem study. We identify independent predictors of ctDNA release and analyse the tumour-volume detection limit. Through blinded profiling of postoperative plasma, we observe evidence of adjuvant chemotherapy resistance and identify patients who are very likely to experience recurrence of their lung cancer. Finally, we show that phylogenetic ctDNA profiling tracks the subclonal nature of lung cancer relapse and metastasis, providing a new approach for ctDNA-driven therapeutic studies

    Multiorgan MRI findings after hospitalisation with COVID-19 in the UK (C-MORE): a prospective, multicentre, observational cohort study

    Get PDF
    Introduction: The multiorgan impact of moderate to severe coronavirus infections in the post-acute phase is still poorly understood. We aimed to evaluate the excess burden of multiorgan abnormalities after hospitalisation with COVID-19, evaluate their determinants, and explore associations with patient-related outcome measures. Methods: In a prospective, UK-wide, multicentre MRI follow-up study (C-MORE), adults (aged ≄18 years) discharged from hospital following COVID-19 who were included in Tier 2 of the Post-hospitalisation COVID-19 study (PHOSP-COVID) and contemporary controls with no evidence of previous COVID-19 (SARS-CoV-2 nucleocapsid antibody negative) underwent multiorgan MRI (lungs, heart, brain, liver, and kidneys) with quantitative and qualitative assessment of images and clinical adjudication when relevant. Individuals with end-stage renal failure or contraindications to MRI were excluded. Participants also underwent detailed recording of symptoms, and physiological and biochemical tests. The primary outcome was the excess burden of multiorgan abnormalities (two or more organs) relative to controls, with further adjustments for potential confounders. The C-MORE study is ongoing and is registered with ClinicalTrials.gov, NCT04510025. Findings: Of 2710 participants in Tier 2 of PHOSP-COVID, 531 were recruited across 13 UK-wide C-MORE sites. After exclusions, 259 C-MORE patients (mean age 57 years [SD 12]; 158 [61%] male and 101 [39%] female) who were discharged from hospital with PCR-confirmed or clinically diagnosed COVID-19 between March 1, 2020, and Nov 1, 2021, and 52 non-COVID-19 controls from the community (mean age 49 years [SD 14]; 30 [58%] male and 22 [42%] female) were included in the analysis. Patients were assessed at a median of 5·0 months (IQR 4·2–6·3) after hospital discharge. Compared with non-COVID-19 controls, patients were older, living with more obesity, and had more comorbidities. Multiorgan abnormalities on MRI were more frequent in patients than in controls (157 [61%] of 259 vs 14 [27%] of 52; p&lt;0·0001) and independently associated with COVID-19 status (odds ratio [OR] 2·9 [95% CI 1·5–5·8]; padjusted=0·0023) after adjusting for relevant confounders. Compared with controls, patients were more likely to have MRI evidence of lung abnormalities (p=0·0001; parenchymal abnormalities), brain abnormalities (p&lt;0·0001; more white matter hyperintensities and regional brain volume reduction), and kidney abnormalities (p=0·014; lower medullary T1 and loss of corticomedullary differentiation), whereas cardiac and liver MRI abnormalities were similar between patients and controls. Patients with multiorgan abnormalities were older (difference in mean age 7 years [95% CI 4–10]; mean age of 59·8 years [SD 11·7] with multiorgan abnormalities vs mean age of 52·8 years [11·9] without multiorgan abnormalities; p&lt;0·0001), more likely to have three or more comorbidities (OR 2·47 [1·32–4·82]; padjusted=0·0059), and more likely to have a more severe acute infection (acute CRP &gt;5mg/L, OR 3·55 [1·23–11·88]; padjusted=0·025) than those without multiorgan abnormalities. Presence of lung MRI abnormalities was associated with a two-fold higher risk of chest tightness, and multiorgan MRI abnormalities were associated with severe and very severe persistent physical and mental health impairment (PHOSP-COVID symptom clusters) after hospitalisation. Interpretation: After hospitalisation for COVID-19, people are at risk of multiorgan abnormalities in the medium term. Our findings emphasise the need for proactive multidisciplinary care pathways, with the potential for imaging to guide surveillance frequency and therapeutic stratification

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570
    • 

    corecore