28 research outputs found

    Time trends and heterogeneity in the disease burden of visual impairment due to cataract, 1990–2019: A global analysis

    Get PDF
    ObjectivesThis study aimed to estimate the disease burden of cataract and evaluate the contributions of risk factors to cataract-associated disability-adjusted life years (DALYs).Materials and methodsPrevalence and DALYs of visual impairment due to cataract were extracted from the Global Burden of Disease (GBD) study 2019 to explore time trends and annual changes. Regional and country-level socioeconomic indexes were obtained from open databases. The time trend of prevalence and DALYs was demonstrated. Stepwise multiple linear regression was used to evaluate associations between the age-standardized rate of DALYs of cataract and potential predictors.ResultsGlobal Prevalence rate of visual impairment due to cataract rose by 58.45% to 1,253.9 per 100,000 population (95% CI: 1,103.3 to 1,417.7 per 100,000 population) in 2019 and the DALYs rate rose by 32.18% from 65.3 per 100,000 population (95% CI: 46.4 to 88.2 per 100,000 population) in 1990 to 86.3 per 100,000 population (95% CI: 61.5 to 116.4 per 100,000 population) in 2019. Stepwise multiple linear regression model showed that higher refractive error prevalence (β = 0.036, 95% CI: 0.022, 0.050, P < 0.001), lower number of physicians per 10,000 population (β = −0.959, 95% CI: −1.685, −0.233, P = 0.010), and lower level of HDI (β = −134.93, 95% CI: −209.84, −60.02, P = 0.001) were associated with a higher disease burden of cataract.ConclusionSubstantial increases in the prevalence of visual impairment and DALYs of cataract were observed from 1990 to 2019. Successful global initiatives targeting improving cataract surgical rate and quality, especially in regions with lower socioeconomic status, is a prerequisite to combating this growing burden of cataract in the aging society

    Cognitive Performance Concomitant With Vision Acuity Predicts 13-Year Risk for Mortality

    Get PDF
    Objective: To assess the joint impact of cognitive performance and visual acuity on mortality over 13-year follow-up in a representative US sample.Methods: Data from National Health and Nutrition Examination Survey (NHANES) participants (≥18 years old) were linked with the death record data of the National Death Index (NDI) with mortality follow-up through December 31, 2011. Cognitive performance was evaluated by the Digit Symbol Substitution Test (DSST) and cognitive performance impairment was defined as the DSST score equal to or less than the median value in the study population. Visual impairment (VI) was defined as presenting visual acuity worse than 20/40 in the better-seeing eye. Risks of all-cause and specific-cause mortality were estimated with Cox proportional hazards models after adjusting for confounders.Results: A total of 2,550 participants 60 years and older from two waves of (NHANES, 1999–2000, 2001–2002) were included in the current analysis. Over a median follow-up period of 9.92 years, 952 (35.2%) died of all causes, of whom 239 (23.1%), 224 (24.0%), and 489 (52.9%) died from cardiovascular disease (CVD), cancer, and non-CVD/non-cancer mortality, respectively. Cognitive performance impairment and VI increased the odds for mortality. Co-presence of VI among cognitive impaired elderly persons predicted nearly a threefold increased risk of all-cause mortality [hazard ratios (HRs), 2.74; 95% confidence interval (CI), 2.02–3.70; P < 0.001) and almost a fourfold higher risk of non-CVD/non-cancer mortality (HR, 3.72; 95% CI, 2.30–6.00; P < 0.001) compared to having neither impairment.Conclusion: People aged 60 years and over with poorer cognitive performance were at higher risk of long-term mortality, and were especially vulnerable to further mortality when concomitant with VI. It is informative for clinical implication in terms of early preventive interventions

    Disease Burden of Chronic Kidney Disease Due to Hypertension From 1990 to 2019: A Global Analysis

    Get PDF
    Background: Although it is widely known that hypertension is an important cause of chronic kidney disease (CKD), little detailed quantitative research exists on the burden of CKD due to hypertension.Objective: The objective of the study is to estimate the global disease burden of CKD due to hypertension and to evaluate the association between the socioeconomic factors and country-level disease burden of CKD due to hypertension.Methods: We extracted the disability-adjusted life-year (DALY) numbers, rates, and age-standardized rates of CKD due to hypertension from the Global Burden of Disease Study 2019 database to investigate the time trends of the burden of CKD due to hypertension from 1990 to 2019. Stepwise multiple linear regression analysis was performed to evaluate the correlations between the age-standardized DALY rate and socioeconomic factors and other related factors obtained from open databases.Results: Globally, from 1990 to 2019, DALY numbers caused by CKD due to hypertension increased by 125.2% [95% confidential interval (CI), 124.6 to 125.7%]. The DALY rate increased by 55.7% (55.3 to 56.0%) to 128.8 (110.9 to 149.2) per 100,000 population, while the age-standardized DALYs per 100,000 population increased by 10.9% (10.3 to 11.5%). In general, males and elderly people tended to have a higher disease burden. The distribution disparity in the burden of CKD due to hypertension varies greatly among countries. In the stepwise multiple linear regression model, inequality-adjusted human development index (IHDI) [β = −161.1 (95% CI −238.1 to −84.2), P < 0.001] and number of physicians per 10,000 people [β = −2.91 (95% CI −4.02 to −1.80), P < 0.001] were significantly negatively correlated with age-standardized DALY rate when adjusted for IHDI, health access and quality (HAQ), number of physicians per 10,000 people, and population with at least some secondary education.Conclusion: Improving the average achievements and equality of distribution in health, education, and income, as well as increasing the number of physicians per 10,000 people could help to reduce the burden of CKD due to hypertension. These findings may provide relevant information toward efforts to optimize health policies aimed at reducing the burden of CKD due to hypertension

    Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study

    Get PDF
    Purpose: Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom. Methods: Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded. Results: The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia. Conclusion: We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes

    Abstracts from the Food Allergy and Anaphylaxis Meeting 2016

    Get PDF

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    31st Annual Meeting and Associated Programs of the Society for Immunotherapy of Cancer (SITC 2016) : part two

    Get PDF
    Background The immunological escape of tumors represents one of the main ob- stacles to the treatment of malignancies. The blockade of PD-1 or CTLA-4 receptors represented a milestone in the history of immunotherapy. However, immune checkpoint inhibitors seem to be effective in specific cohorts of patients. It has been proposed that their efficacy relies on the presence of an immunological response. Thus, we hypothesized that disruption of the PD-L1/PD-1 axis would synergize with our oncolytic vaccine platform PeptiCRAd. Methods We used murine B16OVA in vivo tumor models and flow cytometry analysis to investigate the immunological background. Results First, we found that high-burden B16OVA tumors were refractory to combination immunotherapy. However, with a more aggressive schedule, tumors with a lower burden were more susceptible to the combination of PeptiCRAd and PD-L1 blockade. The therapy signifi- cantly increased the median survival of mice (Fig. 7). Interestingly, the reduced growth of contralaterally injected B16F10 cells sug- gested the presence of a long lasting immunological memory also against non-targeted antigens. Concerning the functional state of tumor infiltrating lymphocytes (TILs), we found that all the immune therapies would enhance the percentage of activated (PD-1pos TIM- 3neg) T lymphocytes and reduce the amount of exhausted (PD-1pos TIM-3pos) cells compared to placebo. As expected, we found that PeptiCRAd monotherapy could increase the number of antigen spe- cific CD8+ T cells compared to other treatments. However, only the combination with PD-L1 blockade could significantly increase the ra- tio between activated and exhausted pentamer positive cells (p= 0.0058), suggesting that by disrupting the PD-1/PD-L1 axis we could decrease the amount of dysfunctional antigen specific T cells. We ob- served that the anatomical location deeply influenced the state of CD4+ and CD8+ T lymphocytes. In fact, TIM-3 expression was in- creased by 2 fold on TILs compared to splenic and lymphoid T cells. In the CD8+ compartment, the expression of PD-1 on the surface seemed to be restricted to the tumor micro-environment, while CD4 + T cells had a high expression of PD-1 also in lymphoid organs. Interestingly, we found that the levels of PD-1 were significantly higher on CD8+ T cells than on CD4+ T cells into the tumor micro- environment (p < 0.0001). Conclusions In conclusion, we demonstrated that the efficacy of immune check- point inhibitors might be strongly enhanced by their combination with cancer vaccines. PeptiCRAd was able to increase the number of antigen-specific T cells and PD-L1 blockade prevented their exhaus- tion, resulting in long-lasting immunological memory and increased median survival
    corecore