113 research outputs found

    Screening and Management of Coronary Artery Disease in Kidney Transplant Candidates

    Get PDF
    Cardiovascular disease (CVD) is a major cause of morbidity and mortality in patients with chronic kidney disease (CKD), especially in end-stage renal disease (ESRD) patients and during the first year after transplantation. For these reasons, and due to the shortage of organs available for transplant, it is of utmost importance to identify patients with a good life expectancy after transplant and minimize the transplant peri-operative risk. Various conditions, such as severe pulmonary diseases, recent myocardial infarction or stroke, and severe aorto-iliac atherosclerosis, need to be ruled out before adding a patient to the transplant waiting list. The effectiveness of systematic coronary artery disease (CAD) treatment before kidney transplant is still debated, and there is no universal screening protocol, not to mention that a nontailored screening could lead to unnecessary invasive procedures and delay or exclude some patients from transplantation. Despite the different clinical guidelines on CAD screening in kidney transplant candidates that exist, up to today, there is no worldwide universal protocol. This review summarizes the key points of cardiovascular risk assessment in renal transplant candidates and faces the role of noninvasive cardiovascular imaging tools and the impact of coronary revascularization versus best medical therapy before kidney transplant on a patient’s cardiovascular outcome

    Executive functioning and serum lipid fractions in Parkinson’s disease—a possible sex-effect: the PACOS study

    Get PDF
    The association between dyslipidemia and cognitive performance in Parkinson’s disease (PD) patients still needs to be clarified. Aim of the study was to evaluate the presence of possible associations between serum lipids fractions and executive dysfunction also exploring the sex-specific contribute of lipids level on cognition. Patients from the PACOS cohort, who underwent a complete serum lipid profile measures (total cholesterol-TC, low-density lipoprotein cholesterol-LDL, high-density lipoprotein cholesterol-HDL and triglycerides-TG) were selected. Adult Treatment Panel III guidelines of the National Cholesterol Education Program were used to classify normal/abnormal lipid fractions. Executive functioning was assessed with the Frontal Assessment Battery (FAB). Logistic regression was performed to assess associations between lipids fractions and FAB score. Correlations between lipids fractions and FAB score were explored. Sex-stratified analysis was performed. Three hundred and forty-eight PD patients (148 women; age 66.5 ± 9.5 years; disease duration 3.9 ± 4.9 years) were enrolled. Women presented significantly higher TC, LDL and HDL than men. In the whole sample, any association between lipid profile measures and FAB score was found. Among women, a positive association between hypertriglyceridemia and FAB score under cutoff was found (OR 3.4; 95%CI 1.29–9.03; p value 0.013). A statistically significant negative correlation was found between the FAB score and triglyceride serum levels (r = − 0.226; p value 0.005). Differently, among men, a statistically significant negative association between hypercholesterolemia and FAB score under cutoff (OR 0.4; 95%CI 0.17–0.84; p value 0.018) and between high LDL levels and FAB score under cutoff (OR 0.4; 95%CI 0.18–0.90; p value 0.027) were found. Our data suggest a sex-specific different role of lipids in executive functioning

    Changes in Motor, Cognitive, and Behavioral Symptoms in Parkinson's Disease and Mild Cognitive Impairment During the COVID-19 Lockdown

    Get PDF
    Objective: The effects of the COVID-19 lockdown on subjects with prodromal phases of dementia are unknown. The aim of this study was to evaluate the motor, cognitive, and behavioral changes during the COVID-19 lockdown in Italy in patients with Parkinson's disease (PD) with and without mild cognitive impairment (PD-MCI and PD-NC) and in patients with MCI not associated with PD (MCInoPD). Methods: A total of 34 patients with PD-NC, 31 PD-MCI, and 31 MCInoPD and their caregivers were interviewed 10 weeks after the COVID-19 lockdown in Italy, and changes in cognitive, behavioral, and motor symptoms were examined. Modified standardized scales, including the Neuropsychiatric Inventory (NPI) and the Movement Disorder Society, Unified Parkinson's Disease Rating Scale (MDS-UPDRS) Parts I and II, were administered. Multivariate logistic regression was used to evaluate associated covariates by comparing PD-NC vs. PD-MCI and MCInoPD vs. PD-MCI. Results: All groups showed a worsening of cognitive (39.6%), pre-existing (37.5%), and new (26%) behavioral symptoms, and motor symptoms (35.4%) during the COVID-19 lockdown, resulting in an increased caregiver burden in 26% of cases. After multivariate analysis, PD-MCI was significantly and positively associated with the IADL lost during quarantine (OR 3.9, CI 1.61–9.58), when compared to PD-NC. In the analysis of MCInoPD vs. PD-MCI, the latter showed a statistically significant worsening of motor symptoms than MCInoPD (OR 7.4, CI 1.09–45.44). Regarding NPI items, nighttime behaviors statistically differed in MCInoPD vs. PD-MCI (16.1% vs. 48.4%, p = 0.007). MDS-UPDRS parts I and II revealed that PD-MCI showed a significantly higher frequency of cognitive impairment (p = 0.034), fatigue (p = 0.036), and speech (p = 0.013) than PD-NC. On the contrary, PD-MCI showed significantly higher frequencies in several MDS-UPDRS items compared to MCInoPD, particularly regarding pain (p = 0.001), turning in bed (p = 0.006), getting out of bed (p = 0.001), and walking and balance (p = 0.003). Conclusion: The COVID-19 quarantine is associated with the worsening of cognitive, behavioral, and motor symptoms in subjects with PD and MCI, particularly in PD-MCI. There is a need to implement specific strategies to contain the effects of quarantine in patients with PD and cognitive impairment and their caregivers

    The Effects of Sacubitril/Valsartan on Clinical, Biochemical and Echocardiographic Parameters in Patients with Heart Failure with Reduced Ejection Fraction: The "Hemodynamic Recovery".

    Get PDF
    Abstract: Background: Sacubitril/valsartan has been shown to be superior to enalapril in reducing the risks of death and hospitalization for heart failure (HF). However, knowledge of the impact on cardiac performance remains limited. We sought to evaluate the effects of sacubitril/valsartan on clinical, biochemical and echocardiographic parameters in patients with heart failure and reduced ejection fraction (HFrEF). Methods: Sacubitril/valsartan was administered to 205 HFrEF patients. Results: Among 230 patients (mean age 59 ± 10 years, 46% with ischemic heart disease) 205 (89%) completed the study. After a follow-up of 10.49 (2.93 ± 18.44) months, the percentage of patients in New York Heart Association (NYHA) class III changed from 40% to 17% (p < 0.001). Median N– Type natriuretic peptide (Nt-proBNP) decreased from 1865 ± 2318 to 1514 ± 2205 pg/mL, (p = 0.01). Furosemide dose reduced from 131.3 ± 154.5 to 120 ± 142.5 (p = 0.047). Ejection fraction (from 27± 5.9% to 30 ± 7.7% (p < 0.001) and E/A ratio (from 1.67 ± 1.21 to 1.42 ± 1.12 (p = 0.002)) improved. Moderate to severe mitral regurgitation (from 30.1% to 17.4%; p = 0.002) and tricuspid velocity decreased from 2.8 ± 0.55 m/sec to 2.64 ± 0.59 m/sec (p < 0.014). Conclusions: Sacubitril/valsartan induce “hemodynamic recovery” and, consistently with reduction in Nt-proBNP concentrations, improve NYHA class despite diuretic dose reduction

    Timing of Symptoms of Early-Onset Sepsis after Intrapartum Antibiotic Prophylaxis: Can It Inform the Neonatal Management?

    Get PDF
    The effectiveness of “inadequate” intrapartum antibiotic prophylaxis (IAP administered < 4 h prior to delivery) in preventing early-onset sepsis (EOS) is debated. Italian prospective surveillance cohort data (2003–2022) were used to study the type and duration of IAP according to the timing of symptoms onset of group B streptococcus (GBS) and E. coli culture-confirmed EOS cases. IAP was defined “active” when the pathogen yielded in cultures was susceptible. We identified 263 EOS cases (GBS = 191; E. coli = 72). Among GBS EOS, 25% had received IAP (always active when beta-lactams were administered). Most IAP-exposed neonates with GBS were symptomatic at birth (67%) or remained asymptomatic (25%), regardless of IAP duration. Among E. coli EOS, 60% were IAP-exposed. However, IAP was active in only 8% of cases, and these newborns remained asymptomatic or presented with symptoms prior to 6 h of life. In contrast, most newborns exposed to an “inactive” IAP (52%) developed symptoms from 1 to >48 h of life. The key element to define IAP “adequate” seems the pathogen’s antimicrobial susceptibility rather than its duration. Newborns exposed to an active antimicrobial (as frequently occurs with GBS infections), who remain asymptomatic in the first 6 h of life, are likely uninfected. Because E. coli isolates are often unsusceptible to beta-lactam antibiotics, IAP-exposed neonates frequently develop symptoms of EOS after birth, up to 48 h of life and beyond

    Development and Validation of a Comprehensive Model to Estimate Early Allograft Failure among Patients Requiring Early Liver Retransplant

    Get PDF
    Importance: Expansion of donor acceptance criteria for liver transplant increased the risk for early allograft failure (EAF), and although EAF prediction is pivotal to optimize transplant outcomes, there is no consensus on specific EAF indicators or timing to evaluate EAF. Recently, the Liver Graft Assessment Following Transplantation (L-GrAFT) algorithm, based on aspartate transaminase, bilirubin, platelet, and international normalized ratio kinetics, was developed from a single-center database gathered from 2002 to 2015. Objective: To develop and validate a simplified comprehensive model estimating at day 10 after liver transplant the EAF risk at day 90 (the Early Allograft Failure Simplified Estimation [EASE] score) and, secondarily, to identify early those patients with unsustainable EAF risk who are suitable for retransplant. Design, Setting, and Participants: This multicenter cohort study was designed to develop a score capturing a continuum from normal graft function to nonfunction after transplant. Both parenchymal and vascular factors, which provide an indication to list for retransplant, were included among the EAF determinants. The L-GrAFT kinetic approach was adopted and modified with fewer data entries and novel variables. The population included 1609 patients in Italy for the derivation set and 538 patients in the UK for the validation set; all were patients who underwent transplant in 2016 and 2017. Main Outcomes and Measures: Early allograft failure was defined as graft failure (codified by retransplant or death) for any reason within 90 days after transplant. Results: At day 90 after transplant, the incidence of EAF was 110 of 1609 patients (6.8%) in the derivation set and 41 of 538 patients (7.6%) in the external validation set. Median (interquartile range) ages were 57 (51-62) years in the derivation data set and 56 (49-62) years in the validation data set. The EASE score was developed through 17 entries derived from 8 variables, including the Model for End-stage Liver Disease score, blood transfusion, early thrombosis of hepatic vessels, and kinetic parameters of transaminases, platelet count, and bilirubin. Donor parameters (age, donation after cardiac death, and machine perfusion) were not associated with EAF risk. Results were adjusted for transplant center volume. In receiver operating characteristic curve analyses, the EASE score outperformed L-GrAFT, Model for Early Allograft Function, Early Allograft Dysfunction, Eurotransplant Donor Risk Index, donor age × Model for End-stage Liver Disease, and Donor Risk Index scores, estimating day 90 EAF in 87% (95% CI, 83%-91%) of cases in both the derivation data set and the internal validation data set. Patients could be stratified in 5 classes, with those in the highest class exhibiting unsustainable EAF risk. Conclusions and Relevance: This study found that the developed EASE score reliably estimated EAF risk. Knowledge of contributing factors may help clinicians to mitigate risk factors and guide them through the challenging clinical decision to allocate patients to early liver retransplant. The EASE score may be used in translational research across transplant centers

    Analysis of Mycobacterium tuberculosis-Specific CD8 T-Cells in Patients with Active Tuberculosis and in Individuals with Latent Infection

    Get PDF
    CD8 T-cells contribute to control of Mycobacterium tuberculosis infection, but little is known about the quality of the CD8 T-cell response in subjects with latent infection and in patients with active tuberculosis disease. CD8 T-cells recognizing epitopes from 6 different proteins of Mycobacterium tuberculosis were detected by tetramer staining. Intracellular cytokines staining for specific production of IFN-γ and IL-2 was performed, complemented by phenotyping of memory markers on antigen-specific CD8 T-cells. The ex-vivo frequencies of tetramer-specific CD8 T-cells in tuberculous patients before therapy were lower than in subjects with latent infection, but increased at four months after therapy to comparable percentages detected in subjects with latent infection. The majority of CD8 T-cells from subjects with latent infection expressed a terminally-differentiated phenotype (CD45RA+CCR7−). In contrast, tuberculous patients had only 35% of antigen-specific CD8 T-cells expressing this phenotype, while containing higher proportions of cells with an effector memory- and a central memory-like phenotype, and which did not change significantly after therapy. CD8 T-cells from subjects with latent infection showed a codominance of IL-2+/IFN-γ+ and IL-2−/IFN-γ+ T-cell populations; interestingly, only the IL-2+/IFN-γ+ population was reduced or absent in tuberculous patients, highly suggestive of a restricted functional profile of Mycobacterium tuberculosis-specific CD8 T-cells during active disease. These results suggest distinct Mycobacterium tuberculosis specific CD8 T-cell phenotypic and functional signatures between subjects which control infection (subjects with latent infection) and those who do not (patients with active disease)

    Endo-therapies for biliary duct-to-duct anastomotic stricture after liver transplantation: outcomes of a nationwide survey

    Get PDF
    BACKGROUND: The most appropriate endo-therapeutic approach to biliary anastomotic strictures is yet to be defined. AIM: To retrospectively report on the endo-therapy of duct-to-duct anastomotic strictures during 2013 in Italy. METHODS: Data were collected from 16 Endoscopy Units at the Italian Liver Transplantation Centers (BASALT study group). RESULTS: Complete endo-therapy and follow-up data are available for 181 patients: 101 treated with plastic multistenting, 26 with fully covered self-expandable metal stenting (SEMS) and 54 with single stenting. Radiological success was achieved for 145 patients (80%), i.e. 88% of plastic multistenting, 88% of SEMS and 61% of single stenting (p<0.001 vs plastic multistenting; p<0.05 vs SEMS)]. After first-line endo-therapy failure, the patients underwent a second-line endo-therapy with plastic multistenting for 25%, fully covered SEMS for 53% and single stenting for 22% of cases, and radiological success was achieved for 84%, i.e. 100%, 85%, and 63% with plastic multistenting, SEMS and single stenting (p<0.05 vs plastic multistenting or SEMS), respectively. Procedure-related complications occurred in 7.8% of ERCP. Overall clinical success was achieved in 87% of patients after a median follow-up of 25 months. CONCLUSION: Plastic multistenting is confirmed as the preferred first-line treatment, while fully covered SEMS as rescue option for biliary anastomotic strictures. Single stenting has sub-optimal results and should be abandoned. This article is protected by copyright. All rights reserved

    The Italian data on SARS-CoV-2 infection in transplanted patients support an organ specific immune response in liver recipients

    Get PDF
    Introduction: The study of immune response to SARSCoV-2 infection in different solid organ transplant settings represents an opportunity for clarifying the interplay between SARS-CoV-2 and the immune system. In our nationwide registry study from Italy, we specifically evaluated, during the first wave pandemic, i.e., in non-vaccinated patients, COVID-19 prevalence of infection, mortality, and lethality in liver transplant recipients (LTRs), using non-liver solid transplant recipients (NL-SOTRs) and the Italian general population (GP) as comparators. Methods: Case collection started from February 21 to June 22, 2020, using the data from the National Institute of Health and National Transplant Center, whereas the data analysis was performed on September 30, 2020.To compare the sex- and age-adjusted distribution of infection, mortality, and lethality in LTRs, NL-SOTRs, and Italian GP we applied an indirect standardization method to determine the standardized rate. Results: Among the 43,983 Italian SOTRs with a functioning graft, LTRs accounted for 14,168 patients, of whom 89 were SARS-CoV-2 infected. In the 29,815 NL-SOTRs, 361 cases of SARS-CoV-2 infection were observed. The geographical distribution of the disease was highly variable across the different Italian regions. The standardized rate of infection, mortality, and lethality rates in LTRs resulted lower compared to NL-SOTRs [1.02 (95%CI 0.81-1.23) vs. 2.01 (95%CI 1.8-2.2); 1.0 (95%CI 0.5-1.5) vs. 4.5 (95%CI 3.6-5.3); 1.6 (95%CI 0.7-2.4) vs. 2.8 (95%CI 2.2-3.3), respectively] and comparable to the Italian GP. Discussion: According to the most recent studies on SOTRs and SARS-CoV-2 infection, our data strongly suggest that, in contrast to what was observed in NL-SOTRs receiving a similar immunosuppressive therapy, LTRs have the same risk of SARS-CoV-2 infection, mortality, and lethality observed in the general population. These results suggest an immune response to SARS-CoV-2 infection in LTRS that is different from NL-SOTRs, probably related to the ability of the grafted liver to induce immunotolerance

    Association of kidney disease measures with risk of renal function worsening in patients with type 1 diabetes

    Get PDF
    Background: Albuminuria has been classically considered a marker of kidney damage progression in diabetic patients and it is routinely assessed to monitor kidney function. However, the role of a mild GFR reduction on the development of stage 653 CKD has been less explored in type 1 diabetes mellitus (T1DM) patients. Aim of the present study was to evaluate the prognostic role of kidney disease measures, namely albuminuria and reduced GFR, on the development of stage 653 CKD in a large cohort of patients affected by T1DM. Methods: A total of 4284 patients affected by T1DM followed-up at 76 diabetes centers participating to the Italian Association of Clinical Diabetologists (Associazione Medici Diabetologi, AMD) initiative constitutes the study population. Urinary albumin excretion (ACR) and estimated GFR (eGFR) were retrieved and analyzed. The incidence of stage 653 CKD (eGFR < 60 mL/min/1.73 m2) or eGFR reduction > 30% from baseline was evaluated. Results: The mean estimated GFR was 98 \ub1 17 mL/min/1.73m2 and the proportion of patients with albuminuria was 15.3% (n = 654) at baseline. About 8% (n = 337) of patients developed one of the two renal endpoints during the 4-year follow-up period. Age, albuminuria (micro or macro) and baseline eGFR < 90 ml/min/m2 were independent risk factors for stage 653 CKD and renal function worsening. When compared to patients with eGFR > 90 ml/min/1.73m2 and normoalbuminuria, those with albuminuria at baseline had a 1.69 greater risk of reaching stage 3 CKD, while patients with mild eGFR reduction (i.e. eGFR between 90 and 60 mL/min/1.73 m2) show a 3.81 greater risk that rose to 8.24 for those patients with albuminuria and mild eGFR reduction at baseline. Conclusions: Albuminuria and eGFR reduction represent independent risk factors for incident stage 653 CKD in T1DM patients. The simultaneous occurrence of reduced eGFR and albuminuria have a synergistic effect on renal function worsening
    corecore