34 research outputs found

    Histone Deacetylase Inhibitors Impair Antibacterial Defenses of Macrophages

    Get PDF
    Histone deacetylases (HDACs) control gene expression by deacetylating histones and nonhistone proteins. HDAC inhibitors (HDACi) are powerful anticancer drugs that exert anti-inflammatory and immunomodulatory activities. We recently reported a proof-of-concept study demonstrating that HDACi increase susceptibility to bacterial infections in vivo. Yet, still little is known about the effects of HDACi on antimicrobial innate immune defenses. Here we show that HDACi belonging to different chemical classes inhibit at multiple levels the response of macrophages to bacterial infection. HDACi reduce the phagocytosis and the killing of Escherichia coli and Staphylococcus aureus by macrophages. In line with these findings, HDACi decrease the expression of phagocytic receptors and inhibit bacteria-induced production of reactive oxygen and nitrogen species by macrophages. Consistently, HDACi impair the expression of nicotinamide adenine dinucleotide phosphate (NADPH) oxidase subunits and inducible nitric oxide synthase. These data indicate that HDACi have a strong impact on critical antimicrobial defense mechanisms in macrophage

    Solidification microstructure of centrifugally cast Inconel 625

    Get PDF
    Centrifugal casting is a foundry process allowing the production of near net-shaped axially symmetrical components. The present study focuses on the microstructural characterization of centrifugally cast alloys featuring different chemical compositions for the construction of spheres applied in valves made of alloy IN625 for operation at high pressure. Control of the solidification microstructure is needed to assure the reliability of the castings. Actually, a Ni-base superalloy such as this one should have an outstanding combination of mechanical properties, high temperature stability and corrosion resistance. Alloys such as IN625 are characterised by a large amount of alloying elements and a wide solidification range, so they can be affected by micro-porosity defects, related to the shrinkage difference between the matrix and the secondary reinforcing phases (Nb-rich carbides and Laves phase). In this study, the microstructure characterization was performed as a function of the applied heat treatments and it was coupled with a calorimetric analysis in order to understand the mechanism ruling the formation of micro-porosities that can assure alloy soundness. The obtained results show that the presence of micro-porosities is governed by morphology and by the size of the secondary phases, and the presence of the observed secondary phases is detrimental to corrosion resistance

    Epidemiology and outcomes of medically attended and microbiologically confirmed bacterial foodborne infections in solid organ transplant recipients

    Full text link
    Food-safety measures are recommended to solid organ transplant (SOT) recipients. However, the burden of foodborne infections in SOT recipients has not been established. We describe the epidemiology and outcomes of bacterial foodborne infections in a nationwide cohort including 4405 SOT recipients in Switzerland between 2008 and 2018. Participants were prospectively followed for a median of 4.2 years with systematic collection of data on infections, and patient and graft-related outcomes. We identified 151 episodes of microbiologically confirmed bacterial foodborne infections occurring in median 1.6 years (IQR 0.58-3.40) after transplantation (131 [88%] Campylobacter spp. and 15 [10%] non-typhoidal Salmonella). The cumulative incidence of bacterial foodborne infections was 4% (95% CI 3.4-4.8). Standardized incidence rates were 7.4 (95% CI 6.2-8.7) and 4.6 (95% CI 2.6-7.5) for Campylobacter and Salmonella infections, respectively. Invasive infection was more common with Salmonella (33.3% [5/15]) compared to Campylobacter (3.2% [4/125]; p = .001). Hospital and ICU admission rates were 47.7% (69/145) and 4.1% (6/145), respectively. A composite endpoint of acute rejection, graft loss, or death occurred within 30 days in 3.3% (5/151) of cases. In conclusion, in our cohort bacterial foodborne infections were late post-transplant infections and were associated with significant morbidity, supporting the need for implementation of food-safety recommendations

    Infection Risk in the First Year After ABO-incompatible Kidney Transplantation: A Nationwide Prospective Cohort Study.

    Get PDF
    BACKGROUND ABO-incompatible (ABOi) kidney transplantation (KT) expands the kidney donor pool and may help to overcome organ shortage. Nonetheless, concerns about infectious complications associated with ABOi-KT have been raised. METHODS In a nationwide cohort (Swiss Transplant Cohort Study), we compared the risk for infectious complications among ABOi and ABO-compatible (ABOc) renal transplant recipients. Infections needed to fulfill rigorous, prespecified criteria to be classified as clinically relevant. Unadjusted and adjusted competing risk regression models were used to compare the time to the first clinically relevant infection among ABOi-KT and ABOc-KT recipients. Inverse probability weighted generalized mixed-effects Poisson regression was used to estimate incidence rate ratios for infection. RESULTS We included 757 living-donor KT recipients (639 ABOc; 118 ABOi) and identified 717 infection episodes. The spectrum of causative pathogens and the anatomical sites affected by infections were similar between ABOi-KT and ABOc-KT recipients. There was no significant difference in time to first posttransplant infection between ABOi-KT and ABOc-KT recipients (subhazard ratio, 1.24; 95% confidence interval [CI], 0.93-1.66; P = 0.142). At 1 y, the crude infection rate was 1.11 (95% CI, 0.93-1.33) episodes per patient-year for ABOi patients and 0.94 (95% CI, 0.86-1.01) for ABOc-KT recipients. Inverse probability weighted infection rates were similar between groups (adjusted incidence rate ratio, 1.12; 95% CI, 0.83-1.52; P = 0.461). CONCLUSIONS The burden of infections during the first year posttransplant was high but not relevantly different in ABOi-KT and ABOc-KT recipients. Our results highlight that concerns regarding infectious complications should not affect the implementation of ABOi-KT programs

    Immunogenicity of High-Dose vs. MF59-adjuvanted vs. Standard Influenza Vaccine in Solid Organ Transplant Recipients: The STOP-FLU trial.

    Get PDF
    BACKGROUND The immunogenicity of the standard influenza vaccine is reduced in solid-organ transplant (SOT) recipients, so that new vaccination strategies are needed in this population. METHODS Adult SOT recipients from nine transplant clinics in Switzerland and Spain were enrolled if they were >3 months after transplantation. High, with stratification by organ and time from transplant. The primary outcome was vaccine response rate, defined as a ≥4-fold increase of hemagglutination-inhibition titers to at least one vaccine strain at 28 days post-vaccination. Secondary outcomes included PCR-confirmed influenza and vaccine reactogenicity. RESULTS 619 patients were randomized, 616 received the assigned vaccines, and 598 had serum available for analysis of the primary endpoint (standard, n=198; MF59-adjuvanted, n=205; high-dose, n=195 patients). Vaccine response rates were 42% (84/198) in the standard vaccine group, 60% (122/205) in the MF59-adjuvanted vaccine group, and 66% (129/195) in the high-dose vaccine group (difference in intervention vaccines vs. standard vaccine, 0.20 [97.5% CI 0.12-1]; p<0.001; difference in high-dose vs. standard vaccine, 0.24 [95% CI 0.16-1]; p<0.001; difference in MF59-adjuvanted vs. standard vaccine, 0.17 [97.5% CI 0.08-1]; p<0.001). Influenza occurred in 6% the standard, 5% in the MF59-adjuvanted, and 7% in the high-dose vaccine groups. Vaccine-related adverse events occurred more frequently in the intervention vaccine groups, but most of the events were mild. CONCLUSIONS In SOT recipients, use of an MF59-adjuvanted or a high-dose influenza vaccine was safe and resulted in a higher vaccine response rate. TRIAL REGISTRATION Clinicaltrials.gov NCT03699839

    Immune monitoring-guided vs fixed duration of antiviral prophylaxis against cytomegalovirus in solid-organ transplant recipients. A Multicenter, Randomized Clinical Trial

    Get PDF
    BACKGROUND: The use of assays detecting cytomegalovirus (CMV)-specific T-cell-mediated immunity may individualize the duration of antiviral prophylaxis in transplant recipients. METHODS: In this open-label randomized trial, adult kidney and liver transplant recipients from six centers in Switzerland were enrolled if they were CMV-seronegative with seropositive donors or CMV-seropositive receiving anti-thymocyte globulins. Patients were randomized to a duration of antiviral prophylaxis based on immune-monitoring (intervention) or a fixed duration (control). Patients in the control group were planned to receive 180 days (CMV-seronegative) or 90 days (CMV-seropositive) of valganciclovir. Patients were assessed monthly with a CMV-specific interferon gamma release assay (T-Track® CMV); prophylaxis in the intervention group was stopped if the assay was positive. The primary outcomes were the proportion of patients with clinically significant CMV infection and reduction in days of prophylaxis. Between-group differences were adjusted for CMV serostatus. RESULTS: Overall, 193 patients were randomized (92 in the immune-monitoring and 101 in the control group) of which 185 had evaluation of the primary endpoint (87 and 98 patients, respectively). Clinically significant CMV infection occurred in 26/87 (adjusted percentage, 30.9%) in the immune-monitoring group and in 32/98 (adjusted percentage, 31.1%) in the control group (adjusted risk difference -0.1, 95%CI -13.0%, 12.7%; p = 0.064). The duration of antiviral prophylaxis was shorter in the immune-monitoring group (adjusted difference -26.0 days, 95%-CI -41.1 to -10.8 days, p < 0.001). CONCLUSIONS: Immune monitoring resulted in a significant reduction of antiviral prophylaxis, but we were unable to establish noninferiority of this approach on the co-primary endpoint of CMV infection

    Immune monitoring-guided vs fixed duration of antiviral prophylaxis against cytomegalovirus in solid-organ transplant recipients. A Multicenter, Randomized Clinical Trial.

    Get PDF
    BACKGROUND The use of assays detecting cytomegalovirus (CMV)-specific T-cell-mediated immunity may individualize the duration of antiviral prophylaxis in transplant recipients. METHODS In this open-label randomized trial, adult kidney and liver transplant recipients from six centers in Switzerland were enrolled if they were CMV-seronegative with seropositive donors or CMV-seropositive receiving anti-thymocyte globulins. Patients were randomized to a duration of antiviral prophylaxis based on immune-monitoring (intervention) or a fixed duration (control). Patients in the control group were planned to receive 180 days (CMV-seronegative) or 90 days (CMV-seropositive) of valganciclovir. Patients were assessed monthly with a CMV-specific interferon gamma release assay (T-Track® CMV); prophylaxis in the intervention group was stopped if the assay was positive. The primary outcomes were the proportion of patients with clinically significant CMV infection and reduction in days of prophylaxis. Between-group differences were adjusted for CMV serostatus. RESULTS Overall, 193 patients were randomized (92 in the immune-monitoring and 101 in the control group) of which 185 had evaluation of the primary endpoint (87 and 98 patients, respectively). Clinically significant CMV infection occurred in 26/87 (adjusted percentage, 30.9%) in the immune-monitoring group and in 32/98 (adjusted percentage, 31.1%) in the control group (adjusted risk difference -0.1, 95%CI -13.0%, 12.7%; p = 0.064). The duration of antiviral prophylaxis was shorter in the immune-monitoring group (adjusted difference -26.0 days, 95%-CI -41.1 to -10.8 days, p < 0.001). CONCLUSIONS Immune monitoring resulted in a significant reduction of antiviral prophylaxis, but we were unable to establish noninferiority of this approach on the co-primary endpoint of CMV infection

    Electrophysiological neuromuscular alterations and severe fatigue predict long-term muscle weakness in survivors of COVID-19 acute respiratory distress syndrome

    Get PDF
    IntroductionLong-term weakness is common in survivors of COVID-19-associated acute respiratory distress syndrome (CARDS). We longitudinally assessed the predictors of muscle weakness in patients evaluated 6 and 12 months after intensive care unit discharge with in-person visits.MethodsMuscle strength was measured by isometric maximal voluntary contraction (MVC) of the tibialis anterior muscle. Candidate predictors of muscle weakness were follow-up time, sex, age, mechanical ventilation duration, use of steroids in the intensive care unit, the compound muscle action potential of the tibialis anterior muscle (CMAP-TA-S100), a 6-min walk test, severe fatigue, depression and anxiety, post-traumatic stress disorder, cognitive assessment, and body mass index. We also compared the clinical tools currently available for the evaluation of muscle strength (handgrip strength and Medical Research Council sum score) and electrical neuromuscular function (simplified peroneal nerve test [PENT]) with more objective and robust measures of force (MVC) and electrophysiological evaluation of the neuromuscular function of the tibialis anterior muscle (CMAP-TA-S100) for their essential role in ankle control.ResultsMVC improved at 12 months compared with 6 months. CMAP-TA-S100 (P = 0.016) and the presence of severe fatigue (P = 0.036) were independent predictors of MVC. MVC was strongly associated with handgrip strength, whereas CMAP-TA-S100 was strongly associated with PENT.DiscussionElectrical neuromuscular abnormalities and severe fatigue are independently associated with reduced MVC and can be used to predict the risk of long-term muscle weakness in CARDS survivors

    Refinement of the diagnostic approach for the identification of children and adolescents affected by familial hypercholesterolemia: Evidence from the LIPIGEN study

    Get PDF
    Background and aims: We aimed to describe the limitations of familiar hypercholesterolemia (FH) diagnosis in childhood based on the presence of the typical features of FH, such as physical sings of cholesterol accumulation and personal or family history of premature cardiovascular disease or hypercholesterolemia, comparing their prevalence in the adult and paediatric FH population, and to illustrate how additional information can lead to a more effective diagnosis of FH at a younger age.Methods: From the Italian LIPIGEN cohort, we selected 1188 (&gt;= 18 years) and 708 (&lt;18 years) genetically-confirmed heterozygous FH, with no missing personal FH features. The prevalence of personal and familial FH features was compared between the two groups. For a sub-group of the paediatric cohort (N = 374), data about premature coronary heart disease (CHD) in second-degree family members were also included in the evaluation.Results: The lower prevalence of typical FH features in children/adolescents vs adults was confirmed: the prevalence of tendon xanthoma was 2.1% vs 13.1%, and arcus cornealis was present in 1.6% vs 11.2% of the cohorts, respectively. No children presented clinical history of premature CHD or cerebral/peripheral vascular disease compared to 8.8% and 5.6% of adults, respectively. The prevalence of premature CHD in first-degree relatives was significantly higher in adults compared to children/adolescents (38.9% vs 19.7%). In the sub-cohort analysis, a premature CHD event in parents was reported in 63 out of 374 subjects (16.8%), but the percentage increased to 54.0% extending the evaluation also to second-degree relatives.Conclusions: In children, the typical FH features are clearly less informative than in adults. A more thorough data collection, adding information about second-degree relatives, could improve the diagnosis of FH at younger age

    Lipoprotein(a) Genotype Influences the Clinical Diagnosis of Familial Hypercholesterolemia

    Get PDF
    : Background Evidence suggests that LPA risk genotypes are a possible contributor to the clinical diagnosis of familial hypercholesterolemia (FH). This study aimed at determining the prevalence of LPA risk variants in adult individuals with FH enrolled in the Italian LIPIGEN (Lipid Transport Disorders Italian Genetic Network) study, with (FH/M+) or without (FH/M-) a causative genetic variant. Methods and Results An lp(a) [lipoprotein(a)] genetic score was calculated by summing the number risk-increasing alleles inherited at rs3798220 and rs10455872 variants. Overall, in the 4.6% of 1695 patients with clinically diagnosed FH, the phenotype was not explained by a monogenic or polygenic cause but by genotype associated with high lp(a) levels. Among 765 subjects with FH/M- and 930 subjects with FH/M+, 133 (17.4%) and 95 (10.2%) were characterized by 1 copy of either rs10455872 or rs3798220 or 2 copies of either rs10455872 or rs3798220 (lp(a) score ≥1). Subjects with FH/M- also had lower mean levels of pretreatment low-density lipoprotein cholesterol than individuals with FH/M+ (t test for difference in means between FH/M- and FH/M+ groups &lt;0.0001); however, subjects with FH/M- and lp(a) score ≥1 had higher mean (SD) pretreatment low-density lipoprotein cholesterol levels (223.47 [50.40] mg/dL) compared with subjects with FH/M- and lp(a) score=0 (219.38 [54.54] mg/dL for), although not statistically significant. The adjustment of low-density lipoprotein cholesterol levels based on lp(a) concentration reduced from 68% to 42% the proportion of subjects with low-density lipoprotein cholesterol level ≥190 mg/dL (or from 68% to 50%, considering a more conservative formula). Conclusions Our study supports the importance of measuring lp(a) to perform the diagnosis of FH appropriately and to exclude that the observed phenotype is driven by elevated levels of lp(a) before performing the genetic test for FH
    corecore