12 research outputs found

    Serological Response to Three, Four and Five Doses of SARS-CoV-2 Vaccine in Kidney Transplant Recipients

    Get PDF
    Mortality from COVID-19 among kidney transplant recipients (KTR) is high, and their response to three vaccinations against SARS-CoV-2 is strongly impaired. We retrospectively analyzed the serological response of up to five doses of the SARS-CoV-2 vaccine in KTR from 27 December 2020 until 31 December 2021. Particularly, the influence of the different dose adjustment regimens for mycophenolic acid (MPA) on serological response to fourth vaccination was analyzed. In total, 4277 vaccinations against SARS-CoV-2 in 1478 patients were analyzed. Serological response was 19.5% after 1203 basic immunizations, and increased to 29.4%, 55.6%, and 57.5% in response to 603 third, 250 fourth, and 40 fifth vaccinations, resulting in a cumulative response rate of 88.7%. In patients with calcineurin inhibitor and MPA maintenance immunosuppression, pausing MPA and adding 5 mg prednisolone equivalent before the fourth vaccination increased the serological response rate to 75% in comparison to the no dose adjustment (52%) or dose reduction (46%). Belatacept-treated patients had a response rate of 8.7% (4/46) after three vaccinations and 12.5% (3/25) after four vaccinations. Except for belatacept-treated patients, repeated SARS-CoV-2 vaccination of up to five times effectively induces serological response in kidney transplant recipients. It can be enhanced by pausing MPA at the time of vaccination

    Land Cover and Rainfall Interact to Shape Waterbird Community Composition

    Get PDF
    Human land cover can degrade estuaries directly through habitat loss and fragmentation or indirectly through nutrient inputs that reduce water quality. Strong precipitation events are occurring more frequently, causing greater hydrological connectivity between watersheds and estuaries. Nutrient enrichment and dissolved oxygen depletion that occur following these events are known to limit populations of benthic macroinvertebrates and commercially harvested species, but the consequences for top consumers such as birds remain largely unknown. We used non-metric multidimensional scaling (MDS) and structural equation modeling (SEM) to understand how land cover and annual variation in rainfall interact to shape waterbird community composition in Chesapeake Bay, USA. The MDS ordination indicated that urban subestuaries shifted from a mixed generalist-specialist community in 2002, a year of severe drought, to generalist-dominated community in 2003, of year of high rainfall. The SEM revealed that this change was concurrent with a sixfold increase in nitrate-N concentration in subestuaries. In the drought year of 2002, waterbird community composition depended only on the direct effect of urban development in watersheds. In the wet year of 2003, community composition depended both on this direct effect and on indirect effects associated with high nitrate-N inputs to northern parts of the Bay, particularly in urban subestuaries. Our findings suggest that increased runoff during periods of high rainfall can depress water quality enough to alter the composition of estuarine waterbird communities, and that this effect is compounded in subestuaries dominated by urban development. Estuarine restoration programs often chart progress by monitoring stressors and indicators, but rarely assess multivariate relationships among them. Estuarine management planning could be improved by tracking the structure of relationships among land cover, water quality, and waterbirds. Unraveling these complex relationships may help managers identify and mitigate ecological thresholds that occur with increasing human land cover

    Can a decision support system accelerate rare disease diagnosis? Evaluating the potential impact of Ada DX in a retrospective study

    No full text
    Abstract Background Rare disease diagnosis is often delayed by years. A primary factor for this delay is a lack of knowledge and awareness regarding rare diseases. Probabilistic diagnostic decision support systems (DDSSs) have the potential to accelerate rare disease diagnosis by suggesting differential diagnoses for physicians based on case input and incorporated medical knowledge. We examine the DDSS prototype Ada DX and assess its potential to provide accurate rare disease suggestions early in the course of rare disease cases. Results Ada DX suggested the correct disease earlier than the time of clinical diagnosis among the top five fit disease suggestions in 53.8% of cases (50 of 93), and as the top fit disease suggestion in 37.6% of cases (35 of 93). The median advantage of correct disease suggestions compared to the time of clinical diagnosis was 3 months or 50% for top five fit and 1 month or 21% for top fit. The correct diagnosis was suggested at the first documented patient visit in 33.3% (top 5 fit), and 16.1% of cases (top fit), respectively. Wilcoxon signed-rank test shows a significant difference between the time to clinical diagnosis and the time to correct disease suggestion for both top five fit and top fit (z-score -6.68, respective -5.71, α=0.05, p-value <0.001). Conclusion Ada DX provided accurate rare disease suggestions in most rare disease cases. In many cases, Ada DX provided correct rare disease suggestions early in the course of the disease, sometimes at the very beginning of a patient journey. The interpretation of these results indicates that Ada DX has the potential to suggest rare diseases to physicians early in the course of a case. Limitations of this study derive from its retrospective and unblinded design, data input by a single user, and the optimization of the knowledge base during the course of the study. Results pertaining to the system’s accuracy should be interpreted cautiously. Whether the use of Ada DX reduces the time to diagnosis in rare diseases in a clinical setting should be validated in prospective studies

    Influence of Belatacept- vs. CNI-Based Immunosuppression on Vascular Stiffness and Body Composition

    No full text
    Background: Arterial stiffness and phase angle (PhA) have gained importance as a diagnostic and prognostic parameter in the management of cardiovascular disease. There are few studies regarding the differences in arterial stiffness and body composition between renal transplant recipients (RTRs) receiving belatacept (BELA) vs. calcineurin inhibitors (CNI). Therefore, we investigated the differences in arterial stiffness and body composition between RTRs treated with different immunosuppressants, including BELA. Methods: In total, 325 RTRs were enrolled in the study (mean age 52.2 years, M &minus;62.7%). Arterial stiffness was determined with an automated oscillometric device. All body composition parameters were assessed, based on bioelectrical impedance analysis (BIA), and laboratory parameters were obtained from the medical files of the patients. Results: We did not detect any significant difference in terms of arterial stiffness and PhA in RTRs undergoing different immunosuppressive regimens, based on CsA, Tac, or BELA. Age was an essential risk factor for greater arterial stiffness. The PhA was associated with age, BMI, time of dialysis before transplantation, and kidney graft function. Conclusion: No significant differences in arterial stiffness and PhA were observed in RTRs under different immunosuppressive regimens. While our data provide additional evidence for arterial stiffness and PhA in RTRs, more research is needed to fully explore these cardiovascular risk factors and the impact of different immunosuppressive regimens

    Health economic benefits through the use of diagnostic support systems and expert knowledge

    No full text
    Background!#!Rare diseases are difficult to diagnose. Due to their rarity, heterogeneity, and variability, rare diseases often result not only in extensive diagnostic tests and imaging studies, but also in unnecessary repetitions of examinations, which places a greater overall burden on the healthcare system. Diagnostic decision support systems (DDSS) optimized by rare disease experts and used early by primary care physicians and specialists are able to significantly shorten diagnostic processes. The objective of this study was to evaluate reductions in diagnostic costs incurred in rare disease cases brought about by rapid referral to an expert and diagnostic decision support systems.!##!Methods!#!Retrospectively, diagnostic costs from disease onset to diagnosis were analyzed in 78 patient cases from the outpatient clinic for rare inflammatory systemic diseases at Hannover Medical School. From the onset of the first symptoms, all diagnostic measures related to the disease were taken from the patient files and documented for each day. The basis for the health economic calculations was the Einheitlicher Bewertungsmaßstab (EBM) used in Germany for statutory health insurance, which assigns a fixed flat rate to the various medical services. For 76 cases we also calculated the cost savings that would have been achieved by the diagnosis support system Ada DX applied by an expert.!##!Results!#!The expert was able to achieve significant savings for patients with long courses of disease. On average, the expert needed only 27 % of the total costs incurred in the individual treatment odysseys to make the correct diagnosis. The expert also needed significantly less time and avoided unnecessary examination repetitions. If a DDSS had been applied early in the 76 cases studied, only 51-68 % of the total costs would have incurred and the diagnosis would have been made earlier. Earlier diagnosis would have significantly reduced costs.!##!Conclusion!#!The study showed that significant savings in the diagnostic process of rare diseases can be achieved through rapid referral to an expert and the use of DDSS. Faster diagnosis not only achieves savings, but also enables the right therapy and thus an increase in the quality of life for patients

    Declining Course of Humoral Immune Response in Initially Responding Kidney Transplant Recipients after Repeated SARS-CoV-2 Vaccination

    Get PDF
    The immunogenicity of SARS-CoV-2 vaccines in kidney transplant recipients is limited, resulting in inadequately low serological response rates and low immunoglobulin (Ig) levels, correlating with reduced protection against death and hospitalization from COVID-19. We retrospectively examined the time course of anti-SARS-CoV-2 Ig antibody levels after up to five repeated vaccinations in 644 previously nonresponding kidney transplant recipients. Using anti SARS-CoV-2 IgG/IgA ELISA and the total Ig ECLIA assays, we compared antibody levels at 1 month with levels at 2 and 4 months, respectively. Additionally, we correlated the measurements of the used assays. Between 1 and 2 months, and between 1 and 4 months, mean anti-SARS-CoV-2 Ig levels in responders decreased by 14% and 25%, respectively, depending on the assay. Absolute Ig values and time course of antibody levels showed high interindividual variability. Ig levels decreased by at least 20% in 77 of 148 paired samples with loss of sufficient serological protection over time occurring in 18 out of 148 (12.2%). IgG ELISA and total Ig ECLIA assays showed a strong positive correlation (Kendall&rsquo;s tau = 0.78), yet the two assays determined divergent results in 99 of 751 (13.2%) measurements. IgG and IgA assays showed overall strong correlation but divergent results in 270 of 1.173 (23.0%) cases and only weak correlation of antibody levels in positive samples. Large interindividual variability and significant loss of serological response after 4 months supports repeated serological sampling and consideration of shorter vaccination intervals in kidney transplant recipients

    Poor Long-Term Renal Allograft Survival in Patients with Chronic Antibody-Mediated Rejection, Irrespective of Treatment&mdash;A Single Center Retrospective Study

    No full text
    The Banff 2017 report permits the diagnosis of pure chronic antibody-mediated rejection (cAMR) in absence of microcirculation inflammation. We retrospectively investigated renal allograft function and long-term outcomes of 67 patients with cAMR, and compared patients who received antihumoral therapy (cAMR-AHT, n = 21) with patients without treatment (cAMRwo, n = 46). At baseline, the cAMR-AHT group had more concomitant T-cell-mediated rejection (9/46 (19.2%) vs. 10/21 (47.6%); p = 0.04), a higher g-lesion score (0.4 &plusmn; 0.5 versus 0.1 &plusmn; 0.3; p = 0.01) and a higher median eGFR decline in the six months prior to biopsy (6.6 vs. 3.0 mL/min; p = 0.04). The median eGFR decline six months after biopsy was comparable (2.6 vs. 4.9 mL/min, p = 0.61) between both groups, and three-year graft survival after biopsy was statistically lower in the cAMR-AHT group (35.0% vs. 61.0%, p = 0.03). Patients who received AHT had more infections (0.38 vs. 0.20 infections/patient; p = 0.04). Currently, antihumoral therapy is more often administered to patients with cAMR and rapidly deteriorating renal function or concomitant TCMR. However, long-term graft outcomes remain poor, despite treatment

    Development and validation of multivariable prediction models of serological response to SARS-CoV-2 vaccination in kidney transplant recipients

    No full text
    International audienceRepeated vaccination against SARS-CoV-2 increases serological response in kidney transplant recipients (KTR) with high interindividual variability. No decision support tool exists to predict SARS-CoV-2 vaccination response to third or fourth vaccination in KTR. We developed, internally and externally validated five different multivariable prediction models of serological response after the third and fourth vaccine dose against SARS-CoV-2 in previously seronegative, COVID-19-naĂŻve KTR. Using 20 candidate predictor variables, we applied statistical and machine learning approaches including logistic regression (LR), least absolute shrinkage and selection operator (LASSO)-regularized LR, random forest, and gradient boosted regression trees. For development and internal validation, data from 590 vaccinations were used. External validation was performed in four independent, international validation cohorts comprising 191, 184, 254, and 323 vaccinations, respectively. LASSO-regularized LR performed on the whole development dataset yielded a 20- and 10-variable model, respectively. External validation showed AUC-ROC of 0.840, 0.741, 0.816, and 0.783 for the sparser 10-variable model, yielding an overall performance 0.812. A 10-variable LASSO-regularized LR model predicts vaccination response in KTR with good overall accuracy. Implemented as an online tool, it can guide decisions whether to modulate immunosuppressive therapy before additional active vaccination, or to perform passive immunization to improve protection against COVID-19 in previously seronegative, COVID-19-naĂŻve KTR
    corecore