26 research outputs found

    Validation of a new prognostic model to predict short and medium-term survival in patients with liver cirrhosis

    Get PDF
    Background: MELD score and MELD score derivates are used to objectify and grade the risk of liver-related death in patients with liver cirrhosis. We recently proposed a new predictive model that combines serum creatinine levels and maximum liver function capacity (LiMAx®), namely the CreLiMAx risk score. In this validation study we have aimed to reproduce its diagnostic accuracy in patients with end-stage liver disease. Methods: Liver function of 113 patients with liver cirrhosis was prospectively investigated. Primary end-point of the study was liver-related death within 12 months of follow-up. Results: Alcoholic liver disease was the main cause of liver disease (n = 51; 45%). Within 12 months of follow-up 11 patients (9.7%) underwent liver transplantation and 17 (15.1%) died (13 deaths were related to liver disease, two not). Measures of diagnostic accuracy were comparable for MELD, MELD-Na and the CreLiMAx risk score as to power in predicting short and medium-term mortality risk in the overall cohort: AUROCS for liver related risk of death were for MELD [6 months 0.89 (95% CI 0.80–0.98) p < 0.001; 12 months 0.89 (95% CI 0.81–0.96) p < 0.001]; MELD-Na [6 months 0.93 (95% CI 0.85–1.00) p < 0.001 and 12 months 0.89 (95% CI 0.80–0.98) p < 0.001]; CPS 6 months 0.91 (95% CI 0.85–0.97) p < 0.01 and 12 months 0.88 (95% CI 0.80–0.96) p < 0.001] and CreLiMAx score [6 months 0.80 (95% CI 0.67–0.96) p < 0.01 and 12 months 0.79 (95% CI 0.64–0.94) p = 0.001]. In a subgroup analysis of patients with Child-Pugh Class B cirrhosis, the CreLiMAx risk score remained the only parameter significantly differing in non-survivors and survivors. Furthermore, in these patients the proposed score had a good predictive performance. Conclusion: The CreLiMAx risk score appears to be a competitive and valid tool for estimating not only short- but also medium-term survival of patients with end-stage liver disease. Particularly in patients with Child-Pugh Class B cirrhosis the new score showed a good ability to identify patients not at risk of death

    Prevalence of Steatosis Hepatis in the Eurotransplant Region: Impact on Graft Acceptance Rates

    Get PDF
    Due to the shortage of liver allografts and the rising prevalence of fatty liver disease in the general population, steatotic liver grafts are considered for transplantation.This condition is an important risk factor for the outcome after transplantation.We here analyze the characteristics of the donor pool offered to the Charité –Universitätsmedizin Berlin from 2010 to 2016 with respect to liver allograft nonacceptance and steatosis hepatis. Of the 2653 organs offered to our center, 19.9% (n=527) were accepted for transplantation, 58.8% (n=1561) were allocated to other centers, and 21.3% (n = 565) were eventually discarded from transplantation. In parallel to an increase of the incidence of steatosis hepatis in the donor pool from 20% in 2010 to 30% in 2016, the acceptance rates for steatotic organs increased in our center from 22.3% to 51.5% in 2016 (p 0.001) having less than 30% macrovesicular steatosis hepatis. However, by 2016, the number of canceled transplantations due to higher grades of steatosis hepatis had significantly increased from 14.7% (n = 15) to 63.6% (42; p < 0.001).The rising prevalence of steatosis hepatis in the donor pool has led to higher acceptance rates of steatotic allografts. Nonetheless, steatosis hepatis remains a predominant phenomenon in discarded organs necessitating future concepts such as organ reconditioning to increase graft utilization

    Early Allograft Dysfunction Increases Hospital Associated Costs After Liver Transplantation—A Propensity Score–Matched Analysis

    Get PDF
    Concepts to ameliorate the continued mismatch between demand for liver allografts and supply include the acceptance of allografts that meet extended donor criteria (ECD). ECD grafts are generally associated with an increased rate of complications such as early allograft dysfunction (EAD). The costs of liver transplantation for the health care system with respect to specific risk factors remain unclear and are subject to change. We analyzed 317 liver transplant recipients from 2013 to 2018 for outcome after liver transplantation and hospital costs in a German transplant center. In our study period, 1-year survival after transplantation was 80.1% (95% confidence interval: 75.8%-84.6%) and median hospital stay was 33 days (interquartile rage: 24), with mean hospital costs of euro115,924 (SD euro113,347). There was a positive correlation between costs and laboratory Model for End-Stage Liver Disease score (r(s) = 0.48, P < 0.001), and the development of EAD increased hospital costs by euro26,229. ECD grafts were not associated with a higher risk of EAD in our cohort. When adjusting for recipient-associated risk factors such as laboratory Model for End-Stage Liver Disease score, recipient age, and split liver transplantation with propensity score matching, only EAD and cold ischemia increased total costs. Conclusion: Our data show that EAD leads to significantly higher hospital costs for liver transplantation, which are primarily attributed to recipient health status. Strategies to reduce the incidence of EAD are needed to control costs in liver transplantation

    Solid organ transplantation programs facing lack of empiric evidence in the COVID‐19 pandemic: A By‐proxy Society Recommendation Consensus approach

    Get PDF
    The ongoing severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic has a drastic impact on national health care systems. Given the overwhelming demand on facility capacity, the impact on all health care sectors has to be addressed. Solid organ transplantation represents a field with a high demand on staff, intensive care units, and follow-up facilities. The great therapeutic value of organ transplantation has to be weighed against mandatory constraints of health care capacities. In addition, the management of immunosuppressed recipients has to be reassessed during the ongoing coronavirus disease 2019 (COVID-19) pandemic. In addressing these crucial questions, transplant physicians are facing a total lack of scientific evidence. Therefore, the aim of this study was to offer an approach of consensus-based guidance, derived from individual information of 22 transplant societies. Key recommendations were extracted and the degree of consensus among different organizations was calculated. A high degree of consensus was found for temporarily suspending nonurgent transplant procedures and living donation programs. Systematic polymerase chain reaction-based testing of donors and recipients was broadly recommended. Additionally, more specific aspects (eg, screening of surgical explant teams and restricted use of marginal donor organs) were included in our analysis. This study offers a novel approach to informed guidance for health care management when a priori no scientific evidence is available

    Hospitalization Before Liver Transplantation Predicts Posttransplant Patient Survival: A Propensity Score–Matched Analysis

    Get PDF
    In contrast to donor factors predicting outcomes of liver transplantation (LT), few suitable recipient parameters have been identified. To this end, we performed an in-depth analysis of hospitalization status and duration prior to LT as a potential risk factor for posttransplant outcome. The pretransplant hospitalization status of all patients undergoing LT between 2005 and 2016 at the Charité-Universitätsmedizin Berlin was analyzed retrospectively using propensity score matching. At the time of organ acceptance, 226 of 1134 (19.9%) recipients were hospitalized in an intensive care unit (ICU), 146 (12.9%) in a regular ward (RW) and 762 patients (67.2%) were at home. Hospitalized patients (RW and ICU) compared with patients from home showed a dramatically shorter 3-month survival (78.7% versus 94.4%), 1-year survival (66.3% versus 87.3%), and 3-year survival (61.7% versus 81.7%; all P 14 days, 60.5% versus 51.0%, P = 0.006). In conclusion, hospitalization status before transplantation is a valuable predictor of patient survival following LT

    Outcomes of Liver Resections after Liver Transplantation at a High-Volume Hepatobiliary Center

    Get PDF
    Although more than one million liver transplantations have been carried out worldwide, the literature on liver resections in transplanted livers is scarce. We herein report a total number of fourteen patients, who underwent liver resection after liver transplantation (LT) between September 2004 and 2017. Hepatocellular carcinomas and biliary tree pathologies were the predominant indications for liver resection (n = 5 each); other indications were abscesses (n = 2), post-transplant lymphoproliferative disease (n = 1) and one benign tumor. Liver resection was performed at a median of 120 months (interquartile range (IQR): 56.5-199.25) after LT with a preoperative Model for End-Stage Liver Disease (MELD) score of 11 (IQR: 6.75-21). Severe complications greater than Clavien-Dindo Grade III occurred in 5 out of 14 patients (36%). We compared liver resection patients, who had a treatment option of retransplantation (ReLT), with actual ReLTs (excluding early graft failure or rejection, n = 44). Bearing in mind that late ReLT was carried out at a median of 117 months after first transplantation and a median of MELD of 32 (IQR: 17.5-37); three-year survival following liver resection after LT was similar to late ReLT (50.0% vs. 59.1%; p = 0.733). Compared to ReLT, liver resection after LT is a rare surgical procedure with significantly shorter hospital (mean 25, IQR: 8.75-49; p = 0.034) and ICU stays (mean 2, IQR: 1-8; p < 0.001), acceptable complications and survival rates

    Brain Death Induction in Mice Using Intra-Arterial Blood Pressure Monitoring and Ventilation via Tracheostomy

    No full text
    While both living donation and donation after circulatory death provide alternative opportunities for organ transplantation, donation after donor brain death (BD) still represents the major source for solid transplants. Unfortunately, the irreversible loss of brain function is known to induce multiple pathophysiological changes, including hemodynamic as well as hormonal modifications, finally leading to a systemic inflammatory response. Models that allow a systematic investigation of these effects in vivo are scarce. We present a murine model of BD induction, which could aid investigations into the devastating effects of BD on allograft quality. After implementing intra-arterial blood pressure measurement via the common carotid artery and reliable ventilation via a tracheostomy, BD is induced by steadily increasing intracranial pressure using a balloon catheter. Four hours after BD induction, organs may be harvested for analysis or for further transplantation procedures. Our strategy enables the comprehensive analysis of donor BD in a murine model, therefore allowing an in-depth understanding of BD-related effects in solid organ transplantation and potentially paving the way to optimized organ preconditioning

    Postdural puncture headache after neuraxial anesthesia: incidence and risk factors

    No full text
    Hintergrund/Ziel der Arbeit: Der postpunktionelle Kopfschmerz (PKS) ist eine Komplikation nach rückenmarknahen Verfahren (RA) mit erheblichem Krankheitswert. Ziel der Untersuchung war es, die Inzidenz des PKS in 2 großen operativen Kollektiven zu untersuchen, mögliche Risikofaktoren zu identifizieren und den Einfluss auf die Krankenhausverweildauer zu untersuchen. Material und Methoden: In einer retrospektiven Analyse des Zeitraums 2010–2012 wurden 341 unfallchirurgische (UCH) und 2113 geburtsmedizinische (GEB) Patient*innen nach Spinalanästhesie (SPA) analysiert. In der statistischen Auswertung (SPSS-23) kamen univariate Analysen mittels Mann-Whitney-U-, Chi2- und Student’s t‑Test sowie logistische Regressionsanalysen zur Anwendung. Ergebnisse: Die Inzidenz des PKS betrug in der UCH-Gruppe 5,9 % und in der GEB-Gruppe 1,8 %. Patient*innen mit PKS in der UCH wiesen ein jüngeres Patientenalter (38 vs. 47 Jahre, p = 0,011), einen geringeren BMI (23,5 vs. 25,2, p = 0,037) sowie ein niedrigeres Köpergewicht (70,5 kg vs. 77 kg, p = 0,006) als Patient*innen ohne PKS auf. Dabei konnten das Alter mit einer „odds ratio“ (OR 97,5 % Konfidenzintervall [KI]) von 0,963 (97,5% KI 0,932–0,991, p = 0,015) und das Köpergewicht mit einer OR von 0,956 (97,5 % KI 0,920–0,989, p = 0,014) als unabhängige Risikofaktoren für die Entstehung eines PKS identifiziert werden. In der GEB wies die SPA eine höhere Inzidenz des PKS auf als die kombinierte Spinalepiduralanästhesie (CSE) (8,6 % vs. 1,2 %, p < 0,001). Dabei erwies sich das Verfahren mit einer OR von 0,049 (97,5 % KI 0,023–0,106, p < 0,001) als unabhängiger Risikofaktor für die Entstehung eines PKS. In beiden Gruppen war der PKS mit einem verlängerten Krankenhausaufenthalt assoziiert (UCH-Gruppe 4 vs. 2 Tage, p = 0,001; GEB-Gruppe 6 vs. 4 Tage, p < 0.0005). Diskussion: Die Inzidenz des PKS nach SPA/CSE war in unserer Untersuchung in den beschriebenen Patientengruppen unterschiedlich, mit einem deutlich höheren Anteil in der UCH-Gruppe. Alter, Konstitution und Verfahren waren hinweisgebende Risikofaktoren eines PKS. In Anbetracht der funktionellen Einschränkungen (Mobilisation, Versorgung des Neugeborenen) und des verlängerten Krankenhausaufenthalts, sollten zukünftige Studien eine frühe Behandlung des PKS untersuchen.Background/objective: Postdural puncture headache (PDPH) is a severe complication after spinal anesthesia. The aim of this study was to investigate the incidence of PDPH in two different operative cohorts and to identify risk factors for its occurrence as well as to analyze its influence on the duration of hospital stay. Material and methods: In a retrospective study over a period of 3 years (2010–2012), 341 orthopedic surgery (ORT) and 2113 obstetric (OBS) patients were evaluated after spinal anesthesia (SPA). Data were statistically analyzed using (SPSS-23) univariate analyses with the Mann-Whitney U‑test, χ2-test and Student’s t-test as well as logistic regression analysis. Results: The incidence of PDPH was 5.9% in the ORT cohort and 1.8% in the OBS cohort. Patients with PDPH in the ORT cohort were significantly younger (median 38 years vs. 47 years, p = 0.011), had a lower body weight (median 70.5 kg vs. 77 kg, p = 0.006) and a lower body mass index (median 23.5 vs. 25.2, p = 0.037). Body weight (odds ratio (97.5 % Confidence Intervall [CI]), OR 0.956: 97.5% CI 0.920–0.989, p = 0.014) as well as age (OR 0.963: 97.5% CI 0.932–0.991, p = 0.015) were identified as independent risk factors for PDPH. In OBS patients, PDPH occurred more frequently after spinal epidural anesthesia than after combined spinal epidural anesthesia (8.6% vs. 1.2%, p < 0.001) and the type of neuraxial anesthesia was identified as an independent risk factor for PDPH (OR 0.049; 97.5% CI 0.023–0.106, p < 0.001). In both groups the incidence of PDPH was associated with a longer hospital stay (ORT patients 4 days vs. 2 days, p = 0.001; OBS patients 6 days vs. 4 days, p < 0.0005). Conclusion: The incidence of PDPH was different in the two groups with a higher incidence in the ORT but considerably lower than in the literature. Age, constitution and type of neuraxial anesthesia were identified as risk factors of PDPH. Considering the functional imitations (mobilization, neonatal care) and a longer hospital stay, future studies should investigate the impact of an early treatment of PDPH

    Graft Pre-conditioning by Peri-Operative Perfusion of Kidney Allografts With Rabbit Anti-human T-lymphocyte Globulin Results in Improved Kidney Graft Function in the Early Post-transplantation Period—a Prospective, Randomized Placebo-Controlled Trial

    Get PDF
    Introduction: Although prone to a higher degree of ischemia reperfusion injury (IRI), the use of extended criteria donor (ECD) organs has become reality in transplantation. We therefore postulated that peri-operative perfusion of renal transplants with anti-human T-lymphocyte globulin (ATLG) ameliorates IRI and results in improved graft function.Methods: We performed a randomized, single-blinded, placebo-controlled trial involving 50 kidneys (KTx). Prior to implantation organs were perfused and incubated with ATLG (AP) (n = 24 kidney). Control organs (CP) were perfused with saline only (n = 26 kidney). Primary endpoint was defined as graft function reflected by serum creatinine at day 7 post transplantation (post-tx).Results: AP-KTx recipients illustrated significantly better graft function at day 7 post-tx as reflected by lower creatinine levels, whereas no treatment effect was observed after 12 months surveillance. During the early hospitalization phase, 16 of the 26 CP-KTx patients required dialysis during the first 7 days post-tx, whereas only 10 of the 24 AP-KTx patients underwent dialysis. No treatment-specific differences were detected for various lymphocytes subsets in the peripheral blood of patients. Additionally, mRNA analysis of 0-h biopsies post incubation with ATLG revealed no changes of intragraft inflammatory expression patterns between AP and CP organs.Conclusion: We here present the first clinical study on peri-operative organ perfusion with ATLG illustrating improved graft function in the early period post kidney transplantation.Clinical Trial Registration:www.ClinicalTrials.gov, NCT0337728
    corecore