7 research outputs found

    Induction of immunological tolerance in the pig-to-baboon xenotransplantation model : studies aimed at achieving mixed hematopoietic chimerism and preventing associated thrombotic complications

    Get PDF
    The outcome of clinical organ transplantation has dramatically improved since the introduction of cyclosporine (CyA) in 1979 and of other, more recently introduced, immunosuppressive agents such as azathioprine, mycophenolate mofetil, tacrolimus and sirolimus. Furthermore, due to more refined surgical techniques and peri operative management, prolonged survival of allografts is achieved. Due to its relative success, the inclusion criteria for potential organ transplant recipients have been broadened, resulting in an even greater shortage of donor organs. The number of patients with end-stage organ failure that die awaiting organ transplantatio

    Selected liver grafts from donation after circulatory death can be safely used for retransplantation – a multicenter retrospective study

    Get PDF
    Due to the growing number of liver transplantations (LTs), there is an increasing number of patients requiring retransplantation (reLT). Data on the use of grafts from extended criteria donors (ECD), especially donation after circulatory death (DCD), for reLT are lacking. We aimed to assess the outcome of patients undergoing reLT using a DCD graft in the Netherlands between 2001 and July 2018. Propensity score matching was used to match each DCD-reLT with three DBD-reLT cases. Primary outcomes were patient and graft survival. Secondary outcome was the incidence of biliary complications, especially nonanastomotic strictures (NAS). 21 DCD-reLT were compared with 63 matched DBD-reLTs. Donors in the DCD-reLT group had a significantly lower BMI (22.4 vs. 24.7 kg/m2, P-value = 0.02). Comparison of recipient demographics and ischemia times yielded no significant differences. Patient and graft survival rates were comparable between the two groups. However, the occurrence of nonanastomotic strictures after DCD-reLT was significantly higher (38.1% vs. 12.7%, P-value = 0.02). ReLT with DCD grafts does not result in inferior patient and graft survival compared with DBD grafts in selected patients. Therefore, DCD liver grafts should not routinely be declined for patients awaiting reLT

    Donor hepatectomy time influences ischemia-reperfusion injury of the biliary tree in donation after circulatory death liver transplantation

    Get PDF
    Background: Donor hepatectomy time is associated with graft survival after liver transplantation. The aim of this study was to identify the impact of donor hepatectomy time on biliary injury during donation after circulatory death liver transplantation. Methods: First, bile duct biopsies of livers included in (pre)clinical machine perfusion research were analyzed. Secondly, of the same livers, bile samples were collected during normothermic machine perfusion. Lastly, a nationwide retrospective cohort study was performed including 273 adult patients undergoing donation after circulatory death liver transplantation between January 1, 2002 and January 1, 2017. Primary endpoint was development of non-anastomotic biliary strictures within 2 years of donation after circulatory death liver transplantation. Cox proportiona

    Donor diabetes mellitus is a risk factor for diminished outcome after liver transplantation: a nationwide retrospective cohort study

    Get PDF
    With the growing incidence of diabetes mellitus (DM), an increasing number of organ donors with DM can be expected. We sought to investigate the association between donor DM with early post-transplant outcomes. From a national cohort of adult liver transplant recipients (1996–2016), all recipients transplanted with a liver from a DM donor (n = 69) were matched 1:2 with recipients of livers from non-DM donors (n = 138). The primary end-point included early post-transplant outcome, such as the incidence of primary nonfunction (PNF), hepatic artery thrombosis (HAT), and 90-day graft survival. Cox regression analy

    Inhibition of matrix metalloproteinases increases PPAR-α and IL-6 and prevents dietary-induced hepatic steatosis and injury in a murine model

    No full text
    Steatosis is a prominent feature of nonalcoholic fatty liver disease and a potential promoter of inflammation. Injury leading to cirrhosis is partly mediated by dysregulation of matrix protein turnover. Matrix metalloproteinase (MMP) inhibitors protect mice from lethal TNF-α induced liver injury. We hypothesized that Marimastat, a broad-spectrum MMP and TNF-α converting enzyme (TACE) inhibitor, might modulate this injury through interruption of inflammatory pathways. Triglyceride and phospholipid levels (liver, serum) and fatty acid profiles were used to assess essential fatty acid status and de novo lipogenesis as mechanisms for hepatic steatosis. Mice receiving a fat-free, high-carbohydrate diet (HCD) for 19 days developed severe fatty liver infiltration, demonstrated by histology, magnetic resonance spectroscopy, and elevated liver function tests. Animals receiving HCD plus Marimastat (HCD+MAR) were comparable to control animals. Increased tissue levels of peroxisome proliferator activated receptor-α (PPAR-α), higher levels of serum IL-6, and decreased levels of serum TNF-α receptor II were also seen in the HCD+MAR group compared with HCD-only. In addition, there was increased phosphorylation, and likely activation, of PPAR-α in the HCD+MAR group. PPAR-α is a transcription factor involved in β-oxidation of fatty acids, and IL-6 is a hepatoprotective cytokine. Liver triglyceride levels were higher and serum triglyceride and phospholipid levels lower with HCD-only but improved with Marimastat treatment. HCD-only and HCD+MAR groups were essential fatty acid deficient and had elevated rates of de novo lipogenesis. We therefore conclude that Marimastat reduces liver triglyceride accumulation by increasing fat oxidation and/or liver clearance of triglycerides. This may be related to increased expression and activation of PPAR-α or IL-6, respectively. Copyrigh

    Improving outcomes for donation after circulatory death kidney transplantation: Science of the times

    No full text
    The use of kidneys donated after circulatory death (DCD) remains controversial due to concerns with regard to high incidences of early graft loss, delayed graft function (DGF), and impaired graft survival. As these concerns are mainly based on data from historical cohorts, they are prone to time-related effects and may therefore not apply to the current timeframe. To assess the impact of time on outcomes, we performed a time-dependent comparative analysis of outcomes of DCD and donation after brain death (DBD) kidney transplantations. Data of all 11,415 deceased-donor kidney transplantations performed in The Netherlands between 1990-2018 were collected. Based on the incidences of early graft loss, two eras were defined (1998-2008 [n = 3,499] and 2008-2018 [n = 3,781]), and potential time-related effects on outcomes evaluated. Multivariate analyses were applied to examine associations between donor type and outcomes. Interaction tests were used to explore presence of effect modification. Results show clear time-related effects on posttransplant outcomes. The 1998-2008 interval showed compromised outcomes for DCD procedures (higher incidences of DGF and early graft loss, impaired 1-year renal function, and inferior graft survival), whereas DBD and DCD outcome equivalence was observed for the 2008-2018 interval. This occurred despite persistently high incidences of DGF in DCD grafts, and more adverse recipient and donor risk profiles (recipients were 6 years older and the KDRI increased from 1.23 to 1.39 and from 1.35 to 1.49 for DBD and DCD donors). In contrast, the median cold ischaemic period de
    corecore