18 research outputs found
Extra-anatomic aortic bypass for the treatment of a mycotic pseudoaneurysm after liver transplantation for hilar cholangiocarcinoma
Liver transplantation (LT) after neoadjuvant chemoradiotherapy in patients with unresectable hilar cholangiocarcinoma (HC) is an accepted treatment strategy [1]. Neoadjuvant therapy is associated with an increased risk of arterial and portal complications after LT [1,2]. In most cases, radiation therapy makes the use of the native hepatic artery inadvisable, and an aortic anastomosis is needed, either with or without a graft [2]. The development of a mycotic pseudoaneurysm after LT is a rare complication that is associated with a high incidence of graft failure and mortality. Radiotherapy, local infections and the use of grafts are known risk factors for the development of a mycotic pseudoaneurysm, which is always challenging to manage [3]
Removal of a migrated biliary stent using new digital cholangioscopy retrieval devices in a transplant patient
A 51-year-old man who had undergone liver transplantation developed a symptomatic anastomotic biliary stricture 23 months after surgery. Endoscopic biliary therapy via endoscopic retrograde cholangiopancreatography (ERCP) was planned. Progressive biliary balloon dilation of the stenosis was performed, with placement of three coaxial plastic stents (8.5-FrâĂâ12âcm, 8.5-FrâĂâ9âcm, and 10-FrâĂâ12âcm; Advanix, Boston Scientific, Natick, Massachusetts, USA). During an endoscopy to replace the stents, fluoroscopy revealed proximal migration of an 8.5-Fr plastic stent at the level of the cystic insertion ([Fig.â1]). Several failed extraction attempts were made using the standard ERCP techniques (i.âe. extractor balloon, Lasso technique, and others) [1] [2]. Single-operator peroral intraductal cholangioscopy (SpyGlass DS direct visualization system, Boston Scientific) confirmed impaction of the distal end of the proximally migrated stent, located 3âcm proximally to the duodenal papilla. An attempt to mobilize the migrated stent was made using biopsy forceps (SpyBite, Boston Scientific), without success
The etiology, incidence, and impact of preservation fluid contamination during liver transplantation
The role of contaminated preservation fluid in the development of infection after liver transplantation has not been fully elucidated. To assess the incidence and etiology of contaminated preservation fluid and determine its impact on the subsequent development of infection after liver transplantation, we prospectively studied 50 consecutive liver transplants, and cultured the following samples in each instance: preservation fluid (immediately before and at the end of the back-table procedure, and just before implantation), blood, and bile from the donor, and ascitic fluid from the recipient. When any culture was positive, blood cultures were obtained and targeted antimicrobial therapy was started. We found that the incidence of contaminated preservation fluid was 92% (46 of 50 cases of liver transplantation per year), but only 28% (14/50) were contaminated by recognized pathogens. Blood and bile cultures from the donor were positive in 28% and 6% respectively, whereas ascitic fluid was positive in 22%. The most frequently isolated microorganisms were coagulase-negative staphylococci. In nine cases, the microorganisms isolated from the preservation fluid concurred with those grown from the donor blood cultures, and in one case, the isolate matched with the one obtained from bile culture. No liver transplant recipient developed an infection due to the transmission of an organism isolated from the preservation fluid. Our findings indicate that contamination of the preservation fluid is frequent in liver transplantation, and it is mainly caused by saprophytic skin flora. Transmission of infection is low, particularly among those recipients given targeted antimicrobial treatment for organisms isolated in the preservation fluid
Prognostic value and risk stratification of residual disease in patients with incidental gallbladder cancer
Background and aim: given their poor prognosis, patients with residual disease (RD) in the re-resection specimen of an incidental gallbladder carcinoma (IGBC) could benefit from a better selection for surgical treatment. The Gallbladder Cancer Risk Score (GBRS) has been proposed to preoperatively identify RD risk more precisely than T-stage alone. The aim of this study was to assess the prognostic value of RD and to validate the GBRS in a retrospective series of patients. Material and methods: a prospectively collected database including 59 patients with IGBC diagnosed from December 1996 to November 2015 was retrospectively analyzed. Three locations of RD were established: local, regional, and distant. The effect of RD on overall survival (OS) was analyzed with the Kaplan-Meier method. To identify variables associated with the presence of RD, characteristics of patients with and without RD were compared using Fisher's exact test. The relative risk of RD associated with clinical and pathologic factors was studied with a univariate logistic regression analysis. Results: RD was found in 30 patients (50.8%). The presence of RD in any location was associated with worse OS (29% vs. 74.2%, p =â0.0001), even after an R0 resection (37.7% vs 74.2%, p =â0.003). There was no significant difference in survival between patients without RD and with local RD (74.2% vs 64.3%, p =â0.266), nor between patients with regional RD and distant RD (16.1% vs 20%, p =â0.411). After selecting patients in which R0 resection was achieved (n =â44), 5-year survival rate for patients without RD, local RD, and regional RD was, respectively, 74.2%, 75%, and 13.9% (p =â0.0001). The GBRS could be calculated in 25 cases (42.3%), and its usefulness to predict the presence of regional or distant RD (RDRD) was confirmed (80% in high-risk patients and 30% in intermediate risk pâ=â0.041). Conclusion: RDRD, but not local RD, represents a negative prognostic factor of OS. The GBRS was useful to preoperatively identify patients with high risk of RDRD. An R0 resection did not improve OS of patients with regional RD
2-[18F]FDG PET/CT as a Predictor of Microvascular Invasion and High Histological Grade in Patients with Hepatocellular Carcinoma
Hepatocellular carcinoma (HCC) generally presents a low avidity for 2-deoxy-2-[18F]fluoro-d-glucose (FDG) in PET/CT although an increased FDG uptake seems to relate to more aggressive biological factors. To define the prognostic value of PET/CT with FDG in patients with an HCC scheduled for a tumor resection, forty-one patients were prospectively studied. The histological factors of a poor prognosis were determined and FDG uptake in the HCC lesions was analyzed semi-quantitatively (lean body mass-corrected standardized uptake value (SUL) and tumor-to-liver ratio (TLR) at different time points). The PET metabolic parameters were related to the histological characteristics of the resected tumors and to the evolution of patients. Microvascular invasion (MVI) and a poor grade of differentiation were significantly related to a worse prognosis. The SULpeak of the lesion 60 min post-FDG injection was the best parameter to predict MVI while the SULpeak of the TLR at 60 min was better for a poor differentiation. Moreover, the latter parameter was also the best preoperative variable available to predict any of these two histological factors. Patients with an increased TLRpeak60 presented a significantly higher incidence of poor prognostic factors than the rest (75% vs. 28.6%, p = 0.005) and a significantly higher incidence of recurrence at 12 months (38% vs. 0%, p = 0.014). Therefore, a semi-quantitative analysis of certain metabolic parameters on PET/CT can help identify, preoperatively, patients with histological factors of a poor prognosis, allowing an adjustment of the therapeutic strategy for those patients with a higher risk of an early recurrence
Liver resection for hepatocellular carcinoma in patients with clinically significant portal hypertension
Background & Aims: Liver resection (LR) in patients with hepatocellular carcinoma (HCC) and clinically significant portal hypertension (CSPH) defined as a hepatic venous pressure gradient (HVPG) >â10 mmHg is not encouraged. Here, we reap praised the outcomes of patients with cirrhosis and CSPH who underwent LR for HCC in highly specialised liver centres. Methods: This was a retrospective multicentre study from 1999 to 2019. Predictors for postoperative liver decompensation and textbook outcomes were identified. Results: In total, 79 patients with a median age of 65 years were included. The Child-Pugh grade was A in 99% of patients, and the median model for end-stage liver disease (MELD) score was 8. The median HVPG was 12 mmHg. Major hepatectomies and laparoscopies were performed in 28% and 34% of patients, respectively. Ninety-day mortality and severe morbidity rates were 6% and 27%, respectively. Postoperative and persistent liver decompensation occurred in 35% and 10% of patients at 3 months. Predictors of liver decompensation included increased preoperative HVPG (p = 0.004), increased serum total bilirubin (p = 0.02), and open approach (p = 0.03). Of the patients, 34% achieved a textbook outcome, of which the laparoscopic approach was the sole predictor (p = 0.004). The 5-year overall survival and recurrence-free survival rates were 55% and 43%, respectively. Conclusions: Patients with cirrhosis, HCC and HVPG >â10 mmHg can undergo LR with acceptable mortality, morbidity, and liver decompensation rates. The laparoscopic approach was the sole predictor of a textbook outcome
Patisiran treatment in patients with hereditary transthyretin-mediated amyloidosis with polyneuropathy after liver transplantation
Hereditary transthyretin-mediated (hATTR) amyloidosis, or ATTRv amyloidosis, is a progressive disease, for which liver transplantation (LT) has been a long-standing treatment. However, disease progression continues post-LT. This Phase 3b, open-label trial evaluated efficacy and safety of patisiran in patients with ATTRv amyloidosis with polyneuropathy progression post-LT. Primary endpoint was median transthyretin (TTR) reduction from baseline. Twenty-three patients received patisiran for 12 months alongside immunosuppression regimens. Patisiran elicited a rapid, sustained TTR reduction (median reduction [Months 6 and 12 average], 91.0%; 95% CI: 86.1%-92.3%); improved neuropathy, quality of life, and autonomic symptoms from baseline to Month 12 (mean change [SEM], Neuropathy Impairment Score, -3.7 [2.7]; Norfolk Quality of Life-Diabetic Neuropathy questionnaire, -6.5 [4.9]; least-squares mean [SEM], Composite Autonomic Symptom Score-31, -5.0 [2.6]); and stabilized disability (Rasch-built Overall Disability Scale) and nutritional status (modified body mass index). Adverse events were mild or moderate; five patients experienced â„1 serious adverse event. Most patients had normal liver function tests. One patient experienced transplant rejection consistent with inadequate immunosuppression, remained on patisiran, and completed the study. In conclusion, patisiran reduced serum TTR, was well tolerated, and improved or stabilized key disease impairment measures in patients with ATTRv amyloidosis with polyneuropathy progression post-LT (www.clinicaltrials.gov NCT03862807)
Combined Liver-Kidney Transplantation With Preformed Anti-human Leukocyte Antigen Donor-Specific Antibodies
Introduction: the impact of preformed donor-specific anti-human leukocyte antigen (HLA) antibodies (pDSAs) after combined liver-kidney transplantation (CLKT) is still uncertain. Methods: we conducted a retrospective study in 8 European high-volume transplant centers and investigated the outcome of 166 consecutive CLKTs, including 46 patients with pDSAs. Results: patient survival was lower in those with pDSAs (5-year patient survival rate of 63% and 78% with or without pDSA, respectively; P = 0.04). The presence of pDSAs with a mean fluorescence intensity (MFI) ℠5000 (hazard ratio 4.96; 95% confidence interval: 2.3-10.9; P < 0.001) and the presence of 3 or more pDSAs (hazard ratio 6.5; 95% confidence interval: 2.5-18.8; P = 0.05) were independently associated with death. The death-censored liver graft survival was similar in patients with or without pDSAs. Kidney graft survival was comparable in both groups. (The 1- and 5-year death-censored graft survival rates were 91.6% and 79.5%, respectively, in patients with pDSAs and 93% and 88%, respectively, in the donor-specific antibody [DSA]-negative group, P = not significant). Despite a higher rate of kidney graft rejection in patients with pDSAs (5-year kidney graft survival rate without rejection of 87% and 97% with or without pDSAs, respectively; P = 0.04), kidney function did not statistically differ between both groups at 5 years post-transplantation (estimated glomerular filtration rate 45 ± 17 vs. 57 ± 29 ml/min per 1.73 m2, respectively, in patients with and without pDSAs). Five recipients with pDSAs (11.0%) experienced an antibody-mediated kidney rejection that led to graft loss in 1 patient. Conclusion: our results suggest that CLKT with pDSAs is associated with a lower patients' survival despite good recipients', liver and kidney grafts' outcome
Everolimus plus minimized tacrolimus on kidney function in liver transplantation: REDUCE, a prospective, randomized controlled study
Background and aim: reduction in calcineurin inhibitor levels is considered crucial to decrease the incidence of kidney dysfunction in liver transplant (LT) recipients. The aim of this study was to evaluate the safety and impact of everolimus plus reduced tacrolimus (EVR + rTAC) vs. mycophenolate mofetil plus tacrolimus (MMF + TAC) on kidney function in LT recipients from Spain. Methods: the REDUCE study was a 52-week, multicenter, randomized, controlled, open-label, phase 3b study in de novo LT recipients. Eligible patients were randomized (1:1) 28 days post-transplantation to receive EVR + rTAC (TAC levels <_ 5 ng/mL) or to continue with MMF + TAC (TAC levels = 6-10 ng/mL). Mean estimated glomerular filtration rate (eGFR), clinical benefit in renal function, and safety were evaluated. Results: in the EVR + rTAC group (n = 105), eGFR increased from randomization to week 52 (82.2 [28.5] mL/min/1.73 m2 to 86.1 [27.9] mL/min/1.73 m2) whereas it decreased in the MMF + TAC (n = 106) group (88.4 [34.3] mL/min/1.73 m2 to 83.2 [25.2] mL/min/1.73 m2), with significant (p < 0.05) differences in eGFR throughout the study. However, both groups had a similar clinical benefit regarding renal function (improvement in 18.6 % vs. 19.1 %, and stabilization in 81.4 % vs. 80.9 % of patients in the EVR + rTAC vs. MMF + TAC groups, respectively). There were no significant differences in the incidence of acute rejection (5.7 % vs. 3.8 %), deaths (5.7 % vs. 2.8 %), and serious adverse events (51.9 % vs. 44.0 %) between the 2 groups. Conclusion: EVR + rTAC allows a safe reduction in tacrolimus exposure in de novo liver transplant recipients, with a significant improvement in eGFR but without significant differences in renal clinical benefit 1 year after liver transplantation
The Impact of Culturing the Organ Preservation Fluid on Solid Organ Transplantation: A Prospective Multicenter Cohort Study
Background. We analyzed the prevalence, etiology, and risk factors of culture-positive preservation fluid and their impact on the management of solid organ transplant recipients.
Methods. From July 2015 to March 2017, 622 episodes of adult solid organ transplants at 7 university hospitals in Spain were prospectively included in the study.
Results. The prevalence of culture-positive preservation fluid was 62.5% (389/622). Nevertheless, in only 25.2% (98/389) of the cases
were the isolates considered ?high risk? for pathogenicity. After applying a multivariate regression analysis, advanced donor age was the main associated factor for having culture-positive preservation fluid for high-risk microorganisms. Preemptive antibiotic therapy was given to 19.8% (77/389) of the cases. The incidence rate of preservation fluid?related infection was 1.3% (5 recipients); none of these patients had received preemptive therapy. Solid organ transplant (SOT) recipients with high-risk culture-positive preservation fluid receiving
preemptive antibiotic therapy presented both a lower cumulative incidence of infection and a lower rate of acute rejection and graft loss compared with those who did not have high-risk culture-positive preservation fluid. After adjusting for age, sex, type of transplant, and prior graft rejection, preemptive antibiotic therapy remained a significant protective factor for 90-day infection.
Conclusions. The routine culture of preservation fluid may be considered a tool that provides information about the contamination of the transplanted organ. Preemptive therapy for SOT recipients with high-risk culture-positive preservation fluid may be useful to avoid preservation fluid?related infections and improve the outcomes of infection, graft loss, and graft rejection in transplant patients