12 research outputs found
Age at Time of Kidney Transplantation as a Predictor for Mortality, Graft Loss and Self-Rated Health Status: Results From the Swiss Transplant Cohort Study
Introduction:; The effect of age on health outcomes in kidney transplantation remains inconclusive. This study aimed to analyze the relationship between age at time of kidney transplantation with mortality, graft loss and self-rated health status in adult kidney transplant recipients.; Methods:; This study used data from the Swiss Transplant Cohort Study and included prospective data of kidney transplant recipients between 2008 and 2017. Time-to-event analysis was performed using Cox' regression analysis, and -in the case of graft loss- competing risk analysis. A random-intercept regression model was applied to analyse self-rated health status.; Results:; We included 2,366 kidney transplant recipients. Age at transplantation linearly predicted mortality. It was also predictive for graft loss, though nonlinearly, showing that recipients aged between 35 and 55 years presented with the lowest risk of experiencing graft loss. No relationship of age with self-rated health status was detected.; Conclusion:; Higher mortality in older recipients complies with data from the general population. The non-linear relationship between age and graft loss and the higher scored self-rated health status at all follow-up time-points compared to the pre-transplant status -regardless of age- highlight that age alone might not be an accurate measure for risk prediction and clinical decision making in kidney transplantation
Immunogenicity of High-Dose vs. MF59-adjuvanted vs. Standard Influenza Vaccine in Solid Organ Transplant Recipients: The STOP-FLU trial.
BACKGROUND
The immunogenicity of the standard influenza vaccine is reduced in solid-organ transplant (SOT) recipients, so that new vaccination strategies are needed in this population.
METHODS
Adult SOT recipients from nine transplant clinics in Switzerland and Spain were enrolled if they were >3 months after transplantation. High, with stratification by organ and time from transplant. The primary outcome was vaccine response rate, defined as a ≥4-fold increase of hemagglutination-inhibition titers to at least one vaccine strain at 28 days post-vaccination. Secondary outcomes included PCR-confirmed influenza and vaccine reactogenicity.
RESULTS
619 patients were randomized, 616 received the assigned vaccines, and 598 had serum available for analysis of the primary endpoint (standard, n=198; MF59-adjuvanted, n=205; high-dose, n=195 patients). Vaccine response rates were 42% (84/198) in the standard vaccine group, 60% (122/205) in the MF59-adjuvanted vaccine group, and 66% (129/195) in the high-dose vaccine group (difference in intervention vaccines vs. standard vaccine, 0.20 [97.5% CI 0.12-1]; p<0.001; difference in high-dose vs. standard vaccine, 0.24 [95% CI 0.16-1]; p<0.001; difference in MF59-adjuvanted vs. standard vaccine, 0.17 [97.5% CI 0.08-1]; p<0.001). Influenza occurred in 6% the standard, 5% in the MF59-adjuvanted, and 7% in the high-dose vaccine groups. Vaccine-related adverse events occurred more frequently in the intervention vaccine groups, but most of the events were mild.
CONCLUSIONS
In SOT recipients, use of an MF59-adjuvanted or a high-dose influenza vaccine was safe and resulted in a higher vaccine response rate.
TRIAL REGISTRATION
Clinicaltrials.gov NCT03699839
Immune monitoring-guided vs fixed duration of antiviral prophylaxis against cytomegalovirus in solid-organ transplant recipients. A Multicenter, Randomized Clinical Trial.
BACKGROUND
The use of assays detecting cytomegalovirus (CMV)-specific T-cell-mediated immunity may individualize the duration of antiviral prophylaxis in transplant recipients.
METHODS
In this open-label randomized trial, adult kidney and liver transplant recipients from six centers in Switzerland were enrolled if they were CMV-seronegative with seropositive donors or CMV-seropositive receiving anti-thymocyte globulins. Patients were randomized to a duration of antiviral prophylaxis based on immune-monitoring (intervention) or a fixed duration (control). Patients in the control group were planned to receive 180 days (CMV-seronegative) or 90 days (CMV-seropositive) of valganciclovir. Patients were assessed monthly with a CMV-specific interferon gamma release assay (T-Track® CMV); prophylaxis in the intervention group was stopped if the assay was positive. The primary outcomes were the proportion of patients with clinically significant CMV infection and reduction in days of prophylaxis. Between-group differences were adjusted for CMV serostatus.
RESULTS
Overall, 193 patients were randomized (92 in the immune-monitoring and 101 in the control group) of which 185 had evaluation of the primary endpoint (87 and 98 patients, respectively). Clinically significant CMV infection occurred in 26/87 (adjusted percentage, 30.9%) in the immune-monitoring group and in 32/98 (adjusted percentage, 31.1%) in the control group (adjusted risk difference -0.1, 95%CI -13.0%, 12.7%; p = 0.064). The duration of antiviral prophylaxis was shorter in the immune-monitoring group (adjusted difference -26.0 days, 95%-CI -41.1 to -10.8 days, p < 0.001).
CONCLUSIONS
Immune monitoring resulted in a significant reduction of antiviral prophylaxis, but we were unable to establish noninferiority of this approach on the co-primary endpoint of CMV infection
Alterations of the thoracic spine in Marfan's syndrome
OBJECTIVE: The purpose of this study was to determine if the thoracic vertebral elements are altered in patients with Marfan's syndrome. MATERIALS AND METHODS: Thirty patients underwent helical CT of the thorax because of suspected thoracic aortic dilatation and acute dissection. Thirteen had Marfan's syndrome and 17 did not. Two reviewers, unaware of the final diagnosis, evaluated the images by consensus for laminar thickness, foraminal width, dural sac ratios, and vertebral scalloping for T2-T12. RESULTS: At T9-T12, dural sac ratios at the midcorpus level (p = 0.031) and foraminal width (p = 0.0124) were significantly greater in the patients with Marfan's syndrome than in the patients without. Dural sac ratios at lower endplate levels (p = 0.0685), laminar thickness (p = 0.951), and vertebral scalloping (p = 0.24) were not significantly greater in the patients with Marfan's syndrome than in the patients without. CONCLUSION: Because the phenotypic expression of Marfan's syndrome is variable, information on the spine from thoracic studies in combination with major criteria may be helpful clinically
Impact of Hyponatremia after Renal Transplantation on Decline of Renal Function, Graft Loss and Patient Survival: A Prospective Cohort Study
BACKGROUND
Hyponatremia is one of the most common electrolyte disorders observed in hospitalized and ambulatory patients. Hyponatremia is associated with increased falls, fractures, prolonged hospitalisation and mortality. The clinical importance of hyponatremia in the renal transplant field is not well established, so the aim of this study was to determine the relationships between hyponatremia and mortality as main outcome and renal function decline and graft loss as secondary outcome among a prospective cohort of renal transplant recipients.
METHODS
This prospective cohort study included 1315 patients between 1 May 2008 and 31 December 2014. Hyponatremia was defined as sodium concentration below 136 mmol/L at 6 months after transplantation. The main endpoint was mortality. A secondary composite endpoint was also defined as: rapid decline in renal function (≥5 mL/min/1.73 m drop of the eGFR/year), graft loss or mortality.
RESULTS
Mean sodium was 140 ± 3.08 mmol/L. 97 patients displayed hyponatremia with a mean of 132.9 ± 3.05 mmol/L. Hyponatremia at 6 months after transplantation was associated neither with mortality (HR: 1.02; p = 0.97, 95% CI: 0.47-2.19), nor with the composite outcome defined as rapid decline in renal function, graft loss or mortality (logrank test p = 0.9).
CONCLUSIONS
Hyponatremia 6 months after transplantation is not associated with mortality in kidney allograft patients
Novel rescue procedure for inferior vena cava reconstruction in living-donor liver transplantation using a vascular graft recovered 25 h after donors' circulatory death and systematic review
Liver transplantation is a lifesaving treatment for patients suffering
from end-stage liver disease. Rarely, acute congestion of the inferior
vena cava (IVC) is being encountered because of tumor compression. MELD
allocation does not reflect severity of this condition because of lack
of organ failure. Herein, a patient is being presented undergoing urgent
living-donor liver transplantation (LDLT) with IVC reconstruction for a
fast-growing hepatic epithelioid hemangioendothelioma (HEH). IVC
reconstruction using a venous graft recovered from a 25-h after
circulatory-death prior transplantation became necessary to compensate
severe venous congestion. Additionally, a systematic review of the
literature searching MEDLINE/PubMed was performed. Protocol and
eligibility criteria were specified in advance and registered at the
PROSPERO registry (CRD42013004827). Published literature of IVC
reconstruction in LDLT was selected. Two reports describing IVC
reconstruction with cryopreserved IVC grafts and one IVC reconstruction
using a deceased after-circulatory-death-donor IVC graft were included.
Follow-up was at 12 and 13months, respectively. Regarding the graft
recovery in the setting of living-related donation, this graft remained
patent during the nine-month follow-up period. This is the first report
on the use of a venous graft from a circulatory-death-donor, not
eligible for whole organ recovery. We demonstrate in this study the
feasibility of using a size and blood-group-compatible IVC graft from a
cold-stored donor, which can solve the problem of urgent IVC
reconstruction in patients undergoing LDLT
Pre-transplant donor-specific HLA antibodies and risk for poor first-year renal transplant outcomes: results from the Swiss Transplant Cohort Study.
The aim of this study was to analyze first year renal outcomes in a nationwide prospective multicenter cohort comprising 2215 renal transplants, with a special emphasis on the presence of pre-transplant donor-specific HLA antibodies (DSA). All transplants had a complete virtual crossmatch and DSA were detected in 19% (411/2215). The investigated composite endpoint was a poor first-year outcome defined as (i) allograft failure or (ii) death or (iii) poor allograft function (eGFR ≤25 ml/min/1.73 m2 ) at one year. Two hundred and twenty-one (221/2215; 10%) transplants showed a poor first-year outcome. Rejection (24/70; 34%) was the most common reason for graft failure. First-year patient's death was rare (48/2215; 2%). There were no statistically significant differences between DSA-positive and DSA-negative transplants regarding composite and each individual endpoint, as well as reasons for graft failure and death. DSA-positive transplants experienced more frequently rejection episodes, mainly antibody-mediated rejection (both P < 0.0001). The combination of DSA and any first year rejection was associated with the overall poorest death-censored allograft survival (P < 0.0001). In conclusion, presence of pre-transplant DSA per se does not affect first year outcomes. However, DSA-positive transplants experiencing first year rejection are a high-risk population for poor allograft survival and may benefit from intense clinical surveillance
Impact of Hyponatremia after Renal Transplantation on Decline of Renal Function, Graft Loss and Patient Survival: A Prospective Cohort Study
Background: Hyponatremia is one of the most common electrolyte disorders observed in hospitalized and ambulatory patients. Hyponatremia is associated with increased falls, fractures, prolonged hospitalisation and mortality. The clinical importance of hyponatremia in the renal transplant field is not well established, so the aim of this study was to determine the relationships between hyponatremia and mortality as main outcome and renal function decline and graft loss as secondary outcome among a prospective cohort of renal transplant recipients.
Methods: This prospective cohort study included 1315 patients between 1 May 2008 and 31 December 2014. Hyponatremia was defined as sodium concentration below 136 mmol/L at 6 months after transplantation. The main endpoint was mortality. A secondary composite endpoint was also defined as: rapid decline in renal function (≥5 mL/min/1.73 m2drop of the eGFR/year), graft loss or mortality.
Results: Mean sodium was 140 ± 3.08 mmol/L. 97 patients displayed hyponatremia with a mean of 132.9 ± 3.05 mmol/L. Hyponatremia at 6 months after transplantation was associated neither with mortality (HR: 1.02;p= 0.97, 95% CI: 0.47-2.19), nor with the composite outcome defined as rapid decline in renal function, graft loss or mortality (logrank testp= 0.9).
Conclusions: Hyponatremia 6 months after transplantation is not associated with mortality in kidney allograft patients.</p
Image_1_Frequency and impact on renal transplant outcomes of urinary tract infections due to extended-spectrum beta-lactamase-producing Escherichia coli and Klebsiella species.TIF
BackgroundEnterobacterales are often responsible for urinary tract infection (UTI) in kidney transplant recipients. Among these, Escherichia coli or Klebsiella species producing extended-spectrum beta-lactamase (ESBL) are emerging. However, there are only scarce data on frequency and impact of ESBL-UTI on transplant outcomes.MethodsWe investigated frequency and impact of first-year UTI events with ESBL Escherichia coli and/or Klebsiella species in a prospective multicenter cohort consisting of 1,482 kidney transplants performed between 2012 and 2017, focusing only on 389 kidney transplants having at least one UTI with Escherichia coli and/or Klebsiella species. The cohort had a median follow-up of four years.ResultsIn total, 139/825 (17%) first-year UTI events in 69/389 (18%) transplant recipients were caused by ESBL-producing strains. Both UTI phenotypes and proportion among all UTI events over time were not different compared with UTI caused by non-ESBL-producing strains. However, hospitalizations in UTI with ESBL-producing strains were more often observed (39% versus 26%, p = 0.04). Transplant recipients with first-year UTI events with an ESBL-producing strain had more frequently recurrent UTI (33% versus 18%, p = 0.02) but there was no significant difference in one-year kidney function as well as longer-term graft and patient survival between patients with and without ESBL-UTI.ConclusionFirst-year UTI events with ESBL-producing Escherichia coli and/or Klebsiella species are associated with a higher need for hospitalization but do neither impact allograft function nor allograft and patient survival.</p