35 research outputs found

    Early mortality after cardiac transplantation: should we do better?

    No full text
    BACKGROUND: According to International Society for Heart and Lung Transplantation (ISHLT) data, the 30-day survival after heart transplantation has continually improved from 84% (1979-85) to 91% (1996-2001). This has probably been achieved by better donor/recipient selection, along with improved surgical technique and immunosuppressive therapy. On the other hand, the data concerning the early causes of death after cardiac transplantation is incomplete, because in 25% of cases, an unknown cause is listed. This study investigated the incidence and causes of 30-day mortality (determined by postmortem studies) after cardiac transplantation and assessed the possibility of improvements. METHODS: A retrospective study of all patients who underwent heart transplantation at Papworth Hospital from 1979 to June 2001 (n = 879) and who died within 30 days of surgery was carried out. Postmortem examination data were available for all patients. RESULTS: The mean (standard deviation) recipient and donor ages were 46 (12) and 31 (12) years, respectively. Overall, the 30-day mortality was 8.5% (n = 75), 12.1% for the 1979 to 1985 period and 6.9% for the 1996 to 2001 period. The primary causes of death were graft failure (30.7%), acute rejection (22.7%) (1.3% for the 1996-2001 era), sepsis (18.7%) gastrointestinal problems (bowel infarction and pancreatitis; (9.3%), postoperative bleeding (6.7%), and other (12%). CONCLUSIONS: Our 30-day mortality compares favorably with the data from the ISHLT registry, with great improvement in the early mortality. Acute rejection is no longer a major cause of early mortality. Further reduction may be achieved by a better protection of the donor heart against the effects of brainstem death and ischemic injuries. However, the quest to improve early outcome should not be at the expense of needy patients by being overselective

    Are non-brain stem-dead cardiac donors acceptable donors?

    No full text
    BACKGROUND: The deleterious effects of brainstem death (BSD) on donor cardiac function and endothelial integrity have been documented previously. Domino cardiac donation (heart of a heart-lung recipient transplanted into another recipient) is a way to avoid the effects of brainstem death and may confer both short- and long-term benefits to allograft recipients. METHODS: This study evaluates short- and long-term outcome in heart recipients of BSD donors (cadaveric) as compared with domino hearts explanted from patients who underwent heart-lung transplantation. RESULTS: Patients having undergone cardiac transplantation between April 1989 and August 2001 at Papworth Hospital were included (n = 571). Domino donor hearts were used in 81 (14%) of these cases. The pre-operative transpulmonary gradient was not significantly different between the two groups (p = 0.7). There was no significant difference in 30-day mortality (4.9% for domino vs 8.6% for BSD, p = 0.38) or in actuarial survival (p = 0.72). Ischemic time was significantly longer in the BSD group (p < 0.001). Acute rejection and infection episodes were not significantly different (p = 0.24 vs: 0.08). Relative to the BSD group, the risk (95% confidence interval) of acute rejection in the domino group was 0.89 (0.73 to 1.08). Similarly, the relative risk of infection was 0.78 (0.59 to 1.03). The 5-year actuarial survival rates (95% confidence interval) were 78% (69% to 87%) and 69% (65% to 73%) in the domino and BSD groups respectively. Angiography data at 2 years were available in 50 (62%) and 254 (52%) patients in the domino and BSD groups, respectively. The rates for 2-year freedom from cardiac allograft vasculopathy (CAV) were 96% (91% to 100%) and 93% (90% to 96%), respectively. CONCLUSION: Despite the lack of endothelial cell activation after brainstem death and a shorter ischemic time, the performance of domino donor hearts was similar to that of BSD donor hearts. This may indicate a similar pathology (i.e., endothelial cell activation) in the domino donors

    Immunosuppression, eotaxin and the diagnostic changes in eosinophils that precede early acute heart allograft rejection.

    No full text
    Peripheral blood eosinophil counts (EOS) are undetectable in 40% blood samples sent for routine haematology at Papworth Hospital during the first 3 months after heart transplantation (HTx). Increases in EOS usually precede the development of allograft rejection by a median of 4 days. We compared the effects of cyclosporin (dose and total blood concentration), prednisolone (dose and both total and unbound plasma concentrations) and azathioprine, as well as plasma concentrations of the CCR-3 chemokines, eotaxin and RANTES, on changes in EOS in 47 consecutive HTx recipients, with a median follow-up of 90 (IQR 85-95) days. Multivariate analysis confirmed the independent association between both prednisolone dose (P<0.0001) and eotaxin (P<0.0001) and changes in EOS. The plasma eotaxin concentration was, in turn, most closely associated with the cyclosporin dose (P<0.001) and plasma prednisolone concentration (P=0.022). The blood cyclosporin concentration (P=0.028), EOS (P=0.012) and prednisolone dose (P=0.015) were all independently associated with the risk of treated acute rejection. When prednisolone pharmacokinetic parameters were substituted for the prednisolone dose in this multivariate model, only the pharmacokinetic parameter retained a significant association with the risk of rejection. Changes in EOS preceding cardiac allograft rejection are directly associated with plasma eotaxin concentrations and indirectly with prednisolone dosage. Cyclosporin may also indirectly influence these changes by inhibiting eotaxin production. EOS, prednisolone dose and blood cyclosporin concentrations were independently associated with the risk of acute rejection. The total and unbound fractions of prednisolone in plasma appear to be even more closely related to rejection but are difficult to measure. Monitoring EOS, as a surrogate measure of prednisolone immunosuppression, may be more cost-effective for controlling rejection than conventional cyclosporin monitoring in the first 6 weeks after HTx

    Socioeconomic Deprivation and Survival After Heart Transplantation in England: An Analysis of the United Kingdom Transplant Registry.

    Get PDF
    BACKGROUND: Socioeconomic deprivation (SED) is associated with shorter survival across a range of cardiovascular and noncardiovascular diseases. The association of SED with survival after heart transplantation in England, where there is universal healthcare provision, is unknown. METHODS AND RESULTS: Long-term follow-up data were obtained for all patients in England who underwent heart transplantation between 1995 and 2014. We used the United Kingdom Index of Multiple Deprivation (UK IMD), a neighborhood level measure of SED, to estimate the relative degree of deprivation for each recipient. Cox proportional hazard models were used to examine the association between SED and overall survival and conditional survival (dependant on survival at 1 year after transplantation) during follow-up. Models were stratified by transplant center and adjusted for donor and recipient age and sex, ethnicity, serum creatinine, diabetes mellitus, and heart failure cause. A total of 2384 patients underwent heart transplantation. There were 1101 deaths during 17 040 patient-year follow-up. Median overall survival was 12.6 years, and conditional survival was 15.6 years. Comparing the most deprived with the least deprived quintile, adjusted hazard ratios for all-cause mortality were 1.27 (1.04-1.55; P=0.021) and 1.59 (1.22-2.09; P=0.001) in the overall and conditional models, respectively. Median overall survival and conditional survival were 3.4 years shorter in the most deprived quintile than in the least deprived. CONCLUSIONS: Higher SED is associated with shorter survival in heart transplant recipients in England and should be considered when comparing outcomes between centers. Future research should seek to identify modifiable mediators of this association.No direct funding was provided for the conduct of this study. JE completed part of this work as part of an academic clinical fellowship, where he spent time at the University of Cambridge, Cardiovascular Epidemiology Unit receiving training on research methods, supported by SK and EDA. The Cardiovascular Epidemiology Unit is funded by the UK Medical Research Council (G0800270), British Heart Foundation (SP/09/002), British Heart Foundation Cambridge Cardiovascular Centre of Excellence, and UK National Institute for Health Research Cambridge Biomedical Research Centre.This is the author accepted manuscript. The final version is available from American Heart Association via https://doi.org/10.1161/CIRCOUTCOMES.116.00265

    Prognostic value of three iron deficiency definitions in patients with advanced heart failure

    Get PDF
    Aims: There is uncertainty about the definition of iron deficiency (ID) and the association between ID and prognosis in patients with advanced heart failure. We evaluated three definitions of ID in patients referred for heart transplantation. Methods and results: Consecutive patients assessed for heart transplantation at a single UK centre between January 2010 and May 2022 were included. ID was defined as (1) serum ferritin concentration of <100 ng/ml, or 100–299 ng/ml with transferrin saturation <20% (guideline definition), (2) serum iron concentration ≤13 μmol/L, or (3) transferrin saturation <20%. The primary outcome measure was a composite of all-cause mortality, urgent heart transplantation or need for mechanical circulatory support. Overall, 801 patients were included, and the prevalence of ID was 39–55% depending on the definition used. ID, defined by either serum iron or transferrin saturation, was an independent predictor of the primary outcome measure (hazard ratio [HR] 1.532, 95% confidence interval [CI] 1.264–1.944, and HR 1.595, 95% CI 1.323–2.033, respectively), but the same association was not seen with the guideline definition of ID (HR 1.085, 95% CI 0.8827–1.333). These findings were robust in multivariable Cox regression analysis. ID, by all definitions, was associated with lower 6-min walk distance, lower peak oxygen consumption, higher intra-cardiac filling pressures and lower cardiac output. Conclusions: Iron deficiency, when defined by serum iron concentration or transferrin saturation, was associated with increased frequency of adverse clinical outcomes in patients with advanced heart failure. The same association was not seen with guideline definition of ID

    Malignancy after Heart Transplantation: Analysis of 24-Year Experience at a Single Center

    No full text
    WOS: 000269540900029PubMed ID: 19486222Background: Malignancy is an important complication after heart transplantation. The incidence, spectrum, risk factors, and clinical impact of posttransplant malignancy were investigated in a cohort of patients with long-term follow-up at a single center. Methods: Data for 835 patients who underwent heart transplantation between 1979 and 2002 and survived beyond one month were retrospectively evaluated for posttransplant skin cancer, solid organ tumors, and lymphoma. Results: One hundred thirty-ninemalignancies developed in 126 patients (15.1%). Skin cancer, solid organ tumors, and lymphoma represented 49%, 27%, and 24% of the malignancies, respectively. Mean patient age at transplantation for patients developing skin cancer, solid organ tumor, and lymphoma were 50 years, 51 years, and 46 years, respectively (p = 0.024). Risk factors for skin cancer were: age greater than 40 at transplantation, number of treated rejection episodes in the first year after transplantation, and smoking history. Variables associated with solid organ malignancy development were age and smoking history. There was no variable related to the development of posttransplant lymphoma. Median survival after diagnosis of skin cancer, solid organ tumor, and lymphoma were 5.0 years, 0.3 years, and 0.7 years, respectively (p < 0.001). Conclusions: Posttransplant malignancies have different risk factors and variable clinical impact. Older age at transplantation, smoking history, and more episodes of treated rejection were related to increased incidence of nonlymphoid malignancy incidence after heart transplantation, whereas no variable was associated with lymphoid malignancy. Skin cancers have a benign course, while solid organ malignancies and lymphomas carry an unfavorable prognosis. doi: 10.1111/j.1540-8191.2009.00858.x (J Card Surg 2009; 24: 572-579

    Does heart transplantation confer survival benefit in all risk groups?

    No full text
    BACKGROUND: Over 50,000 heart transplants have been performed in the last 3 decades. The global shortage of donor organs and the relaxation of candidate selection criteria over time has resulted in recent controversy about the benefits of heart transplantation for some risk groups. We assessed the survival benefit acquired in the Papworth Hospital heart transplant population overall, taking into account resuscitated marginal donors and high-risk recipients. METHODS: All heart transplant patients listed between 1979 and June 2002 were analyzed (n = 1,212). Of these, 931 cardiac transplantations were done, including the use of 126 marginal donors. High-risk recipients (n = 163) were defined as patients being in the hospital, on intravenous inotropic drugs, and/or with a high transpulmonary gradient (>15 mm Hg). Using Cox regression with transplantation as a time-dependent covariate, we assessed the survival benefit of transplantation. In our model we assumed that after transplantation the initial risk of death is high relative to continued waiting, followed by an exponential decline in risk. The crossover point (COP) is the time at which the risk of death after transplantation is equal to that of continued waiting (i.e., the relative risk is 1). The equity point (EP) determines the time at which the early post-operative risk is offset by the later period of lower risk and, therefore, the time at which transplantation has a survival advantage. RESULTS: Overall, the COP was at 54 days and EP at 141 days. In the marginal donor sub-group, COP was achieved at 32 days with EP at 72 days, indicating a survival benefit. The difference in the COP and EP between the borderline donor and normal donor sub-groups was not statistically significant. Post-transplant survival was not significantly different from recipients of normal cardiac allografts (p = .43). Likewise, for the high-risk recipient group, the COP and EP were at 72 and 203 days. Although post-op survival was significantly shorter than the normal-risk group, both groups achieved survival benefits. CONCLUSION: Heart transplantation provides survival benefit in these risk groups of recipients in our population. This is a reflection of our active donor management protocol and rigorous donor and recipient selection process

    Short- and long-term outcomes of combined cardiac and renal transplantation with allografts from a single donor.

    No full text
    Coexisting end-stage heart and kidney failure can be treated by combined cardiac and renal transplantation. This study reviews the short- and long-term outcomes after such a procedure over a 16-year period at a single institution. All patients who underwent single-donor simultaneous heart and kidney transplantation during the period of March 1986 to April 2002 (including heart retransplantation) were included (n = 13). They were listed for combined heart and kidney transplantation as they fulfilled our criteria for irreversible end-stage organ failure. Retrospective review of patient data from the transplant database, patient case notes and post-mortem reports were carried out. The mean (SD) recipient age was 45 (12) years and there were 2 females. The mean pre-operative creatinine level was 724 (415) micromol/liter with 9 patients (69.2%) on continuous ambulatory peritoneal dialysis and 2 patients (15.4%) on hemodialysis prior to transplantation. The 30-day mortality rate was 15.4% (2 of 13). For surviving patients the mean creatinine level at hospital discharge was 158 (93) micromol/liter. The mean number of acute cardiac rejection episodes per 100 patient-days was significantly lower (p = 0.01) than that for the heart-only transplant group (n = 760) during the same period. The median (interquartile range) post-operative survival was 1,969 (620 to 3,468) days. The actuarial survival rates (95% confidence interval) at 1 and 10 years were 77% (54% to 100%) and 67% (40% to 94%), respectively, and were not significantly different from the isolated heart transplant population (p = 0.68). Only 1 episode of acute renal rejection was diagnosed on clinical grounds, which was treated accordingly. There was no renal allograft loss in the long-term survivors. Combined cardiac and renal transplantation with allografts from the same donor has acceptable short- and long-term outcomes for patients with coexisting end-stage cardiac and renal failure. This group of patients may also experience fewer acute rejection episodes post-operatively

    Risk factors for the development and progression of dyslipidemia after heart transplantation

    No full text
    Background. Hyperlipidemia is an important complication after organ transplantation and contributes to the development of posttransplant accelerated coronary artery diseases. Methods. We have retrospectively evaluated the relative contribution of various risk factors associated with the development and progression of hyperlipidemia in 194 heart transplant recipients by the use of mixed effects multiple linear regression analysis. The demographic characteristics evaluated were primary diagnosis of ischemic heart disease (IHD), gender, and age. Postoperative characteristics included number of treated rejections, dosage of cyclosporine (CYA), tacrolimus (TAC), prednisolone and azathioprine, and concentration of serum creatinine and glucose. The effects of administration of antihypertensive agents, diuretics, and lipid lowering agents were also studied. Results. The total cholesterol concentration increased significantly in the first 3 months posttransplant but gradually decreased thereafter. Total cholesterol and the ratio of low density lipoprotein (LDL) cholesterol to high density lipoprotein (HDL) cholesterol (LDL-C/HDL-C) increased to a greater extent in patients with IHD although female transplant recipients had a greater increase in the total cholesterol concentration. Each episode of rejection increased serum cholesterol by 0.306 mmol/liter (0.258, 0.355) [mean (95% C.I.)] and serum triglyceride by 0.164 mmol/liter (0.12, 0.209) although switching to TAC improved total cholesterol and LDL-C/HDL-C. Administration of frusemide, increased the total cholesterol and LDL-C/HDL-C whereas administration of bumetanide or metolazone increased the concentration of serum triglyceride. Serum glucose was associated with hypertriglyceridemia whereas serum creatinine was associated with increases in the total cholesterol, LDL-C/HDL-C and triglyceride. Conclusions. We have identified demographic and postoperative covariables that predispose heart transplant recipients to hyperlipidemia. Some of these risk factors, such as the effect of diuretics, have not been identified before in this group of patients and may be amenable to modification or closer control. TAC rather than CYA may be the immunosuppressive of choice for patients who are at greater risk of developing hyperlipidemia
    corecore