52 research outputs found

    Vorstellung der Studie: PREDARF: preoperative detection of age related factors

    Full text link

    Comparison of low potassium Euro-Collins solution and standard Euro-Collins solution in an extracorporeal rat heart-lung model

    Full text link
    OBJECTIVE: Euro-Collins solution (EC) is routinely used in lung transplantation. The high potassium of EC, however, may damage the vascular endothelium, thereby contributing to postischemic reperfusion injury. To assess the influence of the potassium concentration on lung preservation, we evaluated the effect of a "low potassium Euro-Collins solution" (LPEC), in which the sodium and potassium concentrations were reversed. METHODS: In an extracorporeal rat heart-lung model lungs were preserved with EC and LPEC. The heart-lung blocks (HLB) were perfused with Krebs-Henseleit solution containing washed bovine red blood cells and ventilated with room air. The lungs were perfused via the working right ventricle with deoxygenated perfusate. Oxygenation and pulmonary vascular resistance (PVR) were monitored. After baseline measurements, hearts were arrested with St. Thomas' solution and the lungs were perfused with EC or LPEC, or were not perfused (controls). The HLBs were stored for 5 min or 2 h ischemic time at 4 degrees C. Reperfusion and ventilation was performed for 40 min. At the end of the trial the wet/dry ratio of the lungs was calculated and light microscopic assessment of the degree of edema was performed. RESULTS: After 5 min of ischemia oxygenation was significantly better in both preserved groups compared to the controls. Pulmonary vascular resistance was elevated in all three groups after 30 min reperfusion at both ischemic times. After 2 h of ischemia PVR of the group preserved with LPEC was significantly lower than those of the EC and controls (LPEC-5 min: 184 +/- 65 dynes * sec * cm-5, EC-5 min: 275 +/- 119 dynes * sec * cm * cm-5, LPEC-2 h: 324 +/- 47 dynes * sec * m-5, EC-2 h: 507 +/- 83 dynes * sec * cm-5). Oxygenation after 2 h of ischemia and 30 min reperfusion was significantly better in the LPEC group compared to EC and controls (LPEC: 70 +/- 17 mmHg, EC: 44 +/- 3 mmHg). The wet/dry ratio was significantly lower in the two preserved groups compared to controls (LPEC-5 min: 5.7 +/- 0.7, EC-5 min: 5.8 +/- 1.2, controls-5 min: 7.5 +/- 1.8, LPEC-2 h: 6.7 +/- 0.4, EC: 6.9 +/- 0.4, controls-2 h: 7.3 +/- 0.4). CONCLUSIONS: We thus conclude that LPEC results in better oxygenation and lower PVR in this lung preservation model. A low potassium concentration in lung preservation solutions may help in reducing the incidence of early graft dysfunction following lung transplantation

    Tricuspid valve regurgitation attributable to endomyocardial biopsies and rejection in heart transplantation

    Full text link
    In the present report the prevalence, severity, and risk factors of tricuspid valve regurgitation (TR) in 251 heart transplant recipients have been analyzed retrospectively. Tricuspid valve function was studied by color-flow Doppler echocardiogram and annual heart catheterization. The presence or severity of TR was graded on a scale from 0 (no TR) to 4 (severe). Additional postoperative data included rate of rejection, number of endomyocardial biopsies, incidence of transplant vasculopathy, and preoperative and postoperative hemodynamics. The incidence of grade 3 TR increases from 5% at 1 year to 50% at 4 years after transplantation. Multivariate analysis showed rate of rejection and donor heart weight to be significant risk factors. The ischemic intervals as well as the preoperative and postoperative pulmonary hemodynamics did not affect the severity or prevalence of TR. These results indicate that various factors appear to have an impact on the development of TR and that the prevalence might be lowered by a reduction of the number of biopsies performed and when possible, oversizing of donor hearts

    Strategies for routine biopsies in heart transplantation based on 8-year results with more than 13,000 biopsies

    Full text link
    The endomyocardial biopsy (EMB) in heart transplant recipients has been considered the "gold standard" for diagnosis of graft rejection (REJ). The purpose of this retrospective study is to develop long-term strategies (frequency and postoperative duration of EMB) for REJ monitoring. Between 1985 and 1992, 346 patients (mean age 44.5 years, female patients = 14%) received 382 heart grafts. For graft surveillance EMBs were performed according to a fixed schedule depending on postoperative day and the results of previous biopsies. In the first year the average number (no.) of EMBs/patient was 20 with 19% positive for REJ in the first quarter, dropping to 7% REJ/EMB by the end of the first year. The percentage of REJ/EMB declined annually from 4.7% to 4.5%, 2.2% and less than 1% after the fifth year. Individual biopsy results in the first 3 postoperative months had little predictive value. Patients with fewer than two REJ (group 1), vs patients with two or more REJ in the first 6 postoperative months (group 2), were significantly less likely to reject in the second half of the first year (group 1: 0.29 +/- 0.6 REJ/patient; group 2:0.83 +/- 1.3 REJ/patient; P < 0.001) and third postoperative year (group 1:0.12 +/- 0.33 REJ/patients; group 2:0.46 +/- 0.93 REJ/patient; P < 0.05). In conclusion, routine EMBs in the first 3 postoperative months have only limited predictive value, however the number of routine EMBs can be drastically reduced later depending on the intermediate postoperative REJ pattern

    Low-dose cyclosporine therapy in triple-drug immunosuppression for heart transplant recipients

    Full text link
    The toxicity of long-term immunosuppressive therapy has become a major concern in long-term follow-up of heart transplant recipients. In this respect the quality of renal function is undoubtedly linked to cyclosporin A (CsA) drug levels. In cardiac transplantation, specific CsA trough levels have historically been maintained between 250 and 350 micrograms/L in many centers without direct evidence for the necessity of such high levels while using triple-drug immunosuppression. This retrospective analysis compares the incidence of acute and chronic graft rejection as well as overall mortality between groups of patients with high (250 to 350 micrograms/L) and low (150 to 250 micrograms/L) specific CsA trough levels. A total of 332 patients who underwent heart transplantation between October 1985 and October 1992 with a minimum follow-up of 30 days were included in this study (46 women and 276 men; aged, 44 +/- 12 years; mean follow-up, 1,122 +/- 777 days). Standard triple-drug immunosuppression included first-year specific CsA target trough levels of 250 to 300 micrograms/L. Patients were grouped according to their average creatinine level in the first postoperative year (group I, or = 130 mumol/L, n = 98). The overall 5-year survival excluding the early 30-day mortality was 92% (group I, 216/232) and 91% (group II, 89/98) with 75% of the mortality due to chronic rejection. The rate of rejection for the entire follow-up period was similar in both groups (first year: group I, 3.2 +/- 2.6 rejection/patient/year; group II, 3.6 +/- 2.7 rejection/patient/year; p = not significant).(ABSTRACT TRUNCATED AT 250 WORDS

    Graft coronary vasculopathy in cardiac transplantation--evaluation of risk factors by multivariate analysis

    Full text link
    The development of coronary vasculopathy is the main determinant of long-term survival in cardiac transplantation. The identification of risk factors, therefore, seems necessary in order to identify possible treatment strategies. Ninety-five out of 397 patients, undergoing orthotopic cardiac transplantation from 10/1985 to 10/1992 were evaluated retrospectively on the basis of perioperative and postoperative variables including age, sex, diagnosis, previous operations, renal function, cholesterol levels, dosage of immunosuppressive drugs (cyclosporin A, azathioprine, steroids), incidence of rejection, treatment with calcium channel blockers at 3, 6, 12, and 18 months postoperatively. Coronary vasculopathy was assessed by annual angiography at 1 and 2 years postoperatively. After univariate analysis, data were evaluated by stepwise multiple logistic regression analysis. Coronary vasculopathy was assessed in 15 patients at 1 (16%), and in 23 patients (24%) at 2, years. On multivariate analysis, previous operations and the incidence of rejections were identified as significant risk factors (P < 0.05), whereas the underlying diagnosis had borderline significance (P = 0.058) for the development of graft coronary vasculopathy. In contrast, all other variables were not significant in our subset of patients investigated. We therefore conclude that the development of coronary vasculopathy in cardiac transplant patients mainly depends on the rejection process itself, aside from patient-dependent factors. Therapeutic measures, such as the administration of calcium channel blockers and regulation of lipid disorders, may therefore only reduce the progress of native atherosclerotic disease in the posttransplant setting
    • …
    corecore