31 research outputs found

    Delayed endovascular treatment of descending aorta stent graft collapse in a patient treated for post- traumatic aortic rupture: a case report

    Get PDF
    Background: We report a case of delayed endovascular correction of graft collapse occurred after emergent Thoracic Endovascular Aortic Repair (TEVAR) for traumatic aortic isthmus rupture.Case presentation: In 7 th post-operative day after emergent TEVAR for traumatic aortic isthmus rupture (Gore TAG \uae 28-150), a partial collapse of the endoprosthesis at the descending tract occurred, with no signs of visceral ischemia. Considering patient's clinical conditions, the graft collapse wasn't treated at that time. When general conditions allowed reintervention, the patient refused any new treatment, so he was discharged.Four months later the patient complainted for severe gluteal and sural claudication, erectile disfunction and abdominal angina; endovascular correction was performed. At 18 months the graft was still patent.Discussion and Conclusion: Graft collapse after TEVAR is a rare event, which should be detected and treated as soon as possible. Delayed correction of this complication can be lethal due to the risk of visceral ischemia and limbs loss

    Clonal evolution of CD8+ T cell responses against latent viruses: relationship among phenotype, localization, and function

    Get PDF
    Human cytomegalovirus (hCMV) infection is characterized by a vast expansion of resting effector-type virus-specific T cells in the circulation. In mice, interleukin-7 receptor α (IL-7Rα)-expressing cells contain the precursors for long-lived antigen-experienced CD8(+) T cells, but it is unclear if similar mechanisms operate to maintain these pools in humans. Here, we studied whether IL-7Rα-expressing cells obtained from peripheral blood (PB) or lymph nodes (LNs) sustain the circulating effector-type hCMV-specific pool. Using flow cytometry and functional assays, we found that the IL-7Rα(+) hCMV-specific T cell population comprises cells that have a memory phenotype and lack effector features. We used next-generation sequencing of the T cell receptor to compare the clonal repertoires of IL-7Rα(+) and IL-7Rα(-) subsets. We observed limited overlap of clones between these subsets during acute infection and after 1 year. When we compared the hCMV-specific repertoire between PB and paired LNs, we found many identical clones but also clones that were exclusively found in either compartment. New clones that were found in PB during antigenic recall were only rarely identical to the unique LN clones. Thus, although PB IL-7Rα-expressing and LN hCMV-specific CD8(+) T cells show typical traits of memory-type cells, these populations do not seem to contain the precursors for the novel hCMV-specific CD8(+) T cell pool during latency or upon antigen recall. IL-7Rα(+) PB and LN hCMV-specific memory cells form separate virus-specific compartments, and precursors for these novel PB hCMV-specific CD8(+) effector-type T cells are possibly located in other secondary lymphoid tissues or are being recruited from the naive CD8(+) T cell pool. IMPORTANCE: Insight into the self-renewal properties of long-lived memory CD8(+) T cells and their location is crucial for the development of both passive and active vaccination strategies. Human CMV infection is characterized by a vast expansion of resting effector-type cells. It is, however, not known how this population is maintained. We here investigated two possible compartments for effector-type cell precursors: circulating acute-phase IL-7Rα-expressing hCMV-specific CD8(+) T cells and lymph node (LN)-residing hCMV-specific (central) memory cells. We show that new clones that appear after primary hCMV infection or during hCMV reactivation seldom originate from either compartment. Thus, although identical clones may be maintained by either memory population, the precursors of the novel clones are probably located in other (secondary) lymphoid tissues or are recruited from the naive CD8(+) T cell pool

    Value of risk scores in the decision to palliate patients with ruptured abdominal aortic aneurysm

    Get PDF
    Background: The aim of this study was to develop a 48-h mortality risk score, which included morphology data, for patients with ruptured abdominal aortic aneurysm presenting to an emergency department, and to assess its predictive accuracy and clinical effectiveness in triaging patients to immediate aneurysm repair, transfer or palliative care. Methods: Data from patients in the IMPROVE (Immediate Management of the Patient With Ruptured Aneurysm: Open Versus Endovascular Repair) randomized trial were used to develop the risk score. Variables considered included age, sex, haemodynamic markers and aortic morphology. Backwards selection was used to identify relevant predictors. Predictive performance was assessed using calibration plots and the C-statistic. Validation of the newly developed and other previously published scores was conducted in four external populations. The net benefit of treating patients based on a risk threshold compared with treating none was quantified. Results: Data from 536 patients in the IMPROVE trial were included. The final variables retained were age, sex, haemoglobin level, serum creatinine level, systolic BP, aortic neck length and angle, and acute myocardial ischaemia. The discrimination of the score for 48-h mortality in the IMPROVE data was reasonable (C-statistic 0·710, 95 per cent c.i. 0·659 to 0·760), but varied in external populations (from 0·652 to 0·761). The new score outperformed other published risk scores in some, but not all, populations. An 8 (95 per cent c.i. 5 to 11) per cent improvement in the C-statistic was estimated compared with using age alone. Conclusion: The assessed risk scores did not have sufficient accuracy to enable potentially life-saving decisions to be made regarding intervention. Focus should therefore shift to offering repair to more patients and reducing non-intervention rates, while respecting the wishes of the patient and family

    Antegrade Balloon Dilatation as a Treatment Option for Posttransplant Ureteral Strictures: Case Series of 50 Patients

    No full text
    Objectives: The aim of this study was to investigate the effects of antegrade balloon dilatation on ureteral strictures that developed after kidney transplant. Materials and Methods: The hospital databases of the Erasmus Medical Center (Rotterdam, The Netherlands) and the Academic Medical Center (Amsterdam, The Netherlands) were retrospectively screened for patients who underwent balloon dilatation after kidney transplant. Balloon dilatation was technically successful whenever it was able to pass the strictured segment with the guidewire followed by balloon inflation; the procedure was clinically successful if no further interventions (for example, surgical revision of the ureteroneocystostomy or prolonged double J placement) were necessary. Results: Fifty patients (2.4%) of 2075 kidney transplant recipients underwent antegrade balloon dilatation because of urinary outflow obstruction. Median time between transplant and balloon dilatation was 3 months (range, 0-139 mo). In 43 patients (86%), balloon dilatation was technically successful. In the remaining 7 patients (14%), it was impossible to pass the strictured segment with the guidewire. In 20 of 43 patients (47%) having a technically successful procedure, the procedure was also clinically successful, with median follow-up after balloon dilatation of 35.5 months (range, 0-102 mo). We did not identify any patient or stricture characteristic that influenced the outcome of treatment. Conclusions: Balloon dilatation is a good option for ureter stricture treatment after kidney transplant as it is minimal invasive and can prevent surgical exploration in almost 50% of cases

    Economising vein-graft surveillance programs

    Get PDF
    Objectives:To investigate the effectiveness of two alternative vein-graft surveillance strategies. In the first strategy surveillance was restricted to patients with a possible higher risk of significant stenosis development, i.e. those with a moderate stenosis identified early after the operation. In the second strategy the effects of reducing the number of duplex tests per patient was examined.Patients and Methods:In a prospective study in three vascular surgical departments 300 patients (300 femoropopliteal or distal grafts) underwent duplex surveillance during the first year after the operation. The duplex-derived PSV-ratio was considered to represent the degree of stenosis. Arteriographic confirmation of suspected stenoses was routinely obtained, and patients without a suspected graft stenosis underwent a consented arteriogram during the first postoperative year. The decision to perform a graft revision was taken on the basis of an arteriographic stenosis of at least 70% diameter reduction. In the first strategy graft categories were defined on the basis of the first postoperative duplex examination: grafts with a PSV-ratio <1.5, grafts with a PSV-ratio <1.5–2.0, grafts with a PSV-ratio of 2.0–2.5, grafts with PSV-ratios 2.5–3.0, and grafts with PSV-ratios >3.0. The primary patency rate at 12 months was compared for these categories. In the second alternative strategy the number of examinations and the percentage of event causing de novo stenoses were analysed per surveillance interval.Results:The presence of moderate abnormalities at the initial duplex scan did not identify patients with a high risk of an event, as initial PSV-ratios of 1.5–2.0 and 2.0–2.5 (early mild-moderate lesions) had comparable 12-month primary patencies to patients with a PSV-ratio <1.5 (completely normal grafts): (63%, 73%, and 71%, respectively). The interval incidence of event causing de novo stenoses was 8% of the total number of duplex tests performed at 3 months, and 8% at 6 months after the operation. In patients who had no previous intervention for stenosis and had a normal bypass during the first 6 months postoperatively, a sharp drop in this incidence was seen at 9 and 12 months, with event causing de novo stenoses observed in only 2% and 1% of all duplex tests.Conclusions:All patients should be included in a surveillance program, as the presence of a normal vein graft at the first duplex examination does not rule out the subsequent development of graft stenosis. The duration of the surveillance period may be restricted to the first 6 months after operation in patients who have a normal bypass during that time period, as only few stenoses will be missed by this policy

    Factors influencing the development of vein-graft stenosis and their significance for clinical management

    No full text
    AbstractObjectivesto assess the influence of clinical and graft factors on the development of stenotic lesions. In addition the implications of any significant correlation for duplex surveillance schedules or surgical bypass techniques was examined.Patients and methodsin a prospective three centre study, preoperative and peroperative data on 300 infrainguinal autologous vein grafts was analysed. All grafts were monitored by a strict duplex surveillance program and all received an angiogram in the first postoperative year. A revision was only performed if there was evidence of a stenosis of 70% diameter reduction or greater on the angiogram.Resultsthe minimum graft diameter was the only factor correlated significantly with the development of asignificant graft stenosis(PSV-ratio ≥2.5) during follow-up (p=0.002). Factors that correlated with the development ofevent-causing graft stenosis, associated with revision or occlusion, were minimal graft diameter (p=0.001), the use of a venovenous anastomosis (p=0.005) and length of the graft (p=0.025). Multivariate regression analysis revealed that the minimal graft diameter was the only independent factor that significantly correlated with anevent-causing graft stenosis(p=0.009). The stenosis-free rates for grafts with a minimal diameter <3.5 mm, between 3.5–4.5 and ≥4.5 mm were 40%, 58% and 75%, respectively (p=<0.05). Composite vein and arm-vein grafts with minimal diameters ≥3.5 mm were compared with grafts which consisted of a single uninterrupted greater saphenous vein with a minimal diameter of <3.5 mm. One-year secondary patency rates in these categories were of 94% and 76%, respectively (p=0.03).Conclusionsa minimal graft diameter <3.5 mm was the only factor that significantly correlated with the development of a graft-stenosis. However, veins with larger diameters may still develop stenotic lesions. Composite vein and arm-vein grafts should be used rather than uninterrupted small caliber saphenous veins
    corecore