253 research outputs found

    Evaluation of Intestinal Transplantation for Short Bowel Syndrome due to Gun Violence versus Other Causes: A Single Center Experience

    Get PDF
    Introduction: Short Bowel Syndrome (SBS) due to trauma is a rarely examined problem with a unique solution in Intestine Transplantation. We discuss factors that may have contributed to variations in patient outcome after intestinal transplant surgery secondary to SBS due to gun violence. Methods: A retrospective chart review was conducted for intestinal transplant patients at an urban medical center. Two patients underwent intestinal transplants secondary to SBS after small bowel resections for gunshot related injury (GSW). Trends were noted and compared to trends found among patients comprising the intestinal transplant recipient registry (n=26) at the same medical center. Results: Two patients were transplanted for intestinal failure related to gunshot wound. 24 patients in the non-GSW group were transplanted for other indications including Crohn’s disease (n=6), neuroendocrine tumor (n=5), and anatomic infarction (n=6). The average age for intestine transplant due to GSW was 29.5 years, compared to 47.8 years of non-GSW group. 50% of patients with GSW had rejection within 6 months following intestine transplant versus 36% from the non-GSW group. 50% of the GSW patients were alive one year and 5 years post-transplant. 92% and 71% of the non-GSW patients were alive at one year and 5 years post-transplant respectively. Conclusion: Intestine transplant can be successful in short bowel syndrome due to gunshot wound. Patients should be considered based on co-morbid conditions and risk factors rather than cause of trauma. Further large, multi-center studies are needed to elucidate risk factors related to the success of intestinal transplant for short bowel syndrome due to trauma

    Role of surveillance biopsy frequency post intestine transplant: A tertiary care experience

    Get PDF
    Background: With only 81 intestine transplant (IT) in the U.S. in 2019, the literature on this type of solid organ transplant remains scarce. Frequent surveillance biopsy is required on the first month post IT due to high-risk of acute rejection, however, the frequency of surveillance biopsy 1-month post IT is often determined by the physician and the institutions\u27 preference. Aims: Report IT outcomes and clinical impact of surveillance biopsy at a single tertiary care center. Methods: This is a retrospective review of patients that underwent IT during the time-period between 08/2010 and 03/2020. Primary outcome was the correlation between increased protocol biopsies and mortality. Secondary outcomes included correlation between increased protocol biopsies and hospital re-admissions, length of hospital stay, and rate of biopsy proven rejection detection. Kaplan-Meier curves was used to perform the survival analysis at 6-month, 1-year, and 2-years post-transplant. Results: A total of 35 patients (mean age 47.6 ± 12.9 years, F 22 (63%) underwent IT for: ischemic bowel 11 (31%), Chron\u27s disease 9 (25%), neuroendocrine tumor 6 (17%), trauma 3 (9%) and \u27others\u27 6 (17%), of which 14 (40%) were part of multivisceral organ transplant. During the first-year posttransplant, the median number of biopsies was 12 (IQR 6-30), with evidence of definite acute graft rejection in 40%, 27%, and 41% at the 1-3, 3-6, and 6-12 post IT time intervals, respectively. During the duration of the study, the mortality rate was 18/35 (51%) at a median time of 37 (12-60) months post IT, and a total of 8/35 (23%) patients underwent enterectomy at a median time of 12 (8-36) months post IT (Table 1). In general, there was survival benefit for patients who had a total number of biopsies of ≥ 10 as compared to \u3c 10 biopsies at the time interval of 6-months post IT, (p=0.008) (Table 2). There was a non-significant trend with longer median length of hospital stay in patients with greater number of biopsies. Conclusion: Our results indicate evidence of survival benefit of increased protocol biopsies. Studies with larger sample sizes are required to validate our results

    Synthesis of 2-Aminofurans by Sequential [2+2] Cycloaddition-Nucleophilic Addition of 2-Propyn-1-ols with Tetracyanoethylene and Amine-Induced Transformation into 6-Aminopentafulvenes

    Get PDF
    Synthesis of 2-aminofuran derivatives with an azulene or N,N-dimethylanilino substituent was established by the formal [2+2] cycloaddition-retroelectrocyclization of 3-(1-azulenyl or N,N-dimethylanilino)-2-propyn-1-ols with tetracyanoethylene, followed by intramolecular nucleophilic addition to the initially formed tetracyanobutadiene moiety of the internal hydroxyl group that come from 2-propyn-1-ol. The reaction proceeds under mild conditions with short reaction period. The products of the reaction are readily available through a simple purification procedure. 2-Aminofuran derivatives obtained by this reaction could be converted into 6-aminofulvene derivatives upon reaction with various amines. The structures of 2-aminofuran and 6-aminopentafulvene with a N,N-dimethylanilino substituent were confirmed by single-crystal X-ray structural analysis.ArticleCHEMISTRY-A EUROPEAN JOURNAL. 23(21):5126-5136 (2017)journal articl

    Robotic-Assisted Versus Open Techniques for Living Donor Kidney Transplant Recipients: A Comparison Using Propensity Score Analysis

    Get PDF
    Background: Following the rapid advancements in minimally invasive urology, living donor robotic-assisted kidney transplantation (RAKT) has developed into a feasible alternative to open kidney transplantation (OKT). The procedure has been performed in multiple international programs, but a relative dearth of experience exists in the US. In this investigation, we compare RAKT to OKT using a propensity score analysis, to elucidate the safety and feasibility of RAKT as a suitable alternative to OKT. Methods: A retrospective review of 101 living kidney transplants (36 RAKT, 65 OKT), which occurred between January 2016 and June 2018, was conducted. Selection for RAKT was based on Robot availability. Recipient and donor demographic variable were collected, in addition to perioperative parameters. A propensity score analysis was conducted, matching for recipient age, gender, body mass index, race, pre-operative dialysis, preoperative serum creatinine, panel reactive antibody, and donor age. Primary outcomes assessed included perioperative factors such as estimated blood loss (EBL), cold ischemic time (CIT), warm ischemic time (WIT), operative time, as well as several patient outcomes including, length of stay, narcotics consumed on postoperative days one and two, and change in serum creatinine (SCr) at five time points (day 3, day 7, day 14, 6 months, and 1 year). Final analysis included 35 patients in each group. Results: Recipients’ (N=101) mean age was 49 years (range 19-74), with RAKT recipients slightly younger than OKT recipients (46 vs 51 years). 61 recipients were male and 62 white (29 Black, 10 other). Average recipient BMI was 29 (range 20-40), with equivalent BMIs in RAKT and OKT subsets. Following propensity score analysis, RAKT recipients demonstrated significantly greater WIT (49 vs 38 minutes, p\u3c0.001) and less EBL (62.5 vs 150 mL, p\u3c0.001). However, total operative time and overall length of stay were not significantly different in the groups. Postoperative narcotics consumed on postoperative days one and two were similar between the groups (31.8 vs 32.3 morphine equivalents). Additionally, SCr was evaluated at days 3, 7, and 14 as well as 6 months and 1 year, without significant differences between the groups. Conclusion: RAKT offers an important minimally invasive alternative to OKT, with a short learning curve, and similar graft and patient outcomes. Notably, this study compares RAKT to OKT with a heterogeneous study population, using propensity scoring. The largest limitation of this study is a small sample size. Interestingly, despite the significantly longer WIT in RAKT, we found an equivalence of SCr between groups in the early and intermediate postoperative period. Although the small sample size limits our ability to detect differences in graft and patient outcomes, trends demonstrate shorter lengths of stay, shorter operative times, and smaller amounts of blood loss for RAKT recipients. Additionally, trends demonstrate fewer narcotics administered by the second postoperative day. Similar to the advent of laparoscopic technology in living donor nephrectomy, early findings in RAKT demonstrate a safe and reasonable alternative for living donor kidney transplantation in various populations.https://scholarlycommons.henryford.com/merf2019clinres/1052/thumbnail.jp

    Robotic-assisted Versus Open Technique for Living Donor Kidney Transplantation: A Comparison Using Propensity Score Matching for Intention to Treat

    Get PDF
    Living donor robotic-assisted kidney transplantation (RAKT) is an alternative to open kidney transplantation (OKT), but experience with this technique is limited in the United States. METHODS: A retrospective review of living donor kidney transplants performed between 2016 and 2018 compared RAKT with OKT with regard to recipient, donor, and perioperative parameters. A 1:1 propensity score matching was performed on recipient/donor age, sex, body mass index, race, preoperative dialysis, and calculated panel reactive antibodies. RESULTS: Outcomes of patient survival, graft survival, and postoperative complications were assessed for 139 transplants (47 RAKT and 92 OKT). Propensity score analysis (47:47) showed that RAKT recipients had longer warm ischemic times (49 versus 40 min; P \u3c 0.001) and less blood loss (100 versus 150 mL; P = 0.005). Operative time and length of stay were similar between groups. Postoperative serum creatinine was similar during a 2-y follow-up. Post hoc analysis excluding 4 open conversions showed lower operative time with RAKT (297 versus 320 min; P = 0.04) and lower 30-d (4.7% versus 23.4%; P = 0.02) and 90-d (7% versus 27.7%; P = 0.01) Clavien-Dindo grade ≥3 complications. CONCLUSIONS: Our findings suggest that RAKT is a safe alternative to OKT

    Improved Survival With Higher-risk Donor Grafts in Liver Transplant With Acute-on-chronic Liver Failure

    Get PDF
    Use of higher-risk grafts in liver transplantation for patients with acute-on-chronic liver failure (ACLF) has been associated with poor outcomes. This study analyzes trends in liver transplantation outcomes for ACLF over time based on the donor risk index (DRI). Methods: Using the Organ Procurement and Transplantation Network and the United Network for Organ Sharing registry, 17 300 ACLF patients who underwent liver transplantation between 2002 and 2019 were evaluated. Based on DRI, adjusted hazard ratios for 1-y patient death were analyzed in 3 eras: Era 1 (2002-2007, n = 4032), Era 2 (2008-2013, n = 6130), and Era 3 (2014-2019, n = 7138). DRI groups were defined by DRI2.0. Results: ACLF patients had significantly lower risks of patient death within 1 y in Era 2 (adjusted hazard ratio, 0.69; 95% confidence interval, 0.61-0.78; P \u3c 0.001) and Era 3 (adjusted hazard ratio, 0.48; 95% confidence interval, 0.42-0.55; P \u3c 0.001) than in Era 1. All DRI groups showed lower hazards in Era 3 than in Era 1. Improvement of posttransplant outcomes were found both in ACLF-1/2 and ACLF-3 patients. In ACLF-1/2, DRI 1.2 to 1.6 and \u3e2.0 had lower adjusted risk in Era 3 than in Era 1. In ACLF-3, DRI 1.2 to 2.0 had lower risk in Era 3. In the overall ACLF cohort, the 2 categories with DRI \u3e1.6 had significantly higher adjusted risks of 1-y patient death than DRI \u3c1.2. When analyzing hazards in each era, DRI \u3e 2.0 carried significantly higher adjusted risks in Eras 1 and 3\u27 whereas DRI 1.2 to 2.0 had similar adjusted risks throughout eras. Similar tendency was found in ACLF-1/2. In the non-ACLF cohort, steady improvement of posttransplant outcomes was obtained in all DRI categories. Similar results were obtained when only hepatitis C virus-uninfected ACLF patients were evaluated. Conclusions: In ACLF patients, posttransplant outcomes have significantly improved, and outcomes with higher-risk organs have improved in all ACLF grades. These results might encourage the use of higher-risk donors in ACLF patients and provide improved access to transplant

    Directed Differentiation of Patient-Specific Induced Pluripotent Stem Cells Identifies the Transcriptional Repression and Epigenetic Modification of NKX2-5, HAND1, and NOTCH1 in Hypoplastic Left Heart Syndrome

    Get PDF
    The genetic basis of hypoplastic left heart syndrome (HLHS) remains unknown, and the lack of animal models to reconstitute the cardiac maldevelopment has hampered the study of this disease. This study investigated the altered control of transcriptional and epigenetic programs that may affect the development of HLHS by using disease-specific induced pluripotent stem (iPS) cells. Cardiac progenitor cells (CPCs) were isolated from patients with congenital heart diseases to generate patient-specific iPS cells. Comparative gene expression analysis of HLHS- and biventricle (BV) heart-derived iPS cells was performed to dissect the complex genetic circuits that may promote the disease phenotype. Both HLHS- and BV heart-derived CPCs were reprogrammed to generate disease-specific iPS cells, which showed characteristic human embryonic stem cell signatures, expressed pluripotency markers, and could give rise to cardiomyocytes. However, HLHS-iPS cells exhibited lower cardiomyogenic differentiation potential than BV-iPS cells. Quantitative gene expression analysis demonstrated that HLHS-derived iPS cells showed transcriptional repression of NKX2-5, reduced levels of TBX2 and NOTCH/HEY signaling, and inhibited HAND1/2 transcripts compared with control cells. Although both HLHS-derived CPCs and iPS cells showed reduced SRE and TNNT2 transcriptional activation compared with BV-derived cells, co-transfection of NKX2-5, HAND1, and NOTCH1 into HLHS-derived cells resulted in synergistic restoration of these promoters activation. Notably, gain- and loss-of-function studies revealed that NKX2-5 had a predominant impact on NPPA transcriptional activation. Moreover, differentiated HLHS-derived iPS cells showed reduced H3K4 dimethylation as well as histone H3 acetylation but increased H3K27 trimethylation to inhibit transcriptional activation on the NKX2-5 promoter. These findings suggest that patient-specific iPS cells may provide molecular insights into complex transcriptional and epigenetic mechanisms, at least in part, through combinatorial expression of NKX2-5, HAND1, and NOTCH1 that coordinately contribute to cardiac malformations in HLHS

    Improvements in liver transplant outcomes in patients with HCV/HIV coinfection after the introduction of direct-acting antiviral therapies

    Get PDF
    BACKGROUND: In recipients with HCV/HIV coinfection, the impact that the wider use of direct-acting antivirals (DAAs) has had on post-liver transplant (LT) outcomes has not been evaluated. We investigated the impact of DAAs introduction on post-LT outcome in patients with HCV/HIV coinfection. METHODS: Using Organ Procurement and Transplant Network/United Network for Organ Sharing data, we compared post-LT outcomes in patients with HCV and/or HIV pre- and post-DAAs introduction. We categorized these patients into two eras: pre-DAA (2008-2012 [pre-DAA era]) and post-DAA (2014-2019 [post-DAA era]). To study the impact of DAAs introduction, inverse probability of treatment weighting was used to adjust patient characteristics. RESULTS: A total of 17 215 LT recipients were eligible for this study (HCV/HIV [n = 160]; HIV mono-infection [n = 188]; HCV mono-infection [n = 16 867]). HCV/HIV coinfection and HCV mono-infection had a significantly lower hazard of 1- and 3-year graft loss post-DAA, compared pre-DAA (1-year: adjusted hazard ratio [aHR] 0.29, 95% confidence interval (CI) 0.16-0.53 in HIV/HCV, aHR 0.58, 95% CI 0.54-0.63, respectively; 3-year: aHR 0.30, 95% CI 0.14-0.61, aHR 0.64, 95% CI 0.58-0.70, respectively). The hazards of 1- and 3-year graft loss post-DAA in HIV mono-infection were comparable to those in pre-DAA. HCV/HIV coinfection had significantly lower patient mortality post-DAA, compared to pre-DAA (1-year: aHR 0.30, 95% CI 0.17-0.55; 3-year: aHR 0.31, 95% CI 0.15-0.63). CONCLUSIONS: Post-LT outcomes in patients with coinfection significantly improved and became comparable to those with HCV mono-infection after introducing DAA therapy. The introduction of DAAs supports the use of LT in the setting of HCV/HIV coinfection
    • …
    corecore