47 research outputs found

    Reply

    Get PDF

    Outbreak of encephalitic listeriosis in red-legged partridges (Alectoris rufa)

    Get PDF
    An outbreak of neurological disease was investigated in red-legged partridges between 8 and 28 days of age. Clinical signs included torticollis, head tilt and incoordination and over an initial eight day period approximately 30–40 fatalities occurred per day. No significant gross post mortem findings were detected. Histopathological examination of the brain and bacterial cultures followed by partial sequencing confirmed a diagnosis of encephalitis due to Listeria monocytogenes. Further isolates were obtained from follow-up carcasses, environmental samples and pooled tissue samples of newly imported day-old chicks prior to placement on farm. These isolates had the same antibiotic resistance pattern as the isolate of the initial post mortem submission and belonged to the same fluorescent amplified fragment length polymorphism (fAFLP) subtype. This suggested that the isolates were very closely related or identical and that the pathogen had entered the farm with the imported day-old chicks, resulting in disease manifestation in partridges between 8 and 28 days of age. Reports of outbreaks of encephalitic listeriosis in avian species are rare and this is to the best of our knowledge the first reported outbreak in red-legged partridges

    Blood Cell Salvage and Autotransfusion Does Not Worsen Oncologic Outcomes Following Liver Transplantation with Incidental Hepatocellular Carcinoma: A Propensity Score-Matched Analysis

    Get PDF
    BACKGROUND: Intraoperative blood cell salvage and autotransfusion (IBSA) during liver transplantation (LT) for hepatocellular carcinoma (HCC) is controversial for concern regarding adversely impacting oncologic outcomes. OBJECTIVE: We aimed to evaluate the long-term oncologic outcomes of patients who underwent LT with incidentally discovered HCC who received IBSA compared with those who did not receive IBSA. METHODS: Patients undergoing LT (January 2001-October 2018) with incidental HCC on explant pathology were retrospectively identified. A 1:1 propensity score matching (PSM) was performed. HCC recurrence and patient survival were compared. Kaplan-Meier survival analyses were performed, and univariable Cox proportional hazard analyses were performed for risks of recurrence and death. RESULTS: Overall, 110 patients were identified (IBSA, n = 76 [69.1%]; non-IBSA, n = 34 [30.9%]). Before matching, the groups were similar in terms of demographics, transplant, and tumor characteristics. Overall survival was similar for IBSA and non-IBSA at 1, 3, and 5 years (96.0%, 88.4%, 83.0% vs. 97.1%, 91.1%, 87.8%, respectively; p = 0.79). Similarly, the recurrence rate at 1, 3, and 5 years was not statistically different (IBSA 0%, 1.8%, 1.8% vs. non-IBSA 0%, 3.2%, 3.2%, respectively; p = 0.55). After 1:1 matching (26 IBSA, 26 non-IBSA), Cox proportional hazard analysis demonstrated similar risk of death and recurrence between the groups (IBSA hazard ratio [HR] of death 1.26, 95% confidence interval [CI] 0.52-3.05, p = 0.61; and HR of recurrence 2.64, 95% CI 0.28-25.30, p = 0.40). CONCLUSIONS: IBSA does not appear to adversely impact oncologic outcomes in patients undergoing LT with incidental HCC. This evidence further supports the need for randomized trials evaluating the impact of IBSA use in LT for HCC

    Perception versus reality: A National Cohort Analysis of the surgery-first approach for resectable pancreatic cancer

    Get PDF
    INTRODUCTION: Although surgical resection is necessary, it is not sufficient for long-term survival in pancreatic ductal adenocarcinoma (PDAC). We sought to evaluate survival after up-front surgery (UFS) in anatomically resectable PDAC in the context of three critical factors: (A) margin status; (B) CA19-9; and (C) receipt of adjuvant chemotherapy. METHODS: The National Cancer Data Base (2010-2015) was reviewed for clinically resectable (stage 0/I/II) PDAC patients. Surgical margins, pre-operative CA19-9, and receipt of adjuvant chemotherapy were evaluated. Patient overall survival was stratified based on these factors and their respective combinations. Outcomes after UFS were compared to equivalently staged patients after neoadjuvant chemotherapy on an intention-to-treat (ITT) basis. RESULTS: Twelve thousand and eighty-nine patients were included (n = 9197 UFS, n = 2892 ITT neoadjuvant). In the UFS cohort, only 20.4% had all three factors (median OS = 31.2 months). Nearly 1/3rd (32.7%) of UFS patients had none or only one factor with concomitant worst survival (median OS = 14.7 months). Survival after UFS decreased with each failing factor (two factors: 23 months, one factor: 15.5 months, no factors: 7.9 months) and this persisted after adjustment. Overall survival was superior in the ITT-neoadjuvant cohort (27.9 vs. 22 months) to UFS. CONCLUSION: Despite the perceived benefit of UFS, only 1-in-5 UFS patients actually realize maximal survival when known factors highly associated with outcomes are assessed. Patients are proportionally more likely to do worst, rather than best after UFS treatment. Similarly staged patients undergoing ITT-neoadjuvant therapy achieve survival superior to the majority of UFS patients. Patients and providers should be aware of the false perception of \u27optimal\u27 survival benefit with UFS in anatomically resectable PDAC

    National time trends in mortality and graft survival following liver transplantation from circulatory death or brainstem death donors.

    Get PDF
    BACKGROUND: Despite high waiting list mortality rates, concern still exists on the appropriateness of using livers donated after circulatory death (DCD). We compared mortality and graft loss in recipients of livers donated after circulatory or brainstem death (DBD) across two successive time periods. METHODS: Observational multinational data from the United Kingdom and Ireland were partitioned into two time periods (2008-2011 and 2012-2016). Cox regression methods were used to estimate hazard ratios (HRs) comparing the impact of periods on post-transplant mortality and graft failure. RESULTS: A total of 1176 DCD recipients and 3749 DBD recipients were included. Three-year patient mortality rates decreased markedly from 19.6 per cent in time period 1 to 10.4 per cent in time period 2 (adjusted HR 0.43, 95 per cent c.i. 0.30 to 0.62; P < 0.001) for DCD recipients but only decreased from 12.8 to 11.3 per cent (adjusted HR 0.96, 95 per cent c.i. 0.78 to 1.19; P = 0.732) in DBD recipients (P for interaction = 0.001). No time period-specific improvements in 3-year graft failure were observed for DCD (adjusted HR 0.80, 95% c.i. 0.61 to 1.05; P = 0.116) or DBD recipients (adjusted HR 0.95, 95% c.i. 0.79 to 1.14; P = 0.607). A slight increase in retransplantation rates occurred between time period 1 and 2 in those who received a DCD liver (from 7.3 to 11.8 per cent; P = 0.042), but there was no change in those receiving a DBD liver (from 4.9 to 4.5 per cent; P = 0.365). In time period 2, no difference in mortality rates between those receiving a DCD liver and those receiving a DBD liver was observed (adjusted HR 0.78, 95% c.i. 0.56 to 1.09; P = 0.142). CONCLUSION: Mortality rates more than halved in recipients of a DCD liver over a decade and eventually compared similarly to mortality rates in recipients of a DBD liver. Regions with high waiting list mortality may mitigate this by use of DCD livers

    The first transformation method for the thermo-acidophilic archaeon Thermoplasma acidophilum

    No full text
    A transformation method yielding up to 10(4) transformants per mu g circular DNA was developed for Thermoplasma acidophilum. The method is based on a natural DNA uptake process in which T. acidophilum cells keep their integrity and turn competent at pH 3.5 and 58 degrees C. Shuttle vector maintenance could not be detected, since the used Nov(R) gyraseB gene integrated into its chromosomal counterpart by homologous recombination. (C) 2013 Elsevier B.V. All rights reserved

    Disparities in the Use of Older Donation after Circulatory Death Liver Allografts in the United States Versus the United Kingdom

    No full text
    BACKGROUND: This study aimed to assess the differences between the United States and the United Kingdom in the characteristics and posttransplant survival of patients who received donation after circulatory death (DCD) liver allografts from donors aged \u3e60 y. METHODS: Data were collected from the UK Transplant Registry and the United Network for Organ Sharing databases. Cohorts were dichotomized into donor age subgroups (donor \u3e60 y [D \u3e60]; donor ≤60 y [D ≤60]). Study period: January 1, 2001, to December 31, 2015. RESULTS: 1157 DCD LTs were performed in the United Kingdom versus 3394 in the United States. Only 13.8% of US DCD donors were aged \u3e50 y, contrary to 44.3% in the United Kingdom. D \u3e60 were 22.6% in the United Kingdom versus 2.4% in the United States. In the United Kingdom, 64.2% of D \u3e60 clustered in 2 metropolitan centers. In the United States, there was marked inter-regional variation. A total of 78.3% of the US DCD allografts were used locally. One- and 5-y unadjusted DCD graft survival was higher in the United Kingdom versus the United States (87.3% versus 81.4%, and 78.0% versus 71.3%, respectively; P \u3c 0.001). One- and 5-y D \u3e60 graft survival was higher in the United Kingdom (87.3% versus 68.1%, and 77.9% versus 51.4%, United Kingdom versus United States, respectively; P \u3c 0.001). In both groups, grafts from donors ≤30 y had the best survival. Survival was similar for donors aged 41 to 50 versus 51 to 60 in both cohorts. CONCLUSIONS: Compared with the United Kingdom, older DCD LT utilization remained low in the United States, with worse D \u3e60 survival. Nonetheless, present data indicate similar survivals for older donors aged ≤60, supporting an extension to the current US DCD age cutoff
    corecore