96 research outputs found
Prognostic Relevance of the Eighth Edition of TNM Classification for Resected Perihilar Cholangiocarcinoma
Objectives: In our study, we evaluated and compared the prognostic value and performance of the 6th, 7th, and 8th editions of the American Joint Committee on Cancer (AJCC) staging system in patients undergoing surgery for perihilar cholangiocarcinoma (PHC). Methods: Patients undergoing liver surgery with curative intention for PHC between 2002 and 2019 were identified from a prospective database. Histopathological parameters and stage of the PHC were assessed according to the 6th, 7th, and 8th editions of the tumor node metastasis (TNM) classification. The prognostic accuracy between staging systems was compared using the area under the receiver operating characteristic curve (AUC) model. Results: Data for a total of 95 patients undergoing liver resection for PHC were analyzed. The median overall survival time was 21 months (95% CI 8.1–33.9), and the three- and five-year survival rates were 46.1% and 36.2%, respectively. Staging according to the 8th edition vs. the 7th edition resulted in the reclassification of 25 patients (26.3%). The log-rank p-values for the 7th and 8th editions were highly statistically significant (p ≤ 0.01) compared to the 6th edition (p = 0.035). The AJCC 8th edition staging system showed a trend to better discrimination, with an AUC of 0.69 (95% CI: 0.52–0.84) compared to 0.61 (95% CI: 0.51–0.73) for the 7th edition. Multivariate survival analysis revealed male gender, age >65 years, positive resection margins, presence of distant metastases, poorly tumor differentiation, and lymph node involvement, such as no caudate lobe resection, as independent predictors of poor survival (p < 0.05). Conclusions: In the current study, the newly released 8th edition of AJCC staging system showed no significant benefit compared to the previous 7th edition in predicting the prognosis of patients undergoing liver resection for perihilar cholangiocarcinoma. Further research may help to improve the prognostic value of the AJCC staging system for PHC—for instance, by identifying new prognostic markers or staging criteria, which may improve that individual patient’s outcome
Influence of Intraoperative Hemodynamic Parameters on Outcome in Simultaneous Pancreas–Kidney Transplant Recipient
Objectives: Adequate organ perfusion, as well as appropriate blood pressure levels at the time of unclamping, is crucial for early and long-term graft function and outcome in simultaneous pancreas–kidney transplantation (SPKT). However, the optimal intraoperative mean arterial pressure (MAP) level has not well been defined. Methods: From a prospectively collected database, the medical data of 105 patients undergoing SPKT at our center were retrospectively analyzed. A receiver operating characteristic (ROC) analysis was preliminarily performed for optimal cut-off value for MAP at reperfusion, to predict early pancreatic graft function. Due to these results, we divided the patients according to their MAP values at reperfusion into 91 mmHg (n = 58 patients) groups. Clinicopathological characteristics and outcomes, as well as early graft function and long-term survival, were retrospectively analyzed. Results: Donor and recipient characteristics were comparable between both groups. Rates of postoperative complications were significantly higher in the 91 mmHg group (vascular thrombosis of the pancreas: 7 (14%) versus 2 (3%); p = 0.03; pancreatitis/intraabdominal abscess: 10 (21%) versus 4 (7%); p = 0.03; renal delayed graft function (DGF): 11 (23%) versus 5 (9%); p = 0.03; postreperfusion urine output: 106 ± 50 mL versus 195 ± 45 mL; p = 0.04). There were no significant differences in intraoperative volume repletion, central venous pressure (CVP), use of vasoactive inotropic agents, and the metabolic outcome. Five-year pancreas graft survival was significantly higher in the >91 mmHg group (>91 mmHg: 82% versus 91 mmHg at the time point of reperfusion was associated with a reduced rate of postoperative complications, enhancing and recovering long-term graft function and outcome and thus increasing long-term survival in SPKT recipients
Securing Wireless Communication in Critical Infrastructure: Challenges and Opportunities
Critical infrastructure constitutes the foundation of every society. While
traditionally solely relying on dedicated cable-based communication, this
infrastructure rapidly transforms to highly digitized and interconnected
systems which increasingly rely on wireless communication. Besides providing
tremendous benefits, especially affording the easy, cheap, and flexible
interconnection of a large number of assets spread over larger geographic
areas, wireless communication in critical infrastructure also raises unique
security challenges. Most importantly, the shift from dedicated private wired
networks to heterogeneous wireless communication over public and shared
networks requires significantly more involved security measures. In this paper,
we identify the most relevant challenges resulting from the use of wireless
communication in critical infrastructure and use those to identify a
comprehensive set of promising opportunities to preserve the high security
standards of critical infrastructure even when switching from wired to wireless
communication.Comment: Author's version of a paper accepted for publication in Proceedings
of the 20th EAI International Conference on Mobile and Ubiquitous Systems:
Computing, Networking and Services (MobiQuitous 2023
Fitness, physical activity, and exercise in multiple sclerosis
Background
A moderate to high level of physical activity, including regular exercise, represents an established behavioral and rehabilitative approach for persons with multiple sclerosis (pwMS). Although being increasingly proposed to limit disease activity and progression, high-quality evidence is lacking.
Objective
The objective of the study is to provide valuable information for MS clinicians and researchers by systematically evaluating the current state of evidence (i) whether exercise interventions affect established clinical measures of disease activity and progression in pwMS (i.e., EDSS, relapse rate, lesion load, brain volume, MSFC) and (ii) how the physical activity and fitness level interact with these measures.
Methods
Literature search was conducted in MEDLINE, EMBASE, CINAHL, and SPORTDiscus. Evaluation of evidence quality was done based on standards published by The American Academy of Neurology.
Results
It is likely that exercise improves the MSFC score, whereas the EDSS score, lesion load, and brain volume are likely to remain unchanged over the intervention period. It is possible that exercise decreases the relapse rate. Results from cross-sectional studies indicate beneficial effects of a high physical activity or fitness level on clinical measures which, however, is not corroborated by high evidence quality.
Conclusions
A (supportive) disease-modifying effect of exercise in pwMS cannot be concluded. The rather low evidence quality of existing RCTs underlines the need to conduct more well-designed studies assessing different measures of disease activity or progression as primary end points. A major limitation is the short intervention duration of existing studies which limits meaningful exercise-induced effects on most disability measures. Findings from cross-sectional studies are difficult to contextualize regarding clinical importance due to their solely associative character and low evidence quality
Impact of Body Mass Index on Tumor Recurrence in Patients Undergoing Liver Resection for Perihilar Cholangiocarcinoma (pCCA)
Background: The association of body mass index (BMI) and long-term prognosis and outcome of patients with perihilar cholangiocarcinoma (pCCA) has not been well defined. The aim of this study was to evaluate clinicopathologic and oncologic outcomes with pCCA undergoing resection, according to their BMI. Methods: Patients undergoing liver resection in curative intention for pCCA at a tertiary German hepatobiliary (HPB) center were identified from a prospective database. Patients were classified as normal weight (BMI 18.5–24.9 kg/m2), overweight (BMI 25.0–29.9 kg/m2) and obese (>30 kg/m2) according to their BMI. Impact of clinical and histo-pathological characteristics on recurrence-free survival (RFS) were assessed using Cox proportional hazard regression analysis among patients of all BMI groups. Results: Among a total of 95 patients undergoing liver resection in curative intention for pCCA in the analytic cohort, 48 patients (50.5%) had normal weight, 33 (34.7%) were overweight and 14 patients (14.7%) were obese. After a median follow-up of 4.3 ± 2.9 years, recurrence was observed in totally 53 patients (56%). The cumulative recurrence probability was higher in obese and overweight patients than normal weight patients (5-year recurrence rate: obese: 82% versus overweight: 81% versus normal weight: 58% at 5 years; p = 0.02). Totally, 1-, 3-, 5- and 10-year recurrence-free survival rates were 68.5%, 44.6%, 28.9% and 13%, respectively. On multivariable analysis, increased BMI (HR 1.08, 95% CI: 1.01–1.16; p = 0.021), poor/moderate tumor differentiation (HR 2.49, 95% CI: 1.2–5.2; p = 0.014), positive lymph node status (HR 2.01, 95% CI: 1.11–3.65; p = 0.021), positive resection margins (HR 1.89, 95% CI:1.02–3.4; p = 0.019) and positive perineural invasion (HR 2.92, 95% CI: 1.02–8.3; p = 0.045) were independent prognostic risk factors for inferior RFS. Conclusion: Our study shows that a high BMI is significantly associated with an increased risk of recurrence after liver resection in curative intention for pCCA. This factor should be considered in future studies to better predict patient’s individual prognosis and outcome based on their BMI
Influence of Multiple Donor Renal Arteries on the Outcome and Graft Survival in Deceased Donor Kidney Transplantation
Aim: Complex arterial reconstruction in kidney transplantation (KT) using kidneys from deceased donors (DD) warrants additional study since little is known about the effects on the mid- and long-term outcome and graft survival. Methods: A total of 451 patients receiving deceased donor KT in our department between 1993 and 2017 were included in our study. Patients were divided into three groups according to the number of arteries and anastomosis: (A) 1 renal artery, 1 arterial anastomosis (N = 369); (B) >1 renal artery, 1 arterial anastomosis (N = 47); and (C) >1 renal artery, >1 arterial anastomosis (N = 35). Furthermore, the influence of localization of the arterial anastomosis (common iliac artery (CIA), versus non-CIA) was analyzed. Clinicopathological characteristics, outcome, and graft and patient survival of all groups were compared retrospectively. Results: With growing vascular complexity, the time of warm ischemia increased significantly (groups A, B, and C: 40 ± 19 min, 45 ± 19 min, and 50 ± 17 min, respectively; p = 0.006). Furthermore, the duration of operation was prolonged, although this did not reach significance (groups A, B, and C: 175 ± 98 min, 180 ± 35 min, and 210 ± 43 min, respectively; p = 0.352). There were no significant differences regarding surgical complications, post-transplant kidney function (delayed graft function, initial non-function, episodes of acute rejection), or long-term graft survival. Regarding the localization of the arterial anastomosis, non-CIA was an independent prognostic factor for deep vein thrombosis in multivariate analysis (CIA versus non-CIA: OR 11.551; 95% CI, 1.218–109.554; p = 0.033). Conclusion: Multiple-donor renal arteries should not be considered a contraindication to deceased KT, as morbidity rates and long-term outcomes seem to be comparable with grafts with single arteries and less complex anastomoses
Predictive Value of HAS-BLED Score Regarding Bleeding Events and Graft Survival following Renal Transplantation
Objective: Due to the high prevalence and incidence of cardio- and cerebrovascular diseases
among dialysis-dependent patients with end-stage renal disease (ERSD) scheduled for kidney
transplantation (KT), the use of antiplatelet therapy (APT) and/or anticoagulant drugs in this patient
population is common. However, these patients share a high risk of complications, either due to
thromboembolic or bleeding events, which makes adequate peri- and post-transplant anticoagulation
management challenging. Predictive clinical models, such as the HAS-BLED score developed for
predicting major bleeding events in patients under anticoagulation therapy, could be helpful tools for
the optimization of antithrombotic management and could reduce peri- and postoperative morbidity
and mortality. Methods: Data from 204 patients undergoing kidney transplantation (KT) between
2011 and 2018 at the University Hospital Leipzig were retrospectively analyzed. Patients were
stratified and categorized postoperatively into the prophylaxis group (group A)—patients without
pretransplant anticoagulation/antiplatelet therapy and receiving postoperative heparin in prophylactic
doses—and into the (sub)therapeutic group (group B)—patients with postoperative continued
use of pretransplant antithrombotic medication used (sub)therapeutically. The primary outcome
was the incidence of postoperative bleeding events, which was evaluated for a possible association
with the use of antithrombotic therapy. Secondary analyses were conducted for the associations of
other potential risk factors, specifically the HAS-BLED score, with allograft outcome. Univariate and
multivariate logistic regression as well as a Cox proportional hazard model were used to identify risk
factors for long-term allograft function, outcome and survival. The calibration and prognostic accuracy
of the risk models were evaluated using the Hosmer–Lemshow test (HLT) and the area under
the receiver operating characteristic curve (AUC) model. Results: In total, 94 of 204 (47%) patients received
(sub)therapeutic antithrombotic therapy after transplantation and 108 (53%) patients received
prophylactic antithrombotic therapy. A total of 61 (29%) patients showed signs of postoperative
bleeding. The incidence (p < 0.01) and timepoint of bleeding (p < 0.01) varied significantly between
the different antithrombotic treatment groups. After applying multivariate analyses, pre-existing
cardiovascular disease (CVD) (OR 2.89 (95% CI: 1.02–8.21); p = 0.04), procedure-specific complications
(blood loss (OR 1.03 (95% CI: 1.0–1.05); p = 0.014), Clavien–Dindo classification > grade II (OR 1.03
(95% CI: 1.0–1.05); p = 0.018)), HAS-BLED score (OR 1.49 (95% CI: 1.08–2.07); p = 0.018), vit K antagonists
(VKA) (OR 5.89 (95% CI: 1.10–31.28); p = 0.037), the combination of APT and therapeutic
heparin (OR 5.44 (95% CI: 1.33–22.31); p = 0.018) as well as postoperative therapeutic heparin (OR 3.37
(95% CI: 1.37–8.26); p < 0.01) were independently associated with an increased risk for bleeding. The
intraoperative use of heparin, prior antiplatelet therapy and APT in combination with prophylactic heparin was not associated with increased bleeding risk. Higher recipient body mass index (BMI)
(OR 0.32 per 10 kg/m2 increase in BMI (95% CI: 0.12–0.91); p = 0.023) as well as living donor KT
(OR 0.43 (95% CI: 0.18–0.94); p = 0.036) were associated with a decreased risk for bleeding. Regarding
bleeding events and graft failure, the HAS-BLED risk model demonstrated good calibration (bleeding
and graft failure: HLT: chi-square: 4.572, p = 0.802, versus chi-square: 6.52, p = 0.18, respectively) and
moderate predictive performance (bleeding AUC: 0.72 (0.63–0.79); graft failure: AUC: 0.7 (0.6–0.78)).
Conclusions: In our current study, we could demonstrate the HAS-BLED risk score as a helpful tool
with acceptable predictive accuracy regarding bleeding events and graft failure following KT. The
intensified monitoring and precise stratification/assessment of bleeding risk factors may be helpful
in identifying patients at higher risks of bleeding, improved individualized anticoagulation decisions
and choices of antithrombotic therapy in order to optimize outcome after kidney transplantatio
Correlation of Different Serum Biomarkers with Prediction of Early Pancreatic Graft Dysfunction Following Simultaneous Pancreas and Kidney Transplantation
Background: Despite recent advances and refinements in perioperative management of simultaneous pancreas–kidney transplantation (SPKT) early pancreatic graft dysfunction (ePGD) remains a critical problem with serious impairment of early and long-term graft function and outcome. Hence, we evaluated a panel of classical blood serum markers for their value in predicting early graft dysfunction in patients undergoing SPKT. Methods: From a prospectively collected database medical data of 105 patients undergoing SPKT between 1998 and 2018 at our center were retrospectively analyzed. The primary study outcome was the detection of occurrence of early pancreatic graft dysfunction (ePGD), the secondary study outcome was early renal graft dysfunction (eRGD) as well as all other outcome parameters associated with the graft function. In this context, ePGD was defined as pancreas graft-related complications including graft pancreatitis, pancreatic abscess/peritonitis, delayed graft function, graft thrombosis, bleeding, rejection and the consecutive need for re-laparotomy due to graft-related complications within 3 months. With regard to analyzing ePGD, serum levels of white blood cell count (WBC), C-reactive protein (CRP), procalcitonin (PCT), pancreatic lipase as well as neutrophil–lymphocyte ratio (NLR) and platelet–lymphocyte ratio (PLR) were measured preoperatively and at postoperative days (POD) 1, 2, 3 and 5. Further, peak serum levels of CRP and lipase during the first 72 h were evaluated. Receiver operating characteristics (ROC) curves were performed to assess their predictive value for ePGD and eRGD. Cut-off levels were calculated with the Youden index. Significant diagnostic biochemical cut-offs as well as other prognostic clinical factors were tested in a multivariate logistic regression model. Results: Of the 105 patients included, 43 patients (41%) and 28 patients (27%) developed ePGD and eRGD following SPKT, respectively. The mean WBC, PCT, NLR, PLR, CRP and lipase levels were significantly higher on most PODs in the ePGD group compared to the non-ePGD group. ROC analysis indicated that peak lipase (AUC: 0.82) and peak CRP levels (AUC: 0.89) were highly predictive for ePGD after SPKT. The combination of both achieved the highest AUC (0.92; p 150 IU/L (OR 2.9 (95% CI: 1.2–7.13), p = 0.021) and CRP levels of ≥ 180 ng/mL on POD 2 (OR 3.6 (95% CI: 1.54–8.34), p 150 ng/mL on POD 3 (OR 4.5 (95% CI: 1.7–11.4), p < 0.01) were revealed as independent biochemical predictive variables for ePGD after transplantation. Conclusions: In the current study, the combination of peak lipase and CRP levels were highly effective in predicting early pancreatic graft dysfunction development following SPKT. In contrast, for early renal graft dysfunction the predictive value of this parameter was less sensitive. Intensified monitoring of these parameters may be helpful for identifying patients at a higher risk of pancreatic ischemia reperfusion injury and various IRI- associated postoperative complications leading to ePGD and thus deteriorated outcome
The Value of Graft Implantation Sequence in Simultaneous Pancreas-Kidney Transplantation on the Outcome and Graft Survival
Background/Objectives: The sequence of graft implantation in simultaneous pancreas-kidney transplantation (SPKT) warrants additional study and more targeted focus, since little is known about the short- and long-term effects on the outcome and graft survival after transplantation. Material and methods: 103 patients receiving SPKT in our department between 1999 and 2015 were included in the study. Patients were divided according to the sequence of graft implantation into pancreas-first (PF, n = 61) and kidney-first (KF, n = 42) groups. Clinicopathological characteristics, outcome and survival were reviewed retrospectively. Results: Donor and recipient characteristics were similar. Rates of post-operative complications and graft dysfunction were significantly higher in the PF group compared with the KF group (episodes of acute rejection within the first year after SPKT: 11 (18%) versus 2 (4.8%); graft pancreatitis: 18 (18%) versus 2 (4.8%), p = 0.04; vascular thrombosis of the pancreas: 9 (14.8%) versus 1 (2.4%), p = 0.03; and delayed graft function of the kidney: 12 (19.6%) versus 2 (4.8%), p = 0.019). The three-month pancreas graft survival was significantly higher in the KF group (PF: 77% versus KF: 92.1%; p = 0.037). No significant difference was observed in pancreas graft survival five years after transplantation (PF: 71.6% versus KF: 84.8%; p = 0.104). Kidney graft survival was similar between the two groups. Multivariate analysis revealed order of graft implantation as an independent prognostic factor for graft survival three months after SPKT (HR 2.6, 1.3–17.1, p = 0.026) and five years (HR 3.7, 2.1–23.4, p = 0.040). Conclusion: Our data indicates that implantation of the pancreas prior to the kidney during SPKT has an influence especially on the early-post-operative outcome and survival rate of pancreas grafts
Recommended from our members
Detailed analysis of metagenome datasets obtained from biogas-producing microbial communities residing in biogas reactors does not indicate the presence of putative pathogenic microorganisms
Background: In recent years biogas plants in Germany have been supposed to be involved in amplification and
dissemination of pathogenic bacteria causing severe infections in humans and animals. In particular, biogas plants
are discussed to contribute to the spreading of Escherichia coli infections in humans or chronic botulism in cattle
caused by Clostridium botulinum. Metagenome datasets of microbial communities from an agricultural biogas plant
as well as from anaerobic lab-scale digesters operating at different temperatures and conditions were analyzed for
the presence of putative pathogenic bacteria and virulence determinants by various bioinformatic approaches.
Results: All datasets featured a low abundance of reads that were taxonomically assigned to the genus Escherichia
or further selected genera comprising pathogenic species. Higher numbers of reads were taxonomically assigned to
the genus Clostridium. However, only very few sequences were predicted to originate from pathogenic clostridial
species. Moreover, mapping of metagenome reads to complete genome sequences of selected pathogenic
bacteria revealed that not the pathogenic species itself, but only species that are more or less related to pathogenic
ones are present in the fermentation samples analyzed. Likewise, known virulence determinants could hardly be
detected. Only a marginal number of reads showed similarity to sequences described in the Microbial Virulence
Database MvirDB such as those encoding protein toxins, virulence proteins or antibiotic resistance determinants.
Conclusions: Findings of this first study of metagenomic sequence reads of biogas producing microbial
communities suggest that the risk of dissemination of pathogenic bacteria by application of digestates from biogas
fermentations as fertilizers is low, because obtained results do not indicate the presence of putative pathogenic
microorganisms in the samples analyzed
- …