138 research outputs found

    A Systematic Review of the Perforated Duodenal Diverticula: Lessons Learned from the Last Decade

    Full text link
    Background: The perforated duodenal diverticulum remains a rare clinical entity, the optimal management of which has not been well established. Historically, primary surgery has been the preferred treatment modality. This was called into question during the last decade, with the successful application of non-operative therapy in selected patients. The aim of this systematic review is to identify cases of perforated duodenal diverticula published over the past decade and to assess any subsequent evolution in treatment. Methods: A systematic review of English and non-English articles reporting on perforated duodenal diverticula using MEDLINE (2008-2020) was performed. Only cases of perforated duodenal diverticula in adults (> 18 years) that reported on diagnosis and treatment were included. Results: Some 328 studies were identified, of which 31 articles met the inclusion criteria. These studies included a total of 47 patients with perforated duodenal diverticula. This series suggests a trend towards conservative management with 34% (16/47) of patients managed non-operatively. In 31% (5/16) patients initially managed conservatively, a step-up approach to surgical intervention was required. Conclusion: Conservative treatment of perforated duodenal diverticula appears to be an acceptable and safe treatment strategy in stable patients without signs of peritonitis under careful observation. For patients who fail to respond to conservative treatment, a step-up approach to percutaneous drainage or surgery can be applied. If surgery is required, competence in techniques ranging from simple diverticulectomy to Roux-en-Y gastric diversion or even Whipple's procedure may be required depending on tissue friability and diverticular collar size. Keywords: Duodenal diverticulum; Duodenum; Management; Perforation

    Pre-Emptive Endoluminal Negative Pressure Therapy at the Anastomotic Site in Minimally Invasive Transthoracic Esophagectomy (the preSPONGE Trial): Study Protocol for a Multicenter Randomized Controlled Trial

    Full text link
    Introduction: Anastomotic leakage (AL) accounts for a significant proportion of morbidity following oesophagectomy. Endoluminal negative pressure (ENP) therapy via a specifically designed polyurethane foam (EsoSponge®, B.Braun Medical, Melsungen, Germany) has become the standard of care for AL in many specialized centres. The prophylactic (pENP) application of this technique aims to reduce postoperative morbidity and is a novel approach which has not yet been investigated in a prospective study. The aim of this study is therefore to assess the effect of pENP at the anastomotic site in high-risk patients undergoing minimally invasive transthoracic Ivor Lewis oesophagectomy. Methods and analysis: The study design is a prospective, multi-centre, two-arm, parallel-group, randomised controlled trial and will be conducted in two phases. Phase one is a randomised feasibility and safety pilot trial involving 40 consecutive patients. After definitive sample size calculation, additional patients will be included accordingly during phase two. The primary outcome of the study will be the postoperative length of hospitalization until reaching previously defined “fit for discharge criteria”. Secondary outcomes will include postoperative morbidity, mortality and postoperative AL-rates based on 90-day follow-up. A confirmatory analysis based on intention-to-treat will be performed. Ethics and dissemination: The ethics committee of the University of Zurich approved this study (2019-00562), which has been registered with ClinicalTrials.gov on 14.11.2019 (NCT04162860) and the Swiss National Clinical Trials Portal (SNCTP000003524). The results of the study will be published and presented at appropriate conferences

    Expert opinion on NSCLC small specimen biomarker testing - Part 2: Analysis, reporting, and quality assessment

    Full text link
    The diagnostic work-up for non-small cell lung cancer (NSCLC) requires biomarker testing to guide therapy choices. This article is the second of a two-part series. In Part 1, we summarised evidence-based recommendations for obtaining and processing small specimen samples (i.e. pre-analytical steps) from patients with advanced NSCLC. Here, in Part 2, we summarise evidence-based recommendations relating to analytical steps of biomarker testing (and associated reporting and quality assessment) of small specimen samples in NSCLC. As the number of biomarkers for actionable (genetic) targets and approved targeted therapies continues to increase, simultaneous testing of multiple actionable oncogenic drivers using next-generation sequencing (NGS) becomes imperative, as set forth in European Society for Medical Oncology guidelines. This is particularly relevant in advanced NSCLC, where tissue specimens are typically limited and NGS may help avoid tissue exhaustion compared with sequential biomarker testing. Despite guideline recommendations, significant discrepancies in access to NGS persist across Europe, primarily due to reimbursement constraints. The use of increasingly complex testing methods also has implications for the reporting of results. Molecular testing reports should include clinical interpretation with additional commentary on sample adequacy as appropriate. Molecular tumour boards are recommended to facilitate the interpretation of complex genetic information arising from NGS, and to collaboratively determine the optimal treatment for patients with NSCLC. Finally, whichever testing modality is employed, it is essential that adequate internal and external validation and quality control measures are implemented

    Expert opinion on NSCLC small specimen biomarker testing - Part 1: Tissue collection and management

    Full text link
    Biomarker testing is crucial for treatment selection in advanced non-small cell lung cancer (NSCLC). However, the quantity of available tissue often presents a key constraint for patients with advanced disease, where minimally invasive tissue biopsy typically returns small samples. In Part 1 of this two-part series, we summarise evidence-based recommendations relating to small sample processing for patients with NSCLC. Generally, tissue biopsy techniques that deliver the greatest quantity and quality of tissue with the least risk to the patient should be selected. Rapid on-site evaluation can help to ensure sufficient sample quality and quantity. Sample processing should be managed according to biomarker testing requirements, because tissue fixation methodology influences downstream nucleic acid, protein and morphological analyses. Accordingly, 10% neutral buffered formalin is recommended as an appropriate fixative, and the duration of fixation is recommended not to exceed 24-48 h. Tissue sparing techniques, including the 'one biopsy per block' approach and small sample cutting protocols, can help preserve tissue. Cytological material (formalin-fixed paraffin-embedded [FFPE] cytology blocks and non-FFPE samples such as smears and touch preparations) can be an excellent source of nucleic acid, providing either primary or supplementary patient material to complete morphological and molecular diagnoses. Considerations on biomarker testing, reporting and quality assessment are discussed in Part 2

    Variation in pre-PCR processing of FFPE samples leads to discrepancies in BRAF and EGFR mutation detection: a diagnostic RING trial.

    Get PDF
    Aims Mutation detection accuracy has been described extensively; however, it is surprising that pre-PCR processing of formalin-fixed paraffin-embedded (FFPE) samples has not been systematically assessed in clinical context. We designed a RING trial to (i) investigate pre-PCR variability, (ii) correlate pre-PCR variation with EGFR/BRAF mutation testing accuracy and (iii) investigate causes for observed variation. Methods 13 molecular pathology laboratories were recruited. 104 blinded FFPE curls including engineered FFPE curls, cell-negative FFPE curls and control FFPE tissue samples were distributed to participants for pre-PCR processing and mutation detection. Follow-up analysis was performed to assess sample purity, DNA integrity and DNA quantitation. Results Rate of mutation detection failure was 11.9%. Of these failures, 80% were attributed to pre-PCR error. Significant differences in DNA yields across all samples were seen using analysis of variance (p<0.0001), and yield variation from engineered samples was not significant (p=0.3782). Two laboratories failed DNA extraction from samples that may be attributed to operator error. DNA extraction protocols themselves were not found to contribute significant variation. 10/13 labs reported yields averaging 235.8ng (95% CI 90.7 to 380.9) from cell-negative samples, which was attributed to issues with spectrophotometry. DNA measurements using Qubit Fluorometry demonstrated a median fivefold overestimation of DNA quantity by Nanodrop Spectrophotometry. DNA integrity and PCR inhibition were factors not found to contribute significant variation. Conclusions In this study, we provide evidence demonstrating that variation in pre-PCR steps is prevalent and may detrimentally affect the patient's ability to receive critical therapy. We provide recommendations for preanalytical workflow optimisation that may reduce errors in down-stream sequencing and for next-generation sequencing library generation

    Sediment Cores from White Pond, South Carolina, contain a Platinum Anomaly, Pyrogenic Carbon Peak, and Coprophilous Spore Decline at 12.8 ka

    Get PDF
    A widespread platinum (Pt) anomaly was recently documented in Greenland ice and 11 North American sedimentary sequences at the onset of the Younger Dryas (YD) event (~12,800 cal yr BP), consistent with the YD Impact Hypothesis. We report high-resolution analyses of a 1-meter section of a lake core from White Pond, South Carolina, USA. After developing a Bayesian age-depth model that brackets the late Pleistocene through early Holocene, we analyzed and quantified the following: (1) Pt and palladium (Pd) abundance, (2) geochemistry of 58 elements, (3) coprophilous spores, (4) sedimentary organic matter (OC and sedaDNA), (5) stable isotopes of C (δ13C) and N (δ15N), (6) soot, (7) aciniform carbon, (8) cryptotephra, (9) mercury (Hg), and (10) magnetic susceptibility. We identified large Pt and Pt/Pd anomalies within a 2-cm section dated to the YD onset (12,785 ± 58 cal yr BP). These anomalies precede a decline in coprophilous spores and correlate with an abrupt peak in soot and C/OC ratios, indicative of large-scale regional biomass burning. We also observed a relatively large excursion in δ15N values, indicating rapid climatic and environmental/hydrological changes at the YD onset. Our results are consistent with the YD Impact Hypothesis and impact-related environmental and ecological changes

    Packaging Health Services When Resources Are Limited: The Example of a Cervical Cancer Screening Visit

    Get PDF
    BACKGROUND: Increasing evidence supporting the value of screening women for cervical cancer once in their lifetime, coupled with mounting interest in scaling up successful screening demonstration projects, present challenges to public health decision makers seeking to take full advantage of the single-visit opportunity to provide additional services. We present an analytic framework for packaging multiple interventions during a single point of contact, explicitly taking into account a budget and scarce human resources, constraints acknowledged as significant obstacles for provision of health services in poor countries. METHODS AND FINDINGS: We developed a binary integer programming (IP) model capable of identifying an optimal package of health services to be provided during a single visit for a particular target population. Inputs to the IP model are derived using state-transition models, which compute lifetime costs and health benefits associated with each intervention. In a simplified example of a single lifetime cervical cancer screening visit, we identified packages of interventions among six diseases that maximized disability-adjusted life years (DALYs) averted subject to budget and human resource constraints in four resource-poor regions. Data were obtained from regional reports and surveys from the World Health Organization, international databases, the published literature, and expert opinion. With only a budget constraint, interventions for depression and iron deficiency anemia were packaged with cervical cancer screening, while the more costly breast cancer and cardiovascular disease interventions were not. Including personnel constraints resulted in shifting of interventions included in the package, not only across diseases but also between low- and high-intensity intervention options within diseases. CONCLUSIONS: The results of our example suggest several key themes: Packaging other interventions during a one-time visit has the potential to increase health gains; the shortage of personnel represents a real-world constraint that can impact the optimal package of services; and the shortage of different types of personnel may influence the contents of the package of services. Our methods provide a general framework to enhance a decision maker's ability to simultaneously consider costs, benefits, and important nonmonetary constraints. We encourage analysts working on real-world problems to shift from considering costs and benefits of interventions for a single disease to exploring what synergies might be achievable by thinking across disease burdens

    The Future is Big Graphs! A Community View on Graph Processing Systems

    Get PDF
    Graphs are by nature unifying abstractions that can leverage interconnectedness to represent, explore, predict, and explain real- and digital-world phenomena. Although real users and consumers of graph instances and graph workloads understand these abstractions, future problems will require new abstractions and systems. What needs to happen in the next decade for big graph processing to continue to succeed?Comment: 12 pages, 3 figures, collaboration between the large-scale systems and data management communities, work started at the Dagstuhl Seminar 19491 on Big Graph Processing Systems, to be published in the Communications of the AC
    corecore