159 research outputs found

    Speichelsteine der Glandula submandibularis: Steinentfernung mit Organerhalt

    Get PDF

    Improving IBD diagnosis and monitoring by understanding preanalytical, analytical and biological fecal calprotectin variability

    Get PDF
    BACKGROUND: The appropriate clinical use of fecal calprotectin (fCal) might be compromised by incomplete harmonization between assays and within- and between-subjects variability. Our aim was to investigate the analytical and biological variability of fCal in order to provide tools for interpreting fCal in the clinical setting. METHODS: Experiments were conducted to investigate the effects of temperature and storage time on fCal. Thirty-nine controls were enrolled to verify biological variability, and a case-control study was conducted on 134 controls and 110 IBD patients to compare the clinical effectiveness of three different fCal assays: ELISA, CLIA and turbidimetry. RESULTS: A 12% decline in fCal levels was observed within 24 h following stool collection irrespective of storage temperature. Samples were unstable following a longer storage time interval at room temperature. Within- and between-subjects fCal biological variability, at 31% and 72% respectively, resulted in a reference change value (RCV) in the region of 100%. fCal sensitivity in distinguishing between controls and IBD patients is satisfactory (68%), and the specificity high (93%) among young (<65 years), but not among older ( 6565 years) subjects (ROC area: 0.584; 95% CI: 0.399-0.769). Among the young, assays have different optimal thresholds (120 \u3bcg/g for ELISA, 50 \u3bcg/g for CLIA and 100 \u3bcg/g for turbidimetry). CONCLUSIONS: We recommend a standardized preanalytical protocol for fCal, avoiding storage at room temperature for more than 24 h. Different cutoffs are recommended for different fCal assays. In monitoring, the difference between two consecutive measurements appears clinically significant when higher than 100%, the fCal biological variability-derived RCV

    Qualität in der Befundung von Kopf- und Halssonographien an Universitätskliniken – eine Stichprobe

    Get PDF
    Background Ultrasound diagnostics are widely used and are standard for radiologists, otolaryngologists, and oral and maxillofacial surgeons in the diagnostic work-up of various pathologies. There is agreement that digital documentation is urgently needed at present to improve and standardize the quality of sonographic documentation. There are more and more publications on the implementation of standardized documentation of findings in imaging diagnostics, including head and neck sonography. Objective The present work aims to determine the quality of routine head and neck sonography findings on a random basis, according to the criteria of the Bavarian Association of Statutory Health Insurance Physicians (KVB) at a selection of German university otolaryngology departments (ENT). Materials and methods A total of 70 randomly selected anonymized written findings including image documentation from seven ENT departments were retrospectively analyzed by an experienced KVB examiner concerning fulfilment of KVB criteria. The data were evaluated descriptively. Results Of the 70 reports, 69 were eligible for evaluation. The average documentation completeness was 80.6%. A total of 9 findings were correctly documented in full (13%). The documentation completeness of the individual departments was sorted in ascending order from 68.1% to 93%. With 88.5% vs. 75%, the hospitals with a structured report showed a higher level of completeness. In 75% of the cases the hospitals with structured reports also had digital solutions for reporting and image archiving. Conclusion In general, there is potential for optimization regarding the completeness and quality of routinely prepared head and neck sonography findings at the selected university ENT departments. The implementation of structured reporting masks and the conversion of analogue documentation into digital solutions as well as digital networking with the hospital information systems, picture archiving and communication systems should be promoted. Supervision by senior doctors is required to ensure the quality of findings of inexperienced colleagues and to help to achieve standards in reporting.Hintergrund Die Ultraschalldiagnostik gilt für den Radiologen, Hals-Nasen-Ohren-Arzt (HNO) oder Mund-Kiefer-Gesichts-Chirurgen als Standard in der Abklärung zahlreicher Pathologien. Es besteht ein Konsens, dass die digitale Dokumentation heute dringend notwendig ist, um die Qualität der sonographischen Dokumentationen zu verbessern und zu standardisieren. Es häufen sich Publikationen zur Implementierung standardisierter Befunddokumentation einschließlich der Kopf- und Halssonographie. Ziel der Arbeit Die vorliegende Arbeit zielt darauf ab, die Qualität von routinemäßig angefertigten Kopf- und Halssonographiebefunden nach Kriterien der Kassenärztlichen Vereinigung (KV) Bayern an einer Auswahl deutscher HNO-Universitätskliniken stichprobenartig zu ermitteln. Material und Methoden Insgesamt wurden retrospektiv 70 zufällig ausgewählte, anonymisierte schriftliche Befunde einschließlich Bildmaterial von insgesamt 7 HNO-Universitätskliniken stichprobenartig nach KV-Kriterien durch einen erfahrenen Prüfer der KV Bayern ausgewertet und deskriptiv analysiert. Ergebnisse Von 70 Befunden konnten 69 ausgewertet werden. Die Dokumentationsvollständigkeit lag im Mittel bei 80,6 %. Neun Befunde waren vollständig korrekt dokumentiert (13 %). Die Dokumentationsvollständigkeit der einzelnen Kliniken lag zwischen 68,1 % und 93 %. Mit 88,5 % vs. 75 % erbrachte eine strukturierte Befundung eine höhere Befundvollständigkeit. In 75 % der Fälle verfügten die Kliniken mit strukturiertem Befund auch über digitale Dokumentationslösungen. Schlussfolgerung Die Vollständigkeit und Qualität von routinemäßig angefertigten Kopf- und Halssonographiebefunden an einer Auswahl von HNO-Universitätskliniken ist insgesamt optimierbar. Die Implementierung strukturierter Befundmasken und die Umstellung der analogen Dokumentation auf digitale Lösungen sowie Vernetzung mit dem Klinikinformationssystem (KIS) und Bildarchivierungs- und Kommunikationssystem (PACS) sollte weiter vorangetrieben werden. Darüber hinaus sind leitende Ärzte dazu angehalten, die Befundqualität unerfahrener Kollegen regelmäßig zu prüfen und im Rahmen der Facharztausbildung auf die Erfüllung entsprechender Standards wie der KV-Ultraschallvereinbarung hinzuarbeiten

    New Capabilities of the FLUKA Multi-Purpose Code

    Get PDF
    FLUKA is a general purpose Monte Carlo code able to describe the transport and interaction of any particle and nucleus type in complex geometries over an energy range extending from thermal neutrons to ultrarelativistic hadron collisions. It has many different applications in accelerator design, detector studies, dosimetry, radiation protection, medical physics, and space research. In 2019, CERN and INFN, as FLUKA copyright holders, together decided to end their formal collaboration framework, allowing them henceforth to pursue different pathways aimed at meeting the evolving requirements of the FLUKA user community, and at ensuring the long term sustainability of the code. To this end, CERN set up the FLUKA.CERN Collaboration1. This paper illustrates the physics processes that have been newly released or are currently implemented in the code distributed by the FLUKA.CERN Collaboration2 under new licensing conditions that are meant to further facilitate access to the code, as well as intercomparisons. The description of coherent effects experienced by high energy hadron beams in crystal devices, relevant to promising beam manipulation techniques, and the charged particle tracking in vacuum regions subject to an electric field, overcoming a former lack, have already been made available to the users. Other features, namely the different kinds of low energy deuteron interactions as well as the synchrotron radiation emission in the course of charged particle transport in vacuum regions subject to magnetic fields, are currently undergoing systematic testing and benchmarking prior to release. FLUKA is widely used to evaluate radiobiological effects, with the powerful support of the Flair graphical interface, whose new generation (Available at http://flair.cern) offers now additional capabilities, e.g., advanced 3D visualization with photorealistic rendering and support for industry-standard volume visualization of medical phantoms. FLUKA has also been playing an extensive role in the characterization of radiation environments in which electronics operate. In parallel, it has been used to evaluate the response of electronics to a variety of conditions not included in radiation testing guidelines and standards for space and accelerators, and not accessible through conventional ground level testing. Instructive results have been obtained from Single Event Effects (SEE) simulations and benchmarks, when possible, for various radiation types and energies. The code has reached a high level of maturity, from which the FLUKA.CERN Collaboration is planning a substantial evolution of its present architecture. Moving towards a modern programming language allows to overcome fundamental constraints that limited development options. Our long term goal, in addition to improving and extending its physics performances with even more rigorous scientific oversight, is to modernize its structure to integrate independent contributions more easily and to formalize quality assurance through state-of-the-art software deployment techniques. This includes a continuous integration pipeline to automatically validate the codebase as well as automatic processing and analysis of a tailored physics-case test suite. With regard to the aforementioned objectives, several paths are currently envisaged, like finding synergies with Geant4, both at the core structure and interface level, this way offering the user the possibility to run with the same input different Monte Carlo codes and crosscheck the results

    Caracterização morfométrica da bacia do Epaminondas com uso de Srtm no Swat+ e Grass gis.

    Get PDF
    A caracterização morfométrica auxilia na compreensão de como ocorre a dinâmica da água na área de captação das bacias hidrográficas, gerando informações que são aliadas na gestão dos recursos hídricos. A sub-bacia Epaminondas apresenta a área de maior contribuição para o reservatório Santa Barbara, responsável por uma parcela significativa no abastecimento de água potável do município de Pelotas-RS, tornando-se alvo de estudos para a proteção e entendimento da região. O objetivo foi determinar as características morfométricas da bacia, utilizando as equações relativas à forma e à rede de drenagem, a partir de dados levantados por modelos hidrológicos em comparação aos dados disponibilizados pela FEPAM-RS. Foi utilizado o QGIS, e os modelos hidrológicos foram o SWAT+ e o GRASS GIS, que processaram de forma automática a geração da rede de drenagem e delimitação da bacia. As equações utilizadas foram o fator de forma, índice de conformidade, densidade de drenagem e extensão média do escoamento superficial direto. Em relação aos dados disponibilizados pela FEPAM-RS, o modelo que mais se aproximou aos trechos de drenagem da fonte oficial em relação a área foi o SWAT+. A diferença em comparação ao mapeamento da FEPAM-RS, pode estar relacionada a fonte de MDE utilizado e a predominância de áreas planas e suave onduladas com pequenas diferenças de altitude, que podem gerar diferença na interpretação dos modelos. Os resultados teóricos das equações foram semelhantes, determinando que a bacia possui baixa tendência de geração de enchentes, e possui a drenagem regular devido às características do relevo pouco acidentado.XXV SBRH

    Positive biodiversity-productivity relationship predominant in global forests

    Get PDF
    The biodiversity-productivity relationship (BPR) is foundational to our understanding of the global extinction crisis and its impacts on ecosystem functioning. Understanding BPR is critical for the accurate valuation and effective conservation of biodiversity. Using ground-sourced data from 777,126 permanent plots, spanning 44 countries and most terrestrial biomes, we reveal a globally consistent positive concave-down BPR, showing that continued biodiversity loss would result in an accelerating decline in forest productivity worldwide. The value of biodiversity in maintaining commercial forest productivity alone - US$166 billion to 490 billion per year according to our estimation - is more than twice what it would cost to implement effective global conservation. This highlights the need for a worldwide reassessment of biodiversity values, forest management strategies, and conservation priorities.Peer Reviewe

    Benchmarking LHC background particle simulation with the CMS triple-GEM detector

    Get PDF
    In 2018, a system of large-size triple-GEM demonstrator chambers was installed in the CMS experiment at CERN\u27s Large Hadron Collider (LHC). The demonstrator\u27s design mimicks that of the final detector, installed for Run-3. A successful Monte Carlo (MC) simulation of the collision-induced background hit rate in this system in proton-proton collisions at 13 TeV is presented. The MC predictions are compared to CMS measurements recorded at an instantaneous luminosity of 1.5 ×1034^{34} cm2^{-2} s1^{-1}. The simulation framework uses a combination of the FLUKA and GEANT4 packages. FLUKA simulates the radiation environment around the GE1/1 chambers. The particle flux by FLUKA covers energy spectra ranging from 1011^{-11} to 104^{4} MeV for neutrons, 103^{-3} to 104^{4} MeV for γ\u27s, 102^{-2} to 104^{4} MeV for e±^{±}, and 101^{-1} to 104^{4} MeV for charged hadrons. GEANT4 provides an estimate of the detector response (sensitivity) based on an accurate description of the detector geometry, the material composition, and the interaction of particles with the detector layers. The detector hit rate, as obtained from the simulation using FLUKA and GEANT4, is estimated as a function of the perpendicular distance from the beam line and agrees with data within the assigned uncertainties in the range 13.7-14.5%. This simulation framework can be used to obtain a reliable estimate of the background rates expected at the High Luminosity LHC

    Modeling the triple-GEM detector response to background particles for the CMS Experiment

    Get PDF
    An estimate of environmental background hit rate on triple-GEM chambers is performed using Monte Carlo (MC) simulation and compared to data taken by test chambers installed in the CMS experiment (GE1/1) during Run-2 at the Large Hadron Collider (LHC). The hit rate is measured using data collected with proton-proton collisions at 13 TeV and a luminosity of 1.5×1034\times10^{34} cm2^{-2} s1^{-1}. The simulation framework uses a combination of the FLUKA and Geant4 packages to obtain the hit rate. FLUKA provides the radiation environment around the GE1/1 chambers, which is comprised of the particle flux with momentum direction and energy spectra ranging from 101110^{-11} to 10410^{4} MeV for neutrons, 10310^{-3} to 10410^{4} MeV for γ\gamma's, 10210^{-2} to 10410^{4} MeV for e±e^{\pm}, and 10110^{-1} to 10410^{4} MeV for charged hadrons. Geant4 provides an estimate of detector response (sensitivity) based on an accurate description of detector geometry, material composition and interaction of particles with the various detector layers. The MC simulated hit rate is estimated as a function of the perpendicular distance from the beam line and agrees with data within the assigned uncertainties of 10-14.5%. This simulation framework can be used to obtain a reliable estimate of background rates expected at the High Luminosity LHC.Comment: 16 pages, 9 figures, 6 table
    corecore