406 research outputs found

    Validation of Deep Convolutional Generative Adversarial Networks for High Energy Physics Calorimeter Simulations

    Full text link
    In particle physics the simulation of particle transport through detectors requires an enormous amount of computational resources, utilizing more than 50% of the resources of the CERN Worldwide Large Hadron Collider Grid. This challenge has motivated the investigation of different, faster approaches for replacing the standard Monte Carlo simulations. Deep Learning Generative Adversarial Networks are among the most promising alternatives. Previous studies showed that they achieve the necessary level of accuracy while decreasing the simulation time by orders of magnitudes. In this paper we present a newly developed neural network architecture which reproduces a three-dimensional problem employing 2D convolutional layers and we compare its performance with an earlier architecture consisting of 3D convolutional layers. The performance evaluation relies on direct comparison to Monte Carlo simulations, in terms of different physics quantities usually employed to quantify the detector response. We prove that our new neural network architecture reaches a higher level of accuracy with respect to the 3D convolutional GAN while reducing the necessary computational resources. Calorimeters are among the most expensive detectors in terms of simulation time. Therefore we focus our study on an electromagnetic calorimeter prototype with a regular highly granular geometry, as an example of future calorimeters.Comment: AAAI-MLPS 2021 Spring Symposium at Stanford Universit

    Relación filogenética de los Cumacea (Crustacea: Peracarida) y variabilidad genética de dos especies antárticas de la familia Leuconidae

    Get PDF
    Phylogenetic hypotheses for the peracarid order Cumacea are scarce and have not provided a solution to the full extent. In the present study, a fragment of the mitochondrial 16S rDNA was used to erect a phylogenetic hypothesis for three cumacean families, Diastylidae, Bodotriidae and Leuconidae, along with intra-family relationships of the latter. The Cumacea resolved monophyletic with tanaids and isopods as outgroup taxa. The Diastylidae were the only family with good support for monophyly. The genus Leucon resolved paraphyletic, whereas the subgenus Crymoleucon was monophyletic. Furthermore, the genetic structure was analysed for two leuconid species, Leucon antarcticus Zimmer, 1907 and L. intermedius Mühlenhardt-Siegel, 1996, from the Weddell Sea and the Ross Sea. The two species showed different patterns of intraspecific genetic variability. In contrast to L. intermedius, a bimodal distribution of pairwise genetic distances was observed for L. antarcticus, which is correlated with geographical and depth distributions between the Ross Sea and the Weddell Sea. Although a clear evaluation of cryptic speciation in these species requires additional work on more specimens from more geographic regions and broader depth ranges, differences shown in the sequences of 16S rDNA can only be explained by genetic separation of populations between the Weddell Sea and the Ross Sea for an extended period of time.Las hipótesis filogenéticas para los peracáridos del orden Cumacea son escasas y no han proporcionado una solución definitiva. En el presente estudio se utilizó un fragmento del rDNA 16S mitocondrial para formular una hipótesis filogenética para tres familias de cumáceos, Diastylidae, Bodotriidae y Leuconidae. Además se han analizado las relaciones intrafamiliares de esta última. Los cumáceos es un grupo monofilético con tanaidáceos e isópodos como taxones externos. De las tres familias analizadas, los Diastylidae fueron la única con buen apoyo para la monofilia. El género Leucon se resolvió parafilético mientras que el subgénero Crymoleucon fue monofilético. Además, se analizó la estructura genética de dos especies de leucónidos Leucon antarcticus Zimmer, 1907 y L. intermedius Mühlenhardt-Siegel, 1996 del mar de Weddell y el mar de Ross respectivamente. Ambas especies mostraron diferentes patrones de variabilidad genética intraespecífica. A diferencia de L. intermedius, para L. antarcticus se observó una distribución bimodal del mismatch distribution, que se correlaciona con las distribuciones geográficas y de profundidad entre el mar de Ross y el mar de Weddell. Aunque una evaluación clara de la especiación críptica en estas especies requiere trabajo adicional con más especímenes de más regiones geográficas y rangos de profundidad más amplios, las diferencias que se muestran en las secuencias del rDNA 16S solo pueden explicarse por la separación genética de poblaciones entre el mar de Weddell y el mar de Ross durante un período de tiempo prolongad

    Precise Image Generation on Current Noisy Quantum Computing Devices

    Full text link
    The Quantum Angle Generator (QAG) is a new full Quantum Machine Learning model designed to generate accurate images on current Noise Intermediate Scale (NISQ) Quantum devices. Variational quantum circuits form the core of the QAG model, and various circuit architectures are evaluated. In combination with the so-called MERA-upsampling architecture, the QAG model achieves excellent results, which are analyzed and evaluated in detail. To our knowledge, this is the first time that a quantum model has achieved such accurate results. To explore the robustness of the model to noise, an extensive quantum noise study is performed. In this paper, it is demonstrated that the model trained on a physical quantum device learns the noise characteristics of the hardware and generates outstanding results. It is verified that even a quantum hardware machine calibration change during training of up to 8% can be well tolerated. For demonstration, the model is employed in indispensable simulations in high energy physics required to measure particle energies and, ultimately, to discover unknown particles at the Large Hadron Collider at CERN

    Restoration of 1325 teeth with partial-coverage crowns manufactured from high noble metal alloys: a retrospective case series 18.8 years after prosthetic delivery

    Get PDF
    Objectives: To evaluate long-term survival and success rates of conventionally cemented partial-coverage crowns (PCCs) manufactured from high noble metal alloys (hn). Material and methods: Restoration-, periodontal- and tooth-related criteria on patients, restored with a single or multiple conventionally cemented hnPCCs in a private dental office were collected from existing patient records. With regard to semi-annual follow-ups, data of the most recent clinical evaluations were considered. Kaplan-Meier and log-rank tests were used for statistical analyses. Level of significance was set at p <= .05. Results: Between 09/1983 and 09/2009, 1325 hnPCCs were conventionally cemented on 1325 teeth in 266 patients (mean age: 44.5 +/- 10.7 years). Due to various reasons, 81 hnPCCs showed complications, documenting a success rate of 93.9% after a mean observation period of 18.8 +/- 5.7 years. Of these, additional 14 restorations were counted as survival, resulting in a survival rate of 94.9%. Most frequent complications were periodontal issues (n = 29, 35.8%). Significantly higher success rates were documented for hnPCCs of patients aged between 37 and 51 years (p = .012). Conclusion: Partial-coverage crowns from high noble metal alloys showed excellent survival and success rates after a mean observation period of 18.8 +/- 5.7 years. Higher patient age was one of the risk factors. Clinical relevance: According to the results of this study, hnPCCs still represent an excellent therapeutic option-even in modern dentistry

    Reduced Precision Strategies for Deep Learning: A High Energy Physics Generative Adversarial Network Use Case

    Full text link
    Deep learning is finding its way into high energy physics by replacing traditional Monte Carlo simulations. However, deep learning still requires an excessive amount of computational resources. A promising approach to make deep learning more efficient is to quantize the parameters of the neural networks to reduced precision. Reduced precision computing is extensively used in modern deep learning and results to lower execution inference time, smaller memory footprint and less memory bandwidth. In this paper we analyse the effects of low precision inference on a complex deep generative adversarial network model. The use case which we are addressing is calorimeter detector simulations of subatomic particle interactions in accelerator based high energy physics. We employ the novel Intel low precision optimization tool (iLoT) for quantization and compare the results to the quantized model from TensorFlow Lite. In the performance benchmark we gain a speed-up of 1.73x on Intel hardware for the quantized iLoT model compared to the initial, not quantized, model. With different physics-inspired self-developed metrics, we validate that the quantized iLoT model shows a lower loss of physical accuracy in comparison to the TensorFlow Lite model.Comment: Submitted at ICPRAM 2021; from CERN openlab - Intel collaboratio

    Comparison of 6 % hydroxyethyl starch and 5 % albumin for volume replacement therapy in patients undergoing cystectomy (CHART): study protocol for a randomized controlled trial

    Get PDF
    Background The use of artificial colloids is currently controversial, especially in Central Europe Several studies demonstrated a worse outcome in intensive care unit patients with the use of hydroxyethyl starch. This recently even led to a drug warning about use of hydroxyethyl starch products in patients admitted to the intensive care unit. The data on hydroxyethyl starch in non–critically ill patients are insufficient to support perioperative use. Methods/Design We are conducting a single-center, open-label, randomized, comparative trial with two parallel patient groups to compare human albumin 5 % (test drug) with hydroxyethyl starch 6 % 130/0.4 (comparator). The primary endpoint is cystatin C ratio, calculated as the ratio of the cystatin value at day 90 after surgery relative to the preoperative value. Secondary objectives are inter alia the evaluation of the influence of human albumin and hydroxyethyl starch on further laboratory chemical and clinical parameters, glycocalyx shedding, intensive care unit and hospital stay and acute kidney injury as defined by RIFLE criteria (risk of renal dysfunction, injury to the kidney, failure of kidney function, loss of kidney function, and end-stage kidney disease) criteria. Discussion There is a general lack of evidence on the relative safety and effects of hydroxyethyl starch compared with human albumin for volume replacement in a perioperative setting. Previously conducted studies of surgical patients in which researchers have compared different hydroxyethyl starch products included too few patients to properly evaluate clinical important outcomes such as renal function. In the present study in a high-risk patient population undergoing a major surgical intervention, we will determine if perioperative fluid replacement with human albumin 5 % will have a long-term advantage over a third-generation hydroxyethyl starch 130/0.4 on the progression of renal dysfunction until 90 days after surgery

    The Meta-data-Database of a Next Generation Sustainability Web-Platform for Language Resources

    Get PDF
    Our goal is to provide a web-based platform for the long-term preservation and distribution of a heterogeneous collection of linguistic resources. We discuss the corpus preprocessing and normalisation phase that results in sets of multi-rooted trees. At the same time we transform the original metadata records, just like the corpora annotated using different annotation approaches and exhibiting different levels of granularity, into the all-encompassing and highly flexible format eTEI for which we present editing and parsing tools. We also discuss the architecture of the sustainability platform. Its primary components are an XML database that contains corpus and metadata files and an SQL database that contains user accounts and access control lists. A staging area, whose structure, contents, and consistency can be checked using tools, is used to make sure that new resources about to be imported into the platform have the correct structure
    corecore