410 research outputs found
Synthesis Of Biodegradable Star-Shaped Polymers Using Bio-Based Polyols Via Ring-Opening Polymerization Of Β-Butyrolactone With Amido-Oxazolinate Zinc Catalysts
A new series of star-shaped and highly branched poly(β-hydroxybutyrates) (PHBs) with a distinct structure was synthesized by ring-opening polymerization (ROP) of β-butyrolactone (BBL) with amido-oxazolinate zinc catalysts. The ROP of BBL using multifunctional hydroxyl-terminated initiators is an efficient methodology that allowed the preparation of PHBs with not only well-controlled molecular weights (Mn) and narrow dispersity (Đ), but also well-defined end-functional groups. Furthermore, the modifications of star-shaped and highly branched PHBs resulted in various architectures with tailored properties for many applications particularly for the biomedical field. Star-shaped structure is constructed via a core-first approach, in which the multi-functional alcohols will serve as the initiator for ROP. Specifically, three-, four-, and multi-armed star polymers were obtained and thoroughly characterized by various techniques such as nuclear magnetic resonance (NMR) spectroscopy, gel permeation chromatography (GPC), and their thermal and mechanical properties were investigated by thermogravimetric analysis (TGA) and differential scanning calorimetry (DSC). These properties were compared with the linear PHBs since it is expected that branched architectures may have different features. Moreover, the impact of the arm numbers and the arm lengths was studied on thermal properties of the obtained star-shaped PHBs, which strongly depended upon their arm numbers and arm lengths
Getting High: High Fidelity Simulation of High Granularity Calorimeters with High Speed
Accurate simulation of physical processes is crucial for the success of
modern particle physics. However, simulating the development and interaction of
particle showers with calorimeter detectors is a time consuming process and
drives the computing needs of large experiments at the LHC and future
colliders. Recently, generative machine learning models based on deep neural
networks have shown promise in speeding up this task by several orders of
magnitude. We investigate the use of a new architecture -- the Bounded
Information Bottleneck Autoencoder -- for modelling electromagnetic showers in
the central region of the Silicon-Tungsten calorimeter of the proposed
International Large Detector. Combined with a novel second post-processing
network, this approach achieves an accurate simulation of differential
distributions including for the first time the shape of the
minimum-ionizing-particle peak compared to a full GEANT4 simulation for a
high-granularity calorimeter with 27k simulated channels. The results are
validated by comparing to established architectures. Our results further
strengthen the case of using generative networks for fast simulation and
demonstrate that physically relevant differential distributions can be
described with high accuracy.Comment: 17 pages, 12 figure
CaloClouds II: Ultra-Fast Geometry-Independent Highly-Granular Calorimeter Simulation
Fast simulation of the energy depositions in high-granular detectors is
needed for future collider experiments with ever-increasing luminosities.
Generative machine learning (ML) models have been shown to speed up and augment
the traditional simulation chain in physics analysis. However, the majority of
previous efforts were limited to models relying on fixed, regular detector
readout geometries. A major advancement is the recently introduced CaloClouds
model, a geometry-independent diffusion model, which generates calorimeter
showers as point clouds for the electromagnetic calorimeter of the envisioned
International Large Detector (ILD).
In this work, we introduce CaloClouds II which features a number of key
improvements. This includes continuous time score-based modelling, which allows
for a 25-step sampling with comparable fidelity to CaloClouds while yielding a
speed-up over Geant4 on a single CPU ( over CaloClouds). We
further distill the diffusion model into a consistency model allowing for
accurate sampling in a single step and resulting in a (
over CaloClouds) speed-up. This constitutes the first application of
consistency distillation for the generation of calorimeter showers.Comment: 30 pages, 7 figures, 3 tables, code available at
https://github.com/FLC-QU-hep/CaloClouds-
New Angles on Fast Calorimeter Shower Simulation
The demands placed on computational resources by the simulation requirements
of high energy physics experiments motivate the development of novel simulation
tools. Machine learning based generative models offer a solution that is both
fast and accurate. In this work we extend the Bounded Information Bottleneck
Autoencoder (BIB-AE) architecture, designed for the simulation of particle
showers in highly granular calorimeters, in two key directions. First, we
generalise the model to a multi-parameter conditioning scenario, while
retaining a high degree of physics fidelity. In a second step, we perform a
detailed study of the effect of applying a state-of-the-art particle flow-based
reconstruction procedure to the generated showers. We demonstrate that the
performance of the model remains high after reconstruction. These results are
an important step towards creating a more general simulation tool, where
maintaining physics performance after reconstruction is the ultimate target.Comment: 26 pages, 19 figure
Les temps de la consultation du comité d’entreprise
The DD4HEP detector description toolkit offers a flexible and easy-to-use solution for the consistent and complete description of particle physics detectors in a single system. The sub-component DDREC provides a dedicated interface to the detector geometry as needed for event reconstruction. With DDREC there is no need to define an additional, separate reconstruction geometry as is often done in HEP, but one can transparently extend the existing detailed simulation model to be also used for the reconstruction. Based on the extension mechanism of DD4HEP, DDREC allows one to attach user defined data structures to detector elements at all levels of the geometry hierarchy. These data structures define a high level view onto the detectors describing their physical properties, such as measurement layers, point resolutions, and cell sizes. For the purpose of charged particle track reconstruction, dedicated surface objects can be attached to every volume in the detector geometry. These surfaces provide the measurement directions, local-to-global coordinate transformations, and material properties. The material properties, essential for the correct treatment of multiple scattering and energy loss effects in charged particle reconstruction, are automatically averaged from the detailed geometry model along the normal of the surface. Additionally, a generic interface allows the user to query material properties at any given point or between any two points in the detector's world volume. In this paper we will present DDREC and how it is used together with the linear collider tracking software and the particle-flow package PANDORAPFA for full event reconstruction of the ILC detector concepts ILD and SiD, and of CLICdp. This flexible tool chain is also well suited for other future accelerator projects such as FCC and CEPC
Radio Galaxy Classification with wGAN-Supported Augmentation
Novel techniques are indispensable to process the flood of data from the new
generation of radio telescopes. In particular, the classification of
astronomical sources in images is challenging. Morphological classification of
radio galaxies could be automated with deep learning models that require large
sets of labelled training data. Here, we demonstrate the use of generative
models, specifically Wasserstein GANs (wGAN), to generate artificial data for
different classes of radio galaxies. Subsequently, we augment the training data
with images from our wGAN. We find that a simple fully-connected neural network
for classification can be improved significantly by including generated images
into the training set.Comment: 10 pages, 6 figures; accepted to ml.astro; v2: matches published
versio
LCIO: A persistency framework and event data model for HEP
Abstract Not Provide
Towards podio v1.0 - A first stable release of the EDM toolkit
A performant and easy-to-use event data model (EDM) is a key component of any HEP software stack. The podio EDM toolkit provides a user friendly way of generating such a performant implementation in C++ from a high level description in yaml format. Finalizing a few important developments, we are in the final stretches for release v1.0 of podio, a stable release with backward compatibility for datafiles written with podio from then on. We present an overview of the podio basics, and go into slighty more technical detail on the most important topics and developments. These include: schema evolution for generated EDMs, multithreading with podio generated EDMs, the implementation of them as well as the basics of I/O. Using EDM4hep, the common and shared EDM of the Key4hep project, we highlight a few of the smaller features in action as well as some lessons learned during the development of EDM4hep and podio. Finally, we show how podio has been integrated into the Gaudi based event processing framework that is used by Key4hep, before we conclude with a brief outlook on potential developments after v1.0
HEP Community White Paper on Software trigger and event reconstruction
Realizing the physics programs of the planned and upgraded high-energy
physics (HEP) experiments over the next 10 years will require the HEP community
to address a number of challenges in the area of software and computing. For
this reason, the HEP software community has engaged in a planning process over
the past two years, with the objective of identifying and prioritizing the
research and development required to enable the next generation of HEP
detectors to fulfill their full physics potential. The aim is to produce a
Community White Paper which will describe the community strategy and a roadmap
for software and computing research and development in HEP for the 2020s. The
topics of event reconstruction and software triggers were considered by a joint
working group and are summarized together in this document.Comment: Editors Vladimir Vava Gligorov and David Lang
- …