376 research outputs found

    Synthesis Of Biodegradable Star-Shaped Polymers Using Bio-Based Polyols Via Ring-Opening Polymerization Of Β-Butyrolactone With Amido-Oxazolinate Zinc Catalysts

    Get PDF
    A new series of star-shaped and highly branched poly(β-hydroxybutyrates) (PHBs) with a distinct structure was synthesized by ring-opening polymerization (ROP) of β-butyrolactone (BBL) with amido-oxazolinate zinc catalysts. The ROP of BBL using multifunctional hydroxyl-terminated initiators is an efficient methodology that allowed the preparation of PHBs with not only well-controlled molecular weights (Mn) and narrow dispersity (Đ), but also well-defined end-functional groups. Furthermore, the modifications of star-shaped and highly branched PHBs resulted in various architectures with tailored properties for many applications particularly for the biomedical field. Star-shaped structure is constructed via a core-first approach, in which the multi-functional alcohols will serve as the initiator for ROP. Specifically, three-, four-, and multi-armed star polymers were obtained and thoroughly characterized by various techniques such as nuclear magnetic resonance (NMR) spectroscopy, gel permeation chromatography (GPC), and their thermal and mechanical properties were investigated by thermogravimetric analysis (TGA) and differential scanning calorimetry (DSC). These properties were compared with the linear PHBs since it is expected that branched architectures may have different features. Moreover, the impact of the arm numbers and the arm lengths was studied on thermal properties of the obtained star-shaped PHBs, which strongly depended upon their arm numbers and arm lengths

    Getting High: High Fidelity Simulation of High Granularity Calorimeters with High Speed

    Full text link
    Accurate simulation of physical processes is crucial for the success of modern particle physics. However, simulating the development and interaction of particle showers with calorimeter detectors is a time consuming process and drives the computing needs of large experiments at the LHC and future colliders. Recently, generative machine learning models based on deep neural networks have shown promise in speeding up this task by several orders of magnitude. We investigate the use of a new architecture -- the Bounded Information Bottleneck Autoencoder -- for modelling electromagnetic showers in the central region of the Silicon-Tungsten calorimeter of the proposed International Large Detector. Combined with a novel second post-processing network, this approach achieves an accurate simulation of differential distributions including for the first time the shape of the minimum-ionizing-particle peak compared to a full GEANT4 simulation for a high-granularity calorimeter with 27k simulated channels. The results are validated by comparing to established architectures. Our results further strengthen the case of using generative networks for fast simulation and demonstrate that physically relevant differential distributions can be described with high accuracy.Comment: 17 pages, 12 figure

    New Angles on Fast Calorimeter Shower Simulation

    Full text link
    The demands placed on computational resources by the simulation requirements of high energy physics experiments motivate the development of novel simulation tools. Machine learning based generative models offer a solution that is both fast and accurate. In this work we extend the Bounded Information Bottleneck Autoencoder (BIB-AE) architecture, designed for the simulation of particle showers in highly granular calorimeters, in two key directions. First, we generalise the model to a multi-parameter conditioning scenario, while retaining a high degree of physics fidelity. In a second step, we perform a detailed study of the effect of applying a state-of-the-art particle flow-based reconstruction procedure to the generated showers. We demonstrate that the performance of the model remains high after reconstruction. These results are an important step towards creating a more general simulation tool, where maintaining physics performance after reconstruction is the ultimate target.Comment: 26 pages, 19 figure

    Les temps de la consultation du comité d’entreprise

    Get PDF
    The DD4HEP detector description toolkit offers a flexible and easy-to-use solution for the consistent and complete description of particle physics detectors in a single system. The sub-component DDREC provides a dedicated interface to the detector geometry as needed for event reconstruction. With DDREC there is no need to define an additional, separate reconstruction geometry as is often done in HEP, but one can transparently extend the existing detailed simulation model to be also used for the reconstruction. Based on the extension mechanism of DD4HEP, DDREC allows one to attach user defined data structures to detector elements at all levels of the geometry hierarchy. These data structures define a high level view onto the detectors describing their physical properties, such as measurement layers, point resolutions, and cell sizes. For the purpose of charged particle track reconstruction, dedicated surface objects can be attached to every volume in the detector geometry. These surfaces provide the measurement directions, local-to-global coordinate transformations, and material properties. The material properties, essential for the correct treatment of multiple scattering and energy loss effects in charged particle reconstruction, are automatically averaged from the detailed geometry model along the normal of the surface. Additionally, a generic interface allows the user to query material properties at any given point or between any two points in the detector's world volume. In this paper we will present DDREC and how it is used together with the linear collider tracking software and the particle-flow package PANDORAPFA for full event reconstruction of the ILC detector concepts ILD and SiD, and of CLICdp. This flexible tool chain is also well suited for other future accelerator projects such as FCC and CEPC

    Radio Galaxy Classification with wGAN-Supported Augmentation

    Full text link
    Novel techniques are indispensable to process the flood of data from the new generation of radio telescopes. In particular, the classification of astronomical sources in images is challenging. Morphological classification of radio galaxies could be automated with deep learning models that require large sets of labelled training data. Here, we demonstrate the use of generative models, specifically Wasserstein GANs (wGAN), to generate artificial data for different classes of radio galaxies. Subsequently, we augment the training data with images from our wGAN. We find that a simple fully-connected neural network for classification can be improved significantly by including generated images into the training set.Comment: 10 pages, 6 figures; accepted to ml.astro; v2: matches published versio

    LCIO: A persistency framework and event data model for HEP

    Full text link
    Abstract Not Provide

    HEP Community White Paper on Software trigger and event reconstruction

    Get PDF
    Realizing the physics programs of the planned and upgraded high-energy physics (HEP) experiments over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. For this reason, the HEP software community has engaged in a planning process over the past two years, with the objective of identifying and prioritizing the research and development required to enable the next generation of HEP detectors to fulfill their full physics potential. The aim is to produce a Community White Paper which will describe the community strategy and a roadmap for software and computing research and development in HEP for the 2020s. The topics of event reconstruction and software triggers were considered by a joint working group and are summarized together in this document.Comment: Editors Vladimir Vava Gligorov and David Lang

    A Roadmap for HEP Software and Computing R&D for the 2020s

    Get PDF
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe
    corecore