188 research outputs found

    Value Stream Mapping and Simulation Modelling for Healthcare Transactional Process Improvement

    Get PDF
    Lean management philosophy was originated in Japan from the Toyota production system. The main idea is to determine and eliminate waste. The concept of end-to-end value allows organizations to achieve competitive advantage through best quality product and services through minimum operational cost. These days there is more to be achieved by applying lean to services and transactional processes floors. Lean facilitators are facing challenges when trying to transform an organization to be a lean enterprise because it is possible in production systems, but that is not easier in the services and transactional sectors, which means there are challenges that should be considered. Some of the challenges for the service sector are; complex and mixed value streams, information and people are processed instead of parts and human interaction is a major part of the service sector

    Design and construction of the MicroBooNE Cosmic Ray Tagger system

    Get PDF
    The MicroBooNE detector utilizes a liquid argon time projection chamber (LArTPC) with an 85 t active mass to study neutrino interactions along the Booster Neutrino Beam (BNB) at Fermilab. With a deployment location near ground level, the detector records many cosmic muon tracks in each beam-related detector trigger that can be misidentified as signals of interest. To reduce these cosmogenic backgrounds, we have designed and constructed a TPC-external Cosmic Ray Tagger (CRT). This sub-system was developed by the Laboratory for High Energy Physics (LHEP), Albert Einstein center for fundamental physics, University of Bern. The system utilizes plastic scintillation modules to provide precise time and position information for TPC-traversing particles. Successful matching of TPC tracks and CRT data will allow us to reduce cosmogenic background and better characterize the light collection system and LArTPC data using cosmic muons. In this paper we describe the design and installation of the MicroBooNE CRT system and provide an overview of a series of tests done to verify the proper operation of the system and its components during installation, commissioning, and physics data-taking

    A Deep Neural Network for Pixel-Level Electromagnetic Particle Identification in the MicroBooNE Liquid Argon Time Projection Chamber

    Full text link
    We have developed a convolutional neural network (CNN) that can make a pixel-level prediction of objects in image data recorded by a liquid argon time projection chamber (LArTPC) for the first time. We describe the network design, training techniques, and software tools developed to train this network. The goal of this work is to develop a complete deep neural network based data reconstruction chain for the MicroBooNE detector. We show the first demonstration of a network's validity on real LArTPC data using MicroBooNE collection plane images. The demonstration is performed for stopping muon and a νμ\nu_\mu charged current neutral pion data samples

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Get PDF
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 103 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora

    Get PDF
    The Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/c charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1 ± 0.6 % and 84.1 ± 0.6 %, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation
    • …
    corecore