31,620 research outputs found

    The COMPASS Experiment at CERN

    Get PDF
    The COMPASS experiment makes use of the CERN SPS high-intensitymuon and hadron beams for the investigation of the nucleon spin structure and the spectroscopy of hadrons. One or more outgoing particles are detected in coincidence with the incoming muon or hadron. A large polarized target inside a superconducting solenoid is used for the measurements with the muon beam. Outgoing particles are detected by a two-stage, large angle and large momentum range spectrometer. The setup is built using several types of tracking detectors, according to the expected incident rate, required space resolution and the solid angle to be covered. Particle identification is achieved using a RICH counter and both hadron and electromagnetic calorimeters. The setup has been successfully operated from 2002 onwards using a muon beam. Data with a hadron beam were also collected in 2004. This article describes the main features and performances of the spectrometer in 2004; a short summary of the 2006 upgrade is also given.Comment: 84 papes, 74 figure

    Development of a GPU-based Monte Carlo dose calculation code for coupled electron-photon transport

    Full text link
    Monte Carlo simulation is the most accurate method for absorbed dose calculations in radiotherapy. Its efficiency still requires improvement for routine clinical applications, especially for online adaptive radiotherapy. In this paper, we report our recent development on a GPU-based Monte Carlo dose calculation code for coupled electron-photon transport. We have implemented the Dose Planning Method (DPM) Monte Carlo dose calculation package (Sempau et al, Phys. Med. Biol., 45(2000)2263-2291) on GPU architecture under CUDA platform. The implementation has been tested with respect to the original sequential DPM code on CPU in phantoms with water-lung-water or water-bone-water slab geometry. A 20 MeV mono-energetic electron point source or a 6 MV photon point source is used in our validation. The results demonstrate adequate accuracy of our GPU implementation for both electron and photon beams in radiotherapy energy range. Speed up factors of about 5.0 ~ 6.6 times have been observed, using an NVIDIA Tesla C1060 GPU card against a 2.27GHz Intel Xeon CPU processor.Comment: 13 pages, 3 figures, and 1 table. Paper revised. Figures update

    Infrastructure for Detector Research and Development towards the International Linear Collider

    Full text link
    The EUDET-project was launched to create an infrastructure for developing and testing new and advanced detector technologies to be used at a future linear collider. The aim was to make possible experimentation and analysis of data for institutes, which otherwise could not be realized due to lack of resources. The infrastructure comprised an analysis and software network, and instrumentation infrastructures for tracking detectors as well as for calorimetry.Comment: 54 pages, 48 picture

    PPF - A Parallel Particle Filtering Library

    Full text link
    We present the parallel particle filtering (PPF) software library, which enables hybrid shared-memory/distributed-memory parallelization of particle filtering (PF) algorithms combining the Message Passing Interface (MPI) with multithreading for multi-level parallelism. The library is implemented in Java and relies on OpenMPI's Java bindings for inter-process communication. It includes dynamic load balancing, multi-thread balancing, and several algorithmic improvements for PF, such as input-space domain decomposition. The PPF library hides the difficulties of efficient parallel programming of PF algorithms and provides application developers with the necessary tools for parallel implementation of PF methods. We demonstrate the capabilities of the PPF library using two distributed PF algorithms in two scenarios with different numbers of particles. The PPF library runs a 38 million particle problem, corresponding to more than 1.86 GB of particle data, on 192 cores with 67% parallel efficiency. To the best of our knowledge, the PPF library is the first open-source software that offers a parallel framework for PF applications.Comment: 8 pages, 8 figures; will appear in the proceedings of the IET Data Fusion & Target Tracking Conference 201

    Performance of the EUDET-type beam telescopes

    Full text link
    Test beam measurements at the test beam facilities of DESY have been conducted to characterise the performance of the EUDET-type beam telescopes originally developed within the EUDET project. The beam telescopes are equipped with six sensor planes using MIMOSA26 monolithic active pixel devices. A programmable Trigger Logic Unit provides trigger logic and time stamp information on particle passage. Both data acquisition framework and offline reconstruction software packages are available. User devices are easily integrable into the data acquisition framework via predefined interfaces. The biased residual distribution is studied as a function of the beam energy, plane spacing and sensor threshold. Its standard deviation at the two centre pixel planes using all six planes for tracking in a 6\,GeV electron/positron-beam is measured to be (2.88\,\pm\,0.08)\,\upmu\meter.Iterative track fits using the formalism of General Broken Lines are performed to estimate the intrinsic resolution of the individual pixel planes. The mean intrinsic resolution over the six sensors used is found to be (3.24\,\pm\,0.09)\,\upmu\meter.With a 5\,GeV electron/positron beam, the track resolution halfway between the two inner pixel planes using an equidistant plane spacing of 20\,mm is estimated to (1.83\,\pm\,0.03)\,\upmu\meter assuming the measured intrinsic resolution. Towards lower beam energies the track resolution deteriorates due to increasing multiple scattering. Threshold studies show an optimal working point of the MIMOSA26 sensors at a sensor threshold of between five and six times their RMS noise. Measurements at different plane spacings are used to calibrate the amount of multiple scattering in the material traversed and allow for corrections to the predicted angular scattering for electron beams

    Fast Monte Carlo Simulation for Patient-specific CT/CBCT Imaging Dose Calculation

    Full text link
    Recently, X-ray imaging dose from computed tomography (CT) or cone beam CT (CBCT) scans has become a serious concern. Patient-specific imaging dose calculation has been proposed for the purpose of dose management. While Monte Carlo (MC) dose calculation can be quite accurate for this purpose, it suffers from low computational efficiency. In response to this problem, we have successfully developed a MC dose calculation package, gCTD, on GPU architecture under the NVIDIA CUDA platform for fast and accurate estimation of the x-ray imaging dose received by a patient during a CT or CBCT scan. Techniques have been developed particularly for the GPU architecture to achieve high computational efficiency. Dose calculations using CBCT scanning geometry in a homogeneous water phantom and a heterogeneous Zubal head phantom have shown good agreement between gCTD and EGSnrc, indicating the accuracy of our code. In terms of improved efficiency, it is found that gCTD attains a speed-up of ~400 times in the homogeneous water phantom and ~76.6 times in the Zubal phantom compared to EGSnrc. As for absolute computation time, imaging dose calculation for the Zubal phantom can be accomplished in ~17 sec with the average relative standard deviation of 0.4%. Though our gCTD code has been developed and tested in the context of CBCT scans, with simple modification of geometry it can be used for assessing imaging dose in CT scans as well.Comment: 18 pages, 7 figures, and 1 tabl
    • …
    corecore