1,365 research outputs found

    Jet properties from di-hadron correlations in p+p collisions at s**(1/2) = 200-GeV

    Full text link
    An analysis of high pT hadron spectra associated with high pT π0\pi^0 particles in p+p collisions at s**(1/2) = 200-GeV is presented. The shape of the azimuthal angular correlation is used to determine the value of partonic intrinsic momentum \sqrt{\left} = 2.68 \pm 0.07(\rm stat) \pm 0.34(\rm sys) GeV/c. The effect of kT-smearing of inclusive π0\pi^0 cross section is discussed.Comment: To appear in the proceedings of 2nd International Conference on Hard and Electromagnetic Probes of High-Energy Nuclear Collisions (Hard Probes 2006), Asilomar, Pacific Grove, California, 9-16 Jun 200

    The Google Settlement One Year Later

    Get PDF

    Collisional energy loss and the suppression of high pTp_T hadrons

    Get PDF
    We calculate nuclear suppression factor (RAAR_{AA}) for light hadrons by taking only the elastic processes and argue that in the measured pTp_T domain of RHIC, collisional rather than the radiative processes is the dominant mechanism for partonic energy loss.Comment: Presented at the international conference on strong and electroweak matter 2006, May 10-13, Brookhaven National Laborator

    STAR inner tracking upgrade - A performance study

    Get PDF
    Anisotropic flow measurements have demonstrated development of partonic collectivity in 200GeV200\mathrm{GeV} Au+Au collisions at RHIC. To understand the partonic EOS, thermalization must be addressed. Collective motion of heavy-flavor (c,b) quarks can be used to indicate the degree of thermalization of the light-flavor quarks (u,d,s). Measurement of heavy-flavor quark collectivity requires direct reconstruction of heavy-flavor hadrons in the low \pt region. Measurement of open charm spectra to high \pt can be used to investigate heavy-quark energy loss and medium properties. The Heavy Flavor Tracker (HFT), a proposed upgrade to the STAR experiment at midrapidity, will measure v2v_{2} of open-charm hadrons to very low \pt by reconstructing their displaced decay vertices. The innermost part of the HFT is the PIXEL detector (made of two low mass monolithic active pixel sensor layers), which delivers a high precision position measurement close to the collision vertex. The Intermediate Silicon Tracker (IST), a 1-layer strip detector, is essential to improve hit identification in the PIXEL detector when running at full RHIC-II luminosity. Using a full GEANT simulation, open charm measurement capabilities of STAR with the HFT will be shown. Its performance in a broad \pt range will be demonstrated on v2v_{2} (\pt > 0.5\mathrm{GeV}/c) and RCPR_\mathrm{CP} (\pt < 10\mathrm{GeV}/c) measurements of \D meson. Results of reconstruction of \Lc baryon in heavy-ion collisions are presented.Comment: to appear in EPJ C (Hot Quarks 2008 conference volume

    EDS tomographic reconstruction regularized by total nuclear variation joined with HAADF-STEM tomography

    Get PDF
    Energy-dispersive X-ray spectroscopy (EDS) tomography is an advanced technique to characterize compositional information for nanostructures in three dimensions (3D). However, the application is hindered by the poor image quality caused by the low signal-to-noise ratios and the limited number of tilts, which are fundamentally limited by the insufficient number of X-ray counts. In this paper, we explore how to make accurate EDS reconstructions from such data. We propose to augment EDS tomography by joining with it a more accurate high-angle annular dark-field STEM (HAADF-STEM) tomographic reconstruction, for which usually a larger number of tilt images are feasible. This augmentation is realized through total nuclear variation (TNV) regularization, which encourages the joint EDS and HAADF reconstructions to have not only sparse gradients but also common edges and parallel (or antiparallel) gradients. Our experiments show that reconstruction images are more accurate compared to the non-regularized and the total variation regularized reconstructions, even when the number of tilts is small or the X-ray counts are low

    Improving ICD-based semantic similarity by accounting for varying degrees of comorbidity

    Full text link
    Finding similar patients is a common objective in precision medicine, facilitating treatment outcome assessment and clinical decision support. Choosing widely-available patient features and appropriate mathematical methods for similarity calculations is crucial. International Statistical Classification of Diseases and Related Health Problems (ICD) codes are used worldwide to encode diseases and are available for nearly all patients. Aggregated as sets consisting of primary and secondary diagnoses they can display a degree of comorbidity and reveal comorbidity patterns. It is possible to compute the similarity of patients based on their ICD codes by using semantic similarity algorithms. These algorithms have been traditionally evaluated using a single-term expert rated data set. However, real-word patient data often display varying degrees of documented comorbidities that might impair algorithm performance. To account for this, we present a scale term that considers documented comorbidity-variance. In this work, we compared the performance of 80 combinations of established algorithms in terms of semantic similarity based on ICD-code sets. The sets have been extracted from patients with a C25.X (pancreatic cancer) primary diagnosis and provide a variety of different combinations of ICD-codes. Using our scale term we yielded the best results with a combination of level-based information content, Leacock & Chodorow concept similarity and bipartite graph matching for the set similarities reaching a correlation of 0.75 with our expert's ground truth. Our results highlight the importance of accounting for comorbidity variance while demonstrating how well current semantic similarity algorithms perform.Comment: 11 pages, 6 figures, 1 tabl
    • …
    corecore