1,207 research outputs found
Towards the 3D-Imaging of Sources
Geometric details of a nuclear reaction zone, at the time of particle
emission, can be restored from low relative-velocity particle-correlations,
following imaging. Some of the source details get erased and are a potential
cause of problems in the imaging, in the form of instabilities. These can be
coped with by following the method of discretized optimization for the restored
sources. So far it has been possible to produce 1-dimensional emission source
images, corresponding to the reactions averaged over all possible spatial
directions. Currently, efforts are in progress to restore angular details.Comment: Talk given at the Int. Workshop on Hot and Dense Matter in
Relativistic Heavy Ion Collisions, March 24-27, 2004, Budapest; 10 pages, 6
figure
Exploring Lifetime Effects in Femtoscopy
We investigate the role of lifetime effects from resonances and emission
duration tails in femtoscopy at RHIC in two Blast-Wave models. We find the
non-Gaussian components compare well with published source imaged data, but the
value of R_out obtained from Gaussian fits is not insensitive to the
non-Gaussian contributions when realistic acceptance cuts are applied to
models.Comment: 5 pages, 2 figure
Recommended from our members
Ferrenberg Swendsen Analysis of LLNL and NYBlue BG/L p4rhms Data
These results are from the continuing Lattice Quantum Chromodynamics runs on BG/L. These results are from the Ferrenberg-Swendsen analysis [?] of the combined data from LLNL and NYBlue BG/L runs for 32{sup 3} x 8 runs with the p4rhmc v2.0 QMP-MPI.X (semi-optimized p4 code using qmp over mpi). The jobs include beta values ranging from 3.525 to 3.535 with an alternate analysis extending to 3.540. The NYBlue data sets are from 9k trajectories from Oct 2007, and the LLNL data are from two independent streams of {approx}5k each, taking from the July 2007 runs. The following outputs are produced by the fs-2+1-chiub.c program. All outputs have had checksums produced by addCks.pl and checked by the checkCks.pl perl script after scanning
Kaon and production vs Participants in Nuclear Collisions
Data on kaon and production in nuclear collisions as a function of
centrality are analysed both at AGS and SPS energy range. We compare the
results of several experiments, looking for common trend in `participant
scaling' of production yields. We find a smooth description of scaled kaon and
yields as a function of participant density. We also show a participant
density dependence of kaons and produced in the forward hemisphere for
proton-nucleus collisions.Comment: Proceedings of the International Conference on Strangeness in Quark
Matter, 20-25 July 2000, Berkeley, CA. To appear in Journal of Physics G:
Nuclear and Particle Physic
Recommended from our members
Review of Systematic Investigations of the Rout/Rside ratio in HBT at RHIC
We review the significant difference in the ratio R{sub out}/R{sub side} between experiment and theory in heavy-ion collisions at RHIC. This ratio is expected to be strongly correlated with the pion emission duration. Hydrodynamic models typically calculate a value that approximately equal to 1.5 and moderately dependent on k{sub T} whereas the experiments report a value close to unity and independent of k{sub T}. We review those calculations in which systematic variations in the theoretical assumptions were reported. We find that the scenario of second order phase transition or cross-over has been given insufficient attention, and may play an important role in resolving this discrepancy
Recommended from our members
LQCD Phase 1 Runs with P4RHMC
These results represent the first set of runs of 10 {beta} values ranging from 2000-7000 trajectories with the p4rhmc code. This initial run sequence spanned roughly 2-weeks in late January and Early February, 2007. To manage the submission of dependent jobs: subSet.pl--submits a set of dependent jobs for a single run; rmSet.pl--removes a set of dependent jobs in reverse order of submission; and statSet.pl--runs pstat command and prints parsed output along with directory contents. The results of running the statSet.pl command are printed for three different times during the start up the next sequence of runs using the milc code
Recommended from our members
First LQCD Physics Runs with MILC and P4RHMC
An initial series of physics LQCD runs were submitted to the BG/L science bank with the milc and p4rhmc. Both runs were for lattice dimensions of 32{sup 2} x 8. The p4 calculation was performed with v2.0 QMP{_}MPI.X (semioptomized p4 code using qmp over mpi) and milc v7.2, also using RHMC, but not specifically optimized for BlueGene. Calculations were performed along lines of constant physics, with the light quark masses 2-3 times their physics values and the strange quark mass set by m{sub ud} = 0.1m{sub s}. Job submissions was performed using the standard milc and p4 scripts provided on the ubgl cluster. Initial thermalized lattices for each code were also provided in this way. The only modifications for running on BG/L were to the directory names and the mT parameter which determines job durations (24 hrs on BG/L vs. 4 hrs on ubgl). The milc scripts were set to resubmit themselves 10 times, and the p4 scripts were submitted serially using the ''psub -d'' job dependency option. The runp4rhmc.tcsh could not be used to resubmit due to the 30m time limit imposed on interactive jobs. Most jobs were submitted to the smallest, 512 node partitions, but both codes could also run on the 1024 node partitions with a gain of only 30-50%. The majority of jobs ran without error. Stalled jobs were often indicative of a communication gap within a partition that LC was able to fix quickly. On some occasion a zero-length lattice file was deleted to allow jobs to restart successfully. Approximately 1000 trajectories were calculated for each beta value, see Table . The analysis was performed with the standard analysis scripts for each code, make{_}summary.pl for milc and analysis.tcsh for p4rhmc. All lattices, log files, and job submission scripts have been archived to permanent storage for subsequent analysis
Constraining the initial temperature and shear viscosity in a hybrid hydrodynamic model of =200 GeV Au+Au collisions using pion spectra, elliptic flow, and femtoscopic radii
A new framework for evaluating hydrodynamic models of relativistic heavy ion
collisions has been developed. This framework, a Comprehesive Heavy Ion Model
Evaluation and Reporting Algorithm (CHIMERA) has been implemented by augmenting
UVH 2+1D viscous hydrodynamic model with eccentricity fluctuations,
pre-equilibrium flow, and the Ultra-relativistic Quantum Molecular Dynamic
(UrQMD) hadronic cascade. A range of initial temperatures and shear viscosity
to entropy ratios were evaluated for four initial profiles, and
scaling with and without pre-equilibrium flow. The model results
were compared to pion spectra, elliptic flow, and femtoscopic radii from 200
GeV Au+Au collisions for the 0--20% centrality range.Two sets of initial
density profiles, scaling with pre-equilibrium flow and
scaling without were shown to provide a consistent description of all three
measurements.Comment: 21 pages, 32 figures, version 3 includes additional text for
clarification, division of figures into more manageable units, and placement
of chi-squared values in tables for ease of viewin
- …