1,012 research outputs found
Towards the 3D-Imaging of Sources
Geometric details of a nuclear reaction zone, at the time of particle
emission, can be restored from low relative-velocity particle-correlations,
following imaging. Some of the source details get erased and are a potential
cause of problems in the imaging, in the form of instabilities. These can be
coped with by following the method of discretized optimization for the restored
sources. So far it has been possible to produce 1-dimensional emission source
images, corresponding to the reactions averaged over all possible spatial
directions. Currently, efforts are in progress to restore angular details.Comment: Talk given at the Int. Workshop on Hot and Dense Matter in
Relativistic Heavy Ion Collisions, March 24-27, 2004, Budapest; 10 pages, 6
figure
Exploring Lifetime Effects in Femtoscopy
We investigate the role of lifetime effects from resonances and emission
duration tails in femtoscopy at RHIC in two Blast-Wave models. We find the
non-Gaussian components compare well with published source imaged data, but the
value of R_out obtained from Gaussian fits is not insensitive to the
non-Gaussian contributions when realistic acceptance cuts are applied to
models.Comment: 5 pages, 2 figure
Kaon and production vs Participants in Nuclear Collisions
Data on kaon and production in nuclear collisions as a function of
centrality are analysed both at AGS and SPS energy range. We compare the
results of several experiments, looking for common trend in `participant
scaling' of production yields. We find a smooth description of scaled kaon and
yields as a function of participant density. We also show a participant
density dependence of kaons and produced in the forward hemisphere for
proton-nucleus collisions.Comment: Proceedings of the International Conference on Strangeness in Quark
Matter, 20-25 July 2000, Berkeley, CA. To appear in Journal of Physics G:
Nuclear and Particle Physic
Recommended from our members
Review of Systematic Investigations of the Rout/Rside ratio in HBT at RHIC
We review the significant difference in the ratio R{sub out}/R{sub side} between experiment and theory in heavy-ion collisions at RHIC. This ratio is expected to be strongly correlated with the pion emission duration. Hydrodynamic models typically calculate a value that approximately equal to 1.5 and moderately dependent on k{sub T} whereas the experiments report a value close to unity and independent of k{sub T}. We review those calculations in which systematic variations in the theoretical assumptions were reported. We find that the scenario of second order phase transition or cross-over has been given insufficient attention, and may play an important role in resolving this discrepancy
Recommended from our members
LQCD Phase 1 Runs with P4RHMC
These results represent the first set of runs of 10 {beta} values ranging from 2000-7000 trajectories with the p4rhmc code. This initial run sequence spanned roughly 2-weeks in late January and Early February, 2007. To manage the submission of dependent jobs: subSet.pl--submits a set of dependent jobs for a single run; rmSet.pl--removes a set of dependent jobs in reverse order of submission; and statSet.pl--runs pstat command and prints parsed output along with directory contents. The results of running the statSet.pl command are printed for three different times during the start up the next sequence of runs using the milc code
Recommended from our members
First LQCD Physics Runs with MILC and P4RHMC
An initial series of physics LQCD runs were submitted to the BG/L science bank with the milc and p4rhmc. Both runs were for lattice dimensions of 32{sup 2} x 8. The p4 calculation was performed with v2.0 QMP{_}MPI.X (semioptomized p4 code using qmp over mpi) and milc v7.2, also using RHMC, but not specifically optimized for BlueGene. Calculations were performed along lines of constant physics, with the light quark masses 2-3 times their physics values and the strange quark mass set by m{sub ud} = 0.1m{sub s}. Job submissions was performed using the standard milc and p4 scripts provided on the ubgl cluster. Initial thermalized lattices for each code were also provided in this way. The only modifications for running on BG/L were to the directory names and the mT parameter which determines job durations (24 hrs on BG/L vs. 4 hrs on ubgl). The milc scripts were set to resubmit themselves 10 times, and the p4 scripts were submitted serially using the ''psub -d'' job dependency option. The runp4rhmc.tcsh could not be used to resubmit due to the 30m time limit imposed on interactive jobs. Most jobs were submitted to the smallest, 512 node partitions, but both codes could also run on the 1024 node partitions with a gain of only 30-50%. The majority of jobs ran without error. Stalled jobs were often indicative of a communication gap within a partition that LC was able to fix quickly. On some occasion a zero-length lattice file was deleted to allow jobs to restart successfully. Approximately 1000 trajectories were calculated for each beta value, see Table . The analysis was performed with the standard analysis scripts for each code, make{_}summary.pl for milc and analysis.tcsh for p4rhmc. All lattices, log files, and job submission scripts have been archived to permanent storage for subsequent analysis
Constraining the initial temperature and shear viscosity in a hybrid hydrodynamic model of =200 GeV Au+Au collisions using pion spectra, elliptic flow, and femtoscopic radii
A new framework for evaluating hydrodynamic models of relativistic heavy ion
collisions has been developed. This framework, a Comprehesive Heavy Ion Model
Evaluation and Reporting Algorithm (CHIMERA) has been implemented by augmenting
UVH 2+1D viscous hydrodynamic model with eccentricity fluctuations,
pre-equilibrium flow, and the Ultra-relativistic Quantum Molecular Dynamic
(UrQMD) hadronic cascade. A range of initial temperatures and shear viscosity
to entropy ratios were evaluated for four initial profiles, and
scaling with and without pre-equilibrium flow. The model results
were compared to pion spectra, elliptic flow, and femtoscopic radii from 200
GeV Au+Au collisions for the 0--20% centrality range.Two sets of initial
density profiles, scaling with pre-equilibrium flow and
scaling without were shown to provide a consistent description of all three
measurements.Comment: 21 pages, 32 figures, version 3 includes additional text for
clarification, division of figures into more manageable units, and placement
of chi-squared values in tables for ease of viewin
Recommended from our members
Lattice QCD Thermodynamics First 5000 Trajectories
These results represent the first LQCD analysis for approximately 5000 trajectories with each of the p4rhmc and milc codes, with some of the lower temperature runs having fewer. Both runs were for lattice dimensions of 32{sup 3}x8. Some 32{sup 4} T=0 jobs were also run for p4rhmc. The p4 calculation was performed with v2.0 QMP{_}MPI.X (semi-optimized p4 code using qmp over mpi) and milc version of the su3 rhmc susc eos executable dated Mar 1, 2007 on ubgl in the /usr/gapps/hip/qcd/milc/bin subdirectory (svn revision 28). As with previous runs, calculations were performed along lines of constant physics, with the light quark masses 2-3 times their physics values and the strange quark mass set by m{sub ud} = 0.1m{sub s}. Job submissions were performed using a new subSet.pl job submission script that locates current jobs and submits additional jobs with the same beta value as pending. Note that after reaching a limit of about 35 jobs subsequent submissions are delayed and will not be submitted directly from that state. The job submission script was used to submit revised versions of the milc and p4rhmc csh scripts. Initial thermalized lattices for each code were also for milc (taken from the firstPhys runs), but the p4rhmc runs include thermalization. The only modifications for running on BG/L were to the directory names and the mT parameter which determines job durations (24 hrs on BG/L vs. 4 hrs on ubgl). All finite temperature jobs were submitted to the 512 node partitions, and all T=0 runs were submitted to 2048 node partitions. The set of runs was plagued by filesystem errors on lscratch1 and lscratcH{sub 2}. Many jobs had to be reset (deleting the most recent output file for milc and/or lattice for p4) and resubmitted. The analysis was performed with a new set of scripts that produced a more condensed output for scanning. All scans were verified with checksums, which have been retained in the output along with the line numbers. All lattices, log files, and job submission scripts have been archived to permanent storage in the 5k subdirectory for subsequent analysis
Recommended from our members
The BlueGene/L Supercomputer and Quantum ChromoDynamics
In summary our update contains: (1) Perfect speedup sustaining 19.3% of peak for the Wilson D D-slash Dirac operator. (2) Measurements of the full Conjugate Gradient (CG) inverter that inverts the Dirac operator. The CG inverter contains two global sums over the entire machine. Nevertheless, our measurements retain perfect speedup scaling demonstrating the robustness of our methods. (3) We ran on the largest BG/L system, the LLNL 64 rack BG/L supercomputer, and obtained a sustained speed of 59.1 TFlops. Furthermore, the speedup scaling of the Dirac operator and of the CG inverter are perfect all the way up to the full size of the machine, 131,072 cores (please see Figure II). The local lattice is rather small (4 x 4 x 4 x 16) while the total lattice has been a lattice QCD vision for thermodynamic studies (a total of 128 x 128 x 256 x 32 lattice sites). This speed is about five times larger compared to the speed we quoted in our submission. As we have pointed out in our paper QCD is notoriously sensitive to network and memory latencies, has a relatively high communication to computation ratio which can not be overlapped in BGL in virtual node mode, and as an application is in a class of its own. The above results are thrilling to us and a 30 year long dream for lattice QCD
- …