16 research outputs found
A search for strange quark matter in the -0.75T field setting of E864
Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Physics, 1998.Vita.Includes bibliographical references (p. 170-172).E864 was designed as a high sensitivity search for strange quark matter produced in heavy ion collisions at the AGS accelerator at Brookhaven National Laboratory. In this thesis, we analyze data taken at the -0.75T field setting of the experiment, which is optimal for finding negatively charged strangelet candidates, from Au + Pt collisions at 11.6 GeV/c per nucleon. A measurement of the antiproton invariant multiplicities is made and is in reasonable agreement with previous E864 antiproton measurements. No conclusive strangelet candidates are found, and upper limits are set on the production of charge -1 and charge -2 strangelets of approximately 1 x 10- 8 and 4 x 10- 9 per 10% central collision respectively. These represent world's best limits to date.by Gene Edward Van Buren.Ph.D
Negatively Charged Strangelet Search using the E864 Spectrometer at the AGS
We provide a status report on the progress of searching for negatively
charged strangelets using the E864 spectrometer at the AGS. About 200 million
recorded events representing approximately 14 billion 10% central interactions
of Au + Pt at 11.5 GeV/c taken during the 1996-1997 run of the experiment are
used in the analysis. No strangelet candidates are seen for charges Z=-1 and
Z=-2, corresponding to a 90% confidence level for upper limits of strangelet
production of ~1 x 10^{-8} and ~4 x 10^{-9} per central collision respectively.
The limits are nearly uniform over a wide range of masses and are valid only
for strangelets which are stable or have lifetimes greater than ~50 ns.Comment: 6 pages, 4 figures; Talk at SQM'98, Padova, Italy (July 20-24, 1998
Spectra and radial flow at RHIC with Tsallis statistics in a Blast-Wave description
We have implemented the Tsallis statistics in a Blast-Wave model and applied
it to mid-rapidity transverse-momentum spectra of identified particles measured
at RHIC. This new Tsallis Blast-Wave function fits the RHIC data very well for
3 GeV/. We observed that the collective flow velocity starts from zero
in p+p and peripheral Au+Au collisions growing to 0.470 0.009() in
central Au+Au collisions. The parameter, which characterizes the degree
of non-equilibrium in a system, changes from in p+p to
in central Au+Au collisions, indicating an evolution from a
highly non-equilibrated system in p+p collisions toward an almost thermalized
system in central Au+Au collisions. The temperature and collective velocity are
well described by a quadratic dependence on . Two sets of parameters in
our Tsallis Blast-Wave model are required to describe the meson and baryon
groups separately in p+p collisions while one set of parameters appears to fit
all spectra in central Au+Au collisions.Comment: 6 pages, 3 figures; update text and reference
Improving the dE/dx calibration of the STAR TPC for the high-pT hadron identification
We derive a method to improve particle identification (PID) at high
transverse momentum () using the relativistic rise of the ionization
energy loss () when charged particles traverse the Time Projection
Chamber (TPC) at STAR. Electrons triggered and identified by the Barrel
Electro-Magnetic Calorimeter (BEMC), pure protons and pions from (), and
decays are used to obtain the value and
its width at given . We found that the deviation of the
from the Bichsel function can be up to () in p+p
collisions at GeV taken and subsequently calibrated in year
2005. The deviation is approximately a function of independent of
particle species and can be described with a function of . The deviations obtained with this method are used to
re-calibrate the data sample from p+p collision for physics analysis of
identified hadron spectra and their correlations up to transverse momentum of
15 GeV/. The ratio of (dominantly from -conversion) is
also used to correct the residual asymmetry in the negative and positive
charged hadrons due to momentun distortion in the STAR TPC.Comment: 18pages, 10 figure
The Habitable Exoplanet Observatory (HabEx) Mission Concept Study Final Report
The Habitable Exoplanet Observatory, or HabEx, has been designed to be the Great Observatory of the 2030s. For the first time in human history, technologies have matured sufficiently to enable an affordable space-based telescope mission capable of discovering and characterizing Earthlike planets orbiting nearby bright sunlike stars in order to search for signs of habitability and biosignatures. Such a mission can also be equipped with instrumentation that will enable broad and exciting general astrophysics and planetary science not possible from current or planned facilities. HabEx is a space telescope with unique imaging and multi-object spectroscopic capabilities at wavelengths ranging from ultraviolet (UV) to near-IR. These capabilities allow for a broad suite of compelling science that cuts across the entire NASA astrophysics portfolio. HabEx has three primary science goals: (1) Seek out nearby worlds and explore their habitability; (2) Map out nearby planetary systems and understand the diversity of the worlds they contain; (3) Enable new explorations of astrophysical systems from our own solar system to external galaxies by extending our reach in the UV through near-IR. This Great Observatory science will be selected through a competed GO program, and will account for about 50% of the HabEx primary mission. The preferred HabEx architecture is a 4m, monolithic, off-axis telescope that is diffraction-limited at 0.4 microns and is in an L2 orbit. HabEx employs two starlight suppression systems: a coronagraph and a starshade, each with their own dedicated instrument
The Habitable Exoplanet Observatory (HabEx) Mission Concept Study Final Report
The Habitable Exoplanet Observatory, or HabEx, has been designed to be the
Great Observatory of the 2030s. For the first time in human history,
technologies have matured sufficiently to enable an affordable space-based
telescope mission capable of discovering and characterizing Earthlike planets
orbiting nearby bright sunlike stars in order to search for signs of
habitability and biosignatures. Such a mission can also be equipped with
instrumentation that will enable broad and exciting general astrophysics and
planetary science not possible from current or planned facilities. HabEx is a
space telescope with unique imaging and multi-object spectroscopic capabilities
at wavelengths ranging from ultraviolet (UV) to near-IR. These capabilities
allow for a broad suite of compelling science that cuts across the entire NASA
astrophysics portfolio. HabEx has three primary science goals: (1) Seek out
nearby worlds and explore their habitability; (2) Map out nearby planetary
systems and understand the diversity of the worlds they contain; (3) Enable new
explorations of astrophysical systems from our own solar system to external
galaxies by extending our reach in the UV through near-IR. This Great
Observatory science will be selected through a competed GO program, and will
account for about 50% of the HabEx primary mission. The preferred HabEx
architecture is a 4m, monolithic, off-axis telescope that is
diffraction-limited at 0.4 microns and is in an L2 orbit. HabEx employs two
starlight suppression systems: a coronagraph and a starshade, each with their
own dedicated instrument.Comment: Full report: 498 pages. Executive Summary: 14 pages. More information
about HabEx can be found here: https://www.jpl.nasa.gov/habex
Dealing with High Background Rates in the STAR Heavy Flavor Tracker in Simulation: Embedding Simulation into Real Events
The STAR Heavy Flavor Tracker (HFT) has enabled a rich physics program, providing important insights into heavy quark behavior in heavy ion collisions. Acquiring data during the 2014 through 2016 runs at the Relativistic Heavy Ion Collider (RHIC), the HFT consisted of four layers of precision silicon sensors. Used in concert with the Time Projection Chamber (TPC), the HFT enables the reconstruction and topological identification of tracks arising from charmed hadron decays. The ultimate understanding of the detector efficiency and resolution demands large quantities of high quality simulations, accounting for the precise alignment of sensors, and the detailed response of the detectors and electronics to the incident tracks. The background environment presented additional challenges, as simulating the significant rates from pileup events accumulated during the long integration times of the tracking detectors could have quickly exceeded the available computational resources, and the relative contributions from different sources was unknown. STAR has long addressed these issues by embedding simulations into background events directly sampled during data taking at the experiment. This technique has the advantage of providing a completely realistic picture of the dynamic background environment while introducing minimal additional computational overhead compared to simulation of the primary collision alone, thus scaling to any luminosity. We will discuss how STAR has applied this technique to the simulation of the HFT, and will show how the careful consideration of misalignment of precision detectors and calibration uncertainties results in the detailed reproduction of basic observables, such as track projection to the primary vertex. We will further summarize the experience and lessons learned in applying these techniques to heavy-flavor simulations and discuss recent results
Dealing with High Background Rates in the STAR Heavy Flavor Tracker in Simulation: Embedding Simulation into Real Events
The STAR Heavy Flavor Tracker (HFT) has enabled a rich physics program, providing important insights into heavy quark behavior in heavy ion collisions. Acquiring data during the 2014 through 2016 runs at the Relativistic Heavy Ion Collider (RHIC), the HFT consisted of four layers of precision silicon sensors. Used in concert with the Time Projection Chamber (TPC), the HFT enables the reconstruction and topological identification of tracks arising from charmed hadron decays. The ultimate understanding of the detector efficiency and resolution demands large quantities of high quality simulations, accounting for the precise alignment of sensors, and the detailed response of the detectors and electronics to the incident tracks. The background environment presented additional challenges, as simulating the significant rates from pileup events accumulated during the long integration times of the tracking detectors could have quickly exceeded the available computational resources, and the relative contributions from different sources was unknown. STAR has long addressed these issues by embedding simulations into background events directly sampled during data taking at the experiment. This technique has the advantage of providing a completely realistic picture of the dynamic background environment while introducing minimal additional computational overhead compared to simulation of the primary collision alone, thus scaling to any luminosity. We will discuss how STAR has applied this technique to the simulation of the HFT, and will show how the careful consideration of misalignment of precision detectors and calibration uncertainties results in the detailed reproduction of basic observables, such as track projection to the primary vertex. We will further summarize the experience and lessons learned in applying these techniques to heavy-flavor simulations and discuss recent results