62 research outputs found
Spacetime Coarse Grainings in the Decoherent Histories Approach to Quantum Theory
We investigate the possibility of assigning consistent probabilities to sets
of histories characterized by whether they enter a particular subspace of the
Hilbert space of a closed system during a given time interval. In particular we
investigate the case that this subspace is a region of the configuration space.
This corresponds to a particular class of coarse grainings of spacetime
regions. We consider the arrival time problem and the problem of time in
reparametrization invariant theories as for example in canonical quantum
gravity. Decoherence conditions and probabilities for those application are
derived. The resulting decoherence condition does not depend on the explicit
form of the restricted propagator that was problematic for generalizations such
as application in quantum cosmology. Closely related is the problem of
tunnelling time as well as the quantum Zeno effect. Some interpretational
comments conclude, and we discuss the applicability of this formalism to deal
with the arrival time problem.Comment: 23 pages, Few changes and added references in v
Consistent histories, the quantum Zeno effect, and time of arrival
We present a decomposition of the general quantum mechanical evolution
operator, that corresponds to the path decomposition expansion, and interpret
its constituents in terms of the quantum Zeno effect (QZE). This decomposition
is applied to a finite dimensional example and to the case of a free particle
in the real line, where the possibility of boundary conditions more general
than those hitherto considered in the literature is shown. We reinterpret the
assignment of consistent probabilities to different regions of spacetime in
terms of the QZE. The comparison of the approach of consistent histories to the
problem of time of arrival with the solution provided by the probability
distribution of Kijowski shows the strength of the latter point of view
Time-of-arrival distributions from position-momentum and energy-time joint measurements
The position-momentum quasi-distribution obtained from an Arthurs and Kelly
joint measurement model is used to obtain indirectly an ``operational''
time-of-arrival (TOA) distribution following a quantization procedure proposed
by Kocha\'nski and W\'odkiewicz [Phys. Rev. A 60, 2689 (1999)]. This TOA
distribution is not time covariant. The procedure is generalized by using other
phase-space quasi-distributions, and sufficient conditions are provided for
time covariance that limit the possible phase-space quasi-distributions
essentially to the Wigner function, which, however, provides a non-positive TOA
quasi-distribution. These problems are remedied with a different quantization
procedure which, on the other hand, does not guarantee normalization. Finally
an Arthurs and Kelly measurement model for TOA and energy (valid also for
arbitrary conjugate variables when one of the variables is bounded from below)
is worked out. The marginal TOA distribution so obtained, a distorted version
of Kijowski's distribution, is time covariant, positive, and normalized
Spontaneous Creation of Inflationary Universes and the Cosmic Landscape
We study some gravitational instanton solutions that offer a natural
realization of the spontaneous creation of inflationary universes in the brane
world context in string theory. Decoherence due to couplings of higher
(perturbative) modes of the metric as well as matter fields modifies the
Hartle-Hawking wavefunction for de Sitter space. Generalizing this new
wavefunction to be used in string theory, we propose a principle in string
theory that hopefully will lead us to the particular vacuum we live in, thus
avoiding the anthropic principle. As an illustration of this idea, we give a
phenomenological analysis of the probability of quantum tunneling to various
stringy vacua. We find that the preferred tunneling is to an inflationary
universe (like our early universe), not to a universe with a very small
cosmological constant (i.e., like today's universe) and not to a 10-dimensional
uncompactified de Sitter universe. Such preferred solutions are interesting as
they offer a cosmological mechanism for the stabilization of extra dimensions
during the inflationary epoch.Comment: 52 pages, 7 figures, 1 table. Added discussion on supercritical
string vacua, added reference
SUSY Breaking and Moduli Stabilization from Fluxes in Gauged 6D Supergravity
We construct the 4D N=1 supergravity which describes the low-energy limit of
6D supergravity compactified on a sphere with a monopole background a la Salam
and Sezgin. This provides a simple setting sharing the main properties of
realistic string compactifications such as flat 4D spacetime, chiral fermions
and N=1 supersymmetry as well as Fayet-Iliopoulos terms induced by the
Green-Schwarz mechanism. The matter content of the resulting theory is a
supersymmetric SO(3)xU(1) gauge model with two chiral multiplets, S and T. The
expectation value of T is fixed by the classical potential, and S describes a
flat direction to all orders in perturbation theory. We consider possible
perturbative corrections to the Kahler potential in inverse powers of
and , and find that under certain circumstances, and when taken together
with low-energy gaugino condensation, these can lift the degeneracy of the flat
direction for . The resulting vacuum breaks supersymmetry at moderately
low energies in comparison with the compactification scale, with positive
cosmological constant. It is argued that the 6D model might itself be obtained
from string compactifications, giving rise to realistic string
compactifications on non Ricci flat manifolds. Possible phenomenological and
cosmological applications are briefly discussed.Comment: 32 pages, 2 figures. Uses JHEP3.cls. References fixed and updated,
some minor typos fixed. Corrected minor error concerning Kaluza-Klein scales.
Results remain unchange
The PHENIX Experiment at RHIC
The physics emphases of the PHENIX collaboration and the design and current
status of the PHENIX detector are discussed. The plan of the collaboration for
making the most effective use of the available luminosity in the first years of
RHIC operation is also presented.Comment: 5 pages, 1 figure. Further details of the PHENIX physics program
available at http://www.rhic.bnl.gov/phenix
On the modeling of the 2010 Gulf of Mexico Oil Spill
► Two oil particle trajectory forecasting systems for the 2010 Deepwater Horizon Oil Spill in the Gulf of Mexico are presented. ► A Monte Carlo method was used to model oil removal processes. ► Results were sensitive to initial conditions. ► Data-assimilative models produced the most accurate trajectories. ► About 25% of the oil remains in the water column and most of the oil is below 800 m after three months of simulation.Two oil particle trajectory forecasting systems were developed and applied to the 2010 Deepwater Horizon Oil Spill in the Gulf of Mexico. Both systems use ocean current fields from high-resolution numerical ocean circulation model simulations, Lagrangian stochastic models to represent unresolved sub-grid scale variability to advect oil particles, and Monte Carlo-based schemes for representing uncertain biochemical and physical processes. The first system assumes two-dimensional particle motion at the ocean surface, the oil is in one state, and the particle removal is modeled as a Monte Carlo process parameterized by a one number removal rate. Oil particles are seeded using both initial conditions based on observations and particles released at the location of the Maconda well. The initial conditions (ICs) of oil particle location for the two-dimensional surface oil trajectory forecasts are based on a fusing of all available information including satellite-based analyses. The resulting oil map is digitized into a shape file within which a polygon filling software generates longitude and latitude with variable particle density depending on the amount of oil present in the observations for the IC. The more complex system assumes three (light, medium, heavy) states for the oil, each state has a different removal rate in the Monte Carlo process, three-dimensional particle motion, and a particle size-dependent oil mixing model.Simulations from the two-dimensional forecast system produced results that qualitatively agreed with the uncertain “truth” fields. These simulations validated the use of our Monte Carlo scheme for representing oil removal by evaporation and other weathering processes. Eulerian velocity fields for predicting particle motion from data-assimilative models produced better particle trajectory distributions than a free running model with no data assimilation. Monte Carlo simulations of the three-dimensional oil particle trajectory, whose ensembles were generated by perturbing the size of the oil particles and the fraction in a given size range that are released at depth, the two largest unknowns in this problem. 36 realizations of the model were run with only subsurface oil releases. An average of these results yields that after three months, about 25% of the oil remains in the water column and that most of the oil is below 800m
- …