38 research outputs found
Applications of a New Proposal for Solving the "Problem of Time" to Some Simple Quantum Cosmological Models
We apply a recent proposal for defining states and observables in quantum
gravity to simple models. First, we consider a Klein-Gordon particle in an ex-
ternal potential in Minkowski space and compare our proposal to the theory ob-
tained by deparametrizing with respect to a time slicing prior to quantiza-
tion. We show explicitly that the dynamics of the deparametrization approach
depends on the time slicing. Our proposal yields a dynamics independent of the
choice of time slicing at intermediate times but after the potential is turned
off, the dynamics does not return to the free particle dynamics. Next we apply
our proposal to the closed Robertson-Walker quantum cosmology with a massless
scalar field with the size of the universe as our time variable, so the only
dynamical variable is the scalar field. We show that the resulting theory has
the semi-classical behavior up to the classical turning point from expansion to
contraction, i.e., given a classical solution which expands for much longer
than the Planck time, there is a quantum state whose dynamical evolution
closely approximates this classical solution during the expansion. However,
when the "time" gets larger than the classical maximum, the scalar field be-
comes "frozen" at its value at the maximum expansion. We also obtain similar
results in the Taub model. In an Appendix we derive the form of the Wheeler-
DeWitt equation for the Bianchi models by performing a proper quantum reduc-
tion of the momentum constraints; this equation differs from the usual one ob-
tained by solving the momentum constraints classically, prior to quantization.Comment: 30 pages, LaTeX 3 figures (postscript file or hard copy) available
upon request, BUTP-94/1
Mining metrics for buried treasure
The same but different: That might describe two metrics. On the surface
CLASSI may show two metrics are locally equivalent, but buried beneath one may
be a wealth of further structure. This was beautifully described in a paper by
M.A.H. MacCallum in 1998. Here I will illustrate the effect with two flat
metrics -- one describing ordinary Minkowski spacetime and the other describing
a three-parameter family of Gal'tsov-Letelier-Tod spacetimes. I will dig out
the beautiful hidden classical singularity structure of the latter (a structure
first noticed by Tod in 1994) and then show how quantum considerations can
illuminate the riches. I will then discuss how quantum structure can help us
understand classical singularities and metric parameters in a variety of exact
solutions mined from the Exact Solutions book.Comment: 16 pages, no figures, minor grammatical changes, submitted to
Proceedings of the Malcolm@60 Conference (London, July 2004
Generalized Quantum Theory of Recollapsing Homogeneous Cosmologies
A sum-over-histories generalized quantum theory is developed for homogeneous
minisuperspace type A Bianchi cosmological models, focussing on the particular
example of the classically recollapsing Bianchi IX universe. The decoherence
functional for such universes is exhibited. We show how the probabilities of
decoherent sets of alternative, coarse-grained histories of these model
universes can be calculated. We consider in particular the probabilities for
classical evolution defined by a suitable coarse-graining. For a restricted
class of initial conditions and coarse grainings we exhibit the approximate
decoherence of alternative histories in which the universe behaves classically
and those in which it does not. For these situations we show that the
probability is near unity for the universe to recontract classically if it
expands classically. We also determine the relative probabilities of
quasi-classical trajectories for initial states of WKB form, recovering for
such states a precise form of the familiar heuristic "J d\Sigma" rule of
quantum cosmology, as well as a generalization of this rule to generic initial
states.Comment: 41 pages, 4 eps figures, revtex 4. Modest revisions throughout.
Physics unchanged. To appear in Phys. Rev.
Exact Hypersurface-Homogeneous Solutions in Cosmology and Astrophysics
A framework is introduced which explains the existence and similarities of
most exact solutions of the Einstein equations with a wide range of sources for
the class of hypersurface-homogeneous spacetimes which admit a Hamiltonian
formulation. This class includes the spatially homogeneous cosmological models
and the astrophysically interesting static spherically symmetric models as well
as the stationary cylindrically symmetric models. The framework involves
methods for finding and exploiting hidden symmetries and invariant submanifolds
of the Hamiltonian formulation of the field equations. It unifies, simplifies
and extends most known work on hypersurface-homogeneous exact solutions. It is
shown that the same framework is also relevant to gravitational theories with a
similar structure, like Brans-Dicke or higher-dimensional theories.Comment: 41 pages, REVTEX/LaTeX 2.09 file (don't use LaTeX2e !!!) Accepted for
publication in Phys. Rev.
Can induced gravity isotropize Bianchi I, V, or IX Universes?
We analyze if Bianchi I, V, and IX models in the Induced Gravity (IG) theory
can evolve to a Friedmann--Roberson--Walker (FRW) expansion due to the
non--minimal coupling of gravity and the scalar field. The analytical results
that we found for the Brans-Dicke (BD) theory are now applied to the IG theory
which has ( being the square ratio of the Higgs to
Planck mass) in a cosmological era in which the IG--potential is not
significant. We find that the isotropization mechanism crucially depends on the
value of . Its smallness also permits inflationary solutions. For the
Bianch V model inflation due to the Higgs potential takes place afterwads, and
subsequently the spontaneous symmetry breaking (SSB) ends with an effective FRW
evolution. The ordinary tests of successful cosmology are well satisfied.Comment: 24 pages, 5 figures, to be published in Phys. Rev. D1
Computational analysis of protein sequence and structure
SIGLEAvailable from British Library Document Supply Centre-DSC:DXN024988 / BLDSC - British Library Document Supply CentreGBUnited Kingdo
The development of the graphics-decoding proficiency instrument
The Graphics-Decoding Proficiency (G-DP) instrument was developed as a screening test for the purpose of measuring students’ (aged 8-11 years) capacity to solve graphics-based mathematics tasks. These tasks include number lines, column graphs, maps and pie charts. The instrument was developed within a theoretical framework which highlights the various types of information graphics commonly presented to students in large-scale national and international assessments. The instrument provides researchers, classroom teachers and test designers with an assessment tool which measures students’ graphics decoding proficiency across and within five broad categories of information graphics. The instrument has implications for a number of stakeholders in an era where graphics have become an increasingly important way of representing information
The development of the graphics-decoding proficiency instrument
The Graphics-Decoding Proficiency (G-DP) instrument was developed as a screening test for the purpose of measuring students’ (aged 8-11 years) capacity to solve graphics-based mathematics tasks. These tasks include number lines, column graphs, maps and pie charts. The instrument was developed within a theoretical framework which highlights the various types of information graphics commonly presented to students in large-scale national and international assessments. The instrument provides researchers, classroom teachers and test designers with an assessment tool which measures students’ graphics decoding proficiency across and within five broad categories of information graphics. The instrument has implications for a number of stakeholders in an era where graphics have become an increasingly important way of representing information