30 research outputs found
A reconstruction of the initial conditions of the Universe by optimal mass transportation
Reconstructing the density fluctuations in the early Universe that evolved
into the distribution of galaxies we see today is a challenge of modern
cosmology [ref.]. An accurate reconstruction would allow us to test
cosmological models by simulating the evolution starting from the reconstructed
state and comparing it to the observations. Several reconstruction techniques
have been proposed [8 refs.], but they all suffer from lack of uniqueness
because the velocities of galaxies are usually not known. Here we show that
reconstruction can be reduced to a well-determined problem of optimisation, and
present a specific algorithm that provides excellent agreement when tested
against data from N-body simulations. By applying our algorithm to the new
redshift surveys now under way [ref.], we will be able to recover reliably the
properties of the primeval fluctuation field of the local Universe and to
determine accurately the peculiar velocities (deviations from the Hubble
expansion) and the true positions of many more galaxies than is feasible by any
other method.
A version of the paper with higher-quality figures is available at
http://www.obs-nice.fr/etc7/nature.pdfComment: Latex, 4 pages, 3 figure
Henri Poincaré: The Status of Mechanical Explanations and the Foundations of Statistical Mechanics
The first goal of this paper is to show the evolution of Poincaré’s opinion on the mechanistic reduction of the principles of thermodynamics, placing it in the context of the science of his time. The second is to present some of his work in 1890 on the foundations of statistical mechanics. He became interested first in thermodynamics and its relation with mechanics, drawing on the work of Helm-holtz on monocyclic systems. After a period of skepticism concerning the kinetic theory, he read some of Maxwell’s memories and contributed to the foundations of statistical mechanics. I also show that Poincaré's contributions to the founda-tions of statistical mechanics are closely linked to his work in celestial mechanics and its interest in probability theory and its role in physics
Historical roots of gauge invariance
Gauge invariance is the basis of the modern theory of electroweak and strong
interactions (the so called Standard Model). The roots of gauge invariance go
back to the year 1820 when electromagnetism was discovered and the first
electrodynamic theory was proposed. Subsequent developments led to the
discovery that different forms of the vector potential result in the same
observable forces. The partial arbitrariness of the vector potential A brought
forth various restrictions on it. div A = 0 was proposed by J. C. Maxwell;
4-div A = 0 was proposed L. V. Lorenz in the middle of 1860's . In most of the
modern texts the latter condition is attributed to H. A. Lorentz, who half a
century later was one of the key figures in the final formulation of classical
electrodynamics. In 1926 a relativistic quantum-mechanical equation for charged
spinless particles was formulated by E. Schrodinger, O. Klein, and V. Fock. The
latter discovered that this equation is invariant with respect to
multiplication of the wave function by a phase factor exp(ieX/hc) with the
accompanying additions to the scalar potential of -dX/cdt and to the vector
potential of grad X. In 1929 H. Weyl proclaimed this invariance as a general
principle and called it Eichinvarianz in German and gauge invariance in
English. The present era of non-abelian gauge theories started in 1954 with the
paper by C. N. Yang and R. L. Mills.Comment: final-final, 34 pages, 1 figure, 106 references (one added with
footnote since v.2); to appear in July 2001 Rev. Mod. Phy
Reconstruction of the early Universe as a convex optimization problem
We show that the deterministic past history of the Universe can be uniquely
reconstructed from the knowledge of the present mass density field, the latter
being inferred from the 3D distribution of luminous matter, assumed to be
tracing the distribution of dark matter up to a known bias. Reconstruction
ceases to be unique below those scales -- a few Mpc -- where multi-streaming
becomes significant. Above 6 Mpc/h we propose and implement an effective
Monge-Ampere-Kantorovich method of unique reconstruction. At such scales the
Zel'dovich approximation is well satisfied and reconstruction becomes an
instance of optimal mass transportation, a problem which goes back to Monge
(1781). After discretization into N point masses one obtains an assignment
problem that can be handled by effective algorithms with not more than cubic
time complexity in N and reasonable CPU time requirements. Testing against
N-body cosmological simulations gives over 60% of exactly reconstructed points.
We apply several interrelated tools from optimization theory that were not
used in cosmological reconstruction before, such as the Monge-Ampere equation,
its relation to the mass transportation problem, the Kantorovich duality and
the auction algorithm for optimal assignment. Self-contained discussion of
relevant notions and techniques is provided.Comment: 26 pages, 14 figures; accepted to MNRAS. Version 2: numerous minour
clarifications in the text, additional material on the history of the
Monge-Ampere equation, improved description of the auction algorithm, updated
bibliography. Version 3: several misprints correcte