1,998 research outputs found
Skeleton as a probe of the cosmic web: the 2D case
We discuss the skeleton as a probe of the filamentary structures of a 2D
random field. It can be defined for a smooth field as the ensemble of pairs of
field lines departing from saddle points, initially aligned with the major axis
of local curvature and connecting them to local maxima. This definition is thus
non local and makes analytical predictions difficult, so we propose a local
approximation: the local skeleton is given by the set of points where the
gradient is aligned with the local curvature major axis and where the second
component of the local curvature is negative.
We perform a statistical analysis of the length of the total local skeleton,
chosen for simplicity as the set of all points of space where the gradient is
either parallel or orthogonal to the main curvature axis. In all our numerical
experiments, which include Gaussian and various non Gaussian realizations such
as \chi^2 fields and Zel'dovich maps, the differential length is found within a
normalization factor to be very close to the probability distribution function
of the smoothed field. This is in fact explicitly demonstrated in the Gaussian
case.
This result might be discouraging for using the skeleton as a probe of non
Gausiannity, but our analyses assume that the total length of the skeleton is a
free, adjustable parameter. This total length could in fact be used to
constrain cosmological models, in CMB maps but also in 3D galaxy catalogs,
where it estimates the total length of filaments in the Universe. Making the
link with other works, we also show how the skeleton can be used to study the
dynamics of large scale structure.Comment: 15 pages, 11 figures, submitted to MNRA
Void Statistics and Hierarchical Scaling in the Halo Model
We study scaling behaviour of statistics of voids in the context of the halo
model of nonlinear large-scale structure. The halo model allows us to
understand why the observed galaxy void probability obeys hierarchical scaling,
even though the premise from which the scaling is derived is not satisfied. We
argue that the commonly observed negative binomial scaling is not fundamental,
but merely the result of the specific values of bias and number density for
typical galaxies. The model implies quantitative relations between void
statistics measured for two populations of galaxies, such as SDSS red and blue
galaxies, and their number density and bias.Comment: 11 pages, 11 figures, accepted for publication in MNRA
A cloudy Vlasov solution
We propose to integrate the Vlasov-Poisson equations giving the evolution of
a dynamical system in phase-space using a continuous set of local basis
functions. In practice, the method decomposes the density in phase-space into
small smooth units having compact support. We call these small units ``clouds''
and choose them to be Gaussians of elliptical support. Fortunately, the
evolution of these clouds in the local potential has an analytical solution,
that can be used to evolve the whole system during a significant fraction of
dynamical time. In the process, the clouds, initially round, change shape and
get elongated. At some point, the system needs to be remapped on round clouds
once again. This remapping can be performed optimally using a small number of
Lucy iterations. The remapped solution can be evolved again with the cloud
method, and the process can be iterated a large number of times without showing
significant diffusion. Our numerical experiments show that it is possible to
follow the 2 dimensional phase space distribution during a large number of
dynamical times with excellent accuracy. The main limitation to this accuracy
is the finite size of the clouds, which results in coarse graining the
structures smaller than the clouds and induces small aliasing effects at these
scales. However, it is shown in this paper that this method is consistent with
an adaptive refinement algorithm which allows one to track the evolution of the
finer structure in phase space. It is also shown that the generalization of the
cloud method to the 4 dimensional and the 6 dimensional phase space is quite
natural.Comment: 46 pages, 25 figures, submitted to MNRA
A "metric" semi-Lagrangian Vlasov-Poisson solver
We propose a new semi-Lagrangian Vlasov-Poisson solver. It employs elements
of metric to follow locally the flow and its deformation, allowing one to find
quickly and accurately the initial phase-space position of any test
particle , by expanding at second order the geometry of the motion in the
vicinity of the closest element. It is thus possible to reconstruct accurately
the phase-space distribution function at any time and position by
proper interpolation of initial conditions, following Liouville theorem. When
distorsion of the elements of metric becomes too large, it is necessary to
create new initial conditions along with isotropic elements and repeat the
procedure again until next resampling. To speed up the process, interpolation
of the phase-space distribution is performed at second order during the
transport phase, while third order splines are used at the moments of
remapping. We also show how to compute accurately the region of influence of
each element of metric with the proper percolation scheme. The algorithm is
tested here in the framework of one-dimensional gravitational dynamics but is
implemented in such a way that it can be extended easily to four or
six-dimensional phase-space. It can also be trivially generalised to plasmas.Comment: 32 pages, 14 figures, accepted for publication in Journal of Plasma
Physics, Special issue: The Vlasov equation, from space to laboratory plasma
Transformation seismology: composite soil lenses for steering surface elastic Rayleigh waves.
Metamaterials are artificially structured media that exibit properties beyond those usually encountered in nature. Typically they are developed for electromagnetic waves at millimetric down to nanometric scales, or for acoustics, at centimeter scales. By applying ideas from transformation optics we can steer Rayleigh-surface waves that are solutions of the vector Navier equations of elastodynamics. As a paradigm of the conformal geophysics that we are creating, we design a square arrangement of Luneburg lenses to reroute Rayleigh waves around a building with the dual aim of protection and minimizing the effect on the wavefront (cloaking). To show that this is practically realisable we deliberately choose to use material parameters readily available and this metalens consists of a composite soil structured with buried pillars made of softer material. The regular lattice of inclusions is homogenized to give an effective material with a radially varying velocity profile and hence varying the refractive index of the lens. We develop the theory and then use full 3D numerical simulations to conclusively demonstrate, at frequencies of seismological relevance 3–10 Hz, and for low-speed sedimentary soil (v(s): 300–500 m/s), that the vibration of a structure is reduced by up to 6 dB at its resonance frequency
Using Local Volume data to constrain Dark Matter dynamics
The peculiar velocity reconstruction methods allow one to have a deeper
insight into the distribution of dark matter: both to measure mean matter
density and to obtain the primordial density fluctuations. We present here the
Monge-Ampere-Kantorovitch method applied to mock catalogues mimicking in both
redshift and distance catalogues. After having discussed the results obtained
for a class of biases that may be corrected for, we focus on the systematics
coming from the unknown distribution of unobserved mass and from the
statistical relationship between mass and luminosity. We then show how to use
these systematics to put constraints on the dark matter distribution. Finally a
preliminary application to an extended version (c z < 3000 km/s) of the
Neighbour Galaxy Catalogue is presented. We recover the peculiar velocities in
our neighbourhood and present a preliminary measurement of the local Omega_M.Comment: 4 pages, 2 figures. To be published in the proceedings of ``Galaxies
in the Local Volume'', Sydney 8 to 13 July 200
Extended Perturbation Theory for the Local Density Distribution Function
Perturbation theory makes it possible to calculate the probability
distribution function (PDF) of the large scale density field in the small
variance limit. For top hat smoothing and scale-free Gaussian initial
fluctuations, the result depends only on the linear variance, sigma_linear, and
its logarithmic derivative with respect to the filtering scale
-(n_linear+3)=dlog sigma_linear^2/dlog L (Bernardeau 1994). In this paper, we
measure the PDF and its low-order moments in scale-free simulations evolved
well into the nonlinear regime and compare the results with the above
predictions, assuming that the spectral index and the variance are adjustable
parameters, n_eff and sigma_eff=sigma, where sigma is the true, nonlinear
variance. With these additional degrees of freedom, results from perturbation
theory provide a good fit of the PDFs, even in the highly nonlinear regime. The
value of n_eff is of course equal to n_linear when sigma << 1, and it decreases
with increasing sigma. A nearly flat plateau is reached when sigma >> 1. In
this regime, the difference between n_eff and n_linear increases when n_linear
decreases. For initial power-spectra with n_linear=-2,-1,0,+1, we find n_eff ~
-9,-3,-1,-0.5 when sigma^2 ~ 100.Comment: 13 pages, 6 figures, Latex (MN format), submitted to MNRA
- …
