78 research outputs found
Statistics and geometry of cosmic voids
We introduce new statistical methods for the study of cosmic voids, focusing
on the statistics of largest size voids. We distinguish three different types
of distributions of voids, namely, Poisson-like, lognormal-like and Pareto-like
distributions. The last two distributions are connected with two types of
fractal geometry of the matter distribution. Scaling voids with Pareto
distribution appear in fractal distributions with box-counting dimension
smaller than three (its maximum value), whereas the lognormal void distribution
corresponds to multifractals with box-counting dimension equal to three.
Moreover, voids of the former type persist in the continuum limit, namely, as
the number density of observable objects grows, giving rise to lacunar
fractals, whereas voids of the latter type disappear in the continuum limit,
giving rise to non-lacunar (multi)fractals. We propose both lacunar and
non-lacunar multifractal models of the cosmic web structure of the Universe. A
non-lacunar multifractal model is supported by current galaxy surveys as well
as cosmological -body simulations. This model suggests, in particular, that
small dark matter halos and, arguably, faint galaxies are present in cosmic
voids.Comment: 39 pages, 8 EPS figures, supersedes arXiv:0802.038
Noise parametric identification and whitening for LIGO 40-meter interferometer data
We report the analysis we made on data taken by Caltech 40-meter prototype
interferometer to identify the noise power spectral density and to whiten the
sequence of noise. We concentrate our study on data taken in November 1994, in
particular we analyzed two frames of data: the 18nov94.2.frame and the
19nov94.2.frame.
We show that it is possible to whiten these data, to a good degree of
whiteness, using a high order whitening filter. Moreover we can choose to
whiten only restricted band of frequencies around the region we are interested
in, obtaining a higher level of whiteness.Comment: 11 pages, 15 figures, accepted for publication by Physical Review
New constraints on elemental and Pb and Nd isotope compositions of South American and Southern African aerosol sources to the South Atlantic Ocean
Improving the geochemical database available for characterising potential natural and anthropogenic aerosol sources from South America and Southern Africa is a critical precondition for studies aimed at understanding trace metal controls on the marine biogeochemical cycles of the South Atlantic Ocean. We here present new elemental and isotopic data for a wide range of sample types from South America and Southern Africa that are potentially important aerosol sources. This includes road dust from Buenos Aires and lichen samples from Johannesburg, soil dust from Patagonia, volcanic ash from the Andean volcanic belt, and aerosol samples from São Paulo. All samples were investigated for major (Al, Ca, Fe, Mg, Na, K, Mn) and trace element (Cd, Co, Cr, Cu, Ni, Pb, REE, Sc, Th, Y, V, Zn) concentrations and Nd and Pb isotopic compositions. We show that diagrams of 208Pb/207Pb vs. εNd, 208Pb/207Pb vs. Pb/Al, 1/[Pb], Zn/Al, Cd/Al, Cu/Al, and εNd vs. Pb/Al, and 1/[Nd] are best suited to separate South American and South African source regions as well as natural and anthropogenic sources. A subset of samples from Patagonia and the Andes was additionally subjected to separation of a fine (<5 μm) fraction and compared to the composition of the bulk sample. We show that differences in the geochemical signature of bulk samples between individual regions and source types are significantly larger than between grain sizes. Jointly, these findings present an important step forward towards a quantitative assessment of aeolian trace metal inputs to the South Atlantic Ocean
The uses of coherent structure (Dryden Lecture)
The concept of coherent structure in turbulent flow is a revolutionary idea which is being developed by evolutionary means. The main objective of this review is to list some solid achievements, showing what can be done by using the concept of coherent structure that cannot be done without it. The nature of structure is described in terms of some related concepts, including celerity,
topology, and the phenomenon of coalescence and splitting of structure. The main emphasis is on the mixing layer, as the one flow whose structure is well enough understood so that technical applications are now being made in problems of mixing and chemistry. An attempt is made to identify some conceptual and experimental obstacles that stand in the way of progress in other technically important flows, particularly the turbulent boundary layer. A few comments are included about the role of structure in numerical simulations and in current work on manipulation and control of turbulent flow. Some recent developments are cited which suggest that the time is nearly right for corresponding advances to occur in turbulence modeling
Random Convex Hulls and Extreme Value Statistics
In this paper we study the statistical properties of convex hulls of
random points in a plane chosen according to a given distribution. The points
may be chosen independently or they may be correlated. After a non-exhaustive
survey of the somewhat sporadic literature and diverse methods used in the
random convex hull problem, we present a unifying approach, based on the notion
of support function of a closed curve and the associated Cauchy's formulae,
that allows us to compute exactly the mean perimeter and the mean area enclosed
by the convex polygon both in case of independent as well as correlated points.
Our method demonstrates a beautiful link between the random convex hull problem
and the subject of extreme value statistics. As an example of correlated
points, we study here in detail the case when the points represent the vertices
of independent random walks. In the continuum time limit this reduces to
independent planar Brownian trajectories for which we compute exactly, for
all , the mean perimeter and the mean area of their global convex hull. Our
results have relevant applications in ecology in estimating the home range of a
herd of animals. Some of these results were announced recently in a short
communication [Phys. Rev. Lett. {\bf 103}, 140602 (2009)].Comment: 61 pages (pedagogical review); invited contribution to the special
issue of J. Stat. Phys. celebrating the 50 years of Yeshiba/Rutgers meeting
Detector Description and Performance for the First Coincidence Observations between LIGO and GEO
For 17 days in August and September 2002, the LIGO and GEO interferometer
gravitational wave detectors were operated in coincidence to produce their
first data for scientific analysis. Although the detectors were still far from
their design sensitivity levels, the data can be used to place better upper
limits on the flux of gravitational waves incident on the earth than previous
direct measurements. This paper describes the instruments and the data in some
detail, as a companion to analysis papers based on the first data.Comment: 41 pages, 9 figures 17 Sept 03: author list amended, minor editorial
change
Analysis of LIGO data for gravitational waves from binary neutron stars
We report on a search for gravitational waves from coalescing compact binary
systems in the Milky Way and the Magellanic Clouds. The analysis uses data
taken by two of the three LIGO interferometers during the first LIGO science
run and illustrates a method of setting upper limits on inspiral event rates
using interferometer data. The analysis pipeline is described with particular
attention to data selection and coincidence between the two interferometers. We
establish an observational upper limit of 1.7 \times 10^{2}M_\odot$.Comment: 17 pages, 9 figure
Use of the 'characteristic function' for modelling repeatability precision
The 'characteristic function' is a two-parameter function relating precision or uncertainty in analytical results to the concentration of the analyte. In previous papers, in this series, it has been shown to provide a good model of precision measured: (a) under reproducibility conditions and (b) under 'instrumental' conditions. The present study shows that it is also a valuable model for precision estimated under repeatability conditions. The study data were large sets of duplicated results obtained for the purposes of quality control on typical test materials in routine analysis. As the analytes exhibited concentration ranges encompassing between one and three orders of magnitude, there was ample scope to demonstrate goodness of fit to the function under different circumstances
- …