6,138 research outputs found
The Beylkin-Cramer Summation Rule and A New Fast Algorithm of Cosmic Statistics for Large Data Sets
Based on the Beylkin-Cramer summation rule, we introduce a new fast algorithm
that enable us to explore the high order statistics efficiently in large data
sets. Central to this technique is to make decomposition both of fields and
operators within the framework of multi-resolution analysis (MRA), and realize
theirs discrete representations. Accordingly, a homogenous point process could
be equivalently described by a operation of a Toeplitz matrix on a vector,
which is accomplished by making use of fast Fourier transformation. The
algorithm could be applied widely in the cosmic statistics to tackle large data
sets. Especially, we demonstrate this novel technique using the spherical,
cubic and cylinder counts in cells respectively. The numerical test shows that
the algorithm produces an excellent agreement with the expected results.
Moreover, the algorithm introduces naturally a sharp-filter, which is capable
of suppressing shot noise in weak signals. In the numerical procedures, the
algorithm is somewhat similar to particle-mesh (PM) methods in N-body
simulations. As scaled with , it is significantly faster than the
current particle-based methods, and its computational cost does not relies on
shape or size of sampling cells. In addition, based on this technique, we
propose further a simple fast scheme to compute the second statistics for
cosmic density fields and justify it using simulation samples. Hopefully, the
technique developed here allows us to make a comprehensive study of
non-Guassianity of the cosmic fields in high precision cosmology. A specific
implementation of the algorithm is publicly available upon request to the
author.Comment: 27 pages, 9 figures included. revised version, changes include (a)
adding a new fast algorithm for 2nd statistics (b) more numerical tests
including counts in asymmetric cells, the two-point correlation functions and
2nd variances (c) more discussions on technic
Efficient Cosmological Parameter Estimation from Microwave Background Anisotropies
We revisit the issue of cosmological parameter estimation in light of current
and upcoming high-precision measurements of the cosmic microwave background
power spectrum. Physical quantities which determine the power spectrum are
reviewed, and their connection to familiar cosmological parameters is
explicated. We present a set of physical parameters, analytic functions of the
usual cosmological parameters, upon which the microwave background power
spectrum depends linearly (or with some other simple dependence) over a wide
range of parameter values. With such a set of parameters, microwave background
power spectra can be estimated with high accuracy and negligible computational
effort, vastly increasing the efficiency of cosmological parameter error
determination. The techniques presented here allow calculation of microwave
background power spectra times faster than comparably accurate direct
codes (after precomputing a handful of power spectra). We discuss various
issues of parameter estimation, including parameter degeneracies, numerical
precision, mapping between physical and cosmological parameters, and systematic
errors, and illustrate these considerations with an idealized model of the MAP
experiment.Comment: 22 pages, 12 figure
ASCR/HEP Exascale Requirements Review Report
This draft report summarizes and details the findings, results, and
recommendations derived from the ASCR/HEP Exascale Requirements Review meeting
held in June, 2015. The main conclusions are as follows. 1) Larger, more
capable computing and data facilities are needed to support HEP science goals
in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of
the demand at the 2025 timescale is at least two orders of magnitude -- and in
some cases greater -- than that available currently. 2) The growth rate of data
produced by simulations is overwhelming the current ability, of both facilities
and researchers, to store and analyze it. Additional resources and new
techniques for data analysis are urgently needed. 3) Data rates and volumes
from HEP experimental facilities are also straining the ability to store and
analyze large and complex data volumes. Appropriately configured
leadership-class facilities can play a transformational role in enabling
scientific discovery from these datasets. 4) A close integration of HPC
simulation and data analysis will aid greatly in interpreting results from HEP
experiments. Such an integration will minimize data movement and facilitate
interdependent workflows. 5) Long-range planning between HEP and ASCR will be
required to meet HEP's research needs. To best use ASCR HPC resources the
experimental HEP program needs a) an established long-term plan for access to
ASCR computational and data resources, b) an ability to map workflows onto HPC
resources, c) the ability for ASCR facilities to accommodate workflows run by
collaborations that can have thousands of individual members, d) to transition
codes to the next-generation HPC platforms that will be available at ASCR
facilities, e) to build up and train a workforce capable of developing and
using simulations and analysis to support HEP scientific research on
next-generation systems.Comment: 77 pages, 13 Figures; draft report, subject to further revisio
The New Horizon Run Cosmological N-Body Simulations
We present two large cosmological N-body simulations, called Horizon Run 2
(HR2) and Horizon Run 3 (HR3), made using 6000^3 = 216 billions and 7210^3 =
374 billion particles, spanning a volume of (7.200 Gpc/h)^3 and (10.815
Gpc/h)^3, respectively. These simulations improve on our previous Horizon Run 1
(HR1) up to a factor of 4.4 in volume, and range from 2600 to over 8800 times
the volume of the Millennium Run. In addition, they achieve a considerably
finer mass resolution, down to 1.25x10^11 M_sun/h, allowing to resolve
galaxy-size halos with mean particle separations of 1.2 Mpc/h and 1.5 Mpc/h,
respectively. We have measured the power spectrum, correlation function, mass
function and basic halo properties with percent level accuracy, and verified
that they correctly reproduce the LCDM theoretical expectations, in excellent
agreement with linear perturbation theory. Our unprecedentedly large-volume
N-body simulations can be used for a variety of studies in cosmology and
astrophysics, ranging from large-scale structure topology, baryon acoustic
oscillations, dark energy and the characterization of the expansion history of
the Universe, till galaxy formation science - in connection with the new
SDSS-III. To this end, we made a total of 35 all-sky mock surveys along the
past light cone out to z=0.7 (8 from the HR2 and 27 from the HR3), to simulate
the BOSS geometry. The simulations and mock surveys are already publicly
available at http://astro.kias.re.kr/Horizon-Run23/.Comment: 18 pages, 10 figures. Added clarification on Fig 6. Published in the
Journal of the Korean Astronomical Society (JKAS). The paper with
high-resolution figures is available at
http://jkas.kas.org/journals/2011v44n6/v44n6.ht
Including parameter dependence in the data and covariance for cosmological inference
The final step of most large-scale structure analyses involves the comparison
of power spectra or correlation functions to theoretical models. It is clear
that the theoretical models have parameter dependence, but frequently the
measurements and the covariance matrix depend upon some of the parameters as
well. We show that a very simple interpolation scheme from an unstructured mesh
allows for an efficient way to include this parameter dependence
self-consistently in the analysis at modest computational expense. We describe
two schemes for covariance matrices. The scheme which uses the geometric
structure of such matrices performs roughly twice as well as the simplest
scheme, though both perform very well.Comment: 17 pages, 4 figures, matches version published in JCA
- âŠ