7,140 research outputs found
Computing challenges of the Cosmic Microwave Background
The Cosmic Microwave Background (CMB) encodes information on the origin and evolution of the universe, buried in a fractional anisotropy of one part in 100,000 on angular scales from arcminutes to tens of degrees. We await the coming onslaught of data from experiments measuring the microwave sky from the ground, from balloons and from space. However, we are faced with the harsh reality that current algorithms for extracting cosmological information cannot handle data sets of the size and complexity expected even in the next few years. Here we review the challenges involved in understanding this data: making maps from time-ordered data, removing the foreground contaminants, and finally estimating the power spectrum and cosmological parameters from the CMB map. If handled naively, the global nature of the analysis problem renders these tasks effectively impossible given the volume of the data. We discuss possible techniques for overcoming these issues and outline the many other challenges that wait to be addressed
Using hybrid GPU/CPU kernel splitting to accelerate spherical convolutions
We present a general method for accelerating by more than an order of
magnitude the convolution of pixelated functions on the sphere with a
radially-symmetric kernel. Our method splits the kernel into a compact
real-space component and a compact spherical harmonic space component. These
components can then be convolved in parallel using an inexpensive commodity GPU
and a CPU. We provide models for the computational cost of both real-space and
Fourier space convolutions and an estimate for the approximation error. Using
these models we can determine the optimum split that minimizes the wall clock
time for the convolution while satisfying the desired error bounds. We apply
this technique to the problem of simulating a cosmic microwave background (CMB)
anisotropy sky map at the resolution typical of the high resolution maps
produced by the Planck mission. For the main Planck CMB science channels we
achieve a speedup of over a factor of ten, assuming an acceptable fractional
rms error of order 1.e-5 in the power spectrum of the output map.Comment: 9 pages, 11 figures, 1 table, accepted by Astronomy & Computing w/
minor revisions. arXiv admin note: substantial text overlap with
arXiv:1211.355
CMB-S4 Science Book, First Edition
This book lays out the scientific goals to be addressed by the
next-generation ground-based cosmic microwave background experiment, CMB-S4,
envisioned to consist of dedicated telescopes at the South Pole, the high
Chilean Atacama plateau and possibly a northern hemisphere site, all equipped
with new superconducting cameras. CMB-S4 will dramatically advance cosmological
studies by crossing critical thresholds in the search for the B-mode
polarization signature of primordial gravitational waves, in the determination
of the number and masses of the neutrinos, in the search for evidence of new
light relics, in constraining the nature of dark energy, and in testing general
relativity on large scales
Status of CMB observations in 2015
The 2.725 K cosmic microwave background has played a key role in the
development of modern cosmology by providing a solid observational foundation
for constraining possible theories of what happened at very large redshifts and
theoretical speculation reaching back almost to the would-be big bang initial
singularity. After recounting some of the lesser known history of this area, I
summarize the current observational situation and also discuss some exciting
challenges that lie ahead: the search for B modes, the precision mapping of the
CMB gravitational lensing potential, and the ultra-precise characterization of
the CMB frequency spectrum, which would allow the exploitation of spectral
distortions to probe new physics.Comment: 17 pages, 3 figures, Latex, conference proceeding based on talk at
CosPA 2015 in Daejeon, South Korea in October 2015, minor typos correcte
ASCR/HEP Exascale Requirements Review Report
This draft report summarizes and details the findings, results, and
recommendations derived from the ASCR/HEP Exascale Requirements Review meeting
held in June, 2015. The main conclusions are as follows. 1) Larger, more
capable computing and data facilities are needed to support HEP science goals
in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of
the demand at the 2025 timescale is at least two orders of magnitude -- and in
some cases greater -- than that available currently. 2) The growth rate of data
produced by simulations is overwhelming the current ability, of both facilities
and researchers, to store and analyze it. Additional resources and new
techniques for data analysis are urgently needed. 3) Data rates and volumes
from HEP experimental facilities are also straining the ability to store and
analyze large and complex data volumes. Appropriately configured
leadership-class facilities can play a transformational role in enabling
scientific discovery from these datasets. 4) A close integration of HPC
simulation and data analysis will aid greatly in interpreting results from HEP
experiments. Such an integration will minimize data movement and facilitate
interdependent workflows. 5) Long-range planning between HEP and ASCR will be
required to meet HEP's research needs. To best use ASCR HPC resources the
experimental HEP program needs a) an established long-term plan for access to
ASCR computational and data resources, b) an ability to map workflows onto HPC
resources, c) the ability for ASCR facilities to accommodate workflows run by
collaborations that can have thousands of individual members, d) to transition
codes to the next-generation HPC platforms that will be available at ASCR
facilities, e) to build up and train a workforce capable of developing and
using simulations and analysis to support HEP scientific research on
next-generation systems.Comment: 77 pages, 13 Figures; draft report, subject to further revisio
Mapping Cosmic Dawn and Reionization: Challenges and Synergies
Cosmic dawn and the Epoch of Reionization (EoR) are among the least explored
observational eras in cosmology: a time at which the first galaxies and
supermassive black holes formed and reionized the cold, neutral Universe of the
post-recombination era. With current instruments, only a handful of the
brightest galaxies and quasars from that time are detectable as individual
objects, due to their extreme distances. Fortunately, a multitude of
multi-wavelength intensity mapping measurements, ranging from the redshifted 21
cm background in the radio to the unresolved X-ray background, contain a
plethora of synergistic information about this elusive era. The coming decade
will likely see direct detections of inhomogenous reionization with CMB and 21
cm observations, and a slew of other probes covering overlapping areas and
complementary physical processes will provide crucial additional information
and cross-validation. To maximize scientific discovery and return on
investment, coordinated survey planning and joint data analysis should be a
high priority, closely coupled to computational models and theoretical
predictions.Comment: 5 pages, 1 figure, submitted to the Astro2020 Decadal Survey Science
White Paper cal
Decaying dark energy in light of the latest cosmological dataset
Decaying Dark Energy models modify the background evolution of the most
common observables, such as the Hubble function, the luminosity distance and
the Cosmic Microwave Background temperature-redshift scaling relation. We use
the most recent observationally-determined datasets, including Supernovae Type
Ia and Gamma Ray Bursts data, along with and Cosmic Microwave Background
temperature versus data and the reduced Cosmic Microwave Background
parameters, to improve the previous constraints on these models. We perform a
Monte Carlo Markov Chain analysis to constrain the parameter space, on the
basis of two distinct methods. In view of the first method, the Hubble constant
and the matter density are left to vary freely. In this case, our results are
compatible with previous analyses associated with decaying Dark Energy models,
as well as with the most recent description of the cosmological background. In
view of the second method, we set the Hubble constant and the matter density to
their best fit values obtained by the {\it Planck} satellite, reducing the
parameter space to two dimensions, and improving the existent constraints on
the model's parameters. Our results suggest that the accelerated expansion of
the Universe is well described by the cosmological constant, and we argue that
forthcoming observations will play a determinant role to constrain/rule out
decaying Dark Energy.Comment: 15 pages, 3 figure, 2 table. Accepted in the Special Issue
"Cosmological Inflation, Dark Matter and Dark Energy" on Symmetry Journa
Destriping Cosmic Microwave Background Polarimeter data
Destriping is a well-established technique for removing low-frequency
correlated noise from Cosmic Microwave Background (CMB) survey data. In this
paper we present a destriping algorithm tailored to data from a polarimeter,
i.e. an instrument where each channel independently measures the polarization
of the input signal.
We also describe a fully parallel implementation in Python released as Free
Software and analyze its results and performance on simulated datasets, both
the design case of signal and correlated noise, and with additional systematic
effects.
Finally we apply the algorithm to 30 days of 37.5 GHz polarized microwave
data gathered from the B-Machine experiment, developed at UCSB. The B-Machine
data and destriped maps are made publicly available.
The purpose is the development of a scalable software tool to be applied to
the upcoming 12 months of temperature and polarization data from LATTE (Low
frequency All sky TemperaTure Experiment) at 8 GHz and to even larger datasets.Comment: Submitted to Astronomy and Computing on 15th August 2013, published
7th November 201
Modeling and replicating statistical topology, and evidence for CMB non-homogeneity
Under the banner of `Big Data', the detection and classification of structure
in extremely large, high dimensional, data sets, is, one of the central
statistical challenges of our times. Among the most intriguing approaches to
this challenge is `TDA', or `Topological Data Analysis', one of the primary
aims of which is providing non-metric, but topologically informative,
pre-analyses of data sets which make later, more quantitative analyses
feasible. While TDA rests on strong mathematical foundations from Topology, in
applications it has faced challenges due to an inability to handle issues of
statistical reliability and robustness and, most importantly, in an inability
to make scientific claims with verifiable levels of statistical confidence. We
propose a methodology for the parametric representation, estimation, and
replication of persistence diagrams, the main diagnostic tool of TDA. The power
of the methodology lies in the fact that even if only one persistence diagram
is available for analysis -- the typical case for big data applications --
replications can be generated to allow for conventional statistical hypothesis
testing. The methodology is conceptually simple and computationally practical,
and provides a broadly effective statistical procedure for persistence diagram
TDA analysis. We demonstrate the basic ideas on a toy example, and the power of
the approach in a novel and revealing analysis of CMB non-homogeneity
- …