1,502 research outputs found
CRDTs: Consistency without concurrency control
A CRDT is a data type whose operations commute when they are concurrent.
Replicas of a CRDT eventually converge without any complex concurrency control.
As an existence proof, we exhibit a non-trivial CRDT: a shared edit buffer
called Treedoc. We outline the design, implementation and performance of
Treedoc. We discuss how the CRDT concept can be generalised, and its
limitations
Hyper-efficient model-independent Bayesian method for the analysis of pulsar timing data
A new model independent method is presented for the analysis of pulsar timing
data and the estimation of the spectral properties of an isotropic
gravitational wave background (GWB). We show that by rephrasing the likelihood
we are able to eliminate the most costly aspects of computation normally
associated with this type of data analysis. When applied to the International
Pulsar Timing Array Mock Data Challenge data sets this results in speedups of
approximately 2 to 3 orders of magnitude compared to established methods. We
present three applications of the new likelihood. In the low signal to noise
regime we sample directly from the power spectrum coefficients of the GWB
signal realization. In the high signal to noise regime, where the data can
support a large number of coefficients, we sample from the joint probability
density of the power spectrum coefficients for the individual pulsars and the
GWB signal realization. Critically in both these cases we need make no
assumptions about the form of the power spectrum of the GWB, or the individual
pulsars. Finally we present a method for characterizing the spatial correlation
between pulsars on the sky, making no assumptions about the form of that
correlation, and therefore providing the only truly general Bayesian method of
confirming a GWB detection from pulsar timing data.Comment: 9 pages, 4 figure
A Gravitational Wave Background from Reheating after Hybrid Inflation
The reheating of the universe after hybrid inflation proceeds through the
nucleation and subsequent collision of large concentrations of energy density
in the form of bubble-like structures moving at relativistic speeds. This
generates a significant fraction of energy in the form of a stochastic
background of gravitational waves, whose time evolution is determined by the
successive stages of reheating: First, tachyonic preheating makes the amplitude
of gravity waves grow exponentially fast. Second, bubble collisions add a new
burst of gravitational radiation. Third, turbulent motions finally sets the end
of gravitational waves production. From then on, these waves propagate
unimpeded to us. We find that the fraction of energy density today in these
primordial gravitational waves could be significant for GUT-scale models of
inflation, although well beyond the frequency range sensitivity of
gravitational wave observatories like LIGO, LISA or BBO. However, low-scale
models could still produce a detectable signal at frequencies accessible to BBO
or DECIGO. For comparison, we have also computed the analogous gravitational
wave background from some chaotic inflation models and obtained results similar
to those found by other groups. The discovery of such a background would open a
new observational window into the very early universe, where the details of the
process of reheating, i.e. the Big Bang, could be explored. Moreover, it could
also serve in the future as a new experimental tool for testing the
Inflationary Paradigm.Comment: 22 pages, 18 figures, uses revtex
LIGO's "Science Reach"
Technical discussions of the Laser Interferometer Gravitational Wave
Observatory (LIGO) sensitivity often focus on its effective sensitivity to
gravitational waves in a given band; nevertheless, the goal of the LIGO Project
is to ``do science.'' Exploiting this new observational perspective to explore
the Universe is a long-term goal, toward which LIGO's initial instrumentation
is but a first step. Nevertheless, the first generation LIGO instrumentation is
sensitive enough that even non-detection --- in the form of an upper limit ---
is also informative. In this brief article I describe in quantitative terms
some of the science we can hope to do with first and future generation LIGO
instrumentation: it short, the ``science reach'' of the detector we are
building and the ones we hope to build.Comment: 13 pages, including 1 inlined figure
Neural Network Aided Glitch-Burst Discrimination and Glitch Classification
We investigate the potential of neural-network based classifiers for
discriminating gravitational wave bursts (GWBs) of a given canonical family
(e.g. core-collapse supernova waveforms) from typical transient instrumental
artifacts (glitches), in the data of a single detector. The further
classification of glitches into typical sets is explored.In order to provide a
proof of concept,we use the core-collapse supernova waveform catalog produced
by H. Dimmelmeier and co-Workers, and the data base of glitches observed in
laser interferometer gravitational wave observatory (LIGO) data maintained by
P. Saulson and co-Workers to construct datasets of (windowed) transient
waveforms (glitches and bursts) in additive (Gaussian and compound-Gaussian)
noise with different signal-tonoise ratios (SNR). Principal component analysis
(PCA) is next implemented for reducing data dimensionality, yielding results
consistent with, and extending those in the literature. Then, a multilayer
perceptron is trained by a backpropagation algorithm (MLP-BP) on a data subset,
and used to classify the transients as glitch or burst. A Self-Organizing Map
(SOM) architecture is finally used to classify the glitches. The glitch/burst
discrimination and glitch classification abilities are gauged in terms of the
related truth tables. Preliminary results suggest that the approach is
effective and robust throughout the SNR range of practical interest.
Perspective applications pertain both to distributed (network, multisensor)
detection of GWBs, where someintelligenceat the single node level can be
introduced, and instrument diagnostics/optimization, where spurious transients
can be identified, classified and hopefully traced back to their entry point
- …