10,610 research outputs found
Non-Gaussian Foreground Residuals of the WMAP First Year Maps
We investigate the effect of foreground residuals in the WMAP data (Bennet et
al. 2004) by adding foreground contamination to Gaussian ensembles of CMB
signal and noise maps. We evaluate a set of non-Gaussian estimators on the
contaminated ensembles to determine with what accuracy any residual in the data
can be constrained using higher order statistics. We apply the estimators to
the raw and cleaned Q, V, and W band first year maps. The foreground
subtraction method applied to clean the data in Bennet et al. (2004a) appears
to have induced a correlation between the power spectra and normalized
bispectra of the maps which is absent in Gaussian simulations. It also appears
to increase the correlation between the dl=1 inter-l bispectrum of the cleaned
maps and the foreground templates. In a number of cases the significance of the
effect is above the 98% confidence level.Comment: 9 pages, 4 figure
Primeval symmetries
A detailed examination of the Killing equations in Robertson-Walker
coordinates shows how the addition of matter and/or radiation to a de Sitter
Universe breaks the symmetry generated by four of its Killing fields. The
product U = (a^2)(dH/dt) of the squared scale parameter by the time-derivative
of the Hubble function encapsulates the relationship between the two cases: the
symmetry is maximal when U is a constant, and reduces to the six-parameter
symmetry of a generic Friedmann-Robertson-Walker model when it is not. As the
fields physical interpretation is not clear in these coordinates, comparison is
made with the Killing fields in static coordinates, whose interpretation is
made clearer by their direct relationship to the Poincare group generators via
Wigner-Inonu contractions.Comment: 16 pages, 2 tables; published versio
Log Skeletons: A Classification Approach to Process Discovery
To test the effectiveness of process discovery algorithms, a Process
Discovery Contest (PDC) has been set up. This PDC uses a classification
approach to measure this effectiveness: The better the discovered model can
classify whether or not a new trace conforms to the event log, the better the
discovery algorithm is supposed to be. Unfortunately, even the state-of-the-art
fully-automated discovery algorithms score poorly on this classification. Even
the best of these algorithms, the Inductive Miner, scored only 147 correct
classified traces out of 200 traces on the PDC of 2017. This paper introduces
the rule-based log skeleton model, which is closely related to the Declare
constraint model, together with a way to classify traces using this model. This
classification using log skeletons is shown to score better on the PDC of 2017
than state-of-the-art discovery algorithms: 194 out of 200. As a result, one
can argue that the fully-automated algorithm to construct (or: discover) a log
skeleton from an event log outperforms existing state-of-the-art
fully-automated discovery algorithms.Comment: 16 pages with 9 figures, followed by an appendix of 14 pages with 17
figure
Nonextensivity in the solar magnetic activity during the increasing phase of solar Cycle 23
In this paper we analyze the behavior of the daily Sunspot Number from the
Sunspot Index Data Center (SIDC), the mean Magnetic Field strength from the
National Solar Observatory/Kitt Peak (NSO/KP) and Total Solar Irradiance means
from Virgo/SoHO, in the context of the --Triplet which emerges within
nonextensive statistical mechanics. Distributions for the mean solar Magnetic
Field show two different behaviors, with a --Gaussian for scales of 1 to 16
days and a Gaussian for scales longer than 32 days. The latter corresponds to
an equilibrium state. Distributions for Total Solar Irradiance also show two
different behaviors (approximately Gaussian) for scales of 128 days and longer,
consistent with statistical equilibrium and --Gaussian for scales 128
days. Distributions for the Sunspot Number show a --Gaussian independent of
timescales, consistent with a nonequilibrium state. The values obtained
("--Triplet",,)
demonstrate that the Gaussian or --Gaussian behavior of the aforementioned
data depends significantly on timescales. These results point to strong
multifractal behavior of the dataset analyzed, with the multifractal level
decreasing from Sunspot Number to Total Solar Irradiance. In addition, we found
a numerically satisfied dual relation between and .Comment: 6 pages, 4 figure
Non-Chern-Simons Topological Mass Generation in (2+1) Dimensions
By dimensional reduction of a massive BF theory, a new topological field
theory is constructed in (2+1) dimensions. Two different topological terms, one
involving a scalar and a Kalb-Ramond fields and another one equivalent to the
four-dimensional BF term, are present. We constructed two actions with these
topological terms and show that a topological mass generation mechanism can be
implemented. Using the non-Chern-Simons topological term, an action is proposed
leading to a classical duality relation between Klein-Gordon and Maxwell
actions. We also have shown that an action in (2+1) dimensions with the
Kalb-Ramond field is related by Buscher's duality transformation to a massive
gauge-invariant Stuckelberg-type theory.Comment: 8 pages, no figures, RevTE
Principal Component Analysis as a Tool for Characterizing Black Hole Images and Variability
We explore the use of principal component analysis (PCA) to characterize
high-fidelity simulations and interferometric observations of the millimeter
emission that originates near the horizons of accreting black holes. We show
mathematically that the Fourier transforms of eigenimages derived from PCA
applied to an ensemble of images in the spatial-domain are identical to the
eigenvectors of PCA applied to the ensemble of the Fourier transforms of the
images, which suggests that this approach may be applied to modeling the sparse
interferometric Fourier-visibilities produced by an array such as the Event
Horizon Telescope (EHT). We also show that the simulations in the spatial
domain themselves can be compactly represented with a PCA-derived basis of
eigenimages allowing for detailed comparisons between variable observations and
time-dependent models, as well as for detection of outliers or rare events
within a time series of images. Furthermore, we demonstrate that the spectrum
of PCA eigenvalues is a diagnostic of the power spectrum of the structure and,
hence, of the underlying physical processes in the simulated and observed
images.Comment: 16 pages, 17 figures, submitted to Ap
- …