2,479 research outputs found
High-dimensional Ising model selection using -regularized logistic regression
We consider the problem of estimating the graph associated with a binary
Ising Markov random field. We describe a method based on -regularized
logistic regression, in which the neighborhood of any given node is estimated
by performing logistic regression subject to an -constraint. The method
is analyzed under high-dimensional scaling in which both the number of nodes
and maximum neighborhood size are allowed to grow as a function of the
number of observations . Our main results provide sufficient conditions on
the triple and the model parameters for the method to succeed in
consistently estimating the neighborhood of every node in the graph
simultaneously. With coherence conditions imposed on the population Fisher
information matrix, we prove that consistent neighborhood selection can be
obtained for sample sizes with exponentially decaying
error. When these same conditions are imposed directly on the sample matrices,
we show that a reduced sample size of suffices for the
method to estimate neighborhoods consistently. Although this paper focuses on
the binary graphical models, we indicate how a generalization of the method of
the paper would apply to general discrete Markov random fields.Comment: Published in at http://dx.doi.org/10.1214/09-AOS691 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Comparison of History Effects in Magnetization in Weakly pinned Crystals of high- and low-T Superconductors
A comparison of the history effects in weakly pinned single crystals of a
high YBaCuO (for H c) and a low
CaRhSn, which show anomalous variations in critical current
density are presented via tracings of the minor magnetization
hysteresis loops using a vibrating sample magnetometer. The sample histories
focussed are, (i) the field cooled (FC), (ii) the zero field cooled (ZFC) and
(iii) an isothermal reversal of field from the normal state. An understanding
of the results in terms of the modulation in the plastic deformation of the
elastic vortex solid and supercooling across order-disorder transition is
sought.Comment: Presented in IWCC-200
Violent Hard X-ray Variability of Mrk 421 Observed by NuSTAR in 2013 April
The well studied blazar Markarian 421 (Mrk 421, =0.031) was the subject of
an intensive multi-wavelength campaign when it flared in 2013 April. The
recorded X-ray and very high energy (VHE, E100 GeV) -ray fluxes are
the highest ever measured from this object. At the peak of the activity, it was
monitored by the hard X-ray focusing telescope {\it Nuclear Spectroscopic
Telescope Array} ({\it NuSTAR}) and {\it Swift} X-Ray Telescope (XRT). In this
work, we present a detailed variability analysis of {\it NuSTAR} and {\it
Swift}-XRT observations of Mrk 421 during this flaring episode. We obtained the
shortest flux doubling time of 14.015.03 minutes, which is the shortest
hard X-ray (379 keV) variability ever recorded from Mrk 421 and is on the
order of the light crossing time of the black hole's event horizon. A pattern
of extremely fast variability events superposed on slowly varying flares is
found in most of the {\it NuSTAR} observations. We suggest that these peculiar
variability patterns may be explained by magnetic energy dissipation and
reconnection in a fast moving compact emission region within the jet. Based on
the fast variability, we derive a lower limit on the magnetic field strength of
~G, where is the
Doppler factor in units of 10, and is the characteristic X-ray
synchrotron frequency in units of ~Hz.Comment: 23 pages, 5 figures, 2 tables, to appear in the Astrophysical Journa
A Voting-Based System for Ethical Decision Making
We present a general approach to automating ethical decisions, drawing on
machine learning and computational social choice. In a nutshell, we propose to
learn a model of societal preferences, and, when faced with a specific ethical
dilemma at runtime, efficiently aggregate those preferences to identify a
desirable choice. We provide a concrete algorithm that instantiates our
approach; some of its crucial steps are informed by a new theory of
swap-dominance efficient voting rules. Finally, we implement and evaluate a
system for ethical decision making in the autonomous vehicle domain, using
preference data collected from 1.3 million people through the Moral Machine
website.Comment: 25 pages; paper has been reorganized, related work and discussion
sections have been expande
Black Hole Mass Limits for Optically Dark X-ray Bright Sources in Elliptical Galaxies
Estimation of the black hole mass in bright X-ray sources of nearby galaxies
is crucial to the understanding of these systems and their formation. However,
the present allowed black hole mass range spans five order of magnitude (10Msun
< M < 10^5 Msun) with the upper limit obtained from dynamical friction
arguments. We show that the absence of a detectable optical counterpart for
some of these sources, can provide a much more stringent upper limit. The
argument is based only on the assumption that the outer regions of their
accretion disks is a standard one. Moreover, such optically dark X-ray sources
cannot be foreground stars or background active galactic nuclei, and hence must
be accreting systems residing within their host galaxies. As a demonstration we
search for candidates among the point-like X-ray sources detected with Chandra
in thirteen nearby elliptical galaxies. We use a novel technique to search for
faint optical counterparts in the HST images whereby we subtract the bright
galaxy light based on isophotal modeling of the surface brightness. We show
that for six sources with no detectable optical emission at the 3sigma
level, their black hole masses M_{BH} < 5000Msun. In particular, an
ultra-luminous X-ray source (ULX) in NGC 4486 has M_{BH} < 1244 Msun. We
discuss the potential of this method to provide stringent constraints on the
black hole masses, and the implications on the physical nature of these
sources.Comment: 11 Pages, 1 figure, Accepted for publication in Ap
Coreference detection of low quality objects
The problem of record linkage is a widely studied problem that aims to identify coreferent (i.e. duplicate) data in a structured data source. As indicated by Winkler, a solution to the record linkage problem is only possible if the error rate is sufficiently low. In other words, in order to succesfully deduplicate a database, the objects in the database must be of sufficient quality. However, this assumption is not always feasible. In this paper, it is investigated how merging of low quality objects into one high quality object can improve the process of record linkage. This general idea is illustrated in the context of strings comparison, where strings of low quality (i.e. with a high typographical error rate) are merged into a string of high quality by using an n-dimensional Levenshtein distance matrix and compute the optimal alignment between the dirty strings. Results are presented and possible refinements are proposed
Minimum and maximum against k lies
A neat 1972 result of Pohl asserts that [3n/2]-2 comparisons are sufficient,
and also necessary in the worst case, for finding both the minimum and the
maximum of an n-element totally ordered set. The set is accessed via an oracle
for pairwise comparisons. More recently, the problem has been studied in the
context of the Renyi-Ulam liar games, where the oracle may give up to k false
answers. For large k, an upper bound due to Aigner shows that (k+O(\sqrt{k}))n
comparisons suffice. We improve on this by providing an algorithm with at most
(k+1+C)n+O(k^3) comparisons for some constant C. The known lower bounds are of
the form (k+1+c_k)n-D, for some constant D, where c_0=0.5, c_1=23/32=0.71875,
and c_k=\Omega(2^{-5k/4}) as k goes to infinity.Comment: 11 pages, 3 figure
- …
