990 research outputs found
New Acceleration of Nearly Optimal Univariate Polynomial Root-findERS
Univariate polynomial root-finding has been studied for four millennia and is
still the subject of intensive research. Hundreds of efficient algorithms for
this task have been proposed. Two of them are nearly optimal. The first one,
proposed in 1995, relies on recursive factorization of a polynomial, is quite
involved, and has never been implemented. The second one, proposed in 2016,
relies on subdivision iterations, was implemented in 2018, and promises to be
practically competitive, although user's current choice for univariate
polynomial root-finding is the package MPSolve, proposed in 2000, revised in
2014, and based on Ehrlich's functional iterations. By proposing and
incorporating some novel techniques we significantly accelerate both
subdivision and Ehrlich's iterations. Moreover our acceleration of the known
subdivision root-finders is dramatic in the case of sparse input polynomials.
Our techniques can be of some independent interest for the design and analysis
of polynomial root-finders.Comment: 89 pages, 5 figures, 2 table
A Tensor Approach to Learning Mixed Membership Community Models
Community detection is the task of detecting hidden communities from observed
interactions. Guaranteed community detection has so far been mostly limited to
models with non-overlapping communities such as the stochastic block model. In
this paper, we remove this restriction, and provide guaranteed community
detection for a family of probabilistic network models with overlapping
communities, termed as the mixed membership Dirichlet model, first introduced
by Airoldi et al. This model allows for nodes to have fractional memberships in
multiple communities and assumes that the community memberships are drawn from
a Dirichlet distribution. Moreover, it contains the stochastic block model as a
special case. We propose a unified approach to learning these models via a
tensor spectral decomposition method. Our estimator is based on low-order
moment tensor of the observed network, consisting of 3-star counts. Our
learning method is fast and is based on simple linear algebraic operations,
e.g. singular value decomposition and tensor power iterations. We provide
guaranteed recovery of community memberships and model parameters and present a
careful finite sample analysis of our learning method. As an important special
case, our results match the best known scaling requirements for the
(homogeneous) stochastic block model
Differential fast fixed-point algorithms for underdetermined instantaneous and convolutive partial blind source separation
This paper concerns underdetermined linear instantaneous and convolutive
blind source separation (BSS), i.e., the case when the number of observed mixed
signals is lower than the number of sources.We propose partial BSS methods,
which separate supposedly nonstationary sources of interest (while keeping
residual components for the other, supposedly stationary, "noise" sources).
These methods are based on the general differential BSS concept that we
introduced before. In the instantaneous case, the approach proposed in this
paper consists of a differential extension of the FastICA method (which does
not apply to underdetermined mixtures). In the convolutive case, we extend our
recent time-domain fast fixed-point C-FICA algorithm to underdetermined
mixtures. Both proposed approaches thus keep the attractive features of the
FastICA and C-FICA methods. Our approaches are based on differential sphering
processes, followed by the optimization of the differential nonnormalized
kurtosis that we introduce in this paper. Experimental tests show that these
differential algorithms are much more robust to noise sources than the standard
FastICA and C-FICA algorithms.Comment: this paper describes our differential FastICA-like algorithms for
linear instantaneous and convolutive underdetermined mixture
- …