96,257 research outputs found
Multivariate soft rank via entropic optimal transport: sample efficiency and generative modeling
The framework of optimal transport has been leveraged to extend the notion of
rank to the multivariate setting while preserving desirable properties of the
resulting goodness of-fit (GoF) statistics. In particular, the rank energy (RE)
and rank maximum mean discrepancy (RMMD) are distribution-free under the null,
exhibit high power in statistical testing, and are robust to outliers. In this
paper, we point to and alleviate some of the practical shortcomings of these
proposed GoF statistics, namely their high computational cost, high statistical
sample complexity, and lack of differentiability with respect to the data. We
show that all these practically important issues are addressed by considering
entropy-regularized optimal transport maps in place of the rank map, which we
refer to as the soft rank. We consequently propose two new statistics, the soft
rank energy (sRE) and soft rank maximum mean discrepancy (sRMMD), which exhibit
several desirable properties. Given sample data points, we provide
non-asymptotic convergence rates for the sample estimate of the entropic
transport map to its population version that are essentially of the order
. This compares favorably to non-regularized estimates, which
typically suffer from the curse-of-dimensionality and converge at rate that is
exponential in the data dimension. We leverage this fast convergence rate to
demonstrate the sample estimate of the proposed statistics converge rapidly to
their population versions, enabling efficient rank-based GoF statistical
computation, even in high dimensions. Our statistics are differentiable and
amenable to popular machine learning frameworks that rely on gradient methods.
We leverage these properties towards showcasing the utility of the proposed
statistics for generative modeling on two important problems: image generation
and generating valid knockoffs for controlled feature selection.Comment: 43 pages, 10 figures. Replacement note: Title change, author changes,
new theoretical results, revised and expanded experimental evaluation
Rapid computation of far-field statistics for random obstacle scattering
In this article, we consider the numerical approximation of far-field
statistics for acoustic scattering problems in the case of random obstacles. In
particular, we consider the computation of the expected far-field pattern and
the expected scattered wave away from the scatterer as well as the computation
of the corresponding variances. To that end, we introduce an artificial
interface, which almost surely contains all realizations of the random
scatterer. At this interface, we directly approximate the second order
statistics, i.e., the expectation and the variance, of the Cauchy data by means
of boundary integral equations. From these quantities, we are able to rapidly
evaluate statistics of the scattered wave everywhere in the exterior domain,
including the expectation and the variance of the far-field. By employing a
low-rank approximation of the Cauchy data's two-point correlation function, we
drastically reduce the cost of the computation of the scattered wave's
variance. Numerical results are provided in order to demonstrate the
feasibility of the proposed approach
Training linear ranking SVMs in linearithmic time using red-black trees
We introduce an efficient method for training the linear ranking support
vector machine. The method combines cutting plane optimization with red-black
tree based approach to subgradient calculations, and has O(m*s+m*log(m)) time
complexity, where m is the number of training examples, and s the average
number of non-zero features per example. Best previously known training
algorithms achieve the same efficiency only for restricted special cases,
whereas the proposed approach allows any real valued utility scores in the
training data. Experiments demonstrate the superior scalability of the proposed
approach, when compared to the fastest existing RankSVM implementations.Comment: 20 pages, 4 figure
- …