434,783 research outputs found
Tree-Independent Dual-Tree Algorithms
Dual-tree algorithms are a widely used class of branch-and-bound algorithms.
Unfortunately, developing dual-tree algorithms for use with different trees and
problems is often complex and burdensome. We introduce a four-part logical
split: the tree, the traversal, the point-to-point base case, and the pruning
rule. We provide a meta-algorithm which allows development of dual-tree
algorithms in a tree-independent manner and easy extension to entirely new
types of trees. Representations are provided for five common algorithms; for
k-nearest neighbor search, this leads to a novel, tighter pruning bound. The
meta-algorithm also allows straightforward extensions to massively parallel
settings.Comment: accepted in ICML 201
Dual Geometric Worm Algorithm for Two-Dimensional Discrete Classical Lattice Models
We present a dual geometrical worm algorithm for two-dimensional Ising
models. The existence of such dual algorithms was first pointed out by
Prokof'ev and Svistunov \cite{ProkofevClassical}. The algorithm is defined on
the dual lattice and is formulated in terms of bond-variables and can therefore
be generalized to other two-dimensional models that can be formulated in terms
of bond-variables. We also discuss two related algorithms formulated on the
direct lattice, applicable in any dimension. These latter algorithms turn out
to be less efficient but of considerable intrinsic interest. We show how such
algorithms quite generally can be "directed" by minimizing the probability for
the worms to erase themselves. Explicit proofs of detailed balance are given
for all the algorithms. In terms of computational efficiency the dual
geometrical worm algorithm is comparable to well known cluster algorithms such
as the Swendsen-Wang and Wolff algorithms, however, it is quite different in
structure and allows for a very simple and efficient implementation. The dual
algorithm also allows for a very elegant way of calculating the domain wall
free energy.Comment: 12 pages, 6 figures, Revtex
Totally Corrective Multiclass Boosting with Binary Weak Learners
In this work, we propose a new optimization framework for multiclass boosting
learning. In the literature, AdaBoost.MO and AdaBoost.ECC are the two
successful multiclass boosting algorithms, which can use binary weak learners.
We explicitly derive these two algorithms' Lagrange dual problems based on
their regularized loss functions. We show that the Lagrange dual formulations
enable us to design totally-corrective multiclass algorithms by using the
primal-dual optimization technique. Experiments on benchmark data sets suggest
that our multiclass boosting can achieve a comparable generalization capability
with state-of-the-art, but the convergence speed is much faster than stage-wise
gradient descent boosting. In other words, the new totally corrective
algorithms can maximize the margin more aggressively.Comment: 11 page
- …
