42,816 research outputs found
On Euclidean Norm Approximations
Euclidean norm calculations arise frequently in scientific and engineering
applications. Several approximations for this norm with differing complexity
and accuracy have been proposed in the literature. Earlier approaches were
based on minimizing the maximum error. Recently, Seol and Cheun proposed an
approximation based on minimizing the average error. In this paper, we first
examine these approximations in detail, show that they fit into a single
mathematical formulation, and compare their average and maximum errors. We then
show that the maximum errors given by Seol and Cheun are significantly
optimistic.Comment: 9 pages, 1 figure, Pattern Recognitio
The geometry of optimal degree reduction of Bezier curves
Optimal degree reductions, i.e. best approximations of -th degree Bezier curves
by Bezier curves of degree - 1, with respect to different norms are studied. It
is shown that for any -norm the euclidean degree reduction where the norm is applied to the euclidean distance function of two curves is identical to componentwise degree reduction. The Bezier points of the degree reductions are found to lie on parallel lines through the Bezier points of any Taylor expansion of degree - 1 of the original curve. This geometric situation is shown to hold also in the case of constrained degree reduction. The Bezier points of the degree reduction are explicitly given in the unconstrained case for = 1 and = 2 and in the constrained case for = 2
Comments on "On Approximating Euclidean Metrics by Weighted t-Cost Distances in Arbitrary Dimension"
Mukherjee (Pattern Recognition Letters, vol. 32, pp. 824-831, 2011) recently
introduced a class of distance functions called weighted t-cost distances that
generalize m-neighbor, octagonal, and t-cost distances. He proved that weighted
t-cost distances form a family of metrics and derived an approximation for the
Euclidean norm in . In this note we compare this approximation to
two previously proposed Euclidean norm approximations and demonstrate that the
empirical average errors given by Mukherjee are significantly optimistic in
. We also propose a simple normalization scheme that improves the
accuracy of his approximation substantially with respect to both average and
maximum relative errors.Comment: 7 pages, 1 figure, 3 tables. arXiv admin note: substantial text
overlap with arXiv:1008.487
Projecting Ising Model Parameters for Fast Mixing
Inference in general Ising models is difficult, due to high treewidth making
tree-based algorithms intractable. Moreover, when interactions are strong,
Gibbs sampling may take exponential time to converge to the stationary
distribution. We present an algorithm to project Ising model parameters onto a
parameter set that is guaranteed to be fast mixing, under several divergences.
We find that Gibbs sampling using the projected parameters is more accurate
than with the original parameters when interaction strengths are strong and
when limited time is available for sampling.Comment: Advances in Neural Information Processing Systems 201
- …