909 research outputs found
Static/Dynamic Filtering for Mesh Geometry
The joint bilateral filter, which enables feature-preserving signal smoothing
according to the structural information from a guidance, has been applied for
various tasks in geometry processing. Existing methods either rely on a static
guidance that may be inconsistent with the input and lead to unsatisfactory
results, or a dynamic guidance that is automatically updated but sensitive to
noises and outliers. Inspired by recent advances in image filtering, we propose
a new geometry filtering technique called static/dynamic filter, which utilizes
both static and dynamic guidances to achieve state-of-the-art results. The
proposed filter is based on a nonlinear optimization that enforces smoothness
of the signal while preserving variations that correspond to features of
certain scales. We develop an efficient iterative solver for the problem, which
unifies existing filters that are based on static or dynamic guidances. The
filter can be applied to mesh face normals followed by vertex position update,
to achieve scale-aware and feature-preserving filtering of mesh geometry. It
also works well for other types of signals defined on mesh surfaces, such as
texture colors. Extensive experimental results demonstrate the effectiveness of
the proposed filter for various geometry processing applications such as mesh
denoising, geometry feature enhancement, and texture color filtering
On the Effectiveness of Richardson Extrapolation in Machine Learning
Richardson extrapolation is a classical technique from numerical analysis that can improve the approximation error of an estimation method by combining linearly several estimates obtained from different values of one of its hyperparameters, without the need to know in details the inner structure of the original estimation method. The main goal of this paper is to study when Richardson extrapolation can be used within machine learning, beyond the existing applications to step-size adaptations in stochastic gradient descent. We identify two situations where Richardson interpolation can be useful: (1) when the hyperparameter is the number of iterations of an existing iterative optimization algorithm, with applications to averaged gradient descent and Frank-Wolfe algorithms (where we obtain asymptotically rates of on polytopes, where is the number of iterations), and (2) when it is a regularization parameter, with applications to Nesterov smoothing techniques for minimizing non-smooth functions (where we obtain asymptotically rates close to for non-smooth functions), and ridge regression. In all these cases, we show that extrapolation techniques come with no significant loss in performance, but with sometimes strong gains, and we provide theoretical justifications based on asymptotic developments for such gains, as well as empirical illustrations on classical problems from machine learning
- …