32,744 research outputs found
On the Convergence of (Stochastic) Gradient Descent with Extrapolation for Non-Convex Optimization
Extrapolation is a well-known technique for solving convex optimization and
variational inequalities and recently attracts some attention for non-convex
optimization. Several recent works have empirically shown its success in some
machine learning tasks. However, it has not been analyzed for non-convex
minimization and there still remains a gap between the theory and the practice.
In this paper, we analyze gradient descent and stochastic gradient descent with
extrapolation for finding an approximate first-order stationary point in smooth
non-convex optimization problems. Our convergence upper bounds show that the
algorithms with extrapolation can be accelerated than without extrapolation
Adaptive Tag Selection for Image Annotation
Not all tags are relevant to an image, and the number of relevant tags is
image-dependent. Although many methods have been proposed for image
auto-annotation, the question of how to determine the number of tags to be
selected per image remains open. The main challenge is that for a large tag
vocabulary, there is often a lack of ground truth data for acquiring optimal
cutoff thresholds per tag. In contrast to previous works that pre-specify the
number of tags to be selected, we propose in this paper adaptive tag selection.
The key insight is to divide the vocabulary into two disjoint subsets, namely a
seen set consisting of tags having ground truth available for optimizing their
thresholds and a novel set consisting of tags without any ground truth. Such a
division allows us to estimate how many tags shall be selected from the novel
set according to the tags that have been selected from the seen set. The
effectiveness of the proposed method is justified by our participation in the
ImageCLEF 2014 image annotation task. On a set of 2,065 test images with ground
truth available for 207 tags, the benchmark evaluation shows that compared to
the popular top- strategy which obtains an F-score of 0.122, adaptive tag
selection achieves a higher F-score of 0.223. Moreover, by treating the
underlying image annotation system as a black box, the new method can be used
as an easy plug-in to boost the performance of existing systems
Helicity hardens the gas
A screw generally works better than a nail, or a complicated rope knot better
than a simple one, in fastening solid matter, but a gas is more tameless.
However, a flow itself has a physical quantity, helicity, measuring the
screwing strength of the velocity field and the degree of the knottedness of
the vorticity ropes. It is shown that helicity favors the partition of energy
to the vortical modes, compared to others such as the dilatation and pressure
modes of turbulence; that is, helicity stiffens the flow, with nontrivial
implications for aerodynamics, such as aeroacoustics, and conducting fluids,
among others
- …