26,246 research outputs found
Distributionally Robust Learning with Weakly Convex Losses: Convergence Rates and Finite-Sample Guarantees
We consider a distributionally robust stochastic optimization problem and
formulate it as a stochastic two-level composition optimization problem with
the use of the mean--semideviation risk measure. In this setting, we consider a
single time-scale algorithm, involving two versions of the inner function value
tracking: linearized tracking of a continuously differentiable loss function,
and SPIDER tracking of a weakly convex loss function. We adopt the norm of the
gradient of the Moreau envelope as our measure of stationarity and show that
the sample complexity of is possible in both
cases, with only the constant larger in the second case. Finally, we
demonstrate the performance of our algorithm with a robust learning example and
a weakly convex, non-smooth regression example
Recommended from our members
Primary singularities of vector fields on surfaces
Unless another thing is stated one works in the C∞ category and manifolds have empty boundary. Let X and Y be vector fields on a manifold M. We say that Y tracks X if [Y, X] = fX for some continuous function f: M→ R. A subset K of the zero set Z(X) is an essential block for X if it is non-empty, compact, open in Z(X) and its Poincaré-Hopf index does not vanishes. One says that X is non-flat at p if its ∞-jet at p is non-trivial. A point p of Z(X) is called a primary singularity of X if any vector field defined about p and tracking X vanishes at p. This is our main result: consider an essential block K of a vector field X defined on a surface M. Assume that X is non-flat at every point of K. Then K contains a primary singularity of X. As a consequence, if M is a compact surface with non-zero characteristic and X is nowhere flat, then there exists a primary singularity of X
Deep Network Flow for Multi-Object Tracking
Data association problems are an important component of many computer vision
applications, with multi-object tracking being one of the most prominent
examples. A typical approach to data association involves finding a graph
matching or network flow that minimizes a sum of pairwise association costs,
which are often either hand-crafted or learned as linear functions of fixed
features. In this work, we demonstrate that it is possible to learn features
for network-flow-based data association via backpropagation, by expressing the
optimum of a smoothed network flow problem as a differentiable function of the
pairwise association costs. We apply this approach to multi-object tracking
with a network flow formulation. Our experiments demonstrate that we are able
to successfully learn all cost functions for the association problem in an
end-to-end fashion, which outperform hand-crafted costs in all settings. The
integration and combination of various sources of inputs becomes easy and the
cost functions can be learned entirely from data, alleviating tedious
hand-designing of costs.Comment: Accepted to CVPR 201
- …