4,340 research outputs found
A Family of Maximum Margin Criterion for Adaptive Learning
In recent years, pattern analysis plays an important role in data mining and
recognition, and many variants have been proposed to handle complicated
scenarios. In the literature, it has been quite familiar with high
dimensionality of data samples, but either such characteristics or large data
have become usual sense in real-world applications. In this work, an improved
maximum margin criterion (MMC) method is introduced firstly. With the new
definition of MMC, several variants of MMC, including random MMC, layered MMC,
2D^2 MMC, are designed to make adaptive learning applicable. Particularly, the
MMC network is developed to learn deep features of images in light of simple
deep networks. Experimental results on a diversity of data sets demonstrate the
discriminant ability of proposed MMC methods are compenent to be adopted in
complicated application scenarios.Comment: 14 page
Geodesics on Flat Surfaces
This short survey illustrates the ideas of Teichmuller dynamics. As a model
application we consider the asymptotic topology of generic geodesics on a
"flat" surface and count closed geodesics and saddle connections. This survey
is based on the joint papers with A.Eskin and H.Masur and with M.Kontsevich.Comment: (25 pages, 5 figures) Based on the talk at ICM 2006 at Madrid; see
Proceedings of the ICM, Madrid, Spain, 2006, EMS, 121-146 for the final
version. For a more detailed survey see the paper "Flat Surfaces",
arXiv.math.DS/060939
Convergence Analysis of the Fast Subspace Descent Methods for Convex Optimization Problems
The full approximation storage (FAS) scheme is a widely used multigrid method
for nonlinear problems. In this paper, a new framework to design and analyze
FAS-like schemes for convex optimization problems is developed. The new method,
the Fast Subspace Descent (FASD) scheme, which generalizes classical FAS, can
be recast as an inexact version of nonlinear multigrid methods based on space
decomposition and subspace correction. The local problem in each subspace can
be simplified to be linear and one gradient descent iteration (with an
appropriate step size) is enough to ensure a global linear (geometric)
convergence of FASD.Comment: 33 page
- …