77,755 research outputs found

    The geometry of nonlinear least squares with applications to sloppy models and optimization

    Full text link
    Parameter estimation by nonlinear least squares minimization is a common problem with an elegant geometric interpretation: the possible parameter values of a model induce a manifold in the space of data predictions. The minimization problem is then to find the point on the manifold closest to the data. We show that the model manifolds of a large class of models, known as sloppy models, have many universal features; they are characterized by a geometric series of widths, extrinsic curvatures, and parameter-effects curvatures. A number of common difficulties in optimizing least squares problems are due to this common structure. First, algorithms tend to run into the boundaries of the model manifold, causing parameters to diverge or become unphysical. We introduce the model graph as an extension of the model manifold to remedy this problem. We argue that appropriate priors can remove the boundaries and improve convergence rates. We show that typical fits will have many evaporated parameters. Second, bare model parameters are usually ill-suited to describing model behavior; cost contours in parameter space tend to form hierarchies of plateaus and canyons. Geometrically, we understand this inconvenient parametrization as an extremely skewed coordinate basis and show that it induces a large parameter-effects curvature on the manifold. Using coordinates based on geodesic motion, these narrow canyons are transformed in many cases into a single quadratic, isotropic basin. We interpret the modified Gauss-Newton and Levenberg-Marquardt fitting algorithms as an Euler approximation to geodesic motion in these natural coordinates on the model manifold and the model graph respectively. By adding a geodesic acceleration adjustment to these algorithms, we alleviate the difficulties from parameter-effects curvature, improving both efficiency and success rates at finding good fits.Comment: 40 pages, 29 Figure

    Classifying Network Data with Deep Kernel Machines

    Full text link
    Inspired by a growing interest in analyzing network data, we study the problem of node classification on graphs, focusing on approaches based on kernel machines. Conventionally, kernel machines are linear classifiers in the implicit feature space. We argue that linear classification in the feature space of kernels commonly used for graphs is often not enough to produce good results. When this is the case, one naturally considers nonlinear classifiers in the feature space. We show that repeating this process produces something we call "deep kernel machines." We provide some examples where deep kernel machines can make a big difference in classification performance, and point out some connections to various recent literature on deep architectures in artificial intelligence and machine learning

    Geometrical complexity of data approximators

    Full text link
    There are many methods developed to approximate a cloud of vectors embedded in high-dimensional space by simpler objects: starting from principal points and linear manifolds to self-organizing maps, neural gas, elastic maps, various types of principal curves and principal trees, and so on. For each type of approximators the measure of the approximator complexity was developed too. These measures are necessary to find the balance between accuracy and complexity and to define the optimal approximations of a given type. We propose a measure of complexity (geometrical complexity) which is applicable to approximators of several types and which allows comparing data approximations of different types.Comment: 10 pages, 3 figures, minor correction and extensio

    On the parametric dependences of a class of non-linear singular maps

    Full text link
    We discuss a two-parameter family of maps that generalize piecewise linear, expanding maps of the circle. One parameter measures the effect of a non-linearity which bends the branches of the linear map. The second parameter rotates points by a fixed angle. For small values of the nonlinearity parameter, we compute the invariant measure and show that it has a singular density to first order in the nonlinearity parameter. Its Fourier modes have forms similar to the Weierstrass function. We discuss the consequences of this singularity on the Lyapunov exponents and on the transport properties of the corresponding multibaker map. For larger non-linearities, the map becomes non-hyperbolic and exhibits a series of period-adding bifurcations.Comment: 17 pages, 13 figures, to appear in Discrete and Continuous Dynamical Systems, series B Higher resolution versions of Figures 5 downloadable at http://www.glue.umd.edu/~jrd
    • …
    corecore