1,260 research outputs found

    The geometry of nonlinear least squares with applications to sloppy models and optimization

    Full text link
    Parameter estimation by nonlinear least squares minimization is a common problem with an elegant geometric interpretation: the possible parameter values of a model induce a manifold in the space of data predictions. The minimization problem is then to find the point on the manifold closest to the data. We show that the model manifolds of a large class of models, known as sloppy models, have many universal features; they are characterized by a geometric series of widths, extrinsic curvatures, and parameter-effects curvatures. A number of common difficulties in optimizing least squares problems are due to this common structure. First, algorithms tend to run into the boundaries of the model manifold, causing parameters to diverge or become unphysical. We introduce the model graph as an extension of the model manifold to remedy this problem. We argue that appropriate priors can remove the boundaries and improve convergence rates. We show that typical fits will have many evaporated parameters. Second, bare model parameters are usually ill-suited to describing model behavior; cost contours in parameter space tend to form hierarchies of plateaus and canyons. Geometrically, we understand this inconvenient parametrization as an extremely skewed coordinate basis and show that it induces a large parameter-effects curvature on the manifold. Using coordinates based on geodesic motion, these narrow canyons are transformed in many cases into a single quadratic, isotropic basin. We interpret the modified Gauss-Newton and Levenberg-Marquardt fitting algorithms as an Euler approximation to geodesic motion in these natural coordinates on the model manifold and the model graph respectively. By adding a geodesic acceleration adjustment to these algorithms, we alleviate the difficulties from parameter-effects curvature, improving both efficiency and success rates at finding good fits.Comment: 40 pages, 29 Figure

    Growth of Sobolev norms for the quintic NLS on T2\mathbb T^2

    Full text link
    We study the quintic Non Linear Schr\"odinger equation on a two dimensional torus and exhibit orbits whose Sobolev norms grow with time. The main point is to reduce to a sufficiently simple toy model, similar in many ways to the one used in the case of the cubic NLS. This requires an accurate combinatorial analysis.Comment: 41 pages, 5 figures. arXiv admin note: text overlap with arXiv:0808.1742 by other author

    On Invariance and Selectivity in Representation Learning

    Get PDF
    We discuss data representation which can be learned automatically from data, are invariant to transformations, and at the same time selective, in the sense that two points have the same representation only if they are one the transformation of the other. The mathematical results here sharpen some of the key claims of i-theory -- a recent theory of feedforward processing in sensory cortex

    Asymptotic W-symmetries in three-dimensional higher-spin gauge theories

    Full text link
    We discuss how to systematically compute the asymptotic symmetry algebras of generic three-dimensional bosonic higher-spin gauge theories in backgrounds that are asymptotically AdS. We apply these techniques to a one-parameter family of higher-spin gauge theories that can be considered as large N limits of SL(N) x SL(N) Chern-Simons theories, and we provide a closed formula for the structure constants of the resulting infinite-dimensional non-linear W-algebras. Along the way we provide a closed formula for the structure constants of all classical W_N algebras. In both examples the higher-spin generators of the W-algebras are Virasoro primaries. We eventually discuss how to relate our basis to a non-primary quadratic basis that was previously discussed in literature.Comment: 61 page
    • …
    corecore