29,728 research outputs found

    The Sampling-and-Learning Framework: A Statistical View of Evolutionary Algorithms

    Full text link
    Evolutionary algorithms (EAs), a large class of general purpose optimization algorithms inspired from the natural phenomena, are widely used in various industrial optimizations and often show excellent performance. This paper presents an attempt towards revealing their general power from a statistical view of EAs. By summarizing a large range of EAs into the sampling-and-learning framework, we show that the framework directly admits a general analysis on the probable-absolute-approximate (PAA) query complexity. We particularly focus on the framework with the learning subroutine being restricted as a binary classification, which results in the sampling-and-classification (SAC) algorithms. With the help of the learning theory, we obtain a general upper bound on the PAA query complexity of SAC algorithms. We further compare SAC algorithms with the uniform search in different situations. Under the error-target independence condition, we show that SAC algorithms can achieve polynomial speedup to the uniform search, but not super-polynomial speedup. Under the one-side-error condition, we show that super-polynomial speedup can be achieved. This work only touches the surface of the framework. Its power under other conditions is still open

    Kinematic Basis of Emergent Energetics of Complex Dynamics

    Full text link
    Stochastic kinematic description of a complex dynamics is shown to dictate an energetic and thermodynamic structure. An energy function Ο†(x)\varphi(x) emerges as the limit of the generalized, nonequilibrium free energy of a Markovian dynamics with vanishing fluctuations. In terms of the βˆ‡Ο†\nabla\varphi and its orthogonal field Ξ³(x)βŠ₯βˆ‡Ο†\gamma(x)\perp\nabla\varphi, a general vector field b(x)b(x) can be decomposed into βˆ’D(x)βˆ‡Ο†+Ξ³-D(x)\nabla\varphi+\gamma, where βˆ‡β‹…(Ο‰(x)Ξ³(x))=\nabla\cdot\big(\omega(x)\gamma(x)\big)= βˆ’βˆ‡Ο‰D(x)βˆ‡Ο†-\nabla\omega D(x)\nabla\varphi. The matrix D(x)D(x) and scalar Ο‰(x)\omega(x), two additional characteristics to the b(x)b(x) alone, represent the local geometry and density of states intrinsic to the statistical motion in the state space at xx. Ο†(x)\varphi(x) and Ο‰(x)\omega(x) are interpreted as the emergent energy and degeneracy of the motion, with an energy balance equation dΟ†(x(t))/dt=Ξ³Dβˆ’1Ξ³βˆ’bDβˆ’1bd\varphi(x(t))/dt=\gamma D^{-1}\gamma-bD^{-1}b, reflecting the geometrical βˆ₯Dβˆ‡Ο†βˆ₯2+βˆ₯Ξ³βˆ₯2=βˆ₯bβˆ₯2\|D\nabla\varphi\|^2+\|\gamma\|^2=\|b\|^2. The partition function employed in statistical mechanics and J. W. Gibbs' method of ensemble change naturally arise; a fluctuation-dissipation theorem is established via the two leading-order asymptotics of entropy production as Ο΅β†’0\epsilon\to 0. The present theory provides a mathematical basis for P. W. Anderson's emergent behavior in the hierarchical structure of complexity science.Comment: 7 page

    ZOOpt: Toolbox for Derivative-Free Optimization

    Full text link
    Recent advances of derivative-free optimization allow efficient approximating the global optimal solutions of sophisticated functions, such as functions with many local optima, non-differentiable and non-continuous functions. This article describes the ZOOpt (https://github.com/eyounx/ZOOpt) toolbox that provides efficient derivative-free solvers and are designed easy to use. ZOOpt provides a Python package for single-thread optimization, and a light-weighted distributed version with the help of the Julia language for Python described functions. ZOOpt toolbox particularly focuses on optimization problems in machine learning, addressing high-dimensional, noisy, and large-scale problems. The toolbox is being maintained toward ready-to-use tool in real-world machine learning tasks
    • …
    corecore