230,237 research outputs found

    A Framework for Algorithm Stability

    Get PDF
    We say that an algorithm is stable if small changes in the input result in small changes in the output. This kind of algorithm stability is particularly relevant when analyzing and visualizing time-varying data. Stability in general plays an important role in a wide variety of areas, such as numerical analysis, machine learning, and topology, but is poorly understood in the context of (combinatorial) algorithms. In this paper we present a framework for analyzing the stability of algorithms. We focus in particular on the tradeoff between the stability of an algorithm and the quality of the solution it computes. Our framework allows for three types of stability analysis with increasing degrees of complexity: event stability, topological stability, and Lipschitz stability. We demonstrate the use of our stability framework by applying it to kinetic Euclidean minimum spanning trees

    Fast and Accurate Coarsening Simulation with an Unconditionally Stable Time Step

    Full text link
    We present Cahn-Hilliard and Allen-Cahn numerical integration algorithms that are unconditionally stable and so provide significantly faster accuracy-controlled simulation. Our stability analysis is based on Eyre's theorem and unconditional von Neumann stability analysis, both of which we present. Numerical tests confirm the accuracy of the von Neumann approach, which is straightforward and should be widely applicable in phase-field modeling. We show that accuracy can be controlled with an unbounded time step Delta-t that grows with time t as Delta-t ~ t^alpha. We develop a classification scheme for the step exponent alpha and demonstrate that a class of simple linear algorithms gives alpha=1/3. For this class the speed up relative to a fixed time step grows with the linear size of the system as N/log N, and we estimate conservatively that an 8192^2 lattice can be integrated 300 times faster than with the Euler method.Comment: 14 pages, 6 figure

    Polynomial Optimization with Applications to Stability Analysis and Control - Alternatives to Sum of Squares

    Full text link
    In this paper, we explore the merits of various algorithms for polynomial optimization problems, focusing on alternatives to sum of squares programming. While we refer to advantages and disadvantages of Quantifier Elimination, Reformulation Linear Techniques, Blossoming and Groebner basis methods, our main focus is on algorithms defined by Polya's theorem, Bernstein's theorem and Handelman's theorem. We first formulate polynomial optimization problems as verifying the feasibility of semi-algebraic sets. Then, we discuss how Polya's algorithm, Bernstein's algorithm and Handelman's algorithm reduce the intractable problem of feasibility of semi-algebraic sets to linear and/or semi-definite programming. We apply these algorithms to different problems in robust stability analysis and stability of nonlinear dynamical systems. As one contribution of this paper, we apply Polya's algorithm to the problem of H_infinity control of systems with parametric uncertainty. Numerical examples are provided to compare the accuracy of these algorithms with other polynomial optimization algorithms in the literature.Comment: AIMS Journal of Discrete and Continuous Dynamical Systems - Series

    Entropic lattice Boltzmann methods

    Full text link
    We present a general methodology for constructing lattice Boltzmann models of hydrodynamics with certain desired features of statistical physics and kinetic theory. We show how a methodology of linear programming theory, known as Fourier-Motzkin elimination, provides an important tool for visualizing the state space of lattice Boltzmann algorithms that conserve a given set of moments of the distribution function. We show how such models can be endowed with a Lyapunov functional, analogous to Boltzmann's H, resulting in unconditional numerical stability. Using the Chapman-Enskog analysis and numerical simulation, we demonstrate that such entropically stabilized lattice Boltzmann algorithms, while fully explicit and perfectly conservative, may achieve remarkably low values for transport coefficients, such as viscosity. Indeed, the lowest such attainable values are limited only by considerations of accuracy, rather than stability. The method thus holds promise for high-Reynolds number simulations of the Navier-Stokes equations.Comment: 54 pages, 16 figures. Proc. R. Soc. London A (in press

    Stability Analysis of Artificial Bee Colony Optimization Algorithm

    Get PDF
    Theoretical analysis of swarm intelligence and evolutionary algorithms is relatively less explored area of research. Stability and convergence analysis of swarm intelligence and evolutionary algorithms can help the researchers to fine tune the parameter values. This paper presents the stability analysis of a famous Artificial Bee Colony (ABC) optimization algorithm using von Neumann stability criterion for two-level finite difference scheme. Parameter selection for the ABC algorithm is recommended based on the obtained stability conditions. The findings are also validated through numerical experiments on test problems

    A Novel Family of Adaptive Filtering Algorithms Based on The Logarithmic Cost

    Get PDF
    We introduce a novel family of adaptive filtering algorithms based on a relative logarithmic cost. The new family intrinsically combines the higher and lower order measures of the error into a single continuous update based on the error amount. We introduce important members of this family of algorithms such as the least mean logarithmic square (LMLS) and least logarithmic absolute difference (LLAD) algorithms that improve the convergence performance of the conventional algorithms. However, our approach and analysis are generic such that they cover other well-known cost functions as described in the paper. The LMLS algorithm achieves comparable convergence performance with the least mean fourth (LMF) algorithm and extends the stability bound on the step size. The LLAD and least mean square (LMS) algorithms demonstrate similar convergence performance in impulse-free noise environments while the LLAD algorithm is robust against impulsive interferences and outperforms the sign algorithm (SA). We analyze the transient, steady state and tracking performance of the introduced algorithms and demonstrate the match of the theoretical analyzes and simulation results. We show the extended stability bound of the LMLS algorithm and analyze the robustness of the LLAD algorithm against impulsive interferences. Finally, we demonstrate the performance of our algorithms in different scenarios through numerical examples.Comment: Submitted to IEEE Transactions on Signal Processin

    On computing minimal realizations of periodic descriptor systems

    Get PDF
    Abstract: We propose computationally efficient and numerically reliable algorithms to compute minimal realizations of periodic descriptor systems. The main computational tool employed for the structural analysis of periodic descriptor systems (i.e., reachability and observability) is the orthogonal reduction of periodic matrix pairs to Kronecker-like forms. Specializations of a general reduction algortithm are employed for particular type of systems. One of the proposed minimal realization transformations for which the backward numerical stability can be proved
    corecore