5 research outputs found

    Two novel classes of arbitrary high-order structure-preserving algorithms for canonical Hamiltonian systems

    Full text link
    In this paper, we systematically construct two classes of structure-preserving schemes with arbitrary order of accuracy for canonical Hamiltonian systems. The one class is the symplectic scheme, which contains two new families of parameterized symplectic schemes that are derived by basing on the generating function method and the symmetric composition method, respectively. Each member in these schemes is symplectic for any fixed parameter. A more general form of generating functions is introduced, which generalizes the three classical generating functions that are widely used to construct symplectic algorithms. The other class is a novel family of energy and quadratic invariants preserving schemes, which is devised by adjusting the parameter in parameterized symplectic schemes to guarantee energy conservation at each time step. The existence of the solutions of these schemes is verified. Numerical experiments demonstrate the theoretical analysis and conservation of the proposed schemes

    Inference and Model Parameter Learning for Image Labeling by Geometric Assignment

    Get PDF
    Image labeling is a fundamental problem in the area of low-level image analysis. In this work, we present novel approaches to maximum a posteriori (MAP) inference and model parameter learning for image labeling, respectively. Both approaches are formulated in a smooth geometric setting, whose respective solution space is a simple Riemannian manifold. Optimization consists of multiplicative updates that geometrically integrate the resulting Riemannian gradient flow. Our novel approach to MAP inference is based on discrete graphical models. By utilizing local Wasserstein distances for coupling assignment measures across edges of the underlying graph, we smoothly approximate a given discrete objective function and restrict it to the assignment manifold. A corresponding update scheme combines geometric integration of the resulting gradient flow, and rounding to integral solutions that represent valid labelings. This formulation constitutes an inner relaxation of the discrete labeling problem, i.e. throughout this process local marginalization constraints known from the established linear programming relaxation are satisfied. Furthermore, we study the inverse problem of model parameter learning using the linear assignment flow and training data with ground truth. This is accomplished by a Riemannian gradient flow on the manifold of parameters that determine the regularization properties of the assignment flow. This smooth formulation enables us to tackle the model parameter learning problem from the perspective of parameter estimation of dynamical systems. By using symplectic partitioned Runge--Kutta methods for numerical integration, we show that deriving the sensitivity conditions of the parameter learning problem and its discretization commute. A favorable property of our approach is that learning is based on exact inference

    A class of symplectic partitioned Runge–Kutta methods

    No full text
    corecore