1,884 research outputs found

    Geometric Numerical Integration of the Assignment Flow

    Get PDF
    The assignment flow is a smooth dynamical system that evolves on an elementary statistical manifold and performs contextual data labeling on a graph. We derive and introduce the linear assignment flow that evolves nonlinearly on the manifold, but is governed by a linear ODE on the tangent space. Various numerical schemes adapted to the mathematical structure of these two models are designed and studied, for the geometric numerical integration of both flows: embedded Runge-Kutta-Munthe-Kaas schemes for the nonlinear flow, adaptive Runge-Kutta schemes and exponential integrators for the linear flow. All algorithms are parameter free, except for setting a tolerance value that specifies adaptive step size selection by monitoring the local integration error, or fixing the dimension of the Krylov subspace approximation. These algorithms provide a basis for applying the assignment flow to machine learning scenarios beyond supervised labeling, including unsupervised labeling and learning from controlled assignment flows

    Inference and Model Parameter Learning for Image Labeling by Geometric Assignment

    Get PDF
    Image labeling is a fundamental problem in the area of low-level image analysis. In this work, we present novel approaches to maximum a posteriori (MAP) inference and model parameter learning for image labeling, respectively. Both approaches are formulated in a smooth geometric setting, whose respective solution space is a simple Riemannian manifold. Optimization consists of multiplicative updates that geometrically integrate the resulting Riemannian gradient flow. Our novel approach to MAP inference is based on discrete graphical models. By utilizing local Wasserstein distances for coupling assignment measures across edges of the underlying graph, we smoothly approximate a given discrete objective function and restrict it to the assignment manifold. A corresponding update scheme combines geometric integration of the resulting gradient flow, and rounding to integral solutions that represent valid labelings. This formulation constitutes an inner relaxation of the discrete labeling problem, i.e. throughout this process local marginalization constraints known from the established linear programming relaxation are satisfied. Furthermore, we study the inverse problem of model parameter learning using the linear assignment flow and training data with ground truth. This is accomplished by a Riemannian gradient flow on the manifold of parameters that determine the regularization properties of the assignment flow. This smooth formulation enables us to tackle the model parameter learning problem from the perspective of parameter estimation of dynamical systems. By using symplectic partitioned Runge--Kutta methods for numerical integration, we show that deriving the sensitivity conditions of the parameter learning problem and its discretization commute. A favorable property of our approach is that learning is based on exact inference

    Nonlocal Graph-PDEs and Riemannian Gradient Flows for Image Labeling

    Get PDF
    In this thesis, we focus on the image labeling problem which is the task of performing unique pixel-wise label decisions to simplify the image while reducing its redundant information. We build upon a recently introduced geometric approach for data labeling by assignment flows [ APSS17 ] that comprises a smooth dynamical system for data processing on weighted graphs. Hereby we pursue two lines of research that give new application and theoretically-oriented insights on the underlying segmentation task. We demonstrate using the example of Optical Coherence Tomography (OCT), which is the mostly used non-invasive acquisition method of large volumetric scans of human retinal tis- sues, how incorporation of constraints on the geometry of statistical manifold results in a novel purely data driven geometric approach for order-constrained segmentation of volumetric data in any metric space. In particular, making diagnostic analysis for human eye diseases requires decisive information in form of exact measurement of retinal layer thicknesses that has be done for each patient separately resulting in an demanding and time consuming task. To ease the clinical diagnosis we will introduce a fully automated segmentation algorithm that comes up with a high segmentation accuracy and a high level of built-in-parallelism. As opposed to many established retinal layer segmentation methods, we use only local information as input without incorporation of additional global shape priors. Instead, we achieve physiological order of reti- nal cell layers and membranes including a new formulation of ordered pair of distributions in an smoothed energy term. This systematically avoids bias pertaining to global shape and is hence suited for the detection of anatomical changes of retinal tissue structure. To access the perfor- mance of our approach we compare two different choices of features on a data set of manually annotated 3 D OCT volumes of healthy human retina and evaluate our method against state of the art in automatic retinal layer segmentation as well as to manually annotated ground truth data using different metrics. We generalize the recent work [ SS21 ] on a variational perspective on assignment flows and introduce a novel nonlocal partial difference equation (G-PDE) for labeling metric data on graphs. The G-PDE is derived as nonlocal reparametrization of the assignment flow approach that was introduced in J. Math. Imaging & Vision 58(2), 2017. Due to this parameterization, solving the G-PDE numerically is shown to be equivalent to computing the Riemannian gradient flow with re- spect to a nonconvex potential. We devise an entropy-regularized difference-of-convex-functions (DC) decomposition of this potential and show that the basic geometric Euler scheme for inte- grating the assignment flow is equivalent to solving the G-PDE by an established DC program- ming scheme. Moreover, the viewpoint of geometric integration reveals a basic way to exploit higher-order information of the vector field that drives the assignment flow, in order to devise a novel accelerated DC programming scheme. A detailed convergence analysis of both numerical schemes is provided and illustrated by numerical experiments
    • …
    corecore