270 research outputs found

    Joint Reconstruction of Multi-channel, Spectral CT Data via Constrained Total Nuclear Variation Minimization

    Full text link
    We explore the use of the recently proposed "total nuclear variation" (TNV) as a regularizer for reconstructing multi-channel, spectral CT images. This convex penalty is a natural extension of the total variation (TV) to vector-valued images and has the advantage of encouraging common edge locations and a shared gradient direction among image channels. We show how it can be incorporated into a general, data-constrained reconstruction framework and derive update equations based on the first-order, primal-dual algorithm of Chambolle and Pock. Early simulation studies based on the numerical XCAT phantom indicate that the inter-channel coupling introduced by the TNV leads to better preservation of image features at high levels of regularization, compared to independent, channel-by-channel TV reconstructions.Comment: Submitted to Physics in Medicine and Biolog

    Slope And Geometry In Variational Mathematics

    Full text link
    Structure permeates both theory and practice in modern optimization. To make progress, optimizers often presuppose a particular algebraic description of the problem at hand, namely whether the functional components are affine, polynomial, smooth, sparse, etc., and a qualification (transversality) condition guaranteeing the components do not interact wildly. This thesis deals with structure as well, but in an intrinsic and geometric sense, independent of functional representation. On one hand, we emphasize the slope - the fastest instantaneous rate of decrease of a function - as an elegant and powerful tool to study nonsmooth phenomenon. The slope yields a verifiable condition for existence of exact error bounds - a Lipschitz-like dependence of a function's sublevel sets on its values. This relationship, in particular, will be key for the convergence analysis of the method of alternating projections and for the existence theory of steepest descent curves (appropriately defined in absence of differentiability). On the other hand, the slope and the derived concept of subdifferential may be of limited use in general due to various pathologies that may occur. For example, the subdifferential graph may be large (full-dimensional in the ambient space) or the critical value set may be dense in the image space. Such pathologies, however, rarely appear in practice. Semi-algebraic functions - those functions whose graphs are composed of finitely many sets, each defined by finitely many polynomial inequalities - nicely represent concrete functions arising in optimization and are void of such pathologies. To illustrate, we will see that semi-algebraic subdifferential graphs are, in a precise mathematical sense, small. Moreover, using the slope in tandem with semi-algebraic techniques, we significantly strengthen the convergence theory of the method of alternating projections and prove new regularity properties of steepest descent curves in the semi-algebraic setting. To illustrate, under reasonable conditions, bounded steepest descent curves of semi-algebraic functions have finite length and converge to local minimizers - properties that decisively fail in absence of semi-algebraicity. We conclude the thesis with a fresh new look at active sets in optimization from the perspective of representation independence. The underlying idea is extremely simple: around a solution of an optimization problem, an "identifiable" subset of the feasible region is one containing all nearby solutions after small perturbations to the problem. A quest for only the most essential ingredients of sensitivity analysis leads us to consider identifiable sets that are "minimal". In the context of standard nonlinear programming, this concept reduces to the active-set philosophy. On the other hand, identifiability is much broader, being independent of functional representation of the problem. This new notion lays a broad and intuitive variational-analytic foundation for optimality conditions, sensitivity, and active-set methods. In the last chapter of the thesis, we illustrate the robustness of the concept in the context of eigenvalue optimization

    International Conference on Continuous Optimization (ICCOPT) 2019 Conference Book

    Get PDF
    The Sixth International Conference on Continuous Optimization took place on the campus of the Technical University of Berlin, August 3-8, 2019. The ICCOPT is a flagship conference of the Mathematical Optimization Society (MOS), organized every three years. ICCOPT 2019 was hosted by the Weierstrass Institute for Applied Analysis and Stochastics (WIAS) Berlin. It included a Summer School and a Conference with a series of plenary and semi-plenary talks, organized and contributed sessions, and poster sessions. This book comprises the full conference program. It contains, in particular, the scientific program in survey style as well as with all details, and information on the social program, the venue, special meetings, and more
    corecore