757 research outputs found

    Coupling techniques for nonlinear hyperbolic equations. IV. Multi-component coupling and multidimensional well-balanced schemes

    Get PDF
    This series of papers is devoted to the formulation and the approximation of coupling problems for nonlinear hyperbolic equations. The coupling across an interface in the physical space is formulated in term of an augmented system of partial differential equations. In an earlier work, this strategy allowed us to develop a regularization method based on a thick interface model in one space variable. In the present paper, we significantly extend this framework and, in addition, encompass equations in several space variables. This new formulation includes the coupling of several distinct conservation laws and allows for a possible covering in space. Our main contributions are, on one hand, the design and analysis of a well-balanced finite volume method on general triangulations and, on the other hand, a proof of convergence of this method toward entropy solutions, extending Coquel, Cockburn, and LeFloch's theory (restricted to a single conservation law without coupling). The core of our analysis is, first, the derivation of entropy inequalities as well as a discrete entropy dissipation estimate and, second, a proof of convergence toward the entropy solution of the coupling problem.Comment: 37 page

    Characterization of transport optimizers via graphs and applications to Stackelberg-Cournot-Nash equilibria

    Full text link
    We introduce graphs associated to transport problems between discrete marginals, that allow to characterize the set of all optimizers given one primal optimizer. In particular, we establish that connectivity of those graphs is a necessary and sufficient condition for uniqueness of the dual optimizers. Moreover, we provide an algorithm that can efficiently compute the dual optimizer that is the limit, as the regularization parameter goes to zero, of the dual entropic optimizers. Our results find an application in a Stackelberg-Cournot-Nash game, for which we obtain existence and characterization of the equilibria

    Low Complexity Regularization of Linear Inverse Problems

    Full text link
    Inverse problems and regularization theory is a central theme in contemporary signal processing, where the goal is to reconstruct an unknown signal from partial indirect, and possibly noisy, measurements of it. A now standard method for recovering the unknown signal is to solve a convex optimization problem that enforces some prior knowledge about its structure. This has proved efficient in many problems routinely encountered in imaging sciences, statistics and machine learning. This chapter delivers a review of recent advances in the field where the regularization prior promotes solutions conforming to some notion of simplicity/low-complexity. These priors encompass as popular examples sparsity and group sparsity (to capture the compressibility of natural signals and images), total variation and analysis sparsity (to promote piecewise regularity), and low-rank (as natural extension of sparsity to matrix-valued data). Our aim is to provide a unified treatment of all these regularizations under a single umbrella, namely the theory of partial smoothness. This framework is very general and accommodates all low-complexity regularizers just mentioned, as well as many others. Partial smoothness turns out to be the canonical way to encode low-dimensional models that can be linear spaces or more general smooth manifolds. This review is intended to serve as a one stop shop toward the understanding of the theoretical properties of the so-regularized solutions. It covers a large spectrum including: (i) recovery guarantees and stability to noise, both in terms of â„“2\ell^2-stability and model (manifold) identification; (ii) sensitivity analysis to perturbations of the parameters involved (in particular the observations), with applications to unbiased risk estimation ; (iii) convergence properties of the forward-backward proximal splitting scheme, that is particularly well suited to solve the corresponding large-scale regularized optimization problem
    • …
    corecore