112 research outputs found

    A Fully Equivalent Global Pressure Formulation for Three-Phase Compressible Flow

    Get PDF
    We introduce a new global pressure formulation for immiscible three-phase compressible flows in porous media which is fully equivalent to the original equations, unlike the one introduced in \cite{CJ86}. In this formulation, the total volumetric flow of the three fluids and the global pressure follow a classical Darcy law, which simplifies the resolution of the pressure equation. However, this global pressure formulation exists only for Total Differential (TD) three-phase data, which depend only on two functions of saturations and global pressure: the global capillary pressure and the global mobility. Hence we introduce a class of interpolation which constructs such TD-three-phase data from any set of three two-phase data (for each pair of fluids) which satisfy a TD-compatibility condition

    The output least squares identifiability of the diffusion coefficient from an H1H^1-observation in a 2-D elliptic equation

    Get PDF
    Output least squares stability for the diffusion coefficient in an elliptic equation in dimension two is analyzed. This guarantees Lipschitz stability of the solution of the least squares formulation with respect to perturbations in the data independently of their attainability. The analysis shows the influence of the flow direction on the parameter to be estimated. A scale analysis for multi-scale resolution of the unknown parameter is provided

    Image Segmentation with Multidimensional Refinement Indicators

    Get PDF
    We transpose an optimal control technique to the image segmentation problem. The idea is to consider image segmentation as a parameter estimation problem. The parameter to estimate is the color of the pixels of the image. We use the adaptive parameterization technique which builds iteratively an optimal representation of the parameter into uniform regions that form a partition of the domain, hence corresponding to a segmentation of the image. We minimize an error function during the iterations, and the partition of the image into regions is optimally driven by the gradient of this error. The resulting segmentation algorithm inherits desirable properties from its optimal control origin: soundness, robustness, and flexibility

    Approche bloc en ACP group-sparse: le package sparsePCA

    Get PDF
    International audienc

    Generalized sentinels defined via least squares

    Get PDF
    Projet IDENTWe address the problem of monitoring a linear functional (c,x)E of an Hilbert space E, the available data being the observation z, in an Hilbert space F, of a vector Ax depending linearly of x through some known operator A # (E,F). When E = E1 x E2, c = (c1,0) and A is injective and defined through the solution of a partial differential equation, J.L. Lions (1988,1990) introduced sentinels s F such that (s, Ax)F is sensitive to x1 E1 but insensitive to x2 E2. In this paper, we prove the existence, in the general of a generalized sentinel (s,n) F x E, where F F with F dense in F, such that for any a priori guess xo of x, one has : TFF + (n, xo)E =(c, x)E where x is the least squares estimates of x closest to xo, and a family of regularized sentinels (sn, nn) F x E which converge to (s,n). Generalized sentinels unify the least square approach (by construction !) and the sentinel approach (when A is injective) and provide a general frame work for the construction of "sentinels with special sensitivity" in the sense of J.L. Lions (1990)

    A maximum curvature step and geodesic displacement for nonlinear least squares descent algorithms

    Get PDF
    We adress in this paper the choice of both the step and the curve of the parameter space to be used in the line search search part of descent algorithms for the minimization of least squares objective functions. Our analysis is based on the curvature of the path of the data space followed during the line search. We define first a new and easy to compute maximum curvature step, which gives a guaranteed value to the residual at the next iterate, and satisfies a linear decrease condition with =. Then we optimize (i.e. minimize !) the guaranteed residual by performing the line search along a curve such that the corresponding path in the data space is a geodesic of the output set. An inexpensive implementation using a second order approximation to the geodesic is proposed. Preliminary numerical comparisons of the proposed algorithm with two versions of the Gauss-Newton algorithm show that it works properly over a wide range of nonlinearity, and tends to outperform its competitors in strongly nonlinear situations

    Une remarque sur le prolongement par symetrie des formulations variationnelles et son application a la vectorisation

    Get PDF
    Résumé disponible dans les fichiers attaché

    Quasiconvex sets and size X curvature condition. Application to non linear inversion

    Get PDF
    Résumé disponible dans les fichiers attaché

    On the uniqueness of local minima for general abstract non-linear least square problems

    Get PDF
    Résumé disponible dans les fichiers attaché

    Least-Squares, Sentinels and Substractive Optimally Localized Average

    Get PDF
    We present with unified notations three approaches to linear parameter estimation: least-squares, sentinels, and Substrative Optimally Localized Average (SOLA). It becomes then obvious that the two last approaches correspond to the very same mathematical problem. This brings a new interpretation to sentinels, new computational tools to SOLA, and makes clear their link to the classical least-squares approach
    • …
    corecore