115 research outputs found
From explained variance of correlated components to PCA without orthogonality constraints
Block Principal Component Analysis (Block PCA) of a data matrix A, where
loadings Z are determined by maximization of AZ 2 over unit norm orthogonal
loadings, is difficult to use for the design of sparse PCA by 1 regularization,
due to the difficulty of taking care of both the orthogonality constraint on
loadings and the non differentiable 1 penalty. Our objective in this paper is
to relax the orthogonality constraint on loadings by introducing new objective
functions expvar(Y) which measure the part of the variance of the data matrix A
explained by correlated components Y = AZ. So we propose first a comprehensive
study of mathematical and numerical properties of expvar(Y) for two existing
definitions Zou et al. [2006], Shen and Huang [2008] and four new definitions.
Then we show that only two of these explained variance are fit to use as
objective function in block PCA formulations for A rid of orthogonality
constraints
A Fully Equivalent Global Pressure Formulation for Three-Phase Compressible Flow
We introduce a new global pressure formulation for immiscible three-phase
compressible flows in porous media which is fully equivalent to the original
equations, unlike the one introduced in \cite{CJ86}. In this formulation, the
total volumetric flow of the three fluids and the global pressure follow a
classical Darcy law, which simplifies the resolution of the pressure equation.
However, this global pressure formulation exists only for Total Differential
(TD) three-phase data, which depend only on two functions of saturations and
global pressure: the global capillary pressure and the global mobility. Hence
we introduce a class of interpolation which constructs such TD-three-phase data
from any set of three two-phase data (for each pair of fluids) which satisfy a
TD-compatibility condition
The output least squares identifiability of the diffusion coefficient from an -observation in a 2-D elliptic equation
Output least squares stability for the diffusion coefficient in an elliptic equation in dimension
two is analyzed. This guarantees Lipschitz stability of the solution of the least squares
formulation with respect to perturbations in the data independently of their attainability.
The analysis shows the influence of the flow direction on the parameter to be estimated.
A scale analysis for multi-scale resolution of the unknown parameter is provided
Image Segmentation with Multidimensional Refinement Indicators
We transpose an optimal control technique to the image segmentation problem.
The idea is to consider image segmentation as a parameter estimation problem.
The parameter to estimate is the color of the pixels of the image. We use the
adaptive parameterization technique which builds iteratively an optimal
representation of the parameter into uniform regions that form a partition of
the domain, hence corresponding to a segmentation of the image. We minimize an
error function during the iterations, and the partition of the image into
regions is optimally driven by the gradient of this error. The resulting
segmentation algorithm inherits desirable properties from its optimal control
origin: soundness, robustness, and flexibility
Approche bloc en ACP group-sparse: le package sparsePCA
International audienc
Generalized sentinels defined via least squares
Projet IDENTWe address the problem of monitoring a linear functional (c,x)E of an Hilbert space E, the available data being the observation z, in an Hilbert space F, of a vector Ax depending linearly of x through some known operator A # (E,F). When E = E1 x E2, c = (c1,0) and A is injective and defined through the solution of a partial differential equation, J.L. Lions (1988,1990) introduced sentinels s F such that (s, Ax)F is sensitive to x1 E1 but insensitive to x2 E2. In this paper, we prove the existence, in the general of a generalized sentinel (s,n) F x E, where F F with F dense in F, such that for any a priori guess xo of x, one has : TFF + (n, xo)E =(c, x)E where x is the least squares estimates of x closest to xo, and a family of regularized sentinels (sn, nn) F x E which converge to (s,n). Generalized sentinels unify the least square approach (by construction !) and the sentinel approach (when A is injective) and provide a general frame work for the construction of "sentinels with special sensitivity" in the sense of J.L. Lions (1990)
A maximum curvature step and geodesic displacement for nonlinear least squares descent algorithms
We adress in this paper the choice of both the step and the curve of the parameter space to be used in the line search search part of descent algorithms for the minimization of least squares objective functions. Our analysis is based on the curvature of the path of the data space followed during the line search. We define first a new and easy to compute maximum curvature step, which gives a guaranteed value to the residual at the next iterate, and satisfies a linear decrease condition with =. Then we optimize (i.e. minimize !) the guaranteed residual by performing the line search along a curve such that the corresponding path in the data space is a geodesic of the output set. An inexpensive implementation using a second order approximation to the geodesic is proposed. Preliminary numerical comparisons of the proposed algorithm with two versions of the Gauss-Newton algorithm show that it works properly over a wide range of nonlinearity, and tends to outperform its competitors in strongly nonlinear situations
Une remarque sur le prolongement par symetrie des formulations variationnelles et son application a la vectorisation
Résumé disponible dans les fichiers attaché
On the uniqueness of local minima for general abstract non-linear least square problems
Résumé disponible dans les fichiers attaché
Quasiconvex sets and size X curvature condition. Application to non linear inversion
Résumé disponible dans les fichiers attaché
- …