3,152 research outputs found

    Weighted projections into closed subspaces

    Get PDF
    In this paper we study AA-projections, i.e. operators of a Hilbert space \HH which act as projections when a seminorm is considered in \HH. AA-projections were introduced by Mitra and Rao \cite{[MitRao74]} for finite dimensional spaces. We relate this concept to the theory of compatibility between positive operators and closed subspaces of \HH. We also study the relationship between weighted least squares problems and compatibility

    Gradient-Based Estimation of Uncertain Parameters for Elliptic Partial Differential Equations

    Full text link
    This paper addresses the estimation of uncertain distributed diffusion coefficients in elliptic systems based on noisy measurements of the model output. We formulate the parameter identification problem as an infinite dimensional constrained optimization problem for which we establish existence of minimizers as well as first order necessary conditions. A spectral approximation of the uncertain observations allows us to estimate the infinite dimensional problem by a smooth, albeit high dimensional, deterministic optimization problem, the so-called finite noise problem in the space of functions with bounded mixed derivatives. We prove convergence of finite noise minimizers to the appropriate infinite dimensional ones, and devise a stochastic augmented Lagrangian method for locating these numerically. Lastly, we illustrate our method with three numerical examples

    A representer theorem for deep kernel learning

    Full text link
    In this paper we provide a finite-sample and an infinite-sample representer theorem for the concatenation of (linear combinations of) kernel functions of reproducing kernel Hilbert spaces. These results serve as mathematical foundation for the analysis of machine learning algorithms based on compositions of functions. As a direct consequence in the finite-sample case, the corresponding infinite-dimensional minimization problems can be recast into (nonlinear) finite-dimensional minimization problems, which can be tackled with nonlinear optimization algorithms. Moreover, we show how concatenated machine learning problems can be reformulated as neural networks and how our representer theorem applies to a broad class of state-of-the-art deep learning methods

    Total least squares problems on infinite dimensional spaces

    Get PDF
    We study weighted total least squares problems on infinite dimensional spaces. We present some necessary and sufficient conditions for the regularized problem to have a solution. The existence of solution can also be assured for the regularized minimization problem with a constraint to special subsets. Furthermore, we show that a regularization in infinite dimensional total least squares problems is necessary, since in most cases the problem without regularization does not admit a solution.Fil: Contino, Maximiliano. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Saavedra 15. Instituto Argentino de Matemática Alberto Calderón; ArgentinaFil: Fongi, Guillermina. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Rosario. Centro Internacional Franco Argentino de Ciencias de la Información y de Sistemas. Universidad Nacional de Rosario. Centro Internacional Franco Argentino de Ciencias de la Información y de Sistemas; ArgentinaFil: Maestripieri, Alejandra Laura. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Saavedra 15. Instituto Argentino de Matemática Alberto Calderón; ArgentinaFil: Muro, Luis Santiago Miguel. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Rosario. Centro Internacional Franco Argentino de Ciencias de la Información y de Sistemas. Universidad Nacional de Rosario. Centro Internacional Franco Argentino de Ciencias de la Información y de Sistemas; Argentin

    Shape deformation analysis from the optimal control viewpoint

    Get PDF
    A crucial problem in shape deformation analysis is to determine a deformation of a given shape into another one, which is optimal for a certain cost. It has a number of applications in particular in medical imaging. In this article we provide a new general approach to shape deformation analysis, within the framework of optimal control theory, in which a deformation is represented as the flow of diffeomorphisms generated by time-dependent vector fields. Using reproducing kernel Hilbert spaces of vector fields, the general shape deformation analysis problem is specified as an infinite-dimensional optimal control problem with state and control constraints. In this problem, the states are diffeomorphisms and the controls are vector fields, both of them being subject to some constraints. The functional to be minimized is the sum of a first term defined as geometric norm of the control (kinetic energy of the deformation) and of a data attachment term providing a geometric distance to the target shape. This point of view has several advantages. First, it allows one to model general constrained shape analysis problems, which opens new issues in this field. Second, using an extension of the Pontryagin maximum principle, one can characterize the optimal solutions of the shape deformation problem in a very general way as the solutions of constrained geodesic equations. Finally, recasting general algorithms of optimal control into shape analysis yields new efficient numerical methods in shape deformation analysis. Overall, the optimal control point of view unifies and generalizes different theoretical and numerical approaches to shape deformation problems, and also allows us to design new approaches. The optimal control problems that result from this construction are infinite dimensional and involve some constraints, and thus are nonstandard. In this article we also provide a rigorous and complete analysis of the infinite-dimensional shape space problem with constraints and of its finite-dimensional approximations
    • …
    corecore