2,786 research outputs found

    Function approximation via the subsampled Poincaré inequality

    Get PDF
    Function approximation and recovery via some sampled data have long been studied in a wide array of applied mathematics and statistics fields. Analytic tools, such as the Poincaré inequality, have been handy for estimating the approximation errors in different scales. The purpose of this paper is to study a generalized Poincaré inequality, where the measurement function is of subsampled type, with a small but non-zero lengthscale that will be made precise. Our analysis identifies this inequality as a basic tool for function recovery problems. We discuss and demonstrate the optimality of the inequality concerning the subsampled lengthscale, connecting it to existing results in the literature. In application to function approximation problems, the approximation accuracy using different basis functions and under different regularity assumptions is established by using the subsampled Poincaré inequality. We observe that the error bound blows up as the subsampled lengthscale approaches zero, due to the fact that the underlying function is not regular enough to have well-defined pointwise values. A weighted version of the Poincaré inequality is proposed to address this problem; its optimality is also discussed

    Optimal Recovery of Local Truth

    Get PDF
    Probability mass curves the data space with horizons. Let f be a multivariate probability density function with continuous second order partial derivatives. Consider the problem of estimating the true value of f(z) > 0 at a single point z, from n independent observations. It is shown that, the fastest possible estimators (like the k-nearest neighbor and kernel) have minimum asymptotic mean square errors when the space of observations is thought as conformally curved. The optimal metric is shown to be generated by the Hessian of f in the regions where the Hessian is definite. Thus, the peaks and valleys of f are surrounded by singular horizons when the Hessian changes signature from Riemannian to pseudo-Riemannian. Adaptive estimators based on the optimal variable metric show considerable theoretical and practical improvements over traditional methods. The formulas simplify dramatically when the dimension of the data space is 4. The similarities with General Relativity are striking but possibly illusory at this point. However, these results suggest that nonparametric density estimation may have something new to say about current physical theory.Comment: To appear in Proceedings of Maximum Entropy and Bayesian Methods 1999. Check also: http://omega.albany.edu:8008

    Applying a phase field approach for shape optimization of a stationary Navier-Stokes flow

    Get PDF
    We apply a phase field approach for a general shape optimization problem of a stationary Navier-Stokes flow. To be precise we add a multiple of the Ginzburg--Landau energy as a regularization to the objective functional and relax the non-permeability of the medium outside the fluid region. The resulting diffuse interface problem can be shown to be well-posed and optimality conditions are derived. We state suitable assumptions on the problem in order to derive a sharp interface limit for the minimizers and the optimality conditions. Additionally, we can derive a necessary optimality system for the sharp interface problem by geometric variations without stating additional regularity assumptions on the minimizing set

    On the Optimal Recovery of Graph Signals

    Full text link
    Learning a smooth graph signal from partially observed data is a well-studied task in graph-based machine learning. We consider this task from the perspective of optimal recovery, a mathematical framework for learning a function from observational data that adopts a worst-case perspective tied to model assumptions on the function to be learned. Earlier work in the optimal recovery literature has shown that minimizing a regularized objective produces optimal solutions for a general class of problems, but did not fully identify the regularization parameter. Our main contribution provides a way to compute regularization parameters that are optimal or near-optimal (depending on the setting), specifically for graph signal processing problems. Our results offer a new interpretation for classical optimization techniques in graph-based learning and also come with new insights for hyperparameter selection. We illustrate the potential of our methods in numerical experiments on several semi-synthetic graph signal processing datasets.Comment: This paper has been accepted by 14th International conference on Sampling Theory and Applications (SampTA 2023

    Constrained optimization in classes of analytic functions with prescribed pointwise values

    Get PDF
    We consider an overdetermined problem for Laplace equation on a disk with partial boundary data where additional pointwise data inside the disk have to be taken into account. After reformulation, this ill-posed problem reduces to a bounded extremal problem of best norm-constrained approximation of partial L2 boundary data by traces of holomorphic functions which satisfy given pointwise interpolation conditions. The problem of best norm-constrained approximation of a given L2 function on a subset of the circle by the trace of a H2 function has been considered in [Baratchart \& Leblond, 1998]. In the present work, we extend such a formulation to the case where the additional interpolation conditions are imposed. We also obtain some new results that can be applied to the original problem: we carry out stability analysis and propose a novel method of evaluation of the approximation and blow-up rates of the solution in terms of a Lagrange parameter leading to a highly-efficient computational algorithm for solving the problem
    • …
    corecore