1,285 research outputs found

    Optimal discretization of inverse problems in Hilbert scales. Regularization and self-regularization of projection methods

    Get PDF
    We study the efficiency of the approximate solution of ill-posed problems, based on discretized observations, which we assume to be given afore-hand. We restrict ourselves to problems which can be formulated in Hilbert scales. Within this framework we shall quantify the degree of ill-posedness, provide general conditions on projection schemes to achieve the best possible order of accuracy. We pay particular attention on the problem of self-regularization vs. Tikhonov regularization. Moreover, we study the information complexity. Asymptotically, any method, which achieves the best possible order of accuracy must use at least such amount of noisy observations. We accomplish our study with two specific problems, Abel's integral equation and the recovery of continuous functions from noisy coefficients with respect to a given orthonormal system, both classical ill-posed problems

    Self-regularization of projection methods with a posteriori discretization level choice for severely ill-posed problems

    Get PDF
    It is well known that projection schemes for certain linear ill-posed problems A퓍 = y can be regularized by a proper choice of the discretization level only, where no additional regularization is needed. The previous study of this self-regularization phenomenon was restricted to the case of so-called moderately ill-posed problems, i.e., when the singular values σ푘(A), 푘 = 1,2,..., of the operator A tend to zero with polynomial rate. The main accomplishment of the present paper is a new strategy for a discretization level choice that provides optimal order accuracy also for severely ill-posed problems, i.e., when σ푘(A) tend to zero exponentially. The proposed strategy does not require a priori information regarding the solution smoothness and the exact rate of σ푘(A)

    The discretized discrepancy principle under general source conditions

    Get PDF
    AbstractWe discuss adaptive strategies for choosing regularization parameters in Tikhonov–Phillips regularization of discretized linear operator equations. Two rules turn out to be based entirely on data from the underlying regularization scheme. Among them, only the discrepancy principle allows us to search for the optimal regularization parameter from the easiest problem. This potential advantage cannot be achieved by the standard projection scheme. We present a modified scheme, in which the discretization level varies with the successive regularization parameters, which has the advantage, mentioned before

    Mini-Workshop: Statistical Methods for Inverse Problems

    Get PDF
    Inverse problems appear naturally in a broad range of applications. Numerical analysis and statistics have – often independently – developed methods for regularisation and inversion. The aim of this mini-workshop is to bring together these methods and to consider their use in applications, with a focus on mathematical finance

    Iterative Regularization in Nonparametric Instrumental Regression

    Get PDF
    We consider the nonparametric regression model with an additive error that is correlated with the explanatory variables. We suppose the existence of instrumental variables that are considered in this model for the identification and the estimation of the regression function. The nonparametric estimation by instrumental variables is an illposed linear inverse problem with an unknown but estimable operator. We provide a new estimator of the regression function using an iterative regularization method (the Landweber-Fridman method). The optimal number of iterations and the convergence of the mean square error of the resulting estimator are derived under both mild and severe degrees of ill-posedness. A Monte-Carlo exercise shows the impact of some parameters on the estimator and concludes on the reasonable finite sample performance of the new estimator.Nonparametric estimation; Instrumental variable; Ill-posed inverse problem

    Iterative regularization in nonparametric instrumental regression

    Get PDF
    We consider the nonparametric regression model with an additive error that is correlated with the explanatory variables. We suppose the existence of instrumental variables that are considered in this model for the identification and the estimation of the regression function. The nonparametric estimation by instrumental variables is an ill-posed linear inverse problem with an unknown but estimable operator. We provide a new estimator of the regression function using an iterative regularization method (the Landweber-Fridman method). The optimal number of iterations and the convergence of the mean square error of the resulting estimator are derived under both mild and severe degrees of ill-posedness. A Monte-Carlo exercise shows the impact of some parameters on the estimator and concludes on the reasonable finite sample performance of the new estimator.nonparametric estimation, instrumental variable, ill-posed inverse problem, iterative method, estimation by projection
    • 

    corecore