1,195 research outputs found

    Iterative Estimation of Solutions to Noisy Nonlinear Operator Equations in Nonparametric Instrumental Regression

    Full text link
    This paper discusses the solution of nonlinear integral equations with noisy integral kernels as they appear in nonparametric instrumental regression. We propose a regularized Newton-type iteration and establish convergence and convergence rate results. A particular emphasis is on instrumental regression models where the usual conditional mean assumption is replaced by a stronger independence assumption. We demonstrate for the case of a binary instrument that our approach allows the correct estimation of regression functions which are not identifiable with the standard model. This is illustrated in computed examples with simulated data

    Regularized Newton Methods for X-ray Phase Contrast and General Imaging Problems

    Full text link
    Like many other advanced imaging methods, x-ray phase contrast imaging and tomography require mathematical inversion of the observed data to obtain real-space information. While an accurate forward model describing the generally nonlinear image formation from a given object to the observations is often available, explicit inversion formulas are typically not known. Moreover, the measured data might be insufficient for stable image reconstruction, in which case it has to be complemented by suitable a priori information. In this work, regularized Newton methods are presented as a general framework for the solution of such ill-posed nonlinear imaging problems. For a proof of principle, the approach is applied to x-ray phase contrast imaging in the near-field propagation regime. Simultaneous recovery of the phase- and amplitude from a single near-field diffraction pattern without homogeneity constraints is demonstrated for the first time. The presented methods further permit all-at-once phase contrast tomography, i.e. simultaneous phase retrieval and tomographic inversion. We demonstrate the potential of this approach by three-dimensional imaging of a colloidal crystal at 95 nm isotropic resolution.Comment: (C)2016 Optical Society of America. One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modifications of the content of this paper are prohibite

    Iteratively regularized Newton-type methods for general data misfit functionals and applications to Poisson data

    Get PDF
    We study Newton type methods for inverse problems described by nonlinear operator equations F(u)=gF(u)=g in Banach spaces where the Newton equations F(un;un+1un)=gF(un)F'(u_n;u_{n+1}-u_n) = g-F(u_n) are regularized variationally using a general data misfit functional and a convex regularization term. This generalizes the well-known iteratively regularized Gauss-Newton method (IRGNM). We prove convergence and convergence rates as the noise level tends to 0 both for an a priori stopping rule and for a Lepski{\u\i}-type a posteriori stopping rule. Our analysis includes previous order optimal convergence rate results for the IRGNM as special cases. The main focus of this paper is on inverse problems with Poisson data where the natural data misfit functional is given by the Kullback-Leibler divergence. Two examples of such problems are discussed in detail: an inverse obstacle scattering problem with amplitude data of the far-field pattern and a phase retrieval problem. The performence of the proposed method for these problems is illustrated in numerical examples

    Convergence rates for variational regularization of inverse problems in exponential families

    Get PDF
    We consider statistical inverse problems with statistical noise. By using regularization methods one can approximate the true solution of the inverse problem by a regularized solution. The previous investigation of convergence rates for variational regularization with Poisson and empirical process data is shown to be suboptimal. In this thesis we obtain improved convergence rates for variational regularization methods of nonlinear ill-posed inverse problems with certain stochastic noise models described by exponential families and derive better reconstruction error bounds by applying deviation inequalities for stochastic process in some function spaces. Furthermore, we also consider iteratively regularized Newton-method as an alternative while the operator is non-linear. Due to the difficulty of deriving suitable deviation inequalities for stochastic processes in some function spaces, we are currently not able to obtain optimal convergence rates for variational regularization such that we state our desired result as a conjecture. If our conjecture holds true, then we can immediately obtain our desired results
    corecore