131 research outputs found
On the Regularizing Property of Stochastic Gradient Descent
Stochastic gradient descent is one of the most successful approaches for
solving large-scale problems, especially in machine learning and statistics. At
each iteration, it employs an unbiased estimator of the full gradient computed
from one single randomly selected data point. Hence, it scales well with
problem size and is very attractive for truly massive dataset, and holds
significant potentials for solving large-scale inverse problems. In the recent
literature of machine learning, it was empirically observed that when equipped
with early stopping, it has regularizing property. In this work, we rigorously
establish its regularizing property (under \textit{a priori} early stopping
rule), and also prove convergence rates under the canonical sourcewise
condition, for minimizing the quadratic functional for linear inverse problems.
This is achieved by combining tools from classical regularization theory and
stochastic analysis. Further, we analyze the preasymptotic weak and strong
convergence behavior of the algorithm. The theoretical findings shed insights
into the performance of the algorithm, and are complemented with illustrative
numerical experiments.Comment: 22 pages, better presentatio
An Analysis of Finite Element Approximation in Electrical Impedance Tomography
We present a finite element analysis of electrical impedance tomography for
reconstructing the conductivity distribution from electrode voltage
measurements by means of Tikhonov regularization. Two popular choices of the
penalty term, i.e., -norm smoothness penalty and total variation
seminorm penalty, are considered. A piecewise linear finite element method is
employed for discretizing the forward model, i.e., the complete electrode
model, the conductivity, and the penalty functional. The convergence of the
finite element approximations for the Tikhonov model on both polyhedral and
smooth curved domains is established. This provides rigorous justifications for
the ad hoc discretization procedures in the literature.Comment: 20 page
Iterative Soft/Hard Thresholding with Homotopy Continuation for Sparse Recovery
In this note, we analyze an iterative soft / hard thresholding algorithm with
homotopy continuation for recovering a sparse signal from noisy data
of a noise level . Under suitable regularity and sparsity conditions,
we design a path along which the algorithm can find a solution which
admits a sharp reconstruction error with an iteration complexity , where and are problem dimensionality and
controls the length of the path. Numerical examples are given to illustrate its
performance.Comment: 5 pages, 4 figure
- β¦