50,146 research outputs found
Estimating Nuisance Parameters in Inverse Problems
Many inverse problems include nuisance parameters which, while not of direct
interest, are required to recover primary parameters. Structure present in
these problems allows efficient optimization strategies - a well known example
is variable projection, where nonlinear least squares problems which are linear
in some parameters can be very efficiently optimized. In this paper, we extend
the idea of projecting out a subset over the variables to a broad class of
maximum likelihood (ML) and maximum a posteriori likelihood (MAP) problems with
nuisance parameters, such as variance or degrees of freedom. As a result, we
are able to incorporate nuisance parameter estimation into large-scale
constrained and unconstrained inverse problem formulations. We apply the
approach to a variety of problems, including estimation of unknown variance
parameters in the Gaussian model, degree of freedom (d.o.f.) parameter
estimation in the context of robust inverse problems, automatic calibration,
and optimal experimental design. Using numerical examples, we demonstrate
improvement in recovery of primary parameters for several large- scale inverse
problems. The proposed approach is compatible with a wide variety of algorithms
and formulations, and its implementation requires only minor modifications to
existing algorithms.Comment: 16 pages, 5 figure
Separable nonlinear least squares fitting with linear bound constraints and its application in magnetic resonance spectroscopy data quantification
AbstractAn application in magnetic resonance spectroscopy quantification models a signal as a linear combination of nonlinear functions. It leads to a separable nonlinear least squares fitting problem, with linear bound constraints on some variables. The variable projection (VARPRO) technique can be applied to this problem, but needs to be adapted in several respects. If only the nonlinear variables are subject to constraints, then the Levenberg–Marquardt minimization algorithm that is classically used by the VARPRO method should be replaced with a version that can incorporate those constraints. If some of the linear variables are also constrained, then they cannot be projected out via a closed-form expression as is the case for the classical VARPRO technique. We show how quadratic programming problems can be solved instead, and we provide details on efficient function and approximate Jacobian evaluations for the inequality constrained VARPRO method
A Method for Computing Inverse Parametric PDE Problems with Random-Weight Neural Networks
We present a method for computing the inverse parameters and the solution
field to inverse parametric PDEs based on randomized neural networks. This
extends the local extreme learning machine technique originally developed for
forward PDEs to inverse problems. We develop three algorithms for training the
neural network to solve the inverse PDE problem. The first algorithm (NLLSQ)
determines the inverse parameters and the trainable network parameters all
together by the nonlinear least squares method with perturbations
(NLLSQ-perturb). The second algorithm (VarPro-F1) eliminates the inverse
parameters from the overall problem by variable projection to attain a reduced
problem about the trainable network parameters only. It solves the reduced
problem first by the NLLSQ-perturb algorithm for the trainable network
parameters, and then computes the inverse parameters by the linear least
squares method. The third algorithm (VarPro-F2) eliminates the trainable
network parameters from the overall problem by variable projection to attain a
reduced problem about the inverse parameters only. It solves the reduced
problem for the inverse parameters first, and then computes the trainable
network parameters afterwards. VarPro-F1 and VarPro-F2 are reciprocal to each
other in a sense. The presented method produces accurate results for inverse
PDE problems, as shown by the numerical examples herein. For noise-free data,
the errors for the inverse parameters and the solution field decrease
exponentially as the number of collocation points or the number of trainable
network parameters increases, and can reach a level close to the machine
accuracy. For noisy data, the accuracy degrades compared with the case of
noise-free data, but the method remains quite accurate. The presented method
has been compared with the physics-informed neural network method.Comment: 40 pages, 8 figures, 34 table
Assessment of a Variable Projection Algorithm for Trace Gas Retrieval in the Short-Wave Infrared
An important part of atmospheric remote sensing is the monitoring of its composition, which can be retrieved from radiance measurements, e.g. in the short-wave infrared (SWIR). For deriving trace gas concentrations in the SWIR spectral region a radiative transfer model is fitted to observations by least squares optimization. The aim of this thesis is to present the well-established variable projection method for solving separable nonlinear least squares problems and to examine and configure it for trace gas retrieval. For this, a Python implementation of the algorithm, called varpro.py, will be outlined and later utilized in retrievals with real satellite observations. These are meant to assess the efficiency, accuracy and robustness of three iterative algorithms for nonlinear least squares problems which have been built into varpro.py. Furthermore, a new feature - applying bounds to the non-linear fit parameters - will be included in the implementation and evaluated for its quality and usefulness. As a result of these tests, a new 'default' configuration will be suggested based on the algorithm with the best performance for trace gas retrieval. Also, ideas for analysing and testing strategies which could lead to even more insights will be proposed. Finally, possible future applications for trace gas retrieval will be motivated and suggestions for further research and modifications of varpro.py will be made
Representing complex data using localized principal components with application to astronomical data
Often the relation between the variables constituting a multivariate data
space might be characterized by one or more of the terms: ``nonlinear'',
``branched'', ``disconnected'', ``bended'', ``curved'', ``heterogeneous'', or,
more general, ``complex''. In these cases, simple principal component analysis
(PCA) as a tool for dimension reduction can fail badly. Of the many alternative
approaches proposed so far, local approximations of PCA are among the most
promising. This paper will give a short review of localized versions of PCA,
focusing on local principal curves and local partitioning algorithms.
Furthermore we discuss projections other than the local principal components.
When performing local dimension reduction for regression or classification
problems it is important to focus not only on the manifold structure of the
covariates, but also on the response variable(s). Local principal components
only achieve the former, whereas localized regression approaches concentrate on
the latter. Local projection directions derived from the partial least squares
(PLS) algorithm offer an interesting trade-off between these two objectives. We
apply these methods to several real data sets. In particular, we consider
simulated astrophysical data from the future Galactic survey mission Gaia.Comment: 25 pages. In "Principal Manifolds for Data Visualization and
Dimension Reduction", A. Gorban, B. Kegl, D. Wunsch, and A. Zinovyev (eds),
Lecture Notes in Computational Science and Engineering, Springer, 2007, pp.
180--204,
http://www.springer.com/dal/home/generic/search/results?SGWID=1-40109-22-173750210-
A Note on Separable Nonlinear Least Squares Problem
Separable nonlinear least squares (SNLS)problem is a special class of
nonlinear least squares (NLS)problems, whose objective function is a mixture of
linear and nonlinear functions. It has many applications in many different
areas, especially in Operations Research and Computer Sciences. They are
difficult to solve with the infinite-norm metric. In this paper, we give a
short note on the separable nonlinear least squares problem, unseparated scheme
for NLS, and propose an algorithm for solving mixed linear-nonlinear
minimization problem, method of which results in solving a series of least
squares separable problems.Comment: 3 pages; IEEE, 2011 International Conference on Future Computer
Sciences and Application (ICFCSA 2011), Jun. 18- 19, 2011, Hong Kon
- …