1,082 research outputs found
Improving non-linear fits
In this notes we describe an algorithm for non-linear fitting which
incorporates some of the features of linear least squares into a general
minimum fit and provide a pure Python implementation of the algorithm.
It consists of the variable projection method (varpro), combined with a Newton
optimizer and stabilized using the steepest descent with an adaptative step.
The algorithm includes a term to account for Bayesian priors. We performed
tests of the algorithm using simulated data. This method is suitable, for
example, for fitting with sums of exponentials as often needed in Lattice
Quantum Chromodynamics
Estimating Nuisance Parameters in Inverse Problems
Many inverse problems include nuisance parameters which, while not of direct
interest, are required to recover primary parameters. Structure present in
these problems allows efficient optimization strategies - a well known example
is variable projection, where nonlinear least squares problems which are linear
in some parameters can be very efficiently optimized. In this paper, we extend
the idea of projecting out a subset over the variables to a broad class of
maximum likelihood (ML) and maximum a posteriori likelihood (MAP) problems with
nuisance parameters, such as variance or degrees of freedom. As a result, we
are able to incorporate nuisance parameter estimation into large-scale
constrained and unconstrained inverse problem formulations. We apply the
approach to a variety of problems, including estimation of unknown variance
parameters in the Gaussian model, degree of freedom (d.o.f.) parameter
estimation in the context of robust inverse problems, automatic calibration,
and optimal experimental design. Using numerical examples, we demonstrate
improvement in recovery of primary parameters for several large- scale inverse
problems. The proposed approach is compatible with a wide variety of algorithms
and formulations, and its implementation requires only minor modifications to
existing algorithms.Comment: 16 pages, 5 figure
Projective Bundle Adjustment from Arbitrary Initialization Using the Variable Projection Method
Bundle adjustment is used in structure-from-motion pipelines as final refinement stage requiring a sufficiently good initialization to reach a useful local mininum. Starting from an arbitrary initialization almost always gets trapped in a poor minimum. In this work we aim to obtain an initialization-free approach which returns global minima from a large proportion of purely random starting points. Our key inspiration lies in the success of the Variable Projection (VarPro) method for affine factorization problems, which have close to 100% chance of reaching a global minimum from random initialization. We find empirically that this desirable behaviour does not directly carry over to the projective case, and we consequently design and evaluate strategies to overcome this limitation. Also, by unifying the affine and the projective camera settings, we obtain numerically better conditioned reformulations of original bundle adjustment algorithms
Separable nonlinear least squares fitting with linear bound constraints and its application in magnetic resonance spectroscopy data quantification
AbstractAn application in magnetic resonance spectroscopy quantification models a signal as a linear combination of nonlinear functions. It leads to a separable nonlinear least squares fitting problem, with linear bound constraints on some variables. The variable projection (VARPRO) technique can be applied to this problem, but needs to be adapted in several respects. If only the nonlinear variables are subject to constraints, then the Levenberg–Marquardt minimization algorithm that is classically used by the VARPRO method should be replaced with a version that can incorporate those constraints. If some of the linear variables are also constrained, then they cannot be projected out via a closed-form expression as is the case for the classical VARPRO technique. We show how quadratic programming problems can be solved instead, and we provide details on efficient function and approximate Jacobian evaluations for the inequality constrained VARPRO method
Structured least squares problems and robust estimators
Cataloged from PDF version of article.A novel approach is proposed to provide robust and
accurate estimates for linear regression problems when both the
measurement vector and the coefficient matrix are structured and
subject to errors or uncertainty. A new analytic formulation is developed
in terms of the gradient flow of the residual norm to analyze
and provide estimates to the regression. The presented analysis
enables us to establish theoretical performance guarantees to compare
with existing methods and also offers a criterion to choose the
regularization parameter autonomously. Theoretical results and
simulations in applications such as blind identification, multiple
frequency estimation and deconvolution show that the proposed
technique outperforms alternative methods in mean-squared error
for a significant range of signal-to-noise ratio values
- …