16 research outputs found

    On the calculation of minimum variance estimators for unobservable dependent variables

    Get PDF
    The determination of minimum variance estimators in an unusual context is considered. The problem arises from an attempt to perform a regression with an unobservable dependent variable. The required minimum variance estimator is shown to satisfy a linear system of equations where the coefficient matrix has a simple structure. Uniqueness of the estimator is established by determining necessary and sufficient conditions on the data which guarantee positive definiteness of this coefficient matrix. Numerical aspects of the method of computation are also briefly explored

    Direct search methods for nonlinearly constrained optimization using filters and frames

    Get PDF
    Abstract. A direct search method for nonlinear optimization problems with nonlinear inequality constraints is presented. A filter based approach is used, which allows infeasible starting points. The constraints are assumed to be continuously differentiable, and approximations to the constraint gradients are used. For simplicity it is assumed that the active constraint normals are linearly independent at all points of interest on the boundary of the feasible region. An infinite sequence of iterates is generated, some of which are surrounded by sets of points called bent frames. An infinite subsequence of these iterates is identified, and its convergence properties are studied by applying Clarke's non-smooth calculus to the bent frames. It is shown that each cluster point of this subsequence is a Karush-Kuhn-Tucker point of the optimization problem under mild conditions which include strict differentiability of the objective function at each cluster point. This permits the objective function to be non-smooth, infinite, or undefined away from these cluster points. When the objective function is only locally Lipschitz at these cluster points it is shown that certain directions still have interesting properties at these cluster points

    The management and outcome for patients with chronic subdural hematoma: a prospective, multicenter, observational cohort study in the United Kingdom

    Get PDF
    Symptomatic chronic subdural hematoma (CSDH) will become an increasingly common presentation in neurosurgical practice as the population ages, but quality evidence is still lacking to guide the optimal management for these patients. The British Neurosurgical Trainee Research Collaborative (BNTRC) was established by neurosurgical trainees in 2012 to improve research by combining the efforts of trainees in each of the United Kingdom (UK) and Ireland's neurosurgical units (NSUs). The authors present the first study by the BNTRC that describes current management and outcomes for patients with CSDH throughout the UK and Ireland. This provides a resource both for current clinical practice and future clinical research on CSDH

    Circle fitting by linear and nonlinear least squares

    Get PDF
    The problem of determining the circle of best fit to a set of points in the plane (or the obvious generalisation ton-dimensions) is easily formulated as a nonlinear total least squares problem which may be solved using a Gauss-Newton minimisation algorithm. This straightforward approach is shown to be inefficient and extremely sensitive to the presence of outliers. An alternative formulation allows the problem to be reduced to a linear test squares problem which is trivially solved. The recommended approach is shown to have .the added advantage of being much less sensitive to outliers than the nonlinear least squares approach

    A conjugate direction implementation of the BFGS algorithm with automatic scaling

    Get PDF
    A new implementation of the BFGS algorithm for unconstrained optimization is reported which utilizes a conjugate factorization of the approximating Hessian matrix. The implementation is especially useful when gradient information is estimated by finite difference formulae and it is well suited to machines which are able to exploit parallel processing

    A modified BFGS formula maintaining positive definiteness with Armijo-Goldstein steplengths

    Get PDF
    The line search subproblem in unconstrained optimization is concerned with finding an acceptable steplength satisfying certain standard conditions. The conditions proposed in the early work of Armijo and Goldstein are sometimes replaced by those recommended by Wolfe because these latter conditions automatically allow positive definiteness of some popular quasi-Newton updates to be maintained. It is shown that a slightly modified form of quasi-Newton update allows positive definiteness to be maintained even if line searches based on the Armijo-Goldstein conditions are used

    The rise and fall of the vector epsilon algorithm

    Get PDF
    The performance of the vector epsilon is governed by two important mathematical theorems which are briefly reviewed in context. We note that the performance of the vector epsilon algorithm is inevitably qualitatively incorrect for sequences whose generating functions have poles near unity. This difficulty is avoided by the use of hybrid vector Pade approximants

    A convergent variant of the Nelder-Mead algorithm

    Get PDF
    The Nelder-Mead algorithm (1965) for unconstrained optimization has been used extensively to solve parameter estimation (and other) problems. Despite its age it is still the method of choice for many practitioners in the fields of statistics, engineering, and the physical and medical sciences because it is easy to code and very easy to use. It belongs to a class of methods which do not require derivatives and which are often claimed to be robust for problems with discontinuities or where the function values are noisy. Recently (1998) it has been shown that the method can fail to converge or converge to non-solutions on certain classes of problems. Only very limited convergence results exist for a restricted class of problems in one or two dimensions. In this paper, a provably convergent variant of the Nelder-Mead simplex method is presented and analysed. Numerical results are included to show that the modified algorithm is effective in practice

    A projected Lagrangian algorithm for semi-infinite programming

    Get PDF
    A globally convergent algorithm is presented for the solution of a wide class of semi-infinite programming problems. The method is based on the solution of a sequence of equality constrained quadratic programming problems, and usually has a second order convergence rate. Numerical results illustrating the effectiveness of the method are given
    corecore