979 research outputs found

    OPTIMIZATION FOR STRUCTURAL EQUATION MODELING: APPLICATIONS TO SUBSTANCE USE DISORDERS

    Get PDF
    Substance abuse is a serious issue in both modern and traditional societies. Besides health complications such as depression, cancer and HIV, social complications such as loss of concentration, loss of job, and legal problems are among the numerous hazards substance use disorder imposes on societies. Understanding the causes of substance abuse and preventing its negative effects continues to be the focus of much research. Substance use behaviors, symptoms and signs are usually measured in form of ordinal data, which are often modeled under threshold models in Structural Equation Modeling (SEM). In this dissertation, we have developed a general nonlinear optimizer for the software package OpenMx, which is a SEM package in widespread use in the fields of psychology and genetics. The optimizer solves nonlinearly constrained optimization problems using a Sequential Quadratic Programming (SQP) algorithm. We have tested the performance of our optimizer on ordinal data and compared the results with two other optimizers (implementing SQP algorithm) available in the OpenMx package. While all three optimizers reach the same minimum, our new optimizer is faster than the other two. We then applied OpenMx with our optimization engine to a very large population-based drug abuse dataset, collected in Sweden from over one million pairs, to investigate the effects of genetic and environmental factors on liability to drug use. Finally, we investigated the reasons behind better performance of our optimizer by profiling all three optimizers as well as analyzing their memory consumption. We found that objective function evaluation is the most expensive task for all three optimizers, and that our optimizer needs fewer number of calls to this function to find the minimum. In terms of memory consumption, the optimizers use the same amount of memory

    On the convergence of mirror descent beyond stochastic convex programming

    Get PDF
    In this paper, we examine the convergence of mirror descent in a class of stochastic optimization problems that are not necessarily convex (or even quasi-convex), and which we call variationally coherent. Since the standard technique of "ergodic averaging" offers no tangible benefits beyond convex programming, we focus directly on the algorithm's last generated sample (its "last iterate"), and we show that it converges with probabiility 11 if the underlying problem is coherent. We further consider a localized version of variational coherence which ensures local convergence of stochastic mirror descent (SMD) with high probability. These results contribute to the landscape of non-convex stochastic optimization by showing that (quasi-)convexity is not essential for convergence to a global minimum: rather, variational coherence, a much weaker requirement, suffices. Finally, building on the above, we reveal an interesting insight regarding the convergence speed of SMD: in problems with sharp minima (such as generic linear programs or concave minimization problems), SMD reaches a minimum point in a finite number of steps (a.s.), even in the presence of persistent gradient noise. This result is to be contrasted with existing black-box convergence rate estimates that are only asymptotic.Comment: 30 pages, 5 figure

    An Alternating Trust Region Algorithm for Distributed Linearly Constrained Nonlinear Programs, Application to the AC Optimal Power Flow

    Get PDF
    A novel trust region method for solving linearly constrained nonlinear programs is presented. The proposed technique is amenable to a distributed implementation, as its salient ingredient is an alternating projected gradient sweep in place of the Cauchy point computation. It is proven that the algorithm yields a sequence that globally converges to a critical point. As a result of some changes to the standard trust region method, namely a proximal regularisation of the trust region subproblem, it is shown that the local convergence rate is linear with an arbitrarily small ratio. Thus, convergence is locally almost superlinear, under standard regularity assumptions. The proposed method is successfully applied to compute local solutions to alternating current optimal power flow problems in transmission and distribution networks. Moreover, the new mechanism for computing a Cauchy point compares favourably against the standard projected search as for its activity detection properties

    Parameter Selection and Pre-Conditioning for a Graph Form Solver

    Full text link
    In a recent paper, Parikh and Boyd describe a method for solving a convex optimization problem, where each iteration involves evaluating a proximal operator and projection onto a subspace. In this paper we address the critical practical issues of how to select the proximal parameter in each iteration, and how to scale the original problem variables, so as the achieve reliable practical performance. The resulting method has been implemented as an open-source software package called POGS (Proximal Graph Solver), that targets multi-core and GPU-based systems, and has been tested on a wide variety of practical problems. Numerical results show that POGS can solve very large problems (with, say, more than a billion coefficients in the data), to modest accuracy in a few tens of seconds. As just one example, a radiation treatment planning problem with around 100 million coefficients in the data can be solved in a few seconds, as compared to around one hour with an interior-point method.Comment: 28 pages, 1 figure, 1 open source implementatio
    • …
    corecore