291 research outputs found

    A Partially Feasible Distributed SQO Method for Two-block General Linearly Constrained Smooth Optimization

    Full text link
    This paper discusses a class of two-block smooth large-scale optimization problems with both linear equality and linear inequality constraints, which have a wide range of applications, such as economic power dispatch, data mining, signal processing, etc.Our goal is to develop a novel partially feasible distributed (PFD) sequential quadratic optimization (SQO) method (PFD-SQO method) for this kind of problems. The design of the method is based on the ideas of SQO method and augmented Lagrangian Jacobian splitting scheme as well as feasible direction method,which decomposes the quadratic optimization (QO) subproblem into two small-scale QOs that can be solved independently and parallelly. A novel disturbance contraction term that can be suitably adjusted is introduced into the inequality constraints so that the feasible step size along the search direction can be increased to 1. The new iteration points are generated by the Armijo line search and the partially augmented Lagrangian function that only contains equality constraints as the merit function. The iteration points always satisfy all the inequality constraints of the problem. The theoretical properties, such as global convergence, iterative complexity, superlinear and quadratic rates of convergence of the proposed PFD-SQO method are analyzed under appropriate assumptions, respectively. Finally, the numerical effectiveness of the method is tested on a class of academic examples and an economic power dispatch problem, which shows that the proposed method is quite promising

    A feasible sequential linear equation method for inequality constrained optimization

    Get PDF
    2003-2004 > Academic research: refereed > Publication in refereed journalVersion of RecordPublishe

    Retraction-based first-order feasible methods for difference-of-convex programs with smooth inequality and simple geometric constraints

    Full text link
    In this paper, we propose first-order feasible methods for difference-of-convex (DC) programs with smooth inequality and simple geometric constraints. Our strategy for maintaining feasibility of the iterates is based on a "retraction" idea adapted from the literature of manifold optimization. When the constraints are convex, we establish the global subsequential convergence of the sequence generated by our algorithm under strict feasibility condition, and analyze its convergence rate when the objective is in addition convex according to the Kurdyka-Lojasiewicz (KL) exponent of the extended objective (i.e., sum of the objective and the indicator function of the constraint set). We also show that the extended objective of a large class of Euclidean norm (and more generally, group LASSO penalty) regularized convex optimization problems is a KL function with exponent 12\frac12; consequently, our algorithm is locally linearly convergent when applied to these problems. We then extend our method to solve DC programs with a single specially structured nonconvex constraint. Finally, we discuss how our algorithms can be applied to solve two concrete optimization problems, namely, group-structured compressed sensing problems with Gaussian measurement noise and compressed sensing problems with Cauchy measurement noise, and illustrate the empirical performance of our algorithms
    corecore