84 research outputs found

    Positive trigonometric polynomials for strong stability of difference equations

    Full text link
    We follow a polynomial approach to analyse strong stability of linear difference equations with rationally independent delays. Upon application of the Hermite stability criterion on the discrete-time homogeneous characteristic polynomial, assessing strong stability amounts to deciding positive definiteness of a multivariate trigonometric polynomial matrix. This latter problem is addressed with a converging hierarchy of linear matrix inequalities (LMIs). Numerical experiments indicate that certificates of strong stability can be obtained at a reasonable computational cost for state dimension and number of delays not exceeding 4 or 5

    An infeasible interior-point method for the P∗P_*-matrix linear complementarity‎ ‎problem based on a trigonometric kernel function with full-Newton‎ ‎step

    Get PDF
    An infeasible interior-point algorithm for solving the‎ ‎P∗P_*-matrix linear complementarity problem based on a kernel‎ ‎function with trigonometric barrier term is analyzed‎. ‎Each (main)‎ ‎iteration of the algorithm consists of a feasibility step and‎ ‎several centrality steps‎, ‎whose feasibility step is induced by a‎ ‎trigonometric kernel function‎. ‎The complexity result coincides with‎ ‎the best result for infeasible interior-point methods for‎ ‎P∗P_*-matrix linear complementarity problem

    Novel Representations of Semialgebraic Sets Arising in Planning and Control

    Get PDF
    The mathematical notion of a set arises frequently in planning and control of autonomous systems. A common challenge is how to best represent a given set in a manner that is efficient, accurate, and amenable to computational tools of interest. For example, ensuring a vehicle does not collide with an obstacle can be generically posed in multiple ways using techniques from optimization or computational geometry. However these representations generally rely on executing algorithms instead of evaluating closed-form expressions. This presents an issue when we wish to represent an obstacle avoidance condition within a larger motion planning problem which is solved using nonlinear optimization. These tools generally can only accept smooth, closed-form expressions. As such our available representations of obstacle avoidance conditions, while accurate, are not amenable to the relevant tools. A related problem is how to represent a set in a compact form without sacrificing accuracy. For example, we may be presented with point-cloud data representing the boundary of an object that our vehicle must avoid. Using the obstacle avoidance conditions directly on the point-cloud data would require performing these calculations with respect to each point individually. A more efficient approach is to first approximate the data with simple geometric shapes and perform later analysis with the approximation. Common shapes include bounding boxes, ellipsoids, and superquadrics. These shapes are convenient in that they have a compact representation and we have good heuristic objectives for fitting the data. However, their primitive nature means accuracy of representation may suffer. Most notably, their inherent symmetry makes them ill-suited for representing asymmetric shapes. In theory we could consider more complicated shapes given by an implicit function. However we lack reliable methods for ensuring a good fit. This thesis proposes novel approaches to these problems based on tools from convex optimization and convex analysis. Throughout, the sets of interest are described by polynomial inequalities, making them semialgebraic

    Two dimensional search algorithms for linear programming

    Get PDF
    Linear programming is one of the most important classes of optimization problems. These mathematical models have been used by academics and practitioners to solve numerous real world applications. Quickly solving linear programs impacts decision makers from both the public and private sectors. Substantial research has been performed to solve this class of problems faster, and the vast majority of the solution techniques can be categorized as one dimensional search algorithms. That is, these methods successively move from one solution to another solution by solving a one dimensional subspace linear program at each iteration. This dissertation proposes novel algorithms that move between solutions by repeatedly solving a two dimensional subspace linear program. Computational experiments demonstrate the potential of these newly developed algorithms and show an average improvement of nearly 25% in solution time when compared to the corresponding one dimensional search version. This dissertation\u27s research creates the core concept of these two dimensional search algorithms, which is a fast technique to determine an optimal basis and an optimal solution to linear programs with only two variables. This method, called the slope algorithm, compares the slope formed by the objective function with the slope formed by each constraint to determine a pair of constraints that intersect at an optimal basis and an optimal solution. The slope algorithm is implemented within a simplex framework to perform two dimensional searches. This results in the double pivot simplex method. Differently than the well-known simplex method, the double pivot simplex method simultaneously pivots up to two basic variables with two nonbasic variables at each iteration. The theoretical computational complexity of the double pivot simplex method is identical to the simplex method. Computational results show that this new algorithm reduces the number of pivots to solve benchmark instances by approximately 40% when compared to the classical implementation of the simplex method, and 20% when compared to the primal simplex implementation of CPLEX, a high performance mathematical programming solver. Solution times of some random linear programs are also improved by nearly 25% on average. This dissertation also presents a novel technique, called the ratio algorithm, to find an optimal basis and an optimal solution to linear programs with only two constraints. When the ratio algorithm is implemented within a simplex framework to perform two dimensional searches, it results in the double pivot dual simplex method. In this case, the double pivot dual simplex method behaves similarly to the dual simplex method, but two variables are exchanged at every step. Two dimensional searches are also implemented within an interior point framework. This dissertation creates a set of four two dimensional search interior point algorithms derived from primal and dual affine scaling and logarithmic barrier search directions. Each iteration of these techniques quickly solves a two dimensional subspace linear program formed by the intersection of two search directions and the feasible region of the linear program. Search directions are derived by orthogonally partitioning the objective function vector, which allows these novel methods to improve the objective function value at each step by at least as much as the corresponding one dimensional search version. Computational experiments performed on benchmark linear programs demonstrate that these two dimensional search interior point algorithms improve the average solution time by approximately 12% and the average number of iterations by 15%. In conclusion, this dissertation provides a change of paradigm in linear programming optimization algorithms. Implementing two dimensional searches within both a simplex and interior point framework typically reduces the computational time and number of iterations to solve linear programs. Furthermore, this dissertation sets the stage for future research topics in multidimensional search algorithms to solve not only linear programs but also other critical classes of optimization methods. Consequently, this dissertation\u27s research can become one of the first steps to change how commercial and open source mathematical programming software will solve optimization problems

    On Vibration Analysis and Reduction for Damped Linear Systems

    No full text
    • 

    corecore