202 research outputs found

    Structural reliability under uncertainty in moments: distributionally-robust reliability-based design optimization

    Full text link
    This paper considers structural optimization under a reliability constraint, where the input distribution is only partially known. Specifically, when we only know that the expected value vector and the variance-covariance matrix of the input distribution belong to a given convex set, we require that, for any realization of the input distribution, the failure probability of a structure should be no greater than a specified target value. We show that this distributionally-robust reliability constraint can be reduced equivalently to deterministic constraints. By using this reduction, we can treat a reliability-based design optimization problem under the distributionally-robust reliability constraint within the framework of deterministic optimization, specifically, nonlinear semidefinite programming. Two numerical examples are solved to show relation between the optimal value and either the target reliability or the uncertainty magnitude

    A mixed 0-1 programing approach to topology-finding of tensegrity structures

    Full text link
    p. 569-576In this paper we propose an optimization-based approach to finding a tensegrity structure based on the ground structure method. We first solve a problem which maximizes the number of struts over the self-equilibrium condition and the discontinuity condition of struts. Subsequently we solve the minimization problem of the number of cables in order to remove redundant self-equilibrium modes. The optimization problem at each step can be formulated as a mixed integer programming (MIP) problem. The method does not require any connectivity information of cables and struts to be known in advance, while the obtained tensegrity structure is guaranteed to satisfy the discontinuity condition of struts rigorously.Ehara, S.; Kanno, Y. (2009). A mixed 0-1 programing approach to topology-finding of tensegrity structures. Editorial Universitat Politècnica de València. http://hdl.handle.net/10251/654

    A feasible smoothing accelerated projected gradient method for nonsmooth convex optimization

    Full text link
    Smoothing accelerated gradient methods achieve faster convergence rates than that of the subgradient method for some nonsmooth convex optimization problems. However, Nesterov's extrapolation may require gradients at infeasible points, and thus they cannot be applied to some structural optimization problems. We introduce a variant of smoothing accelerated projected gradient methods where every variable is feasible. The O(k1logk)O(k^{-1}\log k) convergence rate is obtained using the Lyapunov function. We conduct a numerical experiment on the robust compliance optimization of a truss structure.Comment: 6 pages, 2 figure
    corecore