12 research outputs found

    An Algorithm for Minimizing a Certain Class of Quasidifferentiable Functions

    Get PDF
    One interesting and important class of nondifferentiable functions is that produced by smooth compositions of max-type functions. Such functions are of practical value and have been studied extensively by several researchers. We treat them as quasidifferentiable functions and analyze them using quasidifferential calculus. One special subgroup of this class of functions (namely, the sum of a max-type function and a min-type function) has been studied by T.I. Sivelina. The main feature of the algorithm described in the present paper is that at each step it is necessary to consider a bundle of auxiliary directions and points, of which only one can be chosen for the next step. This requirement seems to arise from the intrinsic nature of nondifferentiable functions

    Fréchet approach in second-order optimization

    Get PDF
    AbstractWe state a certain second-order sufficient optimality condition for functions defined in infinite-dimensional spaces by means of generalized Fréchet’s approach to second-order differentiability. Moreover, we show that this condition generalizes a certain second-order condition obtained in finite-dimensional spaces

    Solution of feasibility problems via non-smooth optimization

    Get PDF
    Ankara : The Department of Industrial Engineering and the Institute of Engineering and Sciences of Bilkent Univ., 1990.Thesis (Master's) -- Bilkent University, 1990.Includes bibliographical references leaves 33-34In this study we present a penalty function approach for linear feasibility problems. Our attempt is to find an eiL· coive algorithm based on an exterior method. Any given feasibility (for a set of linear inequalities) problem, is first transformed into an unconstrained minimization of a penalty function, and then the problem is reduced to minimizing a convex, non-smooth, quadratic function. Due to non-differentiability of the penalty function, the gradient type methods can not be applied directly, so a modified nonlinear programming technique will be used in order to overcome the difficulties of the break points. In this research we present a new algorithm for minimizing this non-smooth penalty function. By dropping the nonnegativity constraints and using conjugate gradient method we compute a maximum set of conjugate directions and then we perform line searches on these directions in order to minimize our penalty function. Whenever the optimality criteria is not satisfied and the improvements in all directions are not enough, we calculate the new set of conjugate directions by conjugate Gram Schmit process, but one of the directions is the element of sub differential at the present point.Ouveysi, IradjM.S

    Minimizing and stationary sequences.

    Get PDF
    by Wong Oi Ping.Thesis (M.Phil.)--Chinese University of Hong Kong, 1999.Includes bibliographical references (leaves 77-79).Abstracts in English and Chinese.Chapter 1 --- LP-minimizing and Stationary Sequences --- p.8Chapter 1.1 --- Residual function --- p.8Chapter 1.2 --- Minimizing sequences --- p.14Chapter 1.3 --- Stationary sequences --- p.17Chapter 1.4 --- On the equivalence of minimizing and stationary se- quence --- p.21Chapter 1.5 --- Complementarity conditions --- p.25Chapter 1.6 --- Subdifferential-based stationary sequence --- p.29Chapter 1.7 --- Convergence of an Iterative Algorithm --- p.32Chapter 2 --- Minimizing And Stationary Sequences In Nonsmooth Optimization --- p.38Chapter 2.1 --- Subdifferential --- p.38Chapter 2.2 --- Stationary and minimizing sequences --- p.40Chapter 2.3 --- C-convex and BC-convex function --- p.43Chapter 2.4 --- Minimizing sequences in terms of sublevel sets --- p.44Chapter 2.5 --- Critical function --- p.48Chapter 3 --- Optimization Conditions --- p.52Chapter 3.1 --- Introduction --- p.52Chapter 3.2 --- Second-order necessary and sufficient conditions with- out constraint --- p.55Chapter 3.3 --- The Lagrange and G-functions in constrained problems --- p.63Chapter 3.4 --- Second-order necessary conditions for constrained prob- lems --- p.73Chapter 3.5 --- Sufficient conditions for constrained problems --- p.74Bibliograph

    Nonsmooth analysis and optimization.

    Get PDF
    Huang Liren.Thesis (Ph.D.)--Chinese University of Hong Kong, 1993.Includes bibliographical references (leaves 96).Abstract --- p.1Introduction --- p.2References --- p.5Chapter Chapter 1. --- Some elementary results in nonsmooth analysis and optimization --- p.6Chapter 1. --- "Some properties for ""lim sup"" and ""lim inf""" --- p.6Chapter 2. --- The directional derivative of the sup-type function --- p.8Chapter 3. --- Some results in nonsmooth analysis and optimization --- p.12References --- p.19Chapter Chapter 2. --- On generalized second-order derivatives and Taylor expansions in nonsmooth optimization --- p.20Chapter 1. --- Introduction --- p.20Chapter 2. --- "Dini-directional derivatives, Clark's directional derivatives and generalized second-order directional derivatives" --- p.20Chapter 3. --- On Cominetti and Correa's conjecture --- p.28Chapter 4. --- Generalized second-order Taylor expansion --- p.36Chapter 5. --- Detailed proof of Theorem 2.4.2 --- p.40Chapter 6. --- Corollaries of Theorem 2.4.2 and Theorem 2.4.3 --- p.43Chapter 7. --- Some applications in optimization --- p.46Ref erences --- p.51Chapter Chapter 3. --- Second-order necessary and sufficient conditions in nonsmooth optimization --- p.53Chapter 1. --- Introduction --- p.53Chapter 2. --- Second-order necessary and sufficient conditions without constraint --- p.56Chapter 3. --- Second-order necessary conditions with constrains --- p.66Chapter 4. --- Sufficient conditions theorem with constraints --- p.77References --- p.87Appendix --- p.89References --- p.9

    A contribution to the solving of non-linear estimation problems

    Get PDF
    corecore