10,652 research outputs found

    Multidimensional Constrained Global Optimization in Domains with Computable Boundaries

    Full text link
    Multidimensional constrained global optimization problem with objective function under Lipschitz condition and constraints generating a feasible domain with computable boundaries is considered. For solving this problem the dimensionality reduction approach on the base of the nested optimization scheme is used. This scheme reduces initial multidimensional problem to a family of one-dimensional subproblems and allows applying univariate methods for the execution of multidimensional optimization. Sequential and parallel modifications of well-known information-statistical methods of Lipschitz optimization are proposed for solving the univariate subproblems arising inside the nested scheme in the case of domains with computable boundaries. A comparison with classical penalty function method being traditional means of taking into account the constraints is carried out. The results of experiments demonstrate a significant advantage of the methods proposed over the penalty function method

    Lipschitz gradients for global optimization in a one-point-based partitioning scheme

    Get PDF
    A global optimization problem is studied where the objective function f(x)f(x) is a multidimensional black-box function and its gradient f(x)f'(x) satisfies the Lipschitz condition over a hyperinterval with an unknown Lipschitz constant KK. Different methods for solving this problem by using an a priori given estimate of KK, its adaptive estimates, and adaptive estimates of local Lipschitz constants are known in the literature. Recently, the authors have proposed a one-dimensional algorithm working with multiple estimates of the Lipschitz constant for f(x)f'(x) (the existence of such an algorithm was a challenge for 15 years). In this paper, a new multidimensional geometric method evolving the ideas of this one-dimensional scheme and using an efficient one-point-based partitioning strategy is proposed. Numerical experiments executed on 800 multidimensional test functions demonstrate quite a promising performance in comparison with popular DIRECT-based methods.Comment: 25 pages, 4 figures, 5 tables. arXiv admin note: text overlap with arXiv:1103.205

    Index Information Algorithm with Local Tuning for Solving Multidimensional Global Optimization Problems with Multiextremal Constraints

    Full text link
    Multidimensional optimization problems where the objective function and the constraints are multiextremal non-differentiable Lipschitz functions (with unknown Lipschitz constants) and the feasible region is a finite collection of robust nonconvex subregions are considered. Both the objective function and the constraints may be partially defined. To solve such problems an algorithm is proposed, that uses Peano space-filling curves and the index scheme to reduce the original problem to a H\"{o}lder one-dimensional one. Local tuning on the behaviour of the objective function and constraints is used during the work of the global optimization procedure in order to accelerate the search. The method neither uses penalty coefficients nor additional variables. Convergence conditions are established. Numerical experiments confirm the good performance of the technique.Comment: 29 pages, 5 figure

    Deterministic global optimization using space-filling curves and multiple estimates of Lipschitz and Holder constants

    Get PDF
    In this paper, the global optimization problem minySF(y)\min_{y\in S} F(y) with SS being a hyperinterval in N\Re^N and F(y)F(y) satisfying the Lipschitz condition with an unknown Lipschitz constant is considered. It is supposed that the function F(y)F(y) can be multiextremal, non-differentiable, and given as a `black-box'. To attack the problem, a new global optimization algorithm based on the following two ideas is proposed and studied both theoretically and numerically. First, the new algorithm uses numerical approximations to space-filling curves to reduce the original Lipschitz multi-dimensional problem to a univariate one satisfying the H\"{o}lder condition. Second, the algorithm at each iteration applies a new geometric technique working with a number of possible H\"{o}lder constants chosen from a set of values varying from zero to infinity showing so that ideas introduced in a popular DIRECT method can be used in the H\"{o}lder global optimization. Convergence conditions of the resulting deterministic global optimization method are established. Numerical experiments carried out on several hundreds of test functions show quite a promising performance of the new algorithm in comparison with its direct competitors.Comment: 26 pages, 10 figures, 4 table
    corecore