36 research outputs found

    Abnormal problems with a nonclosed image

    No full text
    [No abstract available

    Inverse function theorem and conditions of extremum for abnormal problems with non-closed range

    No full text
    The following two classical problems are considered: the existence and the estimate of a solution of an equation defined by a map F in the neighbourhood of a point x*; necessary conditions for an extremum at x* of a smooth function under equality-type constraints defined in terms of a non-linear map F. If the range of the first derivative of F at x* is not closed, then one cannot use classical methods of analysis based on inverse function theorems and Lagrange's principle. The results on these problems obtained in this paper are of interest in the case when the range of the first derivative of F at x* is non-closed; these are a further development of classical results extending them to abnormal problems with nonclosed range

    Inverse function theorem and conditions of extremum for abnormal problems with non-closed range

    No full text
    The following two classical problems are considered: the existence and the estimate of a solution of an equation defined by a map F in the neighbourhood of a point x*; necessary conditions for an extremum at x* of a smooth function under equality-type constraints defined in terms of a non-linear map F. If the range of the first derivative of F at x* is not closed, then one cannot use classical methods of analysis based on inverse function theorems and Lagrange's principle. The results on these problems obtained in this paper are of interest in the case when the range of the first derivative of F at x* is non-closed; these are a further development of classical results extending them to abnormal problems with nonclosed range

    Abnormal problems with a nonclosed image

    No full text
    [No abstract available

    Directional metric regularity of mappings and stability theorems

    No full text
    A stability theorem, based on the concept of directional matric regularity of mappings is described. Robinson's stability theorem can be used to derive results on the quantitative stability of the feasible set which play a central role in sensitivity analysis for optimization problems. The Robinson regularity condition, if violated, the underlying smooth mapping is not metrically regular. A mapping is said to be regular at a point in the direction where cone denotes the conical hull of a set. In the context of optimization problems, a condition is known as Gollan's regularity condition and it is extended to the general case and in parametric optimization, the condition is known as the directional regularity condition. An analysis performs along such feasible arcs yields the most accurate known quantitative sensitivity results in the case when the solution is Lipschitz stable

    Necessary conditions for an extremum in a mathematical programming problem

    No full text
    For minimization problems with equality and inequality constraints, first-and second-order necessary conditions for a local extremum are presented. These conditions apply when the constraints do not satisfy the traditional regularity assumptions. The approach is based on the concept of 2-regularity; it unites and generalizes the authors' previous studies based on this concept. © Nauka/Interperiodica 2007

    Uniform estimates of distances to a coincidence point set

    No full text
    [No abstract available

    On abnormal problems with a nonclosed image

    No full text
    The authors consider Banach spaces X and Y, a differentiable map Fcolon X to Y, and points x^* in X and y^* in Y with y^*=F(x^*). For y in Y belonging to a neighbourhood of y^* they consider the equation F(x)=y, where x (belonging to a neighbourhood of x^*) is unknown. Then they study the question of under which assumptions there exists a solution x(y) of the equation F(x) = y such that |x(y) - x^*| le{rm const} |y - y^*|^alpha is fulfilled for some alpha > 0. par The authors also discuss applications to optimization problems in general spaces using Lagrangian technique
    corecore