131 research outputs found

    Bounded perturbation resilience of extragradient-type methods and their applications

    Full text link
    In this paper we study the bounded perturbation resilience of the extragradient and the subgradient extragradient methods for solving variational inequality (VI) problem in real Hilbert spaces. This is an important property of algorithms which guarantees the convergence of the scheme under summable errors, meaning that an inexact version of the methods can also be considered. Moreover, once an algorithm is proved to be bounded perturbation resilience, superiorizion can be used, and this allows flexibility in choosing the bounded perturbations in order to obtain a superior solution, as well explained in the paper. We also discuss some inertial extragradient methods. Under mild and standard assumptions of monotonicity and Lipschitz continuity of the VI's associated mapping, convergence of the perturbed extragradient and subgradient extragradient methods is proved. In addition we show that the perturbed algorithms converges at the rate of O(1/t)O(1/t). Numerical illustrations are given to demonstrate the performances of the algorithms.Comment: Accepted for publication in The Journal of Inequalities and Applications. arXiv admin note: text overlap with arXiv:1711.01936 and text overlap with arXiv:1507.07302 by other author

    Projection methods for solving split equilibrium problems

    Full text link
    The paper considers a split inverse problem involving component equilibrium problems in Hilbert spaces. This problem therefore is called the split equilibrium problem (SEP). It is known that almost solution methods for solving problem (SEP) are designed from two fundamental methods as the proximal point method and the extended extragradient method (or the two-step proximal-like method). Unlike previous results, in this paper we introduce a new algorithm, which is only based on the projection method, for finding solution approximations of problem (SEP), and then establish that the resulting algorithm is weakly convergent under mild conditions. Several of numerical results are reported to illustrate the convergence of the proposed algorithm and also to compare with others.Comment: 19 pages, 8 figures (Accepted for publication on January 24, 2019

    Golden ratio algorithms with new stepsize rules for variational inequalities

    Full text link
    In this paper, we introduce two golden ratio algorithms with new stepsize rules for solving pseudomonotone and Lipschitz variational inequalities in finite dimensional Hilbert spaces. The presented stepsize rules allow the resulting algorithms to work without the prior knowledge of the Lipschitz constant of operator. The first algorithm uses a sequence of stepsizes which is previously chosen, diminishing and non-summable. While the stepsizes in the second one are updated at each iteration and by a simple computation. A special point is that the sequence of stepsizes generated by the second algorithm is separated from zero. The convergence as well as the convergence rate of the proposed algorithms are established under some standard conditions. Also, we give several numerical results to show the behavior of the algorithms in comparisons with other algorithms.Comment: 19 pages, 4 figures (Accepted for publication on April 16, 2019

    A hybrid method without extrapolation step for solving variational inequality problems

    Full text link
    In this paper, we introduce a new method for solving variational inequality problems with monotone and Lipschitz-continuous mapping in Hilbert space. The iterative process is based on two well-known projection method and the hybrid (or outer approximation) method. However we do not use an extrapolation step in the projection method. The absence of one projection in our method is explained by slightly different choice of sets in hybrid method. We prove a strong convergence of the sequences generated by our method

    A novel hybrid method for equilibrium problems and fixed point problems

    Full text link
    The paper proposes a novel hybrid method for solving equilibrium problems and fixed point problems. By constructing specially cutting-halfspaces, in this algorithm, only an optimization program is solved at each iteration without the extra-steps as in some previously known methods. The strongly convergence theorem is established and some numerical examples are presented to illustrate its convergence.Comment: 11 pages (submitted). arXiv admin note: substantial text overlap with arXiv:1510.08201; text overlap with arXiv:1510.0821

    Self adaptive inertial extragradient algorithms for solving variational inequality problems

    Full text link
    In this paper, we study the strong convergence of two Mann-type inertial extragradient algorithms, which are devised with a new step size, for solving a variational inequality problem with a monotone and Lipschitz continuous operator in real Hilbert spaces. Strong convergence theorems for our algorithms are proved without the prior knowledge of the Lipschitz constant of the operator. Finally, we provide some numerical experiments to illustrate the performances of the proposed algorithms and provide a comparison with related ones.Comment: 19 pages, 6 figure

    A Relaxed-Projection Splitting Algorithm for Variational Inequalities in Hilbert Spaces

    Full text link
    We introduce a relaxed-projection splitting algorithm for solving variational inequalities in Hilbert spaces for the sum of nonsmooth maximal monotone operators, where the feasible set is defined by a nonlinear and nonsmooth continuous convex function inequality. In our scheme, the orthogonal projections onto the feasible set are replaced by projections onto separating hyperplanes. Furthermore, each iteration of the proposed method consists of simple subgradient-like steps, which does not demand the solution of a nontrivial subproblem, using only individual operators, which exploits the structure of the problem. Assuming monotonicity of the individual operators and the existence of solutions, we prove that the generated sequence converges weakly to a solution.Comment: 18 page

    Strong convergence of inertial extragradient algorithms for solving variational inequalities and fixed point problems

    Full text link
    The paper investigates two inertial extragradient algorithms for seeking a common solution to a variational inequality problem involving a monotone and Lipschitz continuous mapping and a fixed point problem with a demicontractive mapping in real Hilbert spaces. Our algorithms only need to calculate the projection on the feasible set once in each iteration. Moreover, they can work well without the prior information of the Lipschitz constant of the cost operator and do not contain any line search process. The strong convergence of the algorithms is established under suitable conditions. Some experiments are presented to illustrate the numerical efficiency of the suggested algorithms and compare them with some existing ones.Comment: 25 pages, 12 figure

    An inertial Tseng's extragradient method for solving multi-valued variational inequalities with one projection

    Full text link
    In this paper, we introduce an inertial Tseng's extragradient method for solving multi-valued variational inequalits, in which only one projection is needed at each iterate. We also obtain the strong convergence results of the proposed algorithm, provided that the multi-valued mapping is continuous and pseudomonotone with nonempty compact convex values. Moreover, numerical simulation results illustrate the efficiency of our method when compared to existing methods
    • …
    corecore