16 research outputs found
New classes of higher order variational-like inequalities
In this paper, we prove that the optimality conditions of the higher order convex functions are characterized by a class of variational inequalities, which is called the higher order variational inequality. Auxiliary principle technique is used to suggest an implicit method for solving higher order variational inequalities. Convergence analysis of the proposed method is investigated using the pseudo-monotonicity of the operator. Some special cases also discussed. Results obtained in this paper can be viewed as refinement and improvement of previously known results
A study of optimization problems and fixed point iterations in Banach spaces.
Doctoral Degree. University of KwaZulu-Natal, Durban.Abstract available in PDF
Self-adaptive inertial algorithms for approximating solutions of split feasilbility, monotone inclusion, variational inequality and fixed point problems.
Masters Degree. University of KwaZulu-Natal, Durban.In this dissertation, we introduce a self-adaptive hybrid inertial algorithm for approximating
a solution of split feasibility problem which also solves a monotone inclusion problem
and a fixed point problem in p-uniformly convex and uniformly smooth Banach spaces.
We prove a strong convergence theorem for the sequence generated by our algorithm which
does not require a prior knowledge of the norm of the bounded linear operator. Numerical
examples are given to compare the computational performance of our algorithm with other
existing algorithms.
Moreover, we present a new iterative algorithm of inertial form for solving Monotone Inclusion
Problem (MIP) and common Fixed Point Problem (FPP) of a finite family of
demimetric mappings in a real Hilbert space. Motivated by the Armijo line search technique,
we incorporate the inertial technique to accelerate the convergence of the proposed
method. Under standard and mild assumptions of monotonicity and Lipschitz continuity
of the MIP associated mappings, we establish the strong convergence of the iterative
algorithm. Some numerical examples are presented to illustrate the performance of our
method as well as comparing it with the non-inertial version and some related methods in
the literature.
Furthermore, we propose a new modified self-adaptive inertial subgradient extragradient
algorithm in which the two projections are made onto some half spaces. Moreover, under
mild conditions, we obtain a strong convergence of the sequence generated by our proposed
algorithm for approximating a common solution of variational inequality problems
and common fixed points of a finite family of demicontractive mappings in a real Hilbert
space. The main advantages of our algorithm are: strong convergence result obtained
without prior knowledge of the Lipschitz constant of the the related monotone operator,
the two projections made onto some half-spaces and the inertial technique which speeds
up rate of convergence. Finally, we present an application and a numerical example to
illustrate the usefulness and applicability of our algorithm
A viscosity of Cesàro mean approximation method for split generalized equilibrium, variational inequality and fixed point problems
In this paper, we introduce and study a iterative viscosity approximation method by modify Cesàro mean approximation for finding a common solution of split generalized equilibrium, variational inequality and fixed point problems. Under suitable conditions, we prove a strong convergence theorem for the sequences generated by the proposed iterative scheme. The results presented in this paper generalize, extend and improve the corresponding results of Shimizu an
Nonlinear Analysis and Optimization with Applications
Nonlinear analysis has wide and significant applications in many areas of mathematics, including functional analysis, variational analysis, nonlinear optimization, convex analysis, nonlinear ordinary and partial differential equations, dynamical system theory, mathematical economics, game theory, signal processing, control theory, data mining, and so forth. Optimization problems have been intensively investigated, and various feasible methods in analyzing convergence of algorithms have been developed over the last half century. In this Special Issue, we will focus on the connection between nonlinear analysis and optimization as well as their applications to integrate basic science into the real world