41,233 research outputs found

    A class of four parametric with- and without memory root finding methods

    Full text link
    [EN] In this paper, we have constructed a derivative¿free weighted eighth¿order iterative method with and without memory for solving nonlinear equations. This method is an optimal method as it satisfies the Kung¿Traub conjecture. We have used four accelerating parameters, a univariate and a multivariate weight function at the second and third step of the method, respectively. This method is converted into with¿memory method by approximating the parameters using Newton's interpolating polynomials of appropriate degree to increase the order of convergence to 15.51560 and the efficiency index is nearly two. Numerical comparison of our methods is done with the recent methods of respective domain.This research was partially supported by Ministerio de Economía y Competitividad MTM2014-52016-C2-2-P, Generalitat Valenciana PROMETEO/2016/089 and Schlumberger Foundation-Faculty for Future Program.Zafar, F.; Cordero Barbero, A.; Torregrosa Sánchez, JR.; Rafi, A. (2019). A class of four parametric with- and without memory root finding methods. Computational and Mathematical Methods. 1-14. https://doi.org/10.1002/cmm4.1024S114Cordero, A., Junjua, M.-D., Torregrosa, J. R., Yasmin, N., & Zafar, F. (2018). Efficient Four-Parametric with-and-without-Memory Iterative Methods Possessing High Efficiency Indices. Mathematical Problems in Engineering, 2018, 1-12. doi:10.1155/2018/8093673Zafar, F., Akram, S., Yasmin, N., & Junjua, M.-D. (2016). On the construction of three step derivative free four-parametric methods with accelerated order of convergence. Journal of Nonlinear Sciences and Applications, 09(06), 4542-4553. doi:10.22436/jnsa.009.06.92King, R. F. (1973). A Family of Fourth Order Methods for Nonlinear Equations. SIAM Journal on Numerical Analysis, 10(5), 876-879. doi:10.1137/0710072Herzberger, J. (1974). Über Matrixdarstellungen für Iterationsverfahren bei nichtlinearen Gleichungen. Computing, 12(3), 215-222. doi:10.1007/bf02293107Jay, L. O. (2001). Bit Numerical Mathematics, 41(2), 422-429. doi:10.1023/a:1021902825707Chun, C., & Neta, B. (2015). On the new family of optimal eighth order methods developed by Lotfi et al. Numerical Algorithms, 72(2), 363-376. doi:10.1007/s11075-015-0048-9Gdawiec, K. (2017). Fractal patterns from the dynamics of combined polynomial root finding methods. Nonlinear Dynamics, 90(4), 2457-2479. doi:10.1007/s11071-017-3813-

    A von Neumann Alternating Method for Finding Common Solutions to Variational Inequalities

    Full text link
    Modifying von Neumann's alternating projections algorithm, we obtain an alternating method for solving the recently introduced Common Solutions to Variational Inequalities Problem (CSVIP). For simplicity, we mainly confine our attention to the two-set CSVIP, which entails finding common solutions to two unrelated variational inequalities in Hilbert space.Comment: Nonlinear Analysis Series A: Theory, Methods & Applications, accepted for publicatio

    Convergence Analysis of an Inexact Feasible Interior Point Method for Convex Quadratic Programming

    Get PDF
    In this paper we will discuss two variants of an inexact feasible interior point algorithm for convex quadratic programming. We will consider two different neighbourhoods: a (small) one induced by the use of the Euclidean norm which yields a short-step algorithm and a symmetric one induced by the use of the infinity norm which yields a (practical) long-step algorithm. Both algorithms allow for the Newton equation system to be solved inexactly. For both algorithms we will provide conditions for the level of error acceptable in the Newton equation and establish the worst-case complexity results

    Transformation Method for Solving Hamilton-Jacobi-Bellman Equation for Constrained Dynamic Stochastic Optimal Allocation Problem

    Full text link
    In this paper we propose and analyze a method based on the Riccati transformation for solving the evolutionary Hamilton-Jacobi-Bellman equation arising from the stochastic dynamic optimal allocation problem. We show how the fully nonlinear Hamilton-Jacobi-Bellman equation can be transformed into a quasi-linear parabolic equation whose diffusion function is obtained as the value function of certain parametric convex optimization problem. Although the diffusion function need not be sufficiently smooth, we are able to prove existence, uniqueness and derive useful bounds of classical H\"older smooth solutions. We furthermore construct a fully implicit iterative numerical scheme based on finite volume approximation of the governing equation. A numerical solution is compared to a semi-explicit traveling wave solution by means of the convergence ratio of the method. We compute optimal strategies for a portfolio investment problem motivated by the German DAX 30 Index as an example of application of the method
    corecore