A NEW CONJUGATE GRADIENT WITH GLOBAL CONVERGES FOR NONLINEAR PROBLEMS
Abstract
The conjugate gradient(CG) method is one of the most popular and well-known iterative strategies for solving minimization problems, it has extensive applications in many domains such as machine learning, neural networks, and many other fields, partly because to its simplicity in algebraic formulation and implementation in codes of computer and partially due to their efficiency in solving large scale unconstrained optimization problems. Fletcher/Reeves (C, 1964) expanded the concept to nonlinear problems. In 1964, and this is widely regarded as the first algorithm of nonlinear conjugate gradient. Since then, other conjugate gradient method versions have been proposed. In this paper and in section one, we derive a new conjugate gradient for solving nonlinear minimization problems based on parameter of Perry. In section two we will satisfy some conditions like descent and sufficient descent conditions. In section three , we will study the global convergence of new suggestion. We present numerical findings in the fourth part to demonstrate the efficacy of the suggestion technique. Finally, we provide a conclusion
Downloads
References
C, F. R. and R. (1964). Function minimization by conjugate gradients. Comput. J, 7, 149–154.
Dai, Y.H., and Yuan, Y. (1999). A nonlinear conjugate gradient with a strong global convergence properties. SIAM J. Optim., 10(1), 177–182.
E. Polak, G. R. (1969). Note sur la convergence de directions conjugees. Rev. Francaise Inform. Recherche Operationelle 3, 35–43.
E.G. Birgin, J. M. M. (2001). A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim., 43, 117–128.
Hussein A. Kh., S. G. S. (2019). A New Parameter Conjugate Gradient Method Based on Three Terms Unconstrained Optimization. General Letters in Mathematics, 7(1), 39–44.
Hussein A. Kh., S. G. S. (2020). A New conjugate gradient method for unconstrained optimization problems with descent property. Gen. Lett. Math., 9(2).
J. Sun, J. Z. (2001). Convergence of conjugate gradient methods without line search. Annals Operations Research, 103, 161–173.
J.C. Gilbert, J. N. (1992). Global Convergence properties of conjugate gradient methods for optimization. SIAM J. Optim., 2, 21–24.
L. Zhang, W. Zhou, D. L. (2006). Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search. Numer. Math., 104, 561–572.
Liu Y., and Storey, C. . (1991). Efficient generalized conjugate gradient algorithms, Part 1: Theory. JOTA, 69, 129–137.
Perry, A. (1978). A Modified Conjugate Gradient Algorithm. JSTOR, 26(6), 1073–1078.
R., F. (1987). Practical Methods of Optimization vol.1: Unconstrained Optimization. New York: Jhon Wiley & Sons.
Raydan, M. (1997). The Barzilain and Borwein gradient method for the large unconstrained minimization problem. SIAM J. Optim., 7, 26–33.
Reeves, R. F. and C. M. (1964). Function minimization by conjugate gradients. Computer Journal, 7, 149–154.
Steifel, M. R. H. & E. (1952). Method of conjugate gradient for solving linear equations. J.Res. Nat. Bur. Stand, 49, 409–436.
W, Hager, and Z. H. C. (2005). A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim., 16, 170–192.
Y.H. Dai, and L. (2001). New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim., 43, 87–101.
Zhang L., Zhou W.J., L. D. H. (2006a). A descent modified Polak-Ribiere-Polyak conjugate gradient method and its global convergence. IMA Journal of Numerical Analysis, 26, 629–640.
Zhang L., Zhou W.J., L. D. H. (2006b). Global convergence of a modified Fletcher-Reeves conjugate method with Armijo type line search. Numerische Mathematik, 104, 561-572
It is the policy of the Journal of Duhok University to own the copyright of the technical contributions. It publishes and facilitates the appropriate re-utilize of the published materials by others. Photocopying is permitted with credit and referring to the source for individuals use.
Copyright © 2017. All Rights Reserved.