Main Article Content

A sufficient descent modified nonlinear conjugate gradient method for solving large scale unconstrained optimization problems


Moyi A. U.
Abdullahi N.
Aliyu N.

Abstract

Nonlinear conjugate gradient methods (CG) are prominent iterative techniques widely used for solving large-scale unconstrained optimization problems. Their wide application in many fields is due to their simplicity, low memory requirements, computationally less costs and global convergence properties. However, some of the CG methods do not possess the sufficient descent conditions, global convergence properties and good numerical result. To overcome these drawbacks, numerous studies and modification have been conducted to improve on these methods. In this research, a modification of a new Conjugate gradient parameter that posses sufficient descent conditions and global convergence properties is presented. The global convergence result is established using the Strong Wolf Powell condition (SWP). Extensive numerical experiment was conducted on a set of standard unconstrained optimization test functions.  The results show that the method outperforms some well-known methods in terms of efficiency and robustness.


Journal Identifiers


eISSN: 1597-9962
print ISSN: 3026-9091