A Descent Four-Term of Liu and Storey Conjugate Gradient Method for Large Scale Unconstrained Optimization Problems
The conjugate gradient (CG) method is a useful tool for obtaining the optimum point for unconstrained optimization problems since it does not require a second derivative or its approximations. Moreover, the conjugate gradient method can be applied in many fields such as machine learning, deep learning, neural network, and many others. This paper constructs a four-term conjugate gradient method that satisfies the descent property and convergence properties to obtain the stationary point. The new modification was constructed based on Liu and Storey's conjugate gradient method, two-term conjugate gradient method, and three-term conjugate gradient method. To analyze the efficiency and robustness, we used more than 150 optimization functions from the CUTEst library with different dimensions and shapes. The numerical results show that the new modification outperforms the recent conjugate gradient methods such as CG-Descent, Dai and Liao, and others in terms of number of functions evaluations, number of gradient evaluations, number of iterations, and CPU time.
How to Cite
Upon acceptance of an article by the journal, the author(s) accept(s) the transfer of copyright of the article to European Journal of Pure and Applied Mathematics.
European Journal of Pure and Applied Mathematics will be Copyright Holder.