Steepest descent method for unconstrained optimization

Teyib M. M. Babou , Amhammed A. A. Ali

Abstract

The gradient method is the basis for many nonlinear optimization methods, and is also one of the methods used to solve the large scale unconstrained optimization. It requires a small storage volume compared to its counterparts. In this paper we have given a detailed presentation of the gradient method with Armijo's rule, and then a methods to improve its performance. This is represented in the Barzilai-Borwein method, which provides us with the step length the long of the steepest descent direction without the need for linesearch, but, the Barzilai-Borwein method is not always convergent. To solve this problem, we presented an algorithm attributed to the researcher Marcos Raydan, which linked the Barzilai-Borwein method with the nonmonotone linesearch of Grippo-Lampariello-Lucidi. In the end, we conducted numerical tests of the previous methods through a Matlab computer program.

Full text article

Generated from XML file

Authors

Teyib M. M. Babou , Amhammed A. A. Ali
Steepest descent method for unconstrained optimization. (2019). Journal of Pure & Applied Sciences , 18(4). https://doi.org/10.51984/jopas.v18i4.512

Article Details

How to Cite

Steepest descent method for unconstrained optimization. (2019). Journal of Pure & Applied Sciences , 18(4). https://doi.org/10.51984/jopas.v18i4.512

Similar Articles

You may also start an advanced similarity search for this article.

No Related Submission Found