Steepest descent method for unconstrained optimization

Teyib M. M. Babou , Amhammed A. A. Ali (1)
(1) , Libya

Abstract

The gradient method is the basis for many nonlinear optimization methods, and is also one of the methods used to solve the large scale unconstrained optimization. It requires a small storage volume compared to its counterparts. In this paper we have given a detailed presentation of the gradient method with Armijo's rule, and then a methods to improve its performance. This is represented in the Barzilai-Borwein method, which provides us with the step length the long of the steepest descent direction without the need for linesearch, but, the Barzilai-Borwein method is not always convergent. To solve this problem, we presented an algorithm attributed to the researcher Marcos Raydan, which linked the Barzilai-Borwein method with the nonmonotone linesearch of Grippo-Lampariello-Lucidi. In the end, we conducted numerical tests of the previous methods through a Matlab computer program.

Full text article

Generated from XML file

Authors

Teyib M. M. Babou , Amhammed A. A. Ali
الطيب مكين محمد بابو و امحمد المهدي عبدالله. (2019). Steepest descent method for unconstrained optimization. Journal of Pure & Applied Sciences, 18(4). https://doi.org/10.51984/jopas.v18i4.512

Article Details

No Related Submission Found