Steepest descent method is a simple gradient method
for optimization. This method has a slow convergence in heading to
the optimal solution, which occurs because of the zigzag form of the
steps. Barzilai and Borwein modified this algorithm so that it
performs well for problems with large dimensions. Barzilai and
Borwein method results have sparked a lot of research on the method
of steepest descent, including alternate minimization gradient method
and Yuan method. Inspired by previous works, we modified the step
size of the steepest descent method. We then compare the
modification results against the Barzilai and Borwein method,
alternate minimization gradient method and Yuan method for
quadratic function cases in terms of the iterations number and the
running time. The average results indicate that the steepest descent
method with the new step sizes provide good results for small
dimensions and able to compete with the results of Barzilai and
Borwein method and the alternate minimization gradient method for
large dimensions. The new step sizes have faster convergence
compared to the other methods, especially for cases with large
dimensions.
[1] Barzilai J, Borwein JM. 1988. Two point step size gradient methods.
IMA Journal of Numerical Analysis, 8: 141-148.
[2] Bazara MS, Sherali HD, Shetty CM. 2006. Nonlinear Programming:
Theory and Algorithms. USA: Wiley-Interescience.
[3] Cauchy A. 1847. General method for solving simultaneous equations
Systems, Comp. Rend. Sci. Paris, 25: 46-89
[4] Dai YH, Yuan Y. 2003. Alternate minimization gradient method. IMA
Journal of Numerical Analysis, 23: 377-393.
[5] Griva I, Nash SG, Sofer A. 2009. Linear and Nonlinear Optimization.
USA: Society for Industrial and Applied Mathematics.
[6] Sun Wenyu, Yuan Y. 2006. Optimization Theory and Methods:
Nonlinear Programming. New York: Spinger Science, Business Media.
[7] Yuan Y. 2006. A new stepsize for the steepest descent method. Journal
of Computational Mathematics, 24:149-156.
[1] Barzilai J, Borwein JM. 1988. Two point step size gradient methods.
IMA Journal of Numerical Analysis, 8: 141-148.
[2] Bazara MS, Sherali HD, Shetty CM. 2006. Nonlinear Programming:
Theory and Algorithms. USA: Wiley-Interescience.
[3] Cauchy A. 1847. General method for solving simultaneous equations
Systems, Comp. Rend. Sci. Paris, 25: 46-89
[4] Dai YH, Yuan Y. 2003. Alternate minimization gradient method. IMA
Journal of Numerical Analysis, 23: 377-393.
[5] Griva I, Nash SG, Sofer A. 2009. Linear and Nonlinear Optimization.
USA: Society for Industrial and Applied Mathematics.
[6] Sun Wenyu, Yuan Y. 2006. Optimization Theory and Methods:
Nonlinear Programming. New York: Spinger Science, Business Media.
[7] Yuan Y. 2006. A new stepsize for the steepest descent method. Journal
of Computational Mathematics, 24:149-156.
@article{"International Journal of Engineering, Mathematical and Physical Sciences:70540", author = "Bib Paruhum Silalahi and Djihad Wungguli and Sugi Guritman", title = "Steepest Descent Method with New Step Sizes", abstract = "Steepest descent method is a simple gradient method
for optimization. This method has a slow convergence in heading to
the optimal solution, which occurs because of the zigzag form of the
steps. Barzilai and Borwein modified this algorithm so that it
performs well for problems with large dimensions. Barzilai and
Borwein method results have sparked a lot of research on the method
of steepest descent, including alternate minimization gradient method
and Yuan method. Inspired by previous works, we modified the step
size of the steepest descent method. We then compare the
modification results against the Barzilai and Borwein method,
alternate minimization gradient method and Yuan method for
quadratic function cases in terms of the iterations number and the
running time. The average results indicate that the steepest descent
method with the new step sizes provide good results for small
dimensions and able to compete with the results of Barzilai and
Borwein method and the alternate minimization gradient method for
large dimensions. The new step sizes have faster convergence
compared to the other methods, especially for cases with large
dimensions.", keywords = "Convergence, iteration, line search, running time,
steepest descent, unconstrained optimization.", volume = "9", number = "7", pages = "367-7", }