Loading...

Implementation of New Hybrid Conjugate Gradient Algorithms
Based on Modified BFGS Updates

Moshtagh, Mehrdad | 2012

422 Viewed
  1. Type of Document: M.Sc. Thesis
  2. Language: Farsi
  3. Document No: 43500 (02)
  4. University: Sharif University of Technology
  5. Department: Mathematical Sciences
  6. Advisor(s): Mahdavi-Amiri, Nezam
  7. Abstract:
  8. We describe two modified secant equations proposed by Yuan, Li and Fukushima. First, we study the approach proposed by Andrei. Then, we explain two hybrid conjugate gradient methods for unconstrained optimization problems. The methods are hybridizations of Hestenes-Stiefel and Dai-Yuan conjugate gradient methods. It is shown that one of the algorithms is globally convergent for uniformly convex functions and the other is globally convergent for general functions. Two approaches for computing the initial value of the steplength proposed by Babaie, Fatemi, and Mahdavi-Amiri and Andrei are used for accelerating the performance of the line search. We implement the algorithms and compare the performance of the programs with the results obtained by Andrei on unconstrained optimization test problems from the CUTEr collection. Using the performance profile of Dolan and Moré on the comparative results, the effectiveness of the use of Andrei’s line search method in the approach proposed by Babaie, Fatemi and Mahdavi-Amiri is shown
  9. Keywords:
  10. Unconstrainted Optimization ; Global Convergence ; Hybrid Conjugate Gradient Method ; Modified BFGS Method

 Digital Object List

 Bookmark

No TOC