Loading...

Scaled nonlinear conjugate gradient methods for nonlinear least squares problems

Dehghani, R ; Sharif University of Technology | 2018

189 Viewed
  1. Type of Document: Article
  2. DOI: 10.1007/s11075-018-0591-2
  3. Publisher: Springer New York LLC , 2018
  4. Abstract:
  5. We propose a modified structured secant relation to get a more accurate approximation of the second curvature of the least squares objective function. Then, using this relation and an approach introduced by Andrei, we propose three scaled nonlinear conjugate gradient methods for nonlinear least squares problems. An attractive feature of one of the proposed methods is satisfication of the sufficient descent condition regardless of the line search and the objective function convexity. We establish that the three proposed algorithms are globally convergent, under the assumption of the Jacobian matrix having full column rank on the level set for one, and without such assumption for the other two. Numerical experiments are made on the collection of test problems, both zero-residual and nonzero-residual, using the Dolan–Moré performance profiles. They show that the outperformance of our proposed algorithms is more pronounced on nonzero-residual as well as large problems. © 2018, Springer Science+Business Media, LLC, part of Springer Nature
  6. Keywords:
  7. Global convergences ; Nonlinear least squares ; Scaled nonlinear conjugate gradient ; Structured secant relation
  8. Source: Numerical Algorithms ; 2018 ; 10171398 (ISSN)
  9. URL: https://link.springer.com/article/10.1007%2Fs11075-018-0591-2