Loading...

A conjugate gradient sampling method for nonsmooth optimization

Mahdavi Amiri, N ; Sharif University of Technology | 2020

200 Viewed
  1. Type of Document: Article
  2. DOI: 10.1007/s10288-019-00404-2
  3. Publisher: Springer , 2020
  4. Abstract:
  5. We present an algorithm for minimizing locally Lipschitz functions being continuously differentiable in an open dense subset of Rn. The function may be nonsmooth and/or nonconvex. The method makes use of a gradient sampling method along with a conjugate gradient scheme. To find search directions, we make use of a sequence of positive definite approximate Hessians based on conjugate gradient matrices. The algorithm benefits from a restart procedure to improve upon poor search directions or to make sure that the approximate Hessians remain bounded. The global convergence of the algorithm is established. An implementation of the algorithm is executed on a collection of well-known test problems. Comparative numerical results clearly show outperformance of the algorithm over some recent well-known nonsmooth algorithms using the Dolan–Moré performance profiles. © 2019, Springer-Verlag GmbH Germany, part of Springer Nature
  6. Keywords:
  7. Conjugate gradient ; Gradient sampling ; Lipschitz function ; Nonsmooth optimization ; Positive definite approximate Hessians
  8. Source: 4OR ; Volume 18, Issue 1 , May , 2020 , Pages 73-90
  9. URL: https://link.springer.com/article/10.1007%2Fs10288-019-00404-2