Loading...

Solving a Smooth Approximation of the Sparse Recovery Problem Using the Three-Term Conjugate Gradient Algorithms

Qaraei, Mohammad Hossein | 2023

77 Viewed
  1. Type of Document: M.Sc. Thesis
  2. Language: Farsi
  3. Document No: 56328 (02)
  4. University: Sharif University of Technology
  5. Department: Mathematical Sciences
  6. Advisor(s): Mahdavi Amiri, Nezamoddin
  7. Abstract:
  8. Line search-based methods are known as a category of the most efficient iterative algo- rithms for solving unconstrained optimization problems. Among them, the conjugate gradient method is of particular importance in solving large-scale contemporary world problems due to its simplicity of structure, low memory requirement and strong convergence characteristics. In spite of the desirable numerical behavior of the conjugate gradient method, this method generally lacks the descent property even for uniformly convex objective functions. To overcome this defect, some effective modifications have been presented in the literature. Amidst, the three-term extension attracted the attention of many researchers. Here, after stating some fundamental concepts of linear algebra and analysis, we study a three-term conjugate gradient method proposed by Zhang zho li. In fact, as an essential modification on the Polak-Ribiere-Polyak conjugate gradient method, Zhang and zho and li proposed a three-term structure of the search direction of this method. Satisfying the sufficient descent property regardless of the line search technique is known as a significant characteristic of the suggested approach. Additionally, to accelerate the presented algorithm, an appropriate choice for the initial step length is put forward. Convergence analysis of the proposed method for general objective functions is addressed in details. In the following, to bring the numerical performance of the three-term conjugate gradient method in light, the experimental tests on a set of classic problems of CUTEr library are conducted. The results are assessed considering the Dolan-Moré criterion, depicting the merits of the proposed approach. In the continuation of our study, we investigate applying the presented algorithm to solving sparse recovery problems. In fact, solutions of the underdetermined systems of linear equations can be found by solving the 1-norm minimization problem, although the obtained problem is challenging due to its nonsmoothness. To solve the sparse recovery problem, an effective procedure can be obtained, inspired by the Nestrov's smoothing technique. Now, since the three-term conjugate gradient algorithm is capable of solving large-scale problems in reasonable times and acceptable computational costs, this kind of algorithm is employed for solving large-scale contemporary world problems. Moreover, the algorithm is accelerated by constructing a successive loop for computing the Nestrov's smoothing parameter. Numerical experiments on randomly generated test problems demonstrates the appropriateness of the recovered signals based on the suggested conjugate gradient algorithm
  9. Keywords:
  10. Unconstrainted Optimization ; Nonlinear Optimization ; Conjuagate Gradient Method ; Line Search ; Large Scale Optimization ; Sparse Recovery ; Global Convergence ; Sufficient Descent Condition

 Digital Object List

 Bookmark

...see more