Loading...

Acceleration algorithms for iterative methods

Shamsi, M ; Sharif University of Technology | 2023

0 Viewed
  1. Type of Document: Article
  2. DOI: 10.1007/978-3-031-41130-4_19
  3. Publisher: Birkhauser , 2023
  4. Abstract:
  5. In this chapter, we provide the reader with three fundamental acceleration approaches that are actively used in today’s signal processing research trends. As is clear, Artificial Intelligence and Machine Learning algorithms are getting much more attention from the researchers. Since the emerging of popular Deep Learning algorithms in the last two decades, there has always been a vast demand for processing resources. Hence, it seems necessary to understand recently used methods to accelerate them. Knowing the optimization methods to be the core of various machine learning algorithms, we focus on first-order optimization methods. Considering iterative algorithms that try to reconstruct a signal step-by-step and as a converging series, it is reasonable to survey series acceleration methods to speed up the iterative algorithms. Hence, we study six series acceleration algorithms. Considering the frame sequences, we study two polynomial acceleration methods that are originally drawn from the theory of acceleration methods in numerical linear algebra: Chebyshev Acceleration and Conjugate Gradient acceleration. We summarize an article, in which a modified version of Aitkens’ method is employed to accelerate iterative signal recovery algorithms. At the end, its capability to speed up the iterative signal recovery from nonuniform samples is shown. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023
  6. Keywords:
  7. Source: Applied and Numerical Harmonic Analysis ; Volume Part F2077 , 2023 , Pages 521-552 ; 22965009 (ISSN)
  8. URL: https://www.sciencedirect.com/science/article/pii/S0165168419303998