A Stabilized SQP Method: Global and Superlinear Convergence, M.Sc. Thesis Sharif University of Technology ; Mahdavi Amiri, Nezamoddin (Supervisor)
Abstract
Stabilized sequential quadratic programming (sSQP) methods for nonlinear optimization generate a sequence of iterates with fast local convergence regardless of whether or not the activeconstraint gradients are linearly dependent. Here, we are concerned with the local convergence analysis of an sSQP method, recently introduced in the literature, that uses a line search with a primal-dual augmented Lagrangian merit function to enforce global convergence. The method is provably well-defined and is based on solving a strictly convex quadratic programming subproblem at each iteration. It is shown that the method has superlinear local convergence under assumptions that are not stronger than those...
Cataloging briefA Stabilized SQP Method: Global and Superlinear Convergence, M.Sc. Thesis Sharif University of Technology ; Mahdavi Amiri, Nezamoddin (Supervisor)
Abstract
Stabilized sequential quadratic programming (sSQP) methods for nonlinear optimization generate a sequence of iterates with fast local convergence regardless of whether or not the activeconstraint gradients are linearly dependent. Here, we are concerned with the local convergence analysis of an sSQP method, recently introduced in the literature, that uses a line search with a primal-dual augmented Lagrangian merit function to enforce global convergence. The method is provably well-defined and is based on solving a strictly convex quadratic programming subproblem at each iteration. It is shown that the method has superlinear local convergence under assumptions that are not stronger than those...
Find in contentBookmark
|
|