Loading...
Search for: global-convergence
0.006 seconds
Total 27 records

    Operations research and optimization (ORO)

    , Article Optimization ; Volume 62, Issue 6 , Jan , 2013 , Pages 675-691 ; 02331934 (ISSN) Mahdavi Amiri, N ; Ansari, M. R ; Sharif University of Technology
    2013
    Abstract
    We have recently proposed a structured algorithm for solving constrained nonlinear least-squares problems and established its local two-step Q-superlinear convergence rate. The approach is based on an earlier adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method. The structured adaptation makes use of the ideas of Nocedal and Overton for handling quasi-Newton updates of projected Hessians and adapts a structuring scheme due to Engels and Martinez. For robustness, we have employed a specific nonsmooth line search strategy, taking account of the least-squares objective. Numerical results also confirm the practical relevance of our special considerations for... 

    Two effective hybrid conjugate gradient algorithms based on modified BFGS updates

    , Article Numerical Algorithms ; Volume 58, Issue 3 , 2011 , Pages 315-331 ; 10171398 (ISSN) Babaie Kafaki, S ; Fatemi, M ; Mahdavi Amiri, N ; Sharif University of Technology
    Abstract
    Based on two modified secant equations proposed by Yuan, and Li and Fukushima, we extend the approach proposed by Andrei, and introduce two hybrid conjugate gradient methods for unconstrained optimization problems. Our methods are hybridizations of Hestenes-Stiefel and Dai-Yuan conjugate gradient methods. Under proper conditions, we show that one of the proposed algorithms is globally convergent for uniformly convex functions and the other is globally convergent for general functions. To enhance the performance of the line search procedure, we propose a new approach for computing the initial value of the steplength for initiating the line search procedure. We give a comparison of the... 

    A new trust-region method for solving systems of equalities and inequalities

    , Article Computational and Applied Mathematics ; Volume 36, Issue 1 , 2017 , Pages 769-790 ; 01018205 (ISSN) Saeidian, Z ; Peyghami, M. R ; Habibi, M ; Ghasemi, S ; Sharif University of Technology
    Springer Science and Business Media, LLC  2017
    Abstract
    In this paper, we propose a new trust-region method for solving nonlinear systems of equalities and inequalities. The algorithm combines both standard and adaptive trust-region frameworks to construct the steps of the algorithm. The trust-region subproblem is solved in the first iteration using a given initial radius. Then, in each iteration, the standard trust-region method is followed whenever the current trial step is accepted, otherwise, the subproblem is resolved using an adaptive scheme. The convergence results for the new proposed algorithm are established under some mild and standard assumptions. Numerical results on some least-squares test problems show the efficiency and... 

    Scaled nonlinear conjugate gradient methods for nonlinear least squares problems

    , Article Numerical Algorithms ; 2018 ; 10171398 (ISSN) Dehghani, R ; Mahdavi Amiri, N ; Sharif University of Technology
    Springer New York LLC  2018
    Abstract
    We propose a modified structured secant relation to get a more accurate approximation of the second curvature of the least squares objective function. Then, using this relation and an approach introduced by Andrei, we propose three scaled nonlinear conjugate gradient methods for nonlinear least squares problems. An attractive feature of one of the proposed methods is satisfication of the sufficient descent condition regardless of the line search and the objective function convexity. We establish that the three proposed algorithms are globally convergent, under the assumption of the Jacobian matrix having full column rank on the level set for one, and without such assumption for the other... 

    Scaled nonlinear conjugate gradient methods for nonlinear least squares problems

    , Article Numerical Algorithms ; Volume 82, Issue 1 , 2019 ; 10171398 (ISSN) Dehghani, R ; Mahdavi Amiri, N ; Sharif University of Technology
    Springer New York LLC  2019
    Abstract
    We propose a modified structured secant relation to get a more accurate approximation of the second curvature of the least squares objective function. Then, using this relation and an approach introduced by Andrei, we propose three scaled nonlinear conjugate gradient methods for nonlinear least squares problems. An attractive feature of one of the proposed methods is satisfication of the sufficient descent condition regardless of the line search and the objective function convexity. We establish that the three proposed algorithms are globally convergent, under the assumption of the Jacobian matrix having full column rank on the level set for one, and without such assumption for the other... 

    An efficient simplified neural network for solving linear and quadratic programming problems

    , Article Applied Mathematics and Computation ; Volume 175, Issue 1 , 2006 , Pages 452-464 ; 00963003 (ISSN) Ghasabi Oskoei, H ; Mahdavi Amiri, N ; Sharif University of Technology
    2006
    Abstract
    We present a high-performance and efficiently simplified new neural network which improves the existing neural networks for solving general linear and quadratic programming problems. The network, having no need for parameter setting, results in a simple hardware requiring no analog multipliers, is shown to be stable and converges globally to the exact solution. Moreover, using this network we can solve both linear and quadratic programming problems and their duals simultaneously. High accuracy of the obtained solutions and low cost of implementation are among the features of this network. We prove the global convergence of the network analytically and verify the results numerically. © 2005... 

    A Trust Region Method for Solving Semidefinite Programs

    , M.Sc. Thesis Sharif University of Technology Nazari, Parvin (Author) ; Mahdavi-Amiri, Nezamoddin (Supervisor)
    Abstract
    In this thesis, we exmine a group of optimization methods called trust region methods for solving semidefinite programming problems. Nowadays, many application problems can be cast as semidefinite programming and problems with very large size are encountered every year. So, having a powerful method for solving such problems is very important. Trust region approach present a new scheme for constructing efficient algorithms to solve semidefinite programming problems.When using interior point methods for solving semidefinite programs (SDPs), one needs to solve a system of linear equations at every iteration. For large problems, solving the system of linear equations can be very expensive. In... 

    Two new conjugate gradient methods based on modified secant equations

    , Article Journal of Computational and Applied Mathematics ; Volume 234, Issue 5 , 2010 , Pages 1374-1386 ; 03770427 (ISSN) Babaie Kafaki, S ; Ghanbari, R ; Mahdavi Amiri, N ; Sharif University of Technology
    2010
    Abstract
    Following the approach proposed by Dai and Liao, we introduce two nonlinear conjugate gradient methods for unconstrained optimization problems. One of our proposed methods is based on a modified version of the secant equation proposed by Zhang, Deng and Chen, and Zhang and Xu, and the other is based on the modified BFGS update proposed by Yuan. An interesting feature of our methods is their account of both the gradient and function values. Under proper conditions, we show that one of the proposed methods is globally convergent for general functions and that the other is globally convergent for uniformly convex functions. To enhance the performance of the line search procedure, we also... 

    Properties and Numerical Performance of Nonlinear Conjugate Gradient Methods Whit Modified Secant Equations and New Conjugacy Conditions

    , M.Sc. Thesis Sharif University of Technology Abdi, Javad (Author) ; Mahdavi Amiri, Nezamedin (Supervisor)
    Abstract
    Conjugate gradient methods are appealing for large scale nonlinear optimization problems, because they avoid the storage of matrices. Recently, a new conjugacy condition proposed by Dai and Liao, considers an inexact line search scheme that reduces to the old one if the line search is exact. Based on this condition, a new conjugate gradient method was proposed that has fast convergence. Later, Yabe and Takano, based on new conjugacy condition and modified secant condition, proposed another conjugate gradient method. This method takes both the available gradient and function value information and achieves a high-order accuracy in approximating the second-order curvature of the objective... 

    Nonlinear Programming Without a Penalty Function or a Filter

    , M.Sc. Thesis Sharif University of Technology Abdollahi, Fahimeh (Author) ; Mahdavi Amiri, Nezamedin (Supervisor)
    Abstract
    A new method, recently introduced in the literature, is discussed for solving equality constrained nonlinear optimization problems. This method does not use a penalty or a barrier function, or a filter, and yet its global convergence to first-order stationary points can be proved. The method uses different trust regions to cope with the nonlinearities of the objective function and the constraints, and admits inexact SQP steps not lying exactly in the nullspace of the local Jacobian. We implement the method in MATLAb 7.7 software environment and test the resulting program on a collection of CUTEr problems. The numerical results are promising and confirm the global convergence of the method  

    New Conjugate Gradient Methods for Unconstrained Optimization

    , Ph.D. Dissertation Sharif University of Technology (Author) ; Mahdavi Amiri, Nezamoddin (Supervisor)
    Abstract
    We discuss conjugate gradient methods for which both the gradient and func-tion values are considered in computing the conjugate gradient parameter. We pro-pose new conjugate gradient methods as members of Dai-Liao’s family of conjugate gradient methods and Andrei’s family of hybrid conjugate gradient methods. For computing the conjugate gradient parameter in our methods, three modified secant equations proposed by Zhang, Deng and Chen, Li and Fukushima, and Yuan are used. It is shown that under proper conditions, three of the proposed methods are globally convergent for uniformly convex functions and two other methods are glob-ally convergent for general functions. It is also shown that... 

    An Affine Scaling Trust Region Approach to Bound-Constrained Nonlinear Systems

    , M.Sc. Thesis Sharif University of Technology Hekmati, Rasoul (Author) ; Mahdavi Amiri, Nezamoddin (Supervisor)
    Abstract
    We describe an interior method for solving bound-constrained systems of equations , recently introduced by S. Bellavia, M. Macconi and B. Morini in the literature. The method makes use of ideas from the classical trust-region Newton method for unconstrained nonlinear equations and the recent interior affine scaling approach for constrained optimization problems. The iterates are generated to be feasible and the bounds are handled implicitly. The method reduces to a standard trust-region method for unconstrained problems when there are no upper or lower bounds on the variables. Global and local fast convergence properties are ... 

    A First-Order Interior-Point Method For Linearly Constrained Smooth Optimization

    , M.Sc. Thesis Sharif University of Technology Ebadi Zadeh, Monireh (Author) ; Peyghami, Mohammad Reza (Supervisor) ; Fotouhi, Morteza (Supervisor)
    Abstract
    In this thesis, we propose a first-order interior-point method for linearly constrained smooth optimization which was recently proposed in the literatuare that unifies and extends first-order affine-scaling method and replicator dynamics method for standard quadratic programming. Global convergence and, in the case of quadratic program, the (sub)linear convergence rate and iterate convergence results are derived.The method is implemented and numerical experiments on simplex onstrained problems with 1000 variables is reported  

    A Primal-dual Interior Point Method for Nonlinear Semidefinite Programming

    , M.Sc. Thesis Sharif University of Technology Hosseini, Morteza (Author) ; Mahdavi - Amiri, Nezamodden (Supervisor)
    Abstract
    We explain a primal–dual interior point method for solving nonlinear semidefinite programming problems, recently presented in the literature.The method consists of the outer iteration (SDPIP) that finds a KKT point and the inner iteration (SDPLS) that calculates an approximate barrier KKT point. Al gorithm SDPLS uses a commutative class of Newton-like directions for the generation of line search directions. By combining the primal barrier penalty function and the primal–dual barrier function, a new primal–dual merit function is developed. We explain the proof of global convergence of the method.and provide some numerical... 

    Implementation of New Hybrid Conjugate Gradient Algorithms
    Based on Modified BFGS Updates

    , M.Sc. Thesis Sharif University of Technology Moshtagh, Mehrdad (Author) ; Mahdavi-Amiri, Nezam (Supervisor)
    Abstract
    We describe two modified secant equations proposed by Yuan, Li and Fukushima. First, we study the approach proposed by Andrei. Then, we explain two hybrid conjugate gradient methods for unconstrained optimization problems. The methods are hybridizations of Hestenes-Stiefel and Dai-Yuan conjugate gradient methods. It is shown that one of the algorithms is globally convergent for uniformly convex functions and the other is globally convergent for general functions. Two approaches for computing the initial value of the steplength proposed by Babaie, Fatemi, and Mahdavi-Amiri and Andrei are used for accelerating the performance of the line search. We implement the algorithms and compare the... 

    A Proximal Method for Composite Minimization

    , M.Sc. Thesis Sharif University of Technology Taherifard, Sara (Author) ; Mahdavi-Amiri, Nezamoddin (Supervisor) ; Soleimani-damaneh, Majid (Supervisor)
    Abstract
    We consider composite minimization problem of the form minx h(c(x)), where the function c : Rn ! Rm is smooth and the function h : Rm ! [+1;1] is usually convex or prox-regular, but may be nonsmooth. A wide variety of important optimization problems fall into this framework, and so far several studies have been done in this regard. One of these studies relates to the condition that the function h is finite convex and the algorithm uses a line search method. Another case is solving nonlinear programming problems using a penalty function where the function h is finite polyhedral. Research has also been done for the case where the function c is identity, that is c(x) = x.We describe an... 

    Solving Symmetric Nonlinear Equations System Using BFGS Trust Region Quadratic Method

    , M.Sc. Thesis Sharif University of Technology Salimi, Samira (Author) ; Razvan, Mohammad Reza (Supervisor) ; Peyghami, Mohammad Reza (Supervisor)

    Adaptive compensation of gyro bias in rigid-body attitude estimation using a single vector measurement

    , Article IEEE Transactions on Automatic Control ; Volume 58, Issue 7 , 2013 , Pages 1816-1822 ; 00189286 (ISSN) Namvar, M ; Safaei, F ; Sharif University of Technology
    2013
    Abstract
    The presence of bias in measurement of rate gyros is a performance limiting factor for satellite attitude determination systems. Gyro bias is usually handled by Kalman filtering methods which are mostly based on linearization approaches and lack global convergence properties. On the other hand, the existing asymptotically convergent nonlinear observers take into account the gyro bias only when multiple vector measurements are available. We present an asymptotically convergent attitude estimator which uses only one vector measurement and a rate gyro whose output is contaminated with an unknown and constant bias. The effect of unknown bias is compensated by means of a parameter adaptation law.... 

    A class of globally convergent velocity observers for robotic manipulators

    , Article IEEE Transactions on Automatic Control ; Volume 54, Issue 8 , 2009 , Pages 1956-1961 ; 00189286 (ISSN) Namvar, M ; Sharif University of Technology
    2009
    Abstract
    We present a method for global estimation of joint velocities in rigid manipulators. A class of velocity observers with smooth dynamics are introduced whose structure depend on freely selectable functions and gains giving the flexibility of achieving multiple design objectives at the same time. Unlike most methods, no a priori knowledge of an upper bound for velocity magnitude is used. An adaptive version of the observer is also presented to handle a class of structured uncertainties in manipulator model. Simulation example illustrates low noise sensitivity of the globally convergent observer in comparison with semi-globally convergent observers. © 2009 IEEE  

    Globally Convergent Limited Memory Bundle Method for Larg-Scale Nonsmooth Optimization

    , M.Sc. Thesis Sharif University of Technology Dehdarpour, Hamid (Author) ; Mahdavi Amiri, Nezamoddin (Supervisor)