Loading...
Search for: gradient-methods
0.012 seconds
Total 49 records

    Properties and Numerical Performance of Nonlinear Conjugate Gradient Methods Whit Modified Secant Equations and New Conjugacy Conditions

    , M.Sc. Thesis Sharif University of Technology Abdi, Javad (Author) ; Mahdavi Amiri, Nezamedin (Supervisor)
    Abstract
    Conjugate gradient methods are appealing for large scale nonlinear optimization problems, because they avoid the storage of matrices. Recently, a new conjugacy condition proposed by Dai and Liao, considers an inexact line search scheme that reduces to the old one if the line search is exact. Based on this condition, a new conjugate gradient method was proposed that has fast convergence. Later, Yabe and Takano, based on new conjugacy condition and modified secant condition, proposed another conjugate gradient method. This method takes both the available gradient and function value information and achieves a high-order accuracy in approximating the second-order curvature of the objective... 

    New Conjugate Gradient Methods for Unconstrained Optimization

    , Ph.D. Dissertation Sharif University of Technology (Author) ; Mahdavi Amiri, Nezamoddin (Supervisor)
    Abstract
    We discuss conjugate gradient methods for which both the gradient and func-tion values are considered in computing the conjugate gradient parameter. We pro-pose new conjugate gradient methods as members of Dai-Liao’s family of conjugate gradient methods and Andrei’s family of hybrid conjugate gradient methods. For computing the conjugate gradient parameter in our methods, three modified secant equations proposed by Zhang, Deng and Chen, Li and Fukushima, and Yuan are used. It is shown that under proper conditions, three of the proposed methods are globally convergent for uniformly convex functions and two other methods are glob-ally convergent for general functions. It is also shown that... 

    Graph-Based Preconditioners for Network Flow Problems

    , M.Sc. Thesis Sharif University of Technology Yousefi Lalimi, Fateme (Author) ; Mahdavi Amiri, Nezamoddin (Supervisor)
    Abstract
    Considering the special importance of network flow problems in human life, as well as the complexity of solving these problems in very large scales, there are numerous methods to solve them and the interior point methods are the most important approaches among them. In a number of methods, a preconditioned conjugate gradient solver has been applied for the solution of the Karush-Kuhn-Tucker (KKT) system, in each interior point iteration; therefore, the selection of an appropriate preconditioner is a special issue. In spite of presenting different preconditioners in recent years, discussion and implementation of a particular class of triangulated graph-based preconditioners is our main... 

    Accelerated Hybrid Conjugate Gradient Algorithm with Modified Secant Condition

    , M.Sc. Thesis Sharif University of Technology Soleimani Kourandeh, Aria (Author) ; Mahdavi Amiri, Nezamoddin (Supervisor)
    Abstract
    Conjugate gradient methods are useful for large scale nonlinear optimization problem, because they avoid the storage of any matrices. In this thesis, we have investigated an accelerated hybrid conjugate gradient algorithm, recently proposed in the literature. The combining parameter is calculated so that the corresponding direction to the conjugate gradient algorithm, while satisfies the modified secant condition, is a Newton direction. It is shown that for uniformly convex functions and for general nonlinear functions the algorithm with strong Wolfe line search is globally convergent. The algorithm uses an accelerated approach for the reduction of the objective function values by modifying... 

    Design and Analysis of Filter Trust-Region Algorithms for Unconstrained and Bound Constrained Optimization

    , M.Sc. Thesis Sharif University of Technology Fatemi, Masoud (Author) ; Mahdavi Amiri, Nezameddin (Supervisor)
    Abstract
    Design, analysis and practical implementation of the filter trust-region algorithms are investigated. First, we introduce two filter trust-region algorithms for solving the unconstrained optimization problem. These algorithms belong to two different class of optimization algorithms: (1) The monotone class, and (2) The non-monotone class. We prove the global convergence of the sequence of the iterates generated by the new algorithms to the first and second order critical points. Then, we propose a filter trust-region algorithm for solving bound constrained optimization problems and show that the algorithm converges to a first order critical point. Moreover, we address some well known... 

    An Implementation of an Interior Point Algorithm for Nonlinear Optimization Combining line Search and Trust Region Steps

    , M.Sc. Thesis Sharif University of Technology Khajuei Jahromi, Mona (Author) ; Mahdavi-Amiri, Nezamoddin (Supervisor)
    Abstract
    An interior point method for nonlinear programming problem recently proposed by Waltz, Morales, Nocedal and Orban is described and implemented [2].The steps are computed by line search based on the primal-dual equations, or trust region based on the conjugate gradient iteration. Steps computed by line search are tried first, but if they are determinded to be ineffective, a trust region iteration that guarantees progress toward a stationary point is used. In order to reduce the calculations, here we propose some modifications. The algorithms are implemented and the programs are tested on a variety of problems. Numerical results based on Dolan-More’ confirm the effectiveness of the algorithms  

    Iteratively Constructing Preconditioners via the Conjugate Gradient Method

    , M.Sc. Thesis Sharif University of Technology Mousa Abadian, Mohammad (Author) ; Farhadi, Hamid Reza (Supervisor)
    Abstract
    The main goal of this work is solving system of linear equations Ax = b, where A is a n_n square matrix, b is a n_1 vector and x is the vector of unknowns. When n is large, using direct methods is not economical. Thus, the system is solved by iterative methods. At first, projection method onto subspace K _ Rn with dimension m _ n is described, and then this subspace K is equalized with the krylov subspace. Then,some samples of projection methods onto the krylov subspace, such as FOM, GMRES and CG (Conjugate Gradient), are considered. The preconditioning of the linear system is explained, that is, instead of solving system Ax = b, the system PAx = Pb (P nonsingular), is solved, such that the... 

    Correction of Time-Dependent Origin-Destination Demand in Dynamic Traffic Assignment

    , M.Sc. Thesis Sharif University of Technology Shafiei, Mohammad Sajjad (Author) ; Zakaei Ashtiani, Hedayat (Supervisor)
    Abstract
    Time-dependent origin-destination demand is a key input in dynamic traffic assignment in advanced traffic management systems, and the result of dynamic traffic assignment is dependent on the accuracy of this information. One method to achieve time dependent demand matrices is using a primary demand matrix and volume traffic counts in some links of network. In this thesis a bi-level model is used to correct the demand matrix and the extended gradient method is suggested to solve the problem. The extended gradient is an iterative method that in each iteration, corrects the demand matrix in a way that the estimated traffic flow be close to the observed traffic flow. Execution of this method in... 

    A Filter-Trust-Region Method for Simple-Bound Constrained Optimization

    , M.Sc. Thesis Sharif University of Technology Mehrali Varjani, Mohsen (Author) ; Mahdavi Amiri, Nezameddin (Supervisor)
    Abstract
    We explain a filter-trust-region algorithm for solving nonlinear optimization problems with simple bounds recently proposed by Sainvitu and Toint. The algorithm is shown to be globally convergent to at least one first-order critical point. We implement the algorithm and test the program on various problems. The results show the effectiveness of the algorithm  

    Implicit Solution of 2-dimensional Compressible Flow, Using Parallel Krylov Method

    , M.Sc. Thesis Sharif University of Technology Ansarian, Hossein (Author) ; Taeibi Rahni, Mohammad (Supervisor) ; Sabetghadam, Fereidoon (Supervisor)
    Abstract
    Numerical Simulation of two-dimensional steady compressible fluid flow on unstructured grids was accomplished using a fast implicit algorithm. To solve the copmlete two-dimensional Navier-Stokes equations, implicit time stepping was used which results in a large sparse linear system in each iteration. To solve the linear system, the biconjugate gradient method which belongs to Krylov subspace methods family, with an ILU(0) preconditioner was used. For accelerating the solution in large problems, parallel processing was used for linear system to be solved faster. Two upwind methods, namely Roe’s and AUSM+ methods were used for spatial descritizaion of inviscid fluxes with a MUSCL algorithm... 

    Implementation of New Hybrid Conjugate Gradient Algorithms
    Based on Modified BFGS Updates

    , M.Sc. Thesis Sharif University of Technology Moshtagh, Mehrdad (Author) ; Mahdavi-Amiri, Nezam (Supervisor)
    Abstract
    We describe two modified secant equations proposed by Yuan, Li and Fukushima. First, we study the approach proposed by Andrei. Then, we explain two hybrid conjugate gradient methods for unconstrained optimization problems. The methods are hybridizations of Hestenes-Stiefel and Dai-Yuan conjugate gradient methods. It is shown that one of the algorithms is globally convergent for uniformly convex functions and the other is globally convergent for general functions. Two approaches for computing the initial value of the steplength proposed by Babaie, Fatemi, and Mahdavi-Amiri and Andrei are used for accelerating the performance of the line search. We implement the algorithms and compare the... 

    Solving of Nonconvex Optimization Problem Using Trust-Region Newton-Conjugate Gradient Method with Strong Second-Order Complexity Guarantees

    , M.Sc. Thesis Sharif University of Technology Javidpanah, Fatemeh (Author) ; Mahdavi Amiri, Nezamoddin (Supervisor)
    Abstract
    Worst-case complexity guarantees for non-convex optimization algorithms is a topic that have received increasing attention. Here , we review trust-region Newton methods recently proposed in the literature . After a slight modification of the main model , two methods are proposed : one of them is based on the exact solution of the sub-problem , and the other is based on the inexact solution of the sub-problem , such as ``trust-region Newton-conjugate gradient " method with the complexity bounds corresponding to the best known bounds for this class of algorithms . We implement the proposed algorithms and test the programs in the Python software environment  

    Conjugate Residual Method for Large Scale Unconstrained Nonlinear Optimization

    , M.Sc. Thesis Sharif University of Technology Siyadati, Maryam (Author) ; Mahdavi Amiri, Nezam (Supervisor)
    Abstract
    Nowadays, solving large-scale unconstrained optimization problems has wide applications in data science and machine learning. Therefore, the development and analysis of efficient algorithms for solving unconstrained optimization problems is of great interest. Line search and trust region are two general frameworks for guaranteeing the convergence of algorithms for solving unconstrained optimization problems. Conjugate gradient (CG) methods and the conjugate residual (CR) balance by Hestenes and Stiefel, have been presented for solving linear systems with symmetric and positive definite coefficient matrices. The basic feature of CR, that is, residual minimization, is important and can be used... 

    Solving a Smooth Approximation of the Sparse Recovery Problem Using the Three-Term Conjugate Gradient Algorithms

    , M.Sc. Thesis Sharif University of Technology Qaraei, Mohammad Hossein (Author) ; Mahdavi Amiri, Nezamoddin (Supervisor)
    Abstract
    Line search-based methods are known as a category of the most efficient iterative algo- rithms for solving unconstrained optimization problems. Among them, the conjugate gradient method is of particular importance in solving large-scale contemporary world problems due to its simplicity of structure, low memory requirement and strong convergence characteristics. In spite of the desirable numerical behavior of the conjugate gradient method, this method generally lacks the descent property even for uniformly convex objective functions. To overcome this defect, some effective modifications have been presented in the literature. Amidst, the three-term extension attracted the attention of many... 

    Online undersampled dynamic MRI reconstruction using mutual information

    , Article 2014 21st Iranian Conference on Biomedical Engineering, ICBME 2014 ; 17 February , 2014 , Pages 241-245 ; ISBN: 9781479974177 Farzi, M ; Ghaffari, A ; Fatemizadeh, E ; Sharif University of Technology
    Abstract
    We propose an algorithm based on mutual information to address the problem of online reconstruction of dynamic MRI from partial k-space measurements. Most of previous compressed sensing (CS) based methods successfully leverage sparsity constraint for offline reconstruction of MR images, yet they are not used in online applications due to their complexities. In this paper, we formulate the reconstruction as a constraint optimization problem and try to maximize the mutual information between the current and the previous time frames. Conjugate gradient method is used to solve the optimization problem. Using Cartesian mask to undersample k-space measurements, the proposed method reduces... 

    Nonlocal and strain gradient based model for electrostatically actuated silicon nano-beams

    , Article Microsystem Technologies ; Vol. 21, Issue 2 , 2014 , pp. 457-464 ; Online ISSN: 1432-1858 Miandoab, E. M ; Yousefi-Koma, A ; Pishkenari, H. N ; Sharif University of Technology
    Abstract
    Conventional continuum theory does not account for contributions from length scale effects which are important in modeling of nano-beams. Failure to include size-dependent contributions can lead to underestimates of deflection, stresses, and pull-in voltage of electrostatic actuated micro and nano-beams. This research aims to use nonlocal and strain gradient elasticity theories to study the static behavior of electrically actuated micro- and nano-beams. To solve the boundary value nonlinear differential equations, analogue equation and Gauss–Seidel iteration methods are used. Both clamped-free and clamped–clamped micro- and nano-beams under electrostatical actuation are considered where... 

    A modified two-point stepsize gradient algorithm for unconstrained minimization

    , Article Optimization Methods and Software ; Volume 28, Issue 5 , 2013 , Pages 1040-1050 ; 10556788 (ISSN) Babaie Kafaki, S ; Fatemi, M ; Sharif University of Technology
    2013
    Abstract
    Based on a modified secant equation proposed by Li and Fukushima, we derive a stepsize for the Barzilai-Borwein gradient method. Then, using the newly proposed stepsize and another effective stepsize proposed by Dai et al. in an adaptive scheme that is based on the objective function convexity, we suggest a modified two-point stepsize gradient algorithm. We also show that the limit point of the sequence generated by our algorithm is first-order critical. Finally, our numerical comparisons done on a set of unconstrained optimization test problems from the CUTEr collection are presented. At first, we compare the performance of our algorithm with two other two-point stepsize gradient algorithms... 

    Two modified hybrid conjugate gradient methods based on a hybrid secant equation

    , Article Mathematical Modelling and Analysis ; Volume 18, Issue 1 , 2013 , Pages 32-52 ; 13926292 (ISSN) Babaie Kafaki, S ; Mahdavi Amiri, N ; Sharif University of Technology
    2013
    Abstract
    Taking advantage of the attractive features of Hestenes-Stiefel and Dai-Yuan conjugate gradient methods, we suggest two globally convergent hybridizations of these methods following Andrei's approach of hybridizing the conjugate gradient parameters convexly and Powell's approach of nonnegative restriction of the conjugate gradient parameters. In our methods, the hybridization parameter is obtained based on a recently proposed hybrid secant equation. Numerical results demonstrating the efficiency of the proposed methods are reported  

    Estimating the four parameters of the Burr III distribution using a hybrid method of variable neighborhood search and iterated local search algorithms

    , Article Applied Mathematics and Computation ; Volume 218, Issue 19 , 2012 , Pages 9664-9675 ; 00963003 (ISSN) Zoraghi, N ; Abbasi, B ; Niaki, S. T. A ; Abdi, M ; Sharif University of Technology
    2012
    Abstract
    The Burr III distribution properly approximates many familiar distributions such as Normal, Lognormal, Gamma, Weibull, and Exponential distributions. It plays an important role in reliability engineering, statistical quality control, and risk analysis models. The Burr III distribution has four parameters known as location, scale, and two shape parameters. The estimation process of these parameters is controversial. Although the maximum likelihood estimation (MLE) is understood as a straightforward method in parameters estimation, using MLE to estimate the Burr III parameters leads to maximize a complicated function with four unknown variables, where using a conventional optimization such as... 

    Improving response surface methodology by using artificial neural network and simulated annealing

    , Article Expert Systems with Applications ; Volume 39, Issue 3 , February , 2012 , Pages 3461-3468 ; 09574174 (ISSN) Abbasi, B ; Mahlooji, H ; Sharif University of Technology
    2012
    Abstract
    Response surface methodology (RSM) explores the relationships between several explanatory variables and one or more response variables. The main idea of RSM is to use a set of designed experiments to obtain an optimal response. RSM tries to simplify the original problem through some polynomial estimation over small sections of the feasible area, elaborating on optimum provision through a well known optimization technique, say Gradient Method. As the real world problems are usually very complicated, polynomial estimation may not perform well in providing a good representation of the objective function. Also, the main problem of the Gradient Method, getting trapped in local minimum (maximum),...