Font Size: a A A

The Partitioned Group Correction Technique With Automatic Differentiation For Large Scale Sparse Unconstrained Optimization

Posted on:2009-03-07Degree:MasterType:Thesis
Country:ChinaCandidate:Y X JiFull Text:PDF
GTID:2120360242984445Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Nonlinear optimization plays an important role in many fields such as science computation and engineering analysis. For solving nonlinear optimization problems, Newton-like method is one of the most efficient methods and it is an important problem to improve the method. Recently, it is a hot research at home and abroad in portioning groups of the matrix to iteration which is based on the structure of the Hessian. Most of them involves solving the gradient and Hessian of the object function, usually by means of symbolic differentiation. But for the middle and large scale problems, the computation cost by symbolic differentiation is very expensive. When the direction derivative is evaluated, the computation cost by divided difference can be reduced, but it is only one kind of approximate computation. Moreover it is very difficult to confirm the divided difference interval rightly. Automatic differentiation (AD) is a method which computes the derivative exactly and efficiently. AD has significant advantages over two other approaches.The first chapter of the paper introduces the large scale sparse unconstrained optimization and the solution development process. Then refer to the problems of the exist solutions and the research feature of the paper. The second chapter gives the classic method for the unconstrained optimization. In the third chapter it introduces how to compute derivative with the AD, and clearly expounded the basic concepts and methods. After that it explains the two basic modes of the AD: the forward and reverse modes. It also gives the comparison with the difference. Then the paper further discusses in the AD for the second derivative in the forth chapter which is based on the third chapter. And it combines the AD with the thought of the portioned group. Based on the features of the Hessian it establishes the direct and indirect methods to evaluate the object function and compare them with each other. This is one important of the paper. In the fifth chapter we establish the new method. The new method reserves the benefits of the PGC method, and also gets the exact product of the Hessian-vector. At last of the chapter, the new algorithm is implemented by basic numerical experiments.
Keywords/Search Tags:Unconstrained Optimization, Automatic Differentiation, Hessian, Sparsity, Partition
PDF Full Text Request
Related items