Font Size: a A A

Non - Derivative Method For Solving Linear Equations With Linear Constraints And Its Theoretical Analysis

Posted on:2016-12-18Degree:DoctorType:Dissertation
Country:ChinaCandidate:P WangFull Text:PDF
GTID:1100330461985593Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
The optimization problems in industry, agriculture, national defense, communications and other fields are widely used. Especially with the development of computer technology in recent years and all kinds of software continues to improve, it is more important and known as reality to solve the optimization problem fast and efficiently. Among many methods for solving the optimization problem, both line search technique and trust region strategy are well-accepted methods in the optimization to assure global convergence, at the same time, the tensor methods are numerical methods for solving the nonlinear equations, under reasonable conditions, the methods has superlinear convergence rate. For the constrained and unconstrained nonlinear equations, it is difficult to obtain the derivative information of the problem. In this thesis, we propose some families of derivative-free algorithms for solving the problems.The trust region concept minimizes an appropriate quadratic model that approximates the objective function in a region centered at current iterate point. By updating the trust region radius, we can obtain acceptable step. In this thesis, we propose a derivative-free affine scaling trust-region method with interior backtracking technique for bound-constrained nonlinear systems. Using the interpolation function to employ the trust region subproblem which is defined by minimizing a squared Eudidean norm reformulation of linear model subject to an interior ellipsoidal constraint.Combined with the line search technique, each iterate switches to strict feasibility interior point generate a new trial step. For the nonlinear equations with linear inequality constraints, the trust region subproblem is defined subject to an ellipsoidal constraint through a new affine scaling matrix. The linear search technique can avoid solving trust region subproblem repeatedly in the process of operation, improve the efficiency of the algorithm and ensure strict interior feasibility.The global convergence and fast local convergence rate of the proposed algorithm are established under some reasonable conditions. The results of numerical experiments are reported to show the effectiveness of the proposed algorithm.Levenberg-Marquardt method is the popular method for solving nonlinear equations. However, it is expensive to take the exact solution of method in each iteration for large-scale problems,and hence it is often effective to use inexact methods that find an approximate solution satisfying some appropriate conditions. In this thesis, we employ the Levenberg-Marquardt method by interpolation function. In order to ensure better approximation between the interpolation function and the original function, the interpolation points set need to be Λ-poised and the interpolation radius tends to 0. For this purpose, the new interpolation radius is defined by the norm of gradient of interpolation function. Using a new affine scaling matrix, we obtain an iterative direction by solving a strictly convex function. Line search technique is used to find an acceptable trail step length along this direction which is strictly feasible and makes the objective function monotonically decreasing. we would like to study the local convergence behavior of the algorithm and establish local superlinear and quadratic convergence of the algorithm under some local error bound conditions which are considerably weaker than a non-singularity assumption on the Jacobian. The numerical results indicate that the algorithm is useful and effective in practice.Tensor method is an effective method for solving nonlinear equations which the Jacobian is singular or ill conditioned situations. We employ the derivative-free tensor method by interpolation function in this thesis. For ensuring the interpolation model and the original problem with good approximation, the new interpolation radius is defined by the norm of gradient of interpolation function. Our algorithm produces a more accurate tensor step than direct tensor methods by curvilinear linesearch and block-2 tensor method. At the same time, the algorithm provides a global strategies, i.e., a two-dimensional trust region approach. The results of numerical experiments are reported to show the effectiveness of the proposed algorithm.The last chapter concludes the main results of this thesis and proposes some further research directions about our work.
Keywords/Search Tags:derivative-free optimization, trust-region, system of nonlinear equations, interior point, Levenberg-Marquardt method, tensor method
PDF Full Text Request
Related items