| In this thesis, we propose Newton-type hybrid methods for unconstrained optimization problems and the systems of nonmonotone nonlinear equations with singular solutions.For the unconstraint minimization problems with singular solutions, in the case where the objective function is nonconvex, the Hessian matrix of the objective function may be singular. The Newton direction may not exist or exist but is not a descent direction of the objective function. To overcome this difficulty, we develop a hybrid method that combines the regularized Newton method with the steepest descent method. Specifically, when the direction generated by inexact Newton equation is not a descent direction, we use the gradient to replace the search direction. Under milder conditions, we prove the global convergence of the proposed method. Moreover, we show that after finite iterations, the hybrid method essentially reduces to the regularized Newton method. Consequently, it possesses locally quadratic convergence property.We also propose a Newton-type method for solving systems of nonmonotone nonlinear equations. The method is a combination of Newton's method and gradient method with projection methodology. We first solve an Newton equation, when the Newton equation is not solvable, we use the gradient direction as the search direction. Unlike the classical globalization strategies for Newton methods, the line search used in our method does not decrease the value of some merit function. Instead, it is used to construct an appropriate hyperplane which separates the current iterate from the solution set. The step is followed by projecting the current iterate onto this hyperplane, which ensures global convergence of the algorithm. An important property of the algorithm is that the whole sequence of iterates is always globally convergent to a solution of the systems without any additional regularity assumption. Moreover, under standard assumptions the local superlinear rate of convergence is achieved. |