| The thesis mainly focuses on the SGD method in the first-order stochastic optimization method and the BFGS method in the quasi-Newton method.The main work is as follows.By introducing the definition of virtual gradient based on the computation graph,we propose a virtual gradient descent method for solving stochastic optimization problems,simply as SVGD method,and analyze the convergence of the algorithm.The new method has the advantages of low storage cost and low one-step operation complexity.Experimental results in deep learning show that the SVGD method convergence faster than other common first-order stochastic optimization algorithms.This shows that the SVGD method is more efficient.By combining the dynamic subspace technology with the BFGS method,we proposed a numerical method for solving large-scale unconstrained optimization problems,simply as Fast-BFGS method,and established the convergence theory.The experimental results of the Fast-BFGS method and the BFGS and L-BFGS methods on the CUTE problems show that the Fast-BFGS method has the advantages of small storage cost,low one-step float complexity in parallel mode,convergence faster,and wide application range. |