Font Size: a A A

Smooth Quasi-Newton Methods for Nonsmooth Optimizatio

Posted on:2019-09-18Degree:Ph.DType:Thesis
University:Cornell UniversityCandidate:Guo, JiayiFull Text:PDF
GTID:2450390005994358Subject:Operations Research
Abstract/Summary:
The success of Newton's method for smooth optimization, when Hessians are available, motivated the idea of quasi-Newton methods, which approximate Hessians in response to changes in gradients and result in superlinear convergence on smooth functions. Sporadic informal observations over several decades (and more formally in recent work of Lewis and Overton) suggest that such methods also seem to work surprisingly well on nonsmooth functions. This thesis explores this phenomenon from several perspectives. First, Powell's fundamental 1976 convergence proof for the popular Broyden-Fletcher- Goldfarb-Shanno (BFGS) quasi-Newton method for smooth convex functions in fact extends to some nonsmooth settings. Secondly, removing the influence of linesearch techniques and introducing linesearch-free quasi-Newton approaches (including a version of Shor's R algorithm), shows in particular how repeated quasi-Newton updating at a single point can serve as a separation technique for convex sets. Lastly, an experimental comparison, in the nonsmooth setting, of the two most popular smooth quasi-Newton updates, BFGS and Symmetric Rank-One, emphasizes the power of the BFGS update.
Keywords/Search Tags:Quasi-newton, Smooth, Methods, BFGS
Related items