Font Size: a A A

Prediction And Correction Method For Variational Inequality Problems And Its Applications

Posted on:2020-10-09Degree:DoctorType:Dissertation
Country:ChinaCandidate:X M DongFull Text:PDF
GTID:1360330578474036Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Convex programming and variational inequality(VI)problems play vital roles in a wide class of problems arising from mathematics,management science,economics and some other research fields.More and more problems can be characterized as a convex optimization problem or a VI problem with the increasingly close relationship between disciplines.With the advent of big data and machine learning,the problems get larger and larger with the amount of relevant information increasing.It is more and more difficult to solve the problems.A large number of iterative algorithms have been proposed,including the proximal point algorithm,the proximal gradient method,the prediction and correction method,and so on.In this thesis,we study the monotone variational inequality problems and non-convex optimization problems solved by a class of prediction and correction methods.In Chapter 1,we introduce the convex optimization problems and the variational inequality problems,then we summarize some important algorithms and their applications.In Chapter 2.we recall the nonexpansive operator,the properties of projection operator,basic and crucial inequalities for VI,some definitions.There are many classical methods for solving monotone variational inequality problems,and one of the simple and effective methods is the extragradient method.The extragradient method is widely used in the problem with the simple iterative format Similar to the extragradient method,each iterative step of the prediction and correction method is divided into two steps,i.e.,a prediction step and a correction step.The difference is that the extragradient method utilize the same step length in the two steps,while the prediction and correction method selects different step sizes.And the latter obtain better results of numerical experiments.In Chapter 3,we propose a prediction and correction method with Barzilai-Borwein(BB)step sizes to solve monotone(pseudo-monotone)variational inequality problems.We adopt the BB step sizes strategy in the prediction step.To guarantee the convergence,we adopt the line search technique,and we relax the conditions to accept the BB step sizes as large as possible.In the correction step,we utilize a longer step size to calculate the next iteration point.Finally we present some preliminary numerical results to show the efficiency of the algorithm,compared to He's method with self-adaptive steps.The congestion road pricing problem with capacity constraints in traffic can be transformed into a variational inequality problem with linear constraints.In Chapter 4,we consider a class of variational inequality problems with linear constraints,where the demand function is unknown and the system is an oracle.As a result,many classical methods can not deal with the problems.However,we can obtain the exact solution by observing the user's behavior at an expensive price.It is important to get an inexact solution instead of an exact solution.In this chapter,we propose a modified inexact prediction and correction method.Under the mild condition that the underlying mapping is strongly monotone,we prove the global convergence.In addition,we prove the local linear convergence under the error bound condition.Although convex programming problems are widely used,there are still many practical problems that cannot be characterized by convex programming problems.In the next two chapters,we consider to solve nonconvex optimization problems by the proximal gradient method with extrapolation.The method can be regarded as a class of prediction and correction methods,when we regard the extrapolating step and the proximal gradient step as a prediction step and a correction step respectively.In Chapter 5,we consider about minimizing the sum of a Lipschitz continuous but nonconvex function and a semiconvex function.We study the convergence of the proximal gradient algorithm with extrapolation under the error bound condition.We show that there exists a threshold,if the extrapolation coefficients are chosen below the threshold,then the sequence generated by the algorithm is convergent R-linearly to a stationary point of the problem.Moreover the corresponding sequence of objective values is also convergent R-linearly to the optimal value of the function.In Chapter 6,we consider to minimize the sum of a Lipschitz continuous but nonconvex function and a proper closed convex function.We propose an inexact version of the proximal gradient algorithm with extrapolation.The subproblem in proximal gradient algorithm with extrapolation is allowed to be solved inexactly by certain relative error criterion.We prove the convergence of iterative point sequence under the Kurdyka-Lojasiewicz(KL)inequality.Furthermore,the convergence rate of the proposed algorithm can also be established when the KL exponent is known.
Keywords/Search Tags:BB step sizes, prediction and correction method, proximal gradient method with extrapolation, error bound, linear convergence, Kurdyka-Lojasiewicz(KL) inequality
PDF Full Text Request
Related items