Font Size: a A A

Study On The Optimal Estimation Problem Under Sublinear Operators And Convex Operators

Posted on:2021-04-21Degree:DoctorType:Dissertation
Country:ChinaCandidate:C L KongFull Text:PDF
GTID:1360330602980903Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
In the classical probability theory framework,the orthogonal project ion theorem tells us that the conditional expectation respect to the estimated variable is the opti-mal solution of its minimum mean square estimation problem.It is on the basis of the orthogonal projection theorem that Kalman[47],Kalman and Bucy[46]first gave the complete filtering equation for the linear Gaussian system,thus laying the foundation of modern filtering theory.Bensoussan[8],Liptser and Shiryeav[51]further and complete-ly introduced the theoretical results of Kalman-Bucy filtering.Therefore,based on such a complete theoretical system of results,a series of stochastic optimal control problems can be solved under partial observation(or partial information)in different fields.If we replace the mathematical expectation with a sublinear or convex operator,how can we obtain the least mean square estimator under the sublinear operators(or convex opera-tors)and whether the least mean square estimator is still consistent with the conditional g-expectation and conditional coherent risk measure?This is a very meaningful ques-tion.Recently,Sun and Ji[74]has studied the least mean square estimation problem of bounded random variables under sublinear operators.However,this result has some limitations.Therefore,we need to generalize their results to the integrable space so that we can consider some meaningful problems in the stochastic analysis field.In this thesis,the least mean square estimation problems under sublinear operators and convex operators,filtering problem respect to the signal equation(or the observation equation)with uncertainty parameter under sublinear operators and filtering problem respect to the system with uncertainty parameter under convex operators are studied.This dissertation is divided into six chapters,the first chapter is preparatory knowledge,the second chapter to the sixth chapter specific structure are as follows.Chapter 2,In this chapter,the least mean square estimation problem of nonbound-ed random variables under sublinear operators is studied.Under some mild hypothesis,we obtain the existence and uniqueness theorems of the least mean square estimator of nonbounded random variables and some basic properties.Three examples are given to show that the least mean square estimator is different from the conditional coherent risk measure and the conditional g-expectation.The innovation of this chapter is to delete the bounded assumption in Sun and Ji[74]and propose a new proof idea to solve the least mean square estimation problem of integrable random variables under sublinear operators.The existence and uniqueness theorems of the least mean square estimator of variables and some properties of the least mean square estimator are also given.Chapter 3,In this chapter,a generalized Kalman-Bucy model with model un-certainty and a corresponding robust estimation problem are studied where the model uncertainty parameter ? mainly affects the signal equation.We find that this robust estimation problem is equivalent to an estimated problem under a sublinear operator.By Girsanov transformation and minimax theorem,we prove that this problem can be reformulated as a classical Kalman-Bucy filtering problem under a new probability mea-sure.The equation which governs the optimal estimator is obtained.Moreover,the optimal estimator can be decomposed into the classical optimal estimator and a term related to the model uncertainty parameter under some condition.The innovation in this chapter is to introduce the drift ambiguity into the classical Kalman-Bucy model,then the corresponding estimation problem(?)becomes a robust estimation problem inf(?).According to theorem 3.1,we find the optimal ambiguity parameter ?*,which helps us to degenerate two nonlinear elements sup inf in problem(?)into one nonlinear element inf,and finally obtain the filtering equation satisfied by the optimal estimator x.Chapter 4,This chapter mainly considers a generalized Kalman-Bucy filter model under model uncertainty and the corresponding robust estimation problem.In this chapter,the model uncertainty parameter ? mainly affects the observation equation.The construction of this model is based on a dynamic contract problem which is considered by Ji,Li and Miao[37].In their model,they think an item's observable cumulative output equation contains uncertain parameters a and ?.We can also intuitively explain our model that different observers measure the signal process differently.This leads to model uncertainty.Therefore,it is meaningful for us to consider this generalized Kalman-Bucy filtering model.We also regard this robust estimation problem as an estimation problem under a sublinear operator and obtain the characterization equation of the optimal estimator.Similarly,under special conditions,the optimal estimator can be decomposed into two parts,one of which is ther filtering equation in the classical case,and the other partcontains the optimal ambiguity parameter ?*.This result helps to explain how ambiguity parameter ? influences the evolution of the optimal estimator.Chapter 5,In this chapter,we study the minimum mean square estimation prob-lem of bounded random variables under convex operators and obtain the existence and uniqueness of the minimum mean square estimator.The innovation of this chapter is that considering the limitations of the application of sublinear operators,we extend the theory of least mean square estimator under sublinear operators to convex operators.Since convex operators lack positive homogeneity,this results in convex operators have one more penalty term than the sublinear operators.The content of this chapter is different from Sun and Ji[74]is how to deal with the penalty term.Chapter 6,An innovation in this chapter is that.we combine the relevant theo-ry of backward stochastic differential equations with Kalinan-Bucy filtering theory,and generalize the classic filtering problem to a robust estimation problem about signal pro-cesses.This problem can also be seen as a least mean square estimation problem under the convex operator,we finally get the differential equation(ie,the filter equation)that the least mean square estimation element of the signal process satisfies.In addition,in the previous chapter,we studied the existence and uniqueness of the least mean square estimator of bounded random variables under convex operators.In this chapter,another innovation is that we extend the corresponding existence and uniqueness results to the integrable space and obtain the definition of the conditional expectation of the integrable random variables under the convex operator.Since convex operators lack positive homo-geneity,this leads to a more penalized term for the representation of convex operators than for sublinear operators.The main difference between this chapter and the first chapter is how to deal with this penalty item.Chapter 7,In this chapter,we reconstruct the robust estimation in chapter 3 from the perspective of stochastic control,so that it becomes a zero-sum forward-backward stochastic differential game problem.Where the cost functional is defined as:J(a,b;?)=E[Y(a,b;?)(0)]=Y(a,b;?)(0),(7)the state variable(?(·),Y(·))satisfies:where K(t)is uniformly bounded,deterministic function,(a(t),b(t);?(t))is control vari-able.Define(?) are Zt-measurable processes and belong to LZt2(0,T;Rn)and and AF(2)={?|?(t)is Ft-measurable processe such that |?(t)|??}.?? 0.2.If(a,b;?)?AZ(1)×AF(2),then we call(a,b;?)the admission control.Define Hamiltonian function H(t,?,Y,Z1,Z2,a,b,?,l,n2,?)H(t,?,Y,Z1,Z2,a,b,?,l,n2,?)=l(t)(a(t)+b(t)G(t)?(t))+?(t)(-?(t)Z1(t)(9)?K(t)(x(t)-?((t))2)+b(t)n2(t),and adjoint equations:Applying the technique of convex variation,we get the following maximum principle:?? 0.2.Let assumption 4 hold.Suppose that(a(·),b(·);?(·))is the saddle point of problem(7),and(?(·),Y(·),Z1(·),Z2(·))is the corresponding stae trajectory.Then,we have E[Ha(t,?,Y,Z1,Z2,a,b,?,l,n2,?)|Zt|=0,E[Hb(t,?,Y,Z1,Z2,a,b,?,l,n2,?)|Zt|=0,(12)E[H?(t,?,Y,Z1,Z2,a,b,?,l,n2,?)|Ft]=0 where(l(·),n1(·),n2(·))and ?(·)are solution of equations(10)-(11).
Keywords/Search Tags:Kalman-Bucy filter, model uncertainty, robust estimation, minimum square estimator, minimax theorem, sublinear operator, convex operator
PDF Full Text Request
Related items