Font Size: a A A

Properties Of The Nonlinear Expectations And Their Applications

Posted on:2013-02-15Degree:DoctorType:Dissertation
Country:ChinaCandidate:Z LiuFull Text:PDF
GTID:1110330374480473Subject:Financial mathematics and financial engineering
Abstract/Summary:PDF Full Text Request
Peng (2006) introduced the notions of G-normal distribution, G-expectation and G-Brownian motion, constructed the Ito integral with respect to G-Brownian motion, obtained G-Ito formula, the existence and uniqueness of G-SDE and G-BSDE. G-normal distribution plays the same important rule in the theory of the sublincar expectation as that of normal distribution in the classic probability theory. Compared with g-expectations, the theory of G-expectation is intrinsic in the sense that it is not based on a given (lin-ear) probability space. A G-expectation is a fully nonlinear expectation. It characterizes the variance uncertainty of a random variable. We recall that the problem of mean uncertainty has been studied by Chen-Epstein through g-expectation. Under the G-expectation framework, Peng have proposed many interesting methods and obtained many interesting results, more im-portant, it has many interesting problems. The part of the recent results of G-cxpcctations see Dcnis-Hu-Peng (2008), Hu-Peng (2009), Bai-Buckdahn (2009), Xu-Zhang (2009), Gao (2009), Li-Peng (2009), Soner-Touzi-Zhang (2010), Song (2010) and so on.This dissertation also focuses on the research about G-expectation and the related questions. Firstly, we recall from Peng(2009) that the quadratic process (<B>t)t≥0of G-Brownian motion (Bt)t≥0could describe the variance uncertainty in the sense:for each φ∈Cl,lip(Rd×d), v(t,x)=E[φ(x+<B>t)], is the viscosity solution of the following PDE where0describe the variance uncertainty of G-Brownian motion. On the other hand, the generalized G-distribution as an extension of G-normal dis- tribution was first introduced by Peng (2009). It generalizes the G-normal-distribution in the sense that meanuncertainty can be also described.Then how to find a process to describe both the variance uncertainty and mean uncertainty of the G-distribution is necessary and interesting, and the first part of chaper two solved this problem and got the resultTheorem1.3.1Define v solves the following first order PDE: where Dxv=((?)xijv)i,j=1d, Dyv=((?)yiv)i=1d. We also have Secondly,it is well known that the Neyman-Pearson fundamental lemma gives the most powerful statistical tests for simple hypothesis testing problems. However, to the best of our knowledge, only Huber and Strassen's work for2-alternating capacities and S. Ji and X. Zhou's paper for g-expectations have studied the nonlinear probability counterpart. Neyman-Pearson fundamen-tal lemma is introduced in1933by Neyman and Egon Pearson, which gives a sufficient condition, in a hypothesis test with null hypothesis θ=θ0and alternative hypothesis θ=θ1, for choosing a critical region, with given signif-icance level, that maximizes the power of the test. Huber and Strassen show that, if the composite hypotheses can be described in terms of alternating capacities of order2, then the minimax tests are ordinary Neyman-Pearson tests between a fixed representative pare of simple hypothese; moreover, the condition is in a certain sense also necessary. Then S.Ji and X.Zhou's paper studied the case of Neyman-Pearson fundamental lemma under g-probability. With convexity assumptions, a sufficient and necessary condition which char-acterizes the potimal randomized tests is obtained by using a stochastic prin-ciple approach. While g-expectations can be seen as the special case of generalized G-expectations. We then study the simple hypothesis Neyman-Pcarson under the generalized G-expectation, and get the result:Theorem1.4.5There exists a randomized test which attains the minimum of the following model: We also prove that the optimal randomized test is not unique under the gen-eralized G-expectation.In chapter2, we focusc on the G-martingale representation theorem under the following case, where b(X.t,x),h{X.t,x)∈MG1([t, T]),σ(X.t,x)∈MG2([t,T]), we suppose the solution of the above function exists which denoted by Xst,x∈MG1[t,T]. We then have the following theorem:Theorem2.2.5Xst,x is a G-Martingalc, if and only if b(x)=-2G[h(x)]Large deviation principle, which deals with limit problems that arc dif-ferent from central limit theory, is a very fruitful branches of limit theory of probability theory, and it is the precision of the law of large number the-ory, and also has important applications in mathematical statistic, analysis and physics. The theory of large deviations is traced to Khintchine (1929) Cramer (1938) and Chernoff (1952).The theory of large deviation provides a good method to calculus the probabilities of rare events, which will have great impact once happen, although their probabilities are very small. Therefore, the study of rare events is neccssary.In Chapter3, we introduce the Large Deviation Principle relative capacities. Theorem3.2.5If a sequence of random variables[X1]n=1n are both i.i.d. relative u,V.Also let then exist Iv(A)和Iv(A), st.In Chaper4,We consider the limit theorem under nonliear expeetation. At first, we give the definition of weakly independenee,whieh weakens the one introdueed by Peng(2008),Definition4.1.1Weakly Independenee:Suppose that Y1,Y2,...,Yn is a sequenee of random variables such that Yi∈H. Random variable Yn is said to be weakly independent of X:=(Y1,…,Yn-1)under E,if for each mcasurablc function φ0,φ1,Ψ1,on Rn with φ0(X),φ1(X),Ψ1(Y)∈H, we have Under this condition,wc get the Central Limit Theorem under Nonlinear Expectations.Theorem4.1.6Let a sequence{Xi}i=1∞be identically distributed with each others. We also assume that,each Xn+1is weakly independent to (x1,x2,.,xn)for n=1,2,….We assume furthermore that Then{Xi}i=1∞converges in law to the G-normal distribution: whereζ is G-normal distributed under E Moreover, we get the law of the iterated logarithm under weakly indepen-dence, improving the classic case of the law of the iterated logarithm.Theorem4.1.7Let{Xn}n=1∞be a sequence of bounded ⅡD random vari-ables under sublincar expectation E with zero means and bounded variances, i.e.,(A.1) E[X1]=ε[X1]=0,(A.2) E[X12]=σ2,ε[X12]=σ2, where0<σ≤σ<∞. Denote Sn=(?) Xi.Then (Ⅰ)(Ⅱ)(Ⅲ) Suppose that C({xn}) is the cluster set of a sequence of {xn} in R, thenWe generalized the law of large numbers under sublinear expectations given by professor Peng (2008), and get the corresponding law of large num-bers under sublinear expectations by changing the moment condition to the uniform integrability condition,Theorem4.2.4Suppose (Ω,H, E) is a sublinEar expectation space. Let {Xi:i=1,2,...}(?) H such that for each i≥1, Xi+i is identically dis-tributed to Xi and independent from (X1, X2,..., Xi). We also assume that E[|X1|]<∞and limN→∞E[|X1|-|X1|∧N]=0. Then for each φ∈Cb.Lip(R) we have where μ=E[x1]≥一E[-X1]=μ. Particularly we consider the pricing problem under uncertainty in financial markets,for this kind of problem,we mainly consider the maximal and min-imal pricing proposed by Professor Chen(2005),and apply the law of large numbers under sublinear expectations to this problem,we get some interest-ing conclusions of the limit properties of the maximum pricing,Theorem4.2.6Let Xi=Wi-Wi-1,i≥1,Then for each φ∈b.Lip(R), We have We also obtain an interesting limit property of the maximum capacity, espe-cially, the maximum capacity is no longer absolutely continuous with respect to the Wiener measure in infinite horizon,Corollary4.2.8Let Xi=Wi-Wi-1,i≥1,Then(1)For each一k≤a<b≤k,we have(2)For each c∈[一K,K],we haveAt the last part, we simply introducc a mathematical model and its application.
Keywords/Search Tags:G-expectation, G-Brownian motion, G-normal distribu-tion, Generalized G-expectation, Neyman-Pcarson Lemma, Largc deviationprinciple, capacity, The law of large numbers, The law of the iteratedlogarithm
PDF Full Text Request
Related items