Font Size: a A A

Foundations Of Statistical Learning Theory Based On Complex Samples On Uncertainty Space

Posted on:2011-03-01Degree:MasterType:Thesis
Country:ChinaCandidate:X K ZhangFull Text:PDF
GTID:2120360308954081Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Statistical Learning Theory (SLT) is one of the most important theories to deal with small samples. However, the theory is built on probability space and based on real random samples, so it can hardly handle statistical learning problems built on non-probability space and based on non-real random samples in real world. Uncertainty space is wider than probability space. In this dissertation, SLT built on uncertainty space and based on complex random samples is discussed. Firstly, the definitions of complex uncertain variable together with its distribution function, expected value, and variance are presented. Then Markov inequality, Chebyshev inequality and Khintchine law of large numbers built on uncertainty space and based on complex samples are also proved. Secondly, some new concepts, such as complex empirical risk functional, complex expected risk functional, and strict consistency of the complex empirical risk minimization principle built on uncertainty space and based on complex samples, are introduced. The key theorem of learning theory built on uncertainty space and based on complex samples is proved. At last, the bounds on the rate of convergence of learning process built on uncertainty space and based on complex random samples are given.
Keywords/Search Tags:Uncertain measure, Complex samples, Empirical risk minimization principle, Key theorem, Bounds on the rate of convergence
PDF Full Text Request
Related items