In this paper, rough fuzzy theory and statistical learning theory are combined to generalize the basic problems of statistical learning theory with rough fuzzy samples and the credibility measure. Firstly, the Khinchine law of large numbers based on the credibility measure and rough fuzzy samples is given. Some concepts such as rough fuzzy expected risk functional, rough fuzzy empirical risk functional and rough fuzzy empirical risk minimization principle are proposed and the key theorem of learning theory based on the credibility measure and rough fuzzy sample is proved. Furthermore the bounds on the rate of uniform convergence of learning processes based on the credibility measure and rough fuzzy samples are discussed. Finally, we give the definition of the rough fuzzy structural risk minimization principle and the asymptotic bounds on the rate of convergence based on the credibility measure and rough fuzzy samples, which become cornerstones of the theoretical fundamentals of the SLT for rough fuzzy samples and the credibility measure.
|