The key theorem and the bounds on the rate of convergence of learning processes play important roles in statistical learning theory. At present, the researches about them only focus on probability measure space. Furthermore, the samples we deal with are supposed to be noise-free. In fact, it is not always the case because of the influence of human or environmental factors. On credibility space, this paper proposes and proves the key theorem of statistical learning theory about samples corrupted by noise. Then the bounds on the rate of uniform convergence of learning processes are discussed when samples are corrupted by noise.
|