Font Size: a A A

Model Selection Criteria Based On Kullback Information Measures

Posted on:2008-03-18Degree:MasterType:Thesis
Country:ChinaCandidate:M MiaoFull Text:PDF
GTID:2120360218955485Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
Selection an optimal model from a class of candidates is a critical issue in statisticalmodeling. During the past three decades, a number of model selection criteria have beenpropose based on estimating Kullback's (Information Theory and Statistics, Dover, Mineola,NY, 1986, P.5) directed divergence between the model generating the data and a fittedcandidate model. The Akaike (Second International Symposium on Information Theory,Akademia Kiado, Budapest, Hungary, 1973, PP.267-281;IEEE Trans. Automat. Control AC-19(1974) 716) information criterion, AIC, was the first of these. AIC is justified in a verygeneral framework, and as a result, offers a crude estimator of the directed divergence: onewhich exhibits a potentially high degree of negative bias in small-sample applications(Biometrika 76 (1989) 297). The "corrected" Akaike information criterion (Biometrika 76(1989) 297), AIC_c, adjusts for this bias, and consequently often outperforms AIC as aselection criterion. However, AIC_c is less broadly applicable than AIC since istjustification depends upon the structure of the candidate model.Recently, model selection criteria have been proposed based on estimating Kullback's(Information Theory and Statistics, Dover, Mineola, NY, 1986, P.6) symmetric divergencebetween the generating model and a fitted candidate model (Statist. Probab. Lett. 42 (1999)333; Austral. New Zealand J. Statist. 46(2004)257), KIC, KIC_c are criteria devised to targetthe symmetric divergence. In small-sample applications, KIC, KIC_c often outperformsAIC and AIC_c as selection criterions.(Journal of Statistical planning and Inference.134(2005)332-349).McQuarrie and Tsai calculate the signal-to-noise ratio of these criteria. They propose thatthe linear penalty function of AIC results in it proning to overfitting. They alse get a newcriteria AIC_u through stronger the signal-to-noise of AIC. In this paper, we devise a newcriteria KIC_u in the linear regression framework by stronger the signal-to-noise of KIC inthe same manner that McQuarrie and Tsai calculate get AIC_u. We also propose that KIC_userves as an approximately unbiased estimator of Kullback's directed divergence fornonlinear regression candidate models with normal errors. Moreover, KIC_u performsfavorably against AIC_u.
Keywords/Search Tags:AIC, KIC, Kullback-Leibier information, Linear regression, Nonlinear regression
PDF Full Text Request
Related items